Original Link: https://www.anandtech.com/show/438

ATI Rage Fury MAXX

by Mike Andrawes on January 7, 2000 4:47 PM EST


A few weeks ago, AnandTech had it first experience with the ATI Rage Fury MAXX. That first article focused primarily on performance, and as such, a few issues were left out for the time being. In our second look, we'll be looking at these issues, which include the "theoretical lag" that results from ATI's AFR technology, driver features, DVD support, and more. Also since that first article, GeForce DDR boards have found their way into the AnandTech lab, so we've included updated benchmarks as well.

In case you've forgotten, here's a quick review (taken from our original article) of just what the Rage Fury MAXX is and where it came from.

Telling a worthy manufacturer that they cannot compete in the gaming market is much like telling a nice guy that he simply can't play basketball. While sitting in a car with three other members of the ATI team, we were having a nice discussion about the present graphics card market. When one of the ATI representatives asked for our opinion on a higher clock speed Rage Fury Pro, possibly in the TNT2 Ultra range of speeds, we were taken by surprise. Here, for the first time since the true introduction of 3D accelerated gaming on the PC, we had ATI talking about assuming a leading role in the gaming market. Although it is true that just one year ago ATI had the potential to take the gaming market with the release of their Rage 128 chip, delays in the release of the part snatched the chances of that gold medal away from them quickly. This conversation took place just under six months ago, and as shocked as we were back then when ATI was talking about taking on NVIDIA, one of the leaders in the 3D accelerated PC gaming market, we were just as shocked when they dropped the news about project Aurora.
Project Aurora started out as an ambiguous page on ATI's site and shortly thereafter turned into a skeptical press release as the term Aurora was morphed into ATI's latest offering, the Rage Fury MAXX card. The Rage Fury MAXX revisited an idea that was first introduced to the gaming market with the advent of 3dfx's Voodoo2: the idea of putting two standalone graphics chipsets together in order to provide a desirable performance boost with minimal added engineering time.

The idea of using multiple processors to quickly achieve a performance boost without having to wait for the technology to improve is something that is presently all around the industry. 3dfx's Scan Line Interleave (SLI) on the Voodoo2 was a quick and easy way to assume a nice performance boost simply by adding on another graphics card. The SLI technology allowed the number of horizontal lines being rendered to be split evenly between the two cards in the configuration, so one card would handle every even line while the other card would handle every odd line. Because both cards worked on the same scene, the textures present in the scene had to be duplicated in the frame buffers of both cards being used. This was a highly inefficient manner of improving performance, but, then again, at the time of the technology, the 8/12MB of memory on a single Voodoo2 was more than enough for the games.

On the other hand, this manner of improving performance was very appealing to gamers because they could absorb the cost of owning a single Voodoo2 board, enjoy the performance, and when they came across a little more cash they could make the upgrade to a Voodoo2 SLI configuration and assume an immediate performance increase. The key to the success of 3dfx's Voodoo2 SLI was the fact that you never threw away your initial investment, something very rare in the graphics accelerator market.

The success of the SLI technology led to the question of whether or not 3dfx's Voodoo3 supported SLI. Another company, Metabyte, stepped forth with a technology that was unofficially dubbed SLI, yet, with a few modifications, it could be used on any card. Metabyte officially called this technology their Parallel Graphics Configuration (PGC). The PGC technology split up any given frame into two parts, with each card/chip handling one part of the screen. This approach required quite a bit of elegance in the actual drivers themselves as the drivers had to take into account factors like what would happen if the card rendering the top half (which is generally less complex than the bottom half) finished before the other card was done rendering the bottom half. At the same time, the end result would be much more efficient than 3dfx's SLI design because the textures did not have to be duplicated and the polygon throughput of the setup was effectively doubled, whereas it remained equal to that of a single card in the Voodoo2 SLI situation. Unfortunately, Metabyte's PGC never made it to market, an unfortunate reality as the expensive product could have been quite a success -- Can you imagine laughing at a GeForce's 480 Mpixel/s fill rate while running dual Voodoo3 3500's (732Mpixel/s) or dual TNT2 Ultras (600Mpixel/s)?

ATI turned project Aurora into their take on the same idea, and thus ATI's Alternate Frame Rendering (AFR) Technology was born. As the name implies, AFR divides the load between the two chips in the configuration by frames, instead of parts of frames. One chip will handle the current frame while the second chip is handling the next frame. ATI's AFR is the basis for the Rage Fury MAXX and future cards which will carry the MAXX name.

ATI actually claims AFR is more efficient than both SLI and PGC, but for different reasons. With SLI, both cards must perform triangle setup for each frame, which is redundant. PGC can, under certain circumstances, become unbalanced such that one chip is doing considerably more work than the other. The best example of such a situation is when a player might be in an area where the sky takes up half the screen. Obviously, rendering the sky on the upper half of the screen will take a different amount of work than rendering other players, objects, etc. on the lower half.

The Rage Fury MAXX was ATI's only chance at competing with what 3dfx, NVIDIA and S3 hoped to have released by the end of the 1999 holiday shopping season. ATI had no new chip that would allow them to compete with the big boys, all they had was the Rage 128 Pro that delivered performance somewhere between that of a TNT2 and a TNT2 Ultra for about the price of the latter. The Rage 128 Pro itself is a 0.25-micron chip clocked at 125MHz, resulting in a 250 Mpixel/s fill rate; put two of these together and you have a setup capable of beating NVIDIA's recently launched GeForce 256 (500 Mpixel/s versus 480 Mpixel/s). The Rage 128 Pro was featured on ATI's recently released ATI Rage Fury Pro, and the combination of two of these chips using ATI's AFR technology is a product known as the Rage Fury MAXX. With less than three weeks left in 1999, ATI will be pushing for the sale of the Rage Fury MAXX within the next 10 days, pitting it head to head with NVIDIA's GeForce that has been dominating the store shelves. Not only is ATI attempting to compete with NVIDIA on a performance level, but on the issue of price as well, as they have vowed to match the price of the GeForce with the Rage Fury MAXX. Bold claims from a company that isn't known to be a present competitor in the gaming community.



Specifications

First, a quick analysis of the Rage Fury MAXX specs, taken from our original article:

The specs of the Rage Fury MAXX are quite impressive as the card features a whopping 64MB of on-board SDRAM clocked at 143MHz. The 64MB of SDRAM is split into 32MB per Rage 128 Pro chip on the board. Like a Voodoo2 SLI setup, textures must be duplicated for each chip, but framebuffer memory can be shared between the two memory banks. Nevertheless, this provides for a tremendous amount of available memory bandwidth. Taking the 128-bit (16-byte) memory pipeline and multiplying it by the 143MHz memory bus frequency of the Rage Fury MAXX results in an available memory bandwidth of about 2.3GB/s. But we're not done yet! In order to finish the memory bandwidth calculations, you must double that number because of the two dedicated memory busses, one per chip, present on the Rage Fury MAXX. This increases the available memory bandwidth to an incredible 4.6GB/s, which outweighs the 2.7GB/s of NVIDIA's GeForce 256 and even the 3.2GB/s of the Matrox G400MAX. The only card the Rage Fury MAXX isn't able to beat on paper in terms of available memory bandwidth is a GeForce 256 equipped with DDR SDRAM, which has approximately 5.3GB/s of available memory bandwidth.

Why does offering a greater amount of memory bandwidth matter? As you increase the resolution and color depth that you're playing your game at, you begin to require much more memory per frame of data that you are displaying. The greater the memory bandwidth your graphics subsystem or video card allows for, the less noticeable of a performance hit you'll be forced to take as you increase the resolution and color depth that you are playing at. So, while the performance advantage of the Rage Fury MAXX's 4.6GB/s peak memory bandwidth over the GeForce SDR's 2.7GB/s won't be seen running at 640x480 at 16-bit color, the MAXX will begin to pull away from its NVIDIA-born counterpart at resolutions of 1024x768 at 32-bit and at higher resolutions.

As we mentioned before, the dual Rage 128 Pro chips not only double the available memory bandwidth but also the peak fill rate of the card. This transforms the unsurprising 250 Mpixel/s fill rate of a single Rage 128 Pro into a monstrous 500 Mpixel/s fill rate of the Rage Fury MAXX. The fill rate of the Rage Fury MAXX can be considered its raw horse power to all you car-fanatics out there (don't worry, the first requirement to work at AnandTech is that you must be a car-guy or gal) and it is what helps to drive the high frame rates in Quake 3 or UnrealTournament. Currently, the Rage Fury MAXX remains unmatched in terms of raw fill rate of a released product; the only other products we are aware of that will be able to beat out the Rage Fury MAXX won't be seen until sometime into the first half of next year.

In addition to offering good performance on paper (we'll get to the actual benchmarks in a bit to see if they follow up their claims), ATI brings to the table their usual set of DVD enhancements and features that help to separate them from the crowd. The hardware assisted DVD playback of the Rage Fury MAXX is obviously identical to that of the Rage Fury Pro since they are both based on the same chip.

A few things that we didn't note the first time around. Currently the Rage Fury MAXX only supports Windows 98, so Windows NT 4.0, Windows 2000, and Linux users are out of luck for now. Undoubtedly, we'll see drivers for Windows 2000 eventually as the OS is officially released and gains popularity, but for now there's no guarantee when that will happen. It should also be noted that one chip is disabled for 3D apps in a window, resulting in performance identical to the single chip Rage Fury Pro under those circumstances.



Texture Compression (taken from our Rage Fury Pro Review)

When S3 first introduced support for S3TC the market was stunned by the potential that was seemingly just sitting there. Since that announcement, there hasn't been too much support for the standard, also known as DirectX Texture Compression (DXTC) which is natively supported in Direct3D and, through an extension, supported in OpenGL. However the support is growing and since the Rage 128 Pro supports DXTC with its 6 to 1 compression algorithm, you can add ATI to that list.

The Rage 128 Pro, like the original Rage 128, features support for textures of up to 2048 x 2048 pixels in size. Although there isn't a game that makes use of such a large texture, the texture sizes will continue to increase as games become more and more realistic. For this reason the chip's support for texture compression is definitely a move in the right direction. While it won't be the deciding factor in your buying decision it does help the industry move forward.

In this respect, texture compression can be equated with features such as hardware transforming & lighting or Environment Mapped Bump Mapping (EMBM). While they may not be features that are fully taken advantage of in current games, hardware support for them is a step forward for the industry as a whole. The ball must start rolling somewhere and once it does the entire industry benefits. It's refreshing to see ATI contribute to the industry by supporting DXTC and, as we already know, they aren't the only ones that have already pledged support for it. Even 3dfx, with their own texture compression algorithms, has already announced support for DXTC in their upcoming next generation product.

DVD Decoding

Since the original Rage 128 chip, ATI has been a leader in DVD decoding on a single 2D/3D card. Their secret? The addition of Inverse Discrete Cosine Transform, or iDCT for short, support in hardware. iDCT is simply a part of the MPEG-2 decoding process, the standard by which DVD are encoded. By offloading this function to the graphics card, the CPU is free to perform other tasks. ATI is the only graphics maker to support iDCT in hardware on something other than a dedicated MPEG-2 decoder card.

The next key to ATI's infamous DVD support is hardware support for DVD subpicture. DVD subpicture is a very commonly used feature that allows you to display graphics over video such as subtitles or menu features, both of which are commonly used in DVD movies.

In order to place a subpicture over a DVD stream, the subpicture (a compressed bitmap) is decompressed and outputted to the screen on the fly. While this isn't the most CPU intensive part of playing back a DVD, it does contribute to some of the CPU load associated with DVD playback where there is excessive use of the subpicture feature. Performing this in hardware allows for the decompression of the subpicture on the fly as it sends it to the DAC for output. This only reduces memory bandwidth and CPU utilization by a small percentage but it does help. However, without support for DVD subpicture in hardware, you get a dithered approximation of the image that is supposed to be translucent.

The Rage 128 series also has built in hardware motion compensation, filtered XY scaling, etc. just like virtually every other card on the market today. All this combined helps to ensure a constant frame rate, even on lower end systems, with virtually no dropped frames. Not to mention quality that rivals stand alone DVD players.

Of course, with the Rage Fury MAXX, there's no TV output, so you'll be stuck watching those DVD movies on your monitor. It's really too bad because the quality is quite good, especially if it were teamed with the TV-out of the Rage Theater chip used on the Rage Fury Pro.



The Card


Click to Enlarge

With essentially two Rage Fury Pro cards in one, the MAXX's PCB is a monster. In fact, it's one of the largest we've seen here in the AnandTech labs, eclipsed only by the 3DLabs Oxygen GVX1. Adjacent to each Rage 128 Pro chip are four Samsung 7ns 8MB SDRAM chips, for a total of 64MB, clocked at 143 MHz. Each Rage 128 Pro chip is clocked at 125 MHz for 250 Megapixels/s per chip, resulting in the 500 Megapixel/s total theoretical output of the card.

3dfx has already noted that their Voodoo5 products will not require an AGP bridge for the two chip models, and apparently ATI noticed the same thing and has not included any such devices. However, we did find that the MAXX was detected as two devices under Windows 98, with each device requiring its own IRQ. Juggling those IRQ's could be a bit of a hassle inside a system packed with devices that don't like to share IRQ's.

Interestingly enough, unlike NVIDIA's GeForce, we did not encounter any problems with motherboards supplying enough power to the AGP slot to keep the MAXX running smoothly. This is despite the fact that the MAXX is a two chip solution with 64MB RAM. On the other hand, the GeForce does sport over 22 million transistors.

One of the first things that came to mind when we heard the plans for the MAXX was the possibility of dual monitor support from one card, much like the Matrox G400's Dual Head technology. Indeed ATI must have had the same idea as evidenced by the board space dedicated to a second monitor connector, with the word "secondary" silk screened right next to it. With two completely separate Rage 128 Pro chips already on board, ATI didn't really need any additional hardware to accomplish such a feet. So why didn't they do it? With ATI vowing to match the price of NVIDIA's GeForce despite the added cost of double the RAM, additional PCB space, etc. and needed to cut costs wherever possible. The two most obvious are the deletion of the second monitor output and the lack of a software bundle beyond ATI's own DVD player.



Overclocking

One of the biggest lingering questions with the MAXX is "what happens when you overclock it?" More importantly, we had to find out if it could be overclocked at all since it's a very unique configuration. The Rage 128 Tweaker was unsuccessful in such endeavors and simply locked up. However, when in doubt, just go with Entech's Powerstrip, which came to the rescue once again. We used version 3.6, the latest at the time of this review, which detected the MAXX as just a regular Rage 128 Pro. Regardless, the method Powerstrip uses of going direct to hardware seems to have worked on the MAXX.

Overclocking the MAXX is a somewhat more sensitive quest compared to overclocking traditional, single chip cards. You now have two chips that must reach the desired speed, rather than just one. Further, and possibly more important, since the two chips are working in tandem, slight errors from one chip could adversely affect the other.

Nevertheless, we pushed ahead to see how far our retail Rage Fury MAXX sample would go. Thanks to the fairly cool running 0.25 micron Rage 128 Pro core, which is also covered by a quality heatsink and fan, we were able to push the core to 150 MHz without breaking a sweat. The Samsung 7ns/143 MHz SDRAM could only hit 160 MHz, typical of such memory from past experience.

The performance increase from overclocking the MAXX should be significant since the theoretical fill rate is calculated by multiplying the core clock speed by the four pixel pipelines. This means that an increase of the core clock speed by 5MHz results in a 20M pixel/s increase in peak fill rate. That means our MAXX running at 150/160 has a theoretical fillrate of 600 Megapixels/s and 5.1GB/s of memory bandwidth. Very impressive power, but how much would it affect real world performance? Let's take a look:

Under Quake3 Arena, the performance increase is noticeable and beneficial at 1024x768 where the MAXX is more fill rate bound. The effect is less pronounced than it could be due to the limited overclocking of the memory that we were able to accomplish. At 640x480, we see no increase in performance since everything is CPU and driver limited at such a low resolution.

In Unreal Tournament, however, the situation is quite different. Overclocking the MAXX had virtually no affect on performance, regardless of running at 640x480 or 1024x768. This is related to the Unreal Tournament engine more than anything else as we've seen similar results with other cards.



Drivers

As mentioned above, the Rage Fury MAXX is only supported under Windows 98 at this point in time. Support for Windows 2000 is likely if the OS gains acceptance in the gaming market. Under Windows 98, support is quite good, with a full OpenGL ICD and DirectX7 support. In the drivers, ATI has included just about every tweaking option you could ask for. The exception, of course, is that there is no overclocking utility.

Driver Pictures



The Test

Windows 98 SE Test System

Hardware

CPU(s)

Intel Pentium III 700
Intel Pentium III 600E
Intel Pentium III 450

Intel Celeron 500
Intel Celeron 366
AMD Athlon 700
Motherboard(s) ABIT BX6 R2 ABIT BX6 R2 +
"Sloket" Adapter
Gigabyte GA-7IX
Memory

128MB PC133 Corsair SDRAM

Hard Drive

IBM Deskstar 22GXP 22GB Ultra ATA 66 HDD

CDROM

Phillips 48X

Video Card(s)

3dfx Voodoo3 2000 16MB (default clock - 143/143)
3dfx Voodoo3 3000 16MB (default clock - 166/166)
3dfx Voodoo3 3500 16MB (default clock - 183/183)
ATI Rage Fury Pro 32MB (default clock - 125/143)
ATI Rage Fury MAXX 64MB (default clock - 125/143)
Diamond Viper II Z200 Savage 2000 (default clock - 125/155)
Matrox Millennium G400 MAX 32MB (default clock - 166/200)
NVIDIA GeForce 256 32MB (default clock - 120/166)
NVIDIA RIVA TNT2 32MB (default clock - 125/150)
NVIDIA RIVA TNT2 Ultra 32MB (default clock - 150/183)

Ethernet

Linksys LNE100TX 100Mbit PCI Ethernet Adapter

Software

Operating System

Windows 98 SE

Video Drivers

3dfx Voodoo3 - 1.03.04
ATI Rage Fury Pro - Retail Shipping Drivers (N/A for download)
ATI Rage Fury MAXX - Retail Shipping Drivers (N/A for download)
Diamond Viper II Z200 Savage 2000 - 1.09 (N/A for download)
Matrox Millennium G400 - 5.41.008
NVIDIA GeForce 256 - Detonator 3.53
NVIDIA RIVA TNT2 - Detonator 3.53

Benchmarking Applications

Gaming

idSoftware Quake III Arena demo001.dm3
GT Interactive Unreal Tournament 4.00 UTbench.dem

 



You can expect the GeForce equipped with DDR SDRAM to dominate in all the benchmarks, but the lead it holds over the competition is naturally reduced when running at 640 x 480 since this is primarily a CPU test and the effects of fill rate limitations on the slower cards can't really be seen at this low of a resolution.

One thing that you can take note of is the fact that the switch to 32-bit color on the DDR GeForce is about 3% whereas the drop is about 20% on the SDR version of the card. The effects of the greater memory bandwidth provided by the DDR SDRAM can already be seen at this low of a resolution. There are three other cards that exhibit a behavior similar to that of the DDR GeForce when moving to 32-bit color, each for different reasons.

The S3 Savage 2000 deviates from the rules we have been preaching for so long. With an available memory bandwidth virtually equal to that of the SDR GeForce (2.5GB/s vs 2.65GB/s), the Savage 2000 should take a similar drop in performance when switching to 32-bit color rendering. There are two reasons that explain why this does not happen, the first is that the latest drivers (v1.09) deliver a 24-bit Z-buffer to Quake III Arena without providing for an 8-bit stencil buffer, unlike the GeForce which provides the 24-bit Z and 8-bit stencil buffer as requested by the application. This frees up a noticeable amount of memory bandwidth which thus lessens the effect of the switch to 32-bit color rendering which is a bandwidth hog in itself. The second factor that helps the Savage 2000 out was pointed out to us by a helpful reader posting on AnandTech's Forums. If you recall, one of the most talked about features of the Savage 3D was its support for S3TC, or S3's Texture Compression algorithm. S3TC made its way into the Savage4 and now the Savage 2000, fortunately id has included a provision for S3TC in Quake III Arena, instead of using higher resolution textures whenever S3TC is enabled, Quake III Arena simply uses the compressed textures, which are physically smaller in size without sacrificing image quality, and thus a decent amount of memory bandwidth is freed once again making the transition to 32-bit color on the Savage 2000 less painful than it is on the SDR GeForce.

Matrox's G400MAX and ATI's Rage Fury MAXX are also performing quite well in 32-bit color, this is primarily because of their inherently massive available memory bandwidth figures.

These are all factors you'll want to keep in mind as we further investigate performance.

Now at a more realistic resolution for someone with a Pentium III 700 and a $300+ video card, at 1024 x 768 we see the DDR GeForce truly pull ahead of the competition in the 16-bit color tests. Beating out the next fastest Rage Fury MAXX by over 24 fps, the DDR GeForce even delivers a playable 64 fps at 1024 x 768 x 32-bit color. If frame rate is indeed king, then NVIDIA takes the crown here while delivering superb image quality.

Most of the previous generation of cards are having a very hard time competing at this resolution, with the exception of the G400MAX, which is performing at close to a next generation level, the chart is topped with the latest and the greatest from NVIDIA, ATI and S3.

The massive memory bandwidth of the MAXX shows here as it is almost able to keep up with the DDR GeForce and easily beats out the SDR GeForce, especially in 32-bit color.





All of the cards lose some of their horsepower as we make the move down to the Pentium III 600, but the standings remain the same. Once again, the Savage 2000 loses very little performance from the move to 32-bit color because of the aforementioned points of conserving memory bandwidth. There also seems to be some driver tweaking in play here as S3/Diamond must have really concentrated on performance in situations of high memory bandwidth usage.

The performance of the Rage Fury MAXX here is definitely being limited by its drivers, but these are the final shipping drivers so don't expect performance to change until the next driver update.

The performance situation at 1024 x 768 and 1600 x 1200 with the Pentium III 600E is virtually identical to that with the Pentium III 700.



As the CPU performance drops, so does the gap between the performance of the 11 cards compared here as the focus of performance turns to the limitation of the CPU rather than that of the video card.

Once again, the MAXX's 32-bit performance is able to eclipse the SDR GeForce, but can't quite catch the Savage 2000 or the DDR GeForce. Interestingly, the Voodoo3 3500 almost catches the MAXX in 16-bit color.

Although the DDR GeForce is still on top, it's 16-bit color performance advantage is no where near as large as it was in the past two performance tests with the 600 and 700MHz CPUs. The only reason the DDR GeForce would make for a good buy in this case is for future performance with faster CPUs as well as 32-bit color performance since, at 1024 x 768 x 32, it still offers close to that magical 60 fps mark.

The SDR GeForce starts to choke in 32-bit color, especially in comparison to the Savage 2000 which brings forth very promising performance. The history of S3's driver support as well as the previous entries into the Savage line of graphics chipsets makes a purchase here a very weary one at best. S3's history has come back to haunt them and it will take a lot of proof before this market accepts them as a viable alternative to the likes of NVIDIA and 3dfx who have been delivering, more or less, on their promises of performance in recent times.








Since Unreal Tournament is affected by more factors than just video card driver quality at 640 x 480 the DDR GeForce and Rage Fury MAXX are virtually tied for the first place spot at 640 x 480. All of the cards perform respectably here, the Savage 2000 is held back by the poor D3D performance of its drivers.

The Voodoo3 cards excel in this test because we were forced to use the Glide setting for the UT tests with the 3dfx cards, the D3D setting would not complete our tests properly. In any case, if you're into playing UT, then the Voodoo3 with its native Glide support will give you excellent gameplay simply because UT was designed with superb Glide support. The D3D and especially the OpenGL performance under UT is not up to par with its Glide performance at all, giving the 3dfx cards the edge in the UT tests.

For those of you wondering about Athlon 700 performance, the standings are identical to that of the Pentium III 700.

Once again there is very little difference between the DDR GeForce and the Rage Fury MAXX, making a recommendation for the more expensive DDR GeForce very difficult unless you play games other than UT (i.e. Quake III Arena, in which case the DDR GeForce would give you the better overall experience).

The Voodoo3 is once again performing quite respectably as a result of its native Glide support.

The Savage 2000 holds up the rear as it is begging S3 for updated drivers. Fortunately, S3 has recently released MeTaL support for the Savage 2000, allowing it to take advantage of the S3TC compressed textures found on the second Unreal Tournament CD.

The 1600 x 1200 scores you see above won't change all that much as we cycle through the various processors. The DDR GeForce finally separates itself from the Rage Fury MAXX but not by a huge amount.

The Savage 2000 finally moves up the ladder as a result of its efficient memory management, with some driver tweaks for D3D performance it wouldn't be surprising to see the Savage 2000 perform even better. Keep an eye on this card, if S3 can get their act together with their next driver update (due out in January), the Savage 2000 may end up being the "low-cost" (compared to the GeForce and other cards in its class) high performing card that we were promised on paper a few months ago.

The native Glide support can only do so much for the Voodoo3 cards, at 1600 x 1200 the cards begin to show their limits as they fall down below the average performing TNT2 Ultra.




With the Pentium III 600E we get a breakdown of performance similar to that we experienced on the 700, just a bit slower.



The range of frame rates on the Pentium III 450 shrinks to such a level that an upgrade for a Pentium III 450 owner from any one of these cards to the MAXX is not justified at all.



We have a similar situation with the two Celeron CPUs, the benefit of the MAXX is noticeable but not worth the added cost. You'd be better off upgrading your CPU first then worrying about a video card upgrade if you're serious about improving UT performance.



 



The Infamous "Lag" Issue

There's been some talk around the internet regarding the issue of "lag" or "latency" introduced by the Rage Fury MAXX due to the nature of its dual chip solution and ATI's AFR technology. The theory goes like this: the response to an input will not occur until two frames have passed because those two frames are already being rendered when the input was made. This is in contrast to a single chip solution which renders one frame at a time, so the response should occur with the next frame.

As with just about anything in life, it's a bit more complicated than that simple analysis.

ATI's response was posted over at Gamer's Depot and went like this:

Q: There are rumors that the Fury MAXX causes lag in fast-paced action games because of its dual chips. Is this true?

A: No, the rumors centered around the claim that since each Rage 128 Pro chip on the Fury MAXX takes almost twice as long to draw a frame as a GeForce does (although the overall frame rate is the same or higher for the Fury MAXX because it has two chips working together), that there would be a noticeable delay between the time the computer receives input from the user and the time the resulting movement is displayed on the screen. This argument is simplistic and fails to take into account factors such as CPU processing time, scene complexity, frame rate, and screen refresh rate. Extensive testing has found that there is no noticeable difference when playing games on a Fury MAXX or a GeForce if they are running at similar frame rates.

We'll first state that a number of gamers were given the opportunity to play a few games of Quake 3 on the MAXX, followed by a game on an NVIDIA GeForce. None of them noticed any lag issues, whether they were on a LAN game or a modem game (of course the modem itself introduced lag). Playing at high resolutions or low resolutions did not have any effect either. In fact, even with all this talk on the internet, we've yet to see claims of anyone actually feeling the lag - it has all been theoretical so far.

Our take on the situation is this: a "normal" 3D card is actually already rendering the next frame while the current one is being displayed. This technique is called double buffering and is used by all current 3D accelerators. An optional mode of many cards is the use of triple buffering, where one frame is being displayed while two more are being rendered. Sound of kind of like the MAXX? You bet. In essence, you can think of the MAXX as performing a complex version of triple buffering. Of course, there have never been any complaints or issues with triple buffering, so there shouldn't be any with the MAXX either. As far as we're concerned, the issue of lag is a no longer an issue with the MAXX.



Final Words

The Rage Fury MAXX plays an interesting role in the video card market right now. Overall, especially in high-resolution/32-bit color scenarios, it has a nice time beating the SDR GeForce at an equal price. So for those of you that have a fast enough CPU (Pentium III 450 or above), the Rage Fury MAXX makes for an excellent GeForce alternative while adding excellent support for DVD playback.

Those of you with slower CPUs may want to opt to stay away from the MAXX/GeForce debate entirely and just go after one of the recently price-reduced Voodoo3's or TNT2 Ultras which are still very serious performers.

Athlon owners won't find the highest performing solution in the Rage Fury MAXX, but then again they won't be disappointed by the card. The Athlon's match still seems to be the TNT2 Ultra, as it consistently performs noticeably better on the Athlon than it does on any Intel platform.

The true question boils down to how long you will be able to go without upgrading. If you are going to upgrade in another 6 - 8 months then going with the Rage Fury MAXX over the competition won't be too big of a problem, simply because once games begin to take advantage of hardware T&L you will be ready to upgrade to the next generation of video cards with more advanced hardware T&L support. On the other hand, if you are determined on keeping your next video card for much longer than that 6 - 8 month period you may want to consider the GeForce or the Savage 2000, having hardware T&L support on your card will increase the longevity of your investment. While the Savage 2000 currently doesn't have an enabled hardware T&L engine, well before the end of Q1 2000 we will see hardware T&L support for the Savage 2000 under OpenGL making it a viable option for the longevity category.

Although ATI promised to match the price of the GeForce cards, a quick look around the internet shows the MAXX going for more than the SDR version of the GeForce and nearly matching the price of the DDR model. The DDR GeForce seems to be a better all around solution for just a bit more. If ATI could push the price of the MAXX down towards the level of the Savage 2000 or below the SDR GeForce, they would definitely have a winner on their hands. Unfortunately, the use of 64MB of RAM and multiple chips will make that feat a difficult one for ATI.

Log in

Don't have an account? Sign up now