Original Link: https://www.anandtech.com/show/277
April 1999 Super7 3D Video Accelerator Comparison
by Anand Lal Shimpi on April 10, 1999 1:15 AM EST- Posted in
- GPUs
Like a childish game, the unwillingness of Intel to share their Slot-1 architecture with the rest of the industry, although for perfectly logical business reasons, has taken its toll on the quality of systems built upon non-Intel processors. The standard originally developed in 1998 by AMD is widely known among AMD supporters around the world as Super7, an extension to the formerly Intel dominated Socket-7 motherboard architecture. The differences between a Super7 based system and a Slot-1 based system are minimal in terms of the performance that one can theoretically achieve on each of the platforms. Unfortunately there is one critical differentiating factor that has stunted the growth of the AMD dominated Super7 market greatly in comparison to the Intel controlled Slot-1 market. The Achilles heel of the Super7 standard? Poor chipset support.
Since the first day of the official release of the Super7 standard, the efforts of chipset manufacturers Acer Labs Incorporated and VIA Technologies have been lack luster in comparison to their big brother, Intel, in terms of chipset compatibility. One could make the argument that due to Intels incredible size as a company, it is in the best interest of the market to make sure the best selling products are compatible with Intels chipsets; the responsibility falls on ALi and VIA to make sure that their chipsets are equally as compatible. In the past, ALi and VIA chipset solutions have been at fault for compatibility problems with the latest AGP graphics accelerators due to the nature of the AGP support provided by their two respective chipsets. Only recently have most of the "kinks" in the software drivers been ironed out and the level of compatibility of Super7 motherboards and the latest graphics cards been increased. Unfortunately there remains another limitation of the nature of the Super7/Socket-7 standard that has kept the lives of gamers who own such computers a little less happy than those running equally priced Celeron systems on the Intel side of things. The second tragic flaw of the standard? Poor 3DNow! support by video card manufacturers.
Unveiled at the E3 conference in 1998, AMD brought forth the bridge that was supposed to eliminate the gaming performance gap between Intel and AMD based systems, in the form of an extension to the processors native instruction set, AMDs 3DNow! enhancements were the talk of the town. However they never grew to be much other than just that in the eyes of most video card manufacturers. To date, there has been no independent effort by a party other than AMD, to bring a nearly complete 3DNow! implementation into the drivers of the latest in graphics accelerators, and even then, AMDs efforts are only truly seen on the graphics cards manufactured by a single company. Not encouraging at all.
If you consider all of that, the fact that the latest graphics accelerators are not all guaranteed to work on Super7 systems, and the fact that the gaming performance of a high end K6-2 or K6-3 450 system will usually barely come close to that of an equivalently configured Intel Celeron 300A, a $60 chip (3dfx based systems excluded), it makes almost perfect sense that the Super7 market is essentially ignored when it comes to comparisons of the latest in 3D accelerators. It is usually considered to be, a) too much work, and b) not worth the time, simply because of the common misconception that a true gamer would never purchase an AMD based system.
Reality Sets in: There are Super7 Gamers Out There
Just as there are Porsche owners that dont speed, the reality of the matter is that there are gamers out there that play on non-Intel systems, specifically AMD K6, K6-2, and even the newer K6-3 based systems. There isnt a law that states if you want to play a game you must purchase an Intel processor, so theres nothing wrong with fragging a few in Quake 2 or running from the assassins in a game of Half-Life TFC is you happen to have an AMD based system. This comparison is not going to deal with the technical "morality" of purchasing either an Intel or an AMD based system, rather it will attempt to tend to the needs of Super7 users. They, just like all Pentium II owners out there, want to know what the best 3D accelerator for their system happens to be, and just like all Pentium II owners out there (Celery supporters included) they deserve an answer.
Setting It Up
Installing an AGP accelerator on a Super7 motherboard is a bit more complicated than doing so on a Socket-370 or a Slot-1 motherboard based on an Intel chipset. This is simply because you have to take into account the configuration and setup of the drivers that enable a feature of the AGP specification known as the Graphics Address Remapping Table, or GART for short. The importance of GART support to a true AGP accelerator is this, if you happen to have an incredibly large texture that cannot fit within your graphics cards local memory, the AGP bus can allow for it to be transferred quickly for storage and later retrieval to and from system memory. But the AGP bus can only transfer the textures, how is it stored? The Graphics Address Remapping Table essentially allows the video card to address texture maps as single data objects, a process critical to getting the full benefit from AGP texturing (the storage/retrieval of large textures to and from system memory), one of the major benefits of the Accelerated Graphics Port.
Unfortunately for Super7 users, GART support is natively provided and optimized by Windows 98 (and Windows 95 OSR2) for Intel AGPSets, or Intel chipsets with AGP support, such as the i440LX, i440BX, and the i440GX. Although Windows 98 does offer support for VIA and ALi based AGP solutions, the optimizations are not nearly as thorough as those provided for their Intel counterparts, for the same reasons discussed in the opening of the article. The responsibility then fell upon VIA and ALi to produce updated virtual device drivers that would provide full GART support among other features to users of motherboards based upon their chipsets under Windows 98, and this they did.
The most common cause of Super7 AGP video card incompatibilities appears to be the drivers, not the chipset itself, where a lack of proper support for the specification as defined by Intel (since they are the dominant force in the industry, all graphics chipset manufacturers pursue 100% compatibility with their chipsets first) often results in stability problems and compatibility problems. Likewise, the most commonly overlooked step in setting up a high performing yet stable Super7 system with an AGP graphics accelerator is the simple act of loading the AGP drivers from the chipset manufacturer.
A second problem AnandTech ran into when experimenting with the latest 3D accelerators and two of the most popular Super7 chipsets (the Aladdin V and MVP3) was the ambiguous setting referred to as AGP Turbo Mode. This feature, which is common to both Aladdin V and MVP3 based motherboards, illustrated a direct correlation to the performance of the AGP graphics card installed. Enabling AGP Turbo Mode (accomplished through the BIOS on Aladdin V based boards and through the VIA AGP setup utility on MVP3 based boards), as you can easily assume, increases the performance of your system, however it also happens to be one of the most commonly overlooked steps in setting up a Super7 system. Most Aladdin V motherboard owners may not be aware of the setting which should be present in the latest revisions of their BIOS setup files, however it seems as if VIA wisely chose to include the option to enable/disable the Turbo mode in their AGP setup utility. This is not the same AGP Turbo Mode that BX motherboard owners may be familiar with, as it does not simply run your AGP bus at the FSB frequency. The setting seems to enable the full set of AGP functions as defined by the specification, seemingly directly related to the GART as AGP video cards with a poor implementation of the specification generally exhibit erratic behavior after having this option enabled as will soon be explained from AnandTechs findings.
If you take the above precautions into account while setting up a Super7 system with an AGP video accelerator, youll end up with the highest chance of achieving a successful install, a rate which has increased tremendously due to the presence of more mature AGP drivers from the two major Super7 chipset manufacturers.
For purposes of benchmark integrity, each video card compared received a completely formatted test hard drive without any foreign video drivers present and the latest revision of the video manufacturers drivers as well as the motherboard chipset manufacturers drivers were installed, current as of April 9, 1999.
2D Image Quality Comparison
Just like Intel owners, everyone else out there deserves to know which card/chipset will provide the best in 2D output quality. There are a number of factors that go into determining the 2D-output quality of a particular video chipset/board. The thing to remember is that the quality of the 2D signal produced by a video card is dependent on the speed/quality of the boards RAMDAC (Random Access Memory Digital Analog Converter the unit that converts the digital signal from your computer to an analog signal capable of being displayed on a standard monitor) however the images you see on your screen are influenced by the quality of the filters present between the RAMDAC and the cards VGA output. Just like a motherboard, video card manufacturers place capacitors and other "filters" between the video chipset (if the card has an integrated RAMDAC) or the RAMDAC (if the card has an external RAMDAC which is quite rare now) and the VGA port on the back of the card, in order to keep the quality of the signal being transmitted as high as possible. Chances are that the signal your video cards chipset is sending out and the signal that leaves the VGA output port on the video card are quite different, with the latter being a much more noisy signal than the former.
Most video card manufacturers will skimp on the type, quantity, and quality of the filters used in this critical spot on a video card in order to save a few dollars on the retail cost of their board. The motivation behind this is simple, if you see two TNT cards sitting side by side, with their specifications and game bundles exactly the same, yet the one on the left is price $5 cheaper, youll naturally go for the cheaper card. Because of this, the 2D image quality of two cards based on the same chipset yet manufactured by different companies can be drastically different as has been the case with many nVidia TNT based cards. The difference usually cant be seen at resolutions of 1024 x 768 and below, however where the true test comes in is at 1280 x 1024 and above, especially when using higher refresh rates where the screen is updated more frequently from the RAMDAC (more info is passed through those filters in the same amount of time). If you never run your video card at anything above 1024 x 768, then you probably wont notice any difference between the 2D image quality of the video cards mentioned here; and if your monitor wont allow you to run at 1280 x 1024 or higher at a refresh rate greater than 60Hz for example, the same applies.
If you do happen to have a higher end monitor, and if you do happen to run at those higher resolutions, then this may interest you. Out of the video cards compared, the best in 2D quality came from the 3dfx Voodoo3 3000 whose 350MHz integrated RAMDAC produced the clearest pictures at 1280 x 1024 and 1600 x 1200 in AnandTechs test lab. The 2000 model, as discussed in the Voodoo3 Review, featured a 300MHz RAMDAC and did provide an almost equal level of 2D image quality. The third runner up for best 2D image quality out of the roundup would be yet another 3dfx entry, the good ol Banshee whose integrated 2D core inspired the improvements found in the Voodoo3s integrated 2D. Relatively speaking, 3dfx has excellent 2D output on all of their newer 2D/3D cards (Banshee/Voodoo3), however in comparison to the Matrox G200, 3dfx still has a bit of climbing to do on the ladder to reach the top.
ATIs Rage 128 provides what Id like to refer to as average 2D image quality, where it isnt nearly as crisp as that of the Matrox G200, however it doesnt utterly disgust you while youre working with white text on a black background, etc If the Rage 128 were to be called average, the Voodoo3 would be dubbed slightly above average, the G200 being excellent, and then the TNT following with a not-so-dramatic slightly below average rating. The TNT is unique in this comparison because nVidia is the only manufacturer in the aforementioned informal roundup that doesnt manufacture their own cards, therefore the 2D image quality can vary greatly from one board to the next. NVidia has no control over this portion of the manufacturing process, so if youre purchasing a TNT based card and want the best possible 2D image quality your best bet is to stay away from cards that dont adhere to nVidias reference design (such as the Canopus Spectra 2500). Generally speaking, Diamonds V550 has had some of the greatest success in terms of 2D output quality, however even when comparing the V550, the TNT is only on par with that of the Rage 128, and no one wants to have just "average" quality.
Chipset Compatibility Comparison
Heres the killer, it doesnt matter whether or not a manufacturer has the fastest video card on paper, unless it is 100% compatible with your system, youre not going to be enjoying that speed as much as you can. This has unfortunately been the case with a number of AGP accelerators and Super7 chipsets for the same reasons discussed earlier. Which cards boast the best compatibility and which require a little elbow grease to get rolling?
If you recall, the problem with Super7 chipsets doesnt lie in the hardware, rather the drivers put together by their manufacturers. Remember what the drivers are used for? Properly enabling the AGP functions of the chipset in the software. So wouldnt it make sense that those video cards that rely the most upon the functions of the AGP specification as well as those that have problems with some functions of the specification would be the most problematic? Makes sense, right? Well, that happens to be the case.
The cards with the worst AGP implementation, all 3dfx cards, which dont really use the AGP bus for texture storage/retrieval at all, happen to be the most easy to install and problem free of the graphics cards on Super7 systems. Of course PCI cards would be the easiest to install, however out of the latest batch of 2D/3D accelerators, very few are available in PCI versions, and those that are happen to be available in very limited quantities. Its ironic that the very feature 3dfx is mocked for not having in the Slot-1 market is the very feature that gives them the edge in the Super7 industry.
The cards with the best AGP implementation, particularly the nVidia TNT and the Matrox G200, happen to be the cards with the greatest set of problems with Super7 chipsets. Although Matrox has considerably improved their driver support for Super7 users, nVidias TNT continues to plague many Super7 users as there are still a number of complaints about TNT/Super7 compatibility surfacing on BBS and News Groups all over the net. Once again, compatibility has been improved significantly since the first release of the Super7 standard, however there are still cases where users are left without a clue as to what to do next while setting up their Super7 systems.
By far the worst out of all of the cards compared was ATIs Rage 128 whose flawed implementation of the AGP specification (the drivers dont seem to enable GART/AGP texturing properly resulting in poor performance with large textures) was amplified by the barely compatible drivers. The Rage 128 seemed to work much better on AnandTechs MVP3 based FIC PA-2013 test bed after disabling AGP Turbo Mode, however on the ALi Aladdin V based ASUS P5A test bed the Rage 128 was not nearly as reliable as the rest of the cards. The setup of the card required the disabling of AGP Turbo Mode as well as the VGA Frame Buffer option in the BIOS Setup, followed by a reinstall of ATIs drivers. The card refused to work at all on AnandTechs IWill XA-100Plus Aladdin V based test bed whose BIOS did not feature the AGP Turbo Mode setting, so be very careful if youre pursuing the Rage 128 as an option for your Super7 system. When AnandTech first looked at the card in 1998, it seemed to be a very promising solution for Super7 users as it offered an excellent combination of performance, quality, and features. Unfortunately, now, 4 months later, ATIs poor driver support is more than disappointing and keeps all Rage 128 based cards on the "beware of" list for Super7 users, at least until ATI can iron out their driver issues.
3D Image Quality Comparison
Image quality has always been 3dfx's downfall, dating back to the days of the original Voodoo accelerator, how have things changed? Let's do a head to head comparison of the Voodoo3 and a fair representative of the rest of the 3D accelerators out there, the ATI Rage 128 whose 3D image quality is truly top notch.
The top most image is taken from the ATI Rage 128, the bottom from the Voodoo3. Can you tell the difference between the two? Click each one to view a full sized uncompressed jpeg file (approx 900KB). During fast paced gameplay, the Voodoo3 is indiscernible from anything else.
Here we have the same comparison, ATI Rage 128 on top, and the Voodoo3 on the bottom. Notice any difference?
Then what is the difference between the Voodoo3 and the rest of the competition? Here is an illustration of the 256 x 256 texture limitation of the Voodoo3:
This is what the texture should look like
This is what the texture looks like on a
3dfx accelerator
The problem with 3dfx's 3D image quality? If you are the type of person that can't stand anything if it's not crisp and clear, then the Voodoo3 (along with the Voodoo2, Banshee, etc...) will definitely bother you. All other graphics accelerators have support for larger texture sizes, unfortunately 3dfx isn't one of them. In 3dfx's words, it's a small price to pay for performance. What matters the most to you?
3DNow! Support Comparison
What is the AMD K6-2? Nothing more than a faster version of the K6 which runs at a lower core voltage and features AMDs 3DNow! instructions. What is the K6-3? Nothing more than a K6-2 with integrated L2 cache that dramatically improves business application performance. What happens if your hardware has no 3DNow! support? Your K6-3 becomes nothing more than a K6-2 which is nothing more than a faster K6 while playing games, and all of the sudden your 450MHz "powerhouse" is cranking out a measly 40 fps at 800 x 600 in Quake 2. Weve already discussed 2D image quality, 3D image quality, and Chipset compatibility, but what about raw performance?
On Super7 systems, the performance of the fastest graphics chipset on paper can be humbled easily by a "slower" chipset with a stronger 3DNow! implementation. Of the three major manufacturers included in this comparison, 3dfx, nVidia, and ATI, only two of them claim 3DNow! support in their drivers. Of the two that claim the support, only one actually follows through with a truly noticeable increase in performance under a select few games. So which manufacturers took the time to work on 3DNow! driver optimizations?
None. Truthfully speaking, it was AMD who spent the countless hours re-working the Quake 2 OpenGL drivers in order to show off the incredible power of their 3DNow! instructions. The company to receive the most noticeable benefit in this respect has been 3dfx, whose dedication to continuing the 3DNow! support initiated by AMD has given them yet another edge over the competition in the Super7 market. Unfortunately, there is much confusion as to how to properly take advantage of AMDs 3DNow! Quake 2 patch with the 3dfx Banshee and Voodoo3 which require the use of 3dfxs MiniGL driver rather than AMDs 3dfxglam.dll driver under Quake 2 (and all other games based on the Q2 engine). The process is quite simple actually, however it isnt documented anywhere on 3dfxs site:
- Extract the Quake 2 MiniGL file provided by 3dfx to your root Quake 2 directory (X:\Quake2\). The MiniGL is available for download at www.3dfx.com if you dont already have it.
- Rename the MiniGL file (3dfxgl.dll) to the following: opengl32.dll. Be sure to delete any previous opengl32.dll files that were present in your root Quake 2 directory before doing so.
- Extract the AMD Quake 2 3DNow! patch to your Quake 2 directory as documented in AMDs installation FAQ.
- Start Quake 2. Under the video options menu, choose 3DNow! OpenGL as your rendering device, not 3Dnow! 3dfxGL
Because of the current state of Banshee/Voodoo3 drivers, the performance of a single Voodoo2 and a Voodoo2 SLI setup is generally around that of a Banshee/Voodoo3 if not greater due to a stronger 3DNow! implementation in the Voodoo2 drivers. Soon enough 3dfx should improve the Voodoo3 drivers with better 3DNow! support, however also keep in mind that it wasnt until recently that the Banshees drivers received native 3DNow! support.
NVidia claims to be working on support for 3DNow!, but even while using AMD's 3DNow! OpenGL rendering device in Quake 2 the measured performance increase over standard the OpenGL renderer is barely noticeable, at most 5 fps under Quake 2. Under Direct3D things have improved considerably, however all Direct3D games will take advantage of 3DNow! as long as they require DirectX 6.0 to operate.
As of the time of publication, the Rage 128 failed to support 3DNow! in any measurable form however support from ATI for the growing standard should follow in the future. But as you already know, ATI has bigger driver problems on their hands before they can get to 3DNow! optimizations.
The Test
AnandTech received a final revision Voodoo3 2000 AGP and a pre-release Voodoo3 3000 AGP for benchmarking purposes. AnandTech's Super7/Socket-7 test configuration was as follows:
- AMD K6-3 350 CXT, AMD K6-266
- Kryotech Cool K6-3 500
- ASUS P5A Aladdin V based Super7 Motherboard w/ 512KB Cache
- FIC PA-2013 MVP3 based Super7 Motherboard w/ 2MB Cache (for compatibility tests)
- 64MB of Memman/Mushkin SEC Original SDRAM was used in each test system
- Western Digital 5.1GB Ultra ATA/33 HDD
- Microsoft Windows 98
The benchmark software used was as follows:
- id Software's Quake 2 Version 3.20 using demo1.dm2 and 3Finger's crusher.dm2
- Monolith's Shogo using 3Finger's RevDemo
- 3DMark 99 for Image Quality Comparisons
Quake 2 and Shogo were singled out to be the two games used for benchmarking for two primary reasons: 1) the offered a sample of not only OpenGL/Direct3D based games, but also a samples of games with and without native 3DNow! drivers,.and 2) they provide a simple way of comparing each individual cards performance with todays games as the industry is about to witness an influx of new titles including id Softwares Quake 3 Arena and Unreal Tournament Edition.
Quake 2s demo1.dm2 was run at four resolutions, 640 x 480, 800 x 600, 1024 x 768, and 1600 x 1200. The cards were run at 640 x 480 to show their maximum theoretical performance on each individual CPU, basically illustrating the CPU limitations of the card/chipset. The benchmarks were then conducted at 800 x 600 and 1024 x 768 to illustrate real world gaming performance, and finally at 1600 x 1200 to illustrate the performance of a high-end system at that resolution, basically an indicator of whether or not resolutions that high were in fact playable. All resolutions in between 1024 x 768 and 1600 x 1200 were omitted due to the fact that they are not standard across most 3D games.
For the in-depth gaming performance tests Brett "3 Fingers" Jacobs Crusher.dm2 demo was used to simulate the worst case scenario in terms of Quake 2 performance, the point at which your frame rate will rarely drop any further. In contrast, the demo1.dm2 demo was used to simulate the ideal situation in terms of Quake 2 performance, the average high point for your frame rate in normal play. The range covered by the two benchmarks can be interpreted as the range in which you can expect average frame rates during gameplay. Due to the nature of the crusher.dm2 benchmark, only real world scenarios were depicted, and the benchmark was only run at 800 x 600 and 1024 x 768.
The RevDemo Shogo benchmark was run at the same resolutions as Quake 2s demo1.dm2.
Quake 2 demo1.dm2 -AMD K6-3 500 Performance
Quake 2 demo1.dm2 -AMD K6-2 300 Performance
Quake 2 demo1.dm2 -AMD K6 266 Performance
Quake 2 demo1.dm2 CPU Scaling Performance
Quake 2 crusher.dm2 - AMD K6-3 500 Performance
Quake 2 crusher.dm2 - AMD K6-2 300 Performance
Quake 2 crusher.dm2 - AMD K6 266 Performance
Quake 2 crusher.dm2 CPU Scaling Performance
Quake 2 Performance Conclusions
The surprising winner here? The Voodoo2 SLI. The more mature 3DNow! drivers of the Voodoo2 give it the performance advantage of every other setup in the roundup, including 3dfxs newly released Voodoo3. Just as AnandTech illustrated in the Voodoo3 Review, the Voodoo2 SLI and the Voodoo3 are virtually equal performers at 800 x 600. The Voodoo3 2000 was omitted due to the relatively small performance difference between the Voodoo3 2000 and the Voodoo3 3000 as well as the Voodoo2 SLI.
Super7 users that have already invested in a Voodoo2 SLI solution will definitely want to hang on to their Voodoo2 SLI cards until the next wave of graphics cards make their way into the market. There is no real benefit for the Voodoo3 right now in the Super7 arena, especially considering that the maturity of the drivers may end up giving you slower performance than what youre used to in some situations.
If you dont have a 3DNow! processor, the Voodoo3 will end up being the fastest overall performer, however it wont keep that lead over the Voodoo2 SLI by a large margin as is illustrated by the K6-266 benchmarks.
The Voodoo2 is also performing quite well for a single card solution, for those users that are craving more performance out of their setup and are contemplating purchasing the Voodoo3, youre probably better off adding a second Voodoo2 to your setup and running it in SLI as the cost of a Voodoo2 should be quite affordable by now.
One of the most affordable and well rounded 2D/3D accelerators happens to be one of the most commonly overlooked ones, the 3dfx Banshee. While its lack of single-pass multi-texturing keeps it below the performance of the other 3dfx cards in Quake 2 and games based on the same engine, its affordable price and single card nature make it ideal for an affordable Super7 setup. 3dfxs latest Banshee drivers boast 3DNow! support, and using the trick to enable 3DNow! support with 3dfxs MiniGL driver the Banshee easily outsteps the Riva TNT and ATI Rage 128 which ended up being the two worst choices for a Super7 system.
The Voodoo3 is limited too greatly in the Super7 arena, especially by its own current driver support. If you can live with decent 3D acceleration, you may want to consider a single Voodoo2 or a Banshee if you need a 2D/3D card for your Super7 setup for now, and then in a few months you can pick up a Voodoo3 when the prices drop a little lower in preparation for the next wave of graphics cards.
The only real benefit the Voodoo3 offers here is support for higher resolutions at playable frame rates. You can expect the Voodoo3 to rise to the top of the performance charts as soon as 3dfx finishes optimizing the drivers for the card. In a recent conversation with a few 3dfx representatives AnandTech was told that the company is committed to providing further 3DNow! enhancements in their future drivers. Lets hope that nVidia and ATI can learn from 3dfxs example in the future as they are simply lagging behind in this race for a champion.
Shogo RevShogo - AMD K6-3 500 Performance
Shogo RevShogo - AMD K6-2 300 Performance
Shogo RevShogo - AMD K6 266 Performance
Shogo RevShogo CPU Scaling Performance
Shogo Performance Conclusions
There's no question about it, the Voodoo3 is the fastest 3D performer on a Super7 system in Shogo, but what does that mean? Shogo is representative of the Direct3D community, in that it doesn't have an enormous amount of time invested in 3DNow! support and at the same time, it's not the greatest exhibition of a game with outstanding single-pass multi-texturing support. Although the latest patch was applied that enabled single-pass multi-texturing, the best overall winner here seems to be the 3dfx Banshee, whose higher clock speed gives it the edge over the Voodoo2 in games such as Shogo.
The Voodoo2 SLI once again proves itself to be nearly as powerful as 3dfx's latest, the Voodoo3. Once again, if you are a Super7 Voodoo2 SLI owner, you probably want to stick with your current setup unless you happen to have money just burning a hole in your pocket. You're better off spending the money on a memory upgrade, CPU upgrade (if you need one), or grabbing a few extra gigs of hard drive space now that EIDE drives are so cheap (a CD burner wouldn't hurt either). The bottom line is that the Voodoo2 SLI is just as good, performance-wise, for a Super7 user, as a Voodoo3 3000.
The standalone Voodoo2 doesn't fare too poorly here, however if you're looking for a well rounded accelerator you're better off with the Banshee.
The TNT and the Rage 128 prove that their place among Super7 graphics accelerators is not as welcome due to their heavy reliance on raw CPU power as you can tell by the CPU scaling performance charts. While the nVidia TNT might have been one of the most popular 2D/3D combo cards in the Slot-1 market, the world of Super7 seems to be dominated by 3dfx.
The Answer?
So you're a Super7 user...what do you buy?
The absolute fastest thing you can buy today happens to be the Voodoo3. Is it worth it? Absolutely not. The performance is not justified by the added cost at all, unless you crave high performance at resolutions above 800 x 600, the Voodoo3 probably isn't what you're looking for.
The best overall option, in terms of 2D image quality, 3D image quality, performance, and compatibility is the 3dfx Banshee. Since it isn't a true AGP solution, you won't have any real problems with Super7 chipset incompatibilities, and the chipset happens to be very well rounded to the point that it should be able to offer you just about everything you need. This is assuming that you don't have a previous video card, if you do, then the suggested course of action may be a Voodoo2 unless you're absolutely disgusted with the 2D performance/quality of your current video card.
Current Voodoo2 owners will want to either stick with their current setup, or simply add on another Voodoo2, an upgrade from a single Voodoo2 to anything else is not the most desirable and cost effective one in terms of how much bang you'll get for your buck.
Naturally, you'll want to stay away from the TNT and Rage 128 unless you absolutely must have 32-bit color support, but be warned, you'll be trading quite a large portion of your soul for that 32-bit rendering support if you're running a Super7 system and a TNT/Rage 128.
One thing you'll need to remember is this, if you have a CPU slower than a K6-2 300, then the performance difference between all of the chipsets/cards mentioned here will be negligible, so there's no point in teaming up a Voodoo3 with a K6-233, your CPU would end up being the limiting factor.
You know what kind of performance you want, and now that you've been armed with the performance figures, it's time to make a decision. Just remember that a major investment in a video card now will probably be regretted in a few months, so your best bet is to play it safe with a decent card now (like the Banshee) and put the extra money towards other system upgrades, then go all out when the real accelerators hit the streets. Also remember that nVidia has their TNT2 on the way in under a month, not to mention S3's Savage4 and Matrox's G400. 3dfx, nVidia, S3, and Matrox stock piling their ammunition once again, the four superpowers are ready to face off...who will come out on top? Let's see if history repeats itself this time around...