Original Link: https://www.anandtech.com/show/332



Dedicated to the memory of Robert DiGiacamo. A great man and a good friend. May his dreams continue to grow and flourish with time.

Much like basketball fans individual companies have received their own groups of followers that wouldn’t dream of betraying their loyalty at any cost. The greatest example of this in recent times would be the 3dfx/NVIDIA debates that engulfed so many unsuspecting gamers in flame wars across bulletin boards and newsgroups online. It seems like it is almost a fad to support a single company and no other these days, an unfortunate trend since a close minded approach to an open market such as the video accelerator market will greatly limit you. For this very reason, AnandTech’s coverage of Super7 Video solutions comes with one prerequisite that you, the reader, understand one fundamental principle, there is no one "best" video accelerator available on the market today. The fact of the matter is that there is a different video accelerator for each individual’s needs, so instead of wasting time arguing why 3dfx is better than NVIDIA or why S3 is the savior of the market, spend the time researching which card is right for your needs. Now that we’ve established that prerequisite, let’s take a look at the question at hand, what’s the best video card for your Super7 system?

A Call to Action

The Super7 industry has never been a flawless platform for hardware manufacturers to support, ask any motherboard or video card manufacturer and they will undoubtedly agree with you on that point. Does that mean that the Super7 industry doesn’t deserve any attention? Absolutely not, as long as there are users alive that support the standard, there will be a desire for information about Super7 related topics, one of the most prevalent being video card performance. The only problem with delivering such information is that, until recently, most newer graphics accelerators had problems with the two main Super7 chipsets, the ALi Aladdin V and the VIA MVP3.

Before the release of a number of the latest video chipsets AnandTech was given the opportunity to preview the technology by a number of manufacturers. Each time AnandTech would take a look at a prerelease card/chipset solution we would pose the question to the particular card manufacturer of whether or not they had experimented with any Super7 tests. And each time the answer AnandTech received was the same "no" we had been hearing ever since the dawn of the Super7 AGP video incompatibilities dating back to the release of the first Super7 motherboards.

A call to action was essentially issued by 3dfx to the rest of the market to begin to actively support Super7 platforms as their solutions, not being fully AGP compliant, experienced virtually no problems with Super7 systems and tended to perform much better than the competition’s products. Not wanting to give up the entire Super7 market to 3dfx and for once, listening to the demands of the Super7 industry, NVIDIA took the first step towards a brighter day among Super7 gamers with the release of their Detonator 1.88 drivers for the TNT/TNT2. These drivers, with full 3DNow! support gave the Super7 market possibly one of the last breaths of hope the market will allow it before the official release of the AMD Athlon (K7).

With AnandTech’s last Super7 video accelerator comparison performed seemingly ages ago, this was a call to action for all reviewers as well, needless to say after days of benchmarking the comparison is ready, the winners have been crowned, and the graphs have been made. Due to length constraints, the Super7 Video Comparison has been divided into four separate sections, High-End Gaming performance, Mid-Range/Low-End Gaming performance, High-End Professional Application performance, and a final look at all three of those performance categories.

This, the first of four comparisons, concentrates on High-End Gaming performance, comparing the latest graphics accelerators from 3dfx, 3DLabs, Matrox, NVIDIA, and S3 on two of the fastest and most popular Super7 processors on the market. The Mid-Range/Low-End comparison will fill in the gaps with older graphics accelerators and slower Super7 processors, followed by the High-End Professional Application comparison which will concentrate on NT performance/stability. So without further ado, let’s proceed to the next step.

The Problems

For those of you that have never owned a Super7 system and are curious about the problems associated with Super7 systems and AGP video cards, the problems themselves are actually not related to the fact that AMD makes the CPUs, unfortunately the problems do lie in the chipset solutions for Super7 systems. Two main manufacturers, as mentioned before, ALi and VIA are the driving forces behind the motherboard support for the Super7 platform. Unfortunately the maturity of their Super7 solutions isn’t nearly as great as that of Intel’s 440BX chipset that has been giving Slot-1 and now Socket-370 users solid performance for quite some time.

The first true Super7 chipset to hit the market was the Aladdin V by ALi (for the sake of simplicity of this argument we are not addressing the SiS solutions available prior to the Aladdin V), however the Aladdin V, if anything, is the best example of the rule of thumb that first isn’t necessarily best. The Aladdin V unfortunately has a compatibility problem with the current batch of TNT2 accelerators. The nature of the current issues with the chipset is quite bizarre as some motherboards based on the Aladdin V will work just fine with the TNT2, while others won’t. Before its official release, the Aladdin V went through over 10 different chip revisions, however there still seem to be some problems with the chipset, so anyone looking for a new Super7 system will probably wish to stay with VIA for the time being.

The VIA MVP3 is the only other remaining solution for Super7 users (the MVP4 has built in video) looking for good gaming performance. But a question that quite a few users ask is how to properly setup a Super7 system based on the VIA MVP3 chipset. By far, the absolute easiest way to start off setting up a Super7 system is to perform a clean install of your operating system, while this is not always possible it is highly encouraged as it does tend to simplify the rest of the process just in case you were using any other conflicting drivers prior to installing your new video card.



Setting it Up

Before you even begin to install your video card's drivers, you'll want to prep your system for the installation. VIA makes five helpful files available for this setup procedure, the files are:

VIA IDE Busmaster Driver (2.1.33)

VIA AGP GART Driver (3.3)

VIA IRQ Routing Driver (1.3a)

USB Filter Driver (1.04)

PCI Bridge Patch (1.4)

As a rule of thumb, all of these patches and drivers should be installed to obtain the most compatibility and performance from your MVP3 Super7 system. There are a couple of warnings however, for one, users that happen to have DVD decoder cards installed with DVD drives may want to forego the Busmaster driver installation as some cards tend to cause compatibility problems with the drivers.

The order of installation does hold some importance, the Busmaster, IRQ routing, USB Filter, and PCI Bridge drivers should be installed first and a clean reboot should be performed before either the AGP GART or your video card's drivers are installed.

Installing an AGP accelerator on a Super7 motherboard is a bit more complicated than doing so on a Socket-370 or a Slot-1 motherboard based on an Intel chipset. This is simply because you have to take into account the configuration and setup of the drivers that enable a feature of the AGP specification known as the Graphics Address Remapping Table, or GART for short. The importance of GART support to a true AGP accelerator is this, if you happen to have an incredibly large texture that cannot fit within your graphics card’s local memory, the AGP bus can allow for it to be transferred quickly for storage and later retrieval to and from system memory. But the AGP bus can only transfer the textures, how is it stored? The Graphics Address Remapping Table essentially allows the video card to address texture maps as single data objects, a process critical to getting the full benefit from AGP texturing (the storage/retrieval of large textures to and from system memory), one of the major benefits of the Accelerated Graphics Port.

GART support is natively provided and optimized by Windows 98 (and Windows 95 OSR2) for Intel AGPSets, or Intel chipsets with AGP support, such as the i440LX, i440BX, and the i440GX. Although Windows 98 does offer support for VIA and ALi based AGP solutions, the optimizations are not nearly as thorough as those provided for their Intel counterparts, for the same reasons discussed in the opening of the article. The responsibility then fell upon VIA and ALi to produce updated virtual device drivers that would provide full GART support among other features to users of motherboards based upon their chipsets under Windows 98, and this they did.

The most common cause of Super7 AGP video card incompatibilities appears to be the drivers, not the chipset itself, where a lack of proper support for the specification as defined by Intel (since they are the dominant force in the industry, all graphics chipset manufacturers pursue 100% compatibility with their chipsets first) often results in stability problems and compatibility problems. Likewise, the most commonly overlooked step in setting up a high performing yet stable Super7 system with an AGP graphics accelerator is the simple act of loading the AGP drivers from the chipset manufacturer.

The next two files that must be installed are the AGP GART drivers and the video card drivers, but be sure that you have DirectX installed before installing either of those. The order of installing these two doesn't matter all too much, in some cases installing the AGP drivers first then your video card drivers will work fine in other cases the opposite will apply, just use whichever works for you.

A second problem AnandTech ran into when experimenting with the latest 3D accelerators and two of the most popular Super7 chipsets (the Aladdin V and MVP3) was the ambiguous setting referred to as AGP Turbo Mode. This feature, which is common to both Aladdin V and MVP3 based motherboards, illustrated a direct correlation to the performance of the AGP graphics card installed. Enabling AGP Turbo Mode (accomplished through the BIOS on Aladdin V based boards and through the VIA AGP setup utility on MVP3 based boards), as you can easily assume, increases the performance of your system, however it also happens to be one of the most commonly overlooked steps in setting up a Super7 system. Most Aladdin V motherboard owners may not be aware of the setting which should be present in the latest revisions of their BIOS setup files, however it seems as if VIA wisely chose to include the option to enable/disable the Turbo mode in their AGP setup utility. This is not the same AGP Turbo Mode that BX motherboard owners may be familiar with, as it does not simply run your AGP bus at the FSB frequency. AGP Turbo mode is essentially the same thing as AGP 2X mode, enabling it therefore enables AGP 2X mode and increases performance, while possibly leading to decreased stability as any problems with the AGP implementation of the motherboard's chipset are more prone to appear when running in "Turbo" mode.

If you take the above precautions into account while setting up a Super7 system with an AGP video accelerator, you’ll end up with the highest chance of achieving a successful install, a rate which has increased tremendously due to the presence of more mature AGP drivers from the two major Super7 chipset manufacturers.

For purposes of benchmark integrity, each video card compared received a completely formatted test hard drive without any foreign video drivers present and the latest revision of the video manufacturer’s drivers as well as the motherboard chipset manufacturer’s drivers were installed, current as of June 28, 1999.



Defining a High-End Gaming Solution

In the past, the words High-End Gaming and AMD never mixed, unfortunately this was the case for AMD users prior to the release of the K6-2. The AMD K6-2 marked the introduction of the first truly competitive non-Intel CPU that could excel at games, that very processor has evolved into what has finally become a viable High-End Gaming Solution. While there are some users that won't drop their Celeron or Pentium III systems for anything without an Intel Inside sticker, the fact of the matter is that AMD is much more of an alternative now than they were just a year ago. But what is AMD's version of a High-End Gaming Solution?

Currently there are two flavors of AMD CPUs that can fit this description, the K6-2 and K6-3. The current fastest AMD CPU that is available for retail purchase is the K6-3 500, however at a pricetag of over $400 it doesn't fit the budget of most Super7 users. The next best options are the K6-3 450 and the K6-2 475. Both processors share the same requirements, a 2.4v core voltage setting and support for the 100MHz FSB (K6-3 450) or the 95/105MHz FSB (K6-2 475). The price difference between the two is under $100, a quick search on the net will reveal that the real world difference is around $60 - $80, which is a considerable amount of money when you're already dropping a few hundred down on an upgrade. Bringing us to the next question, what's the difference between the two?

The K6-2 475 has the clock speed advantage over the K6-3, however the K6-3 450's on-chip 256KB of L2 cache operates at a frequency 78% greater than that of a normally clocked K6-2 475. For gamers, L2 cache isn't an important consideration, with most of today's games relying heavily on FPU calculations the speed of your processor's L2 cache isn't that important. Most graphics accelerators, however, that are heavily CPU dependent may experience a benefit in upwards of 10% by the increased L2 cache frequency of the K6-3 over the K6-2.

If you want the fastest overall solution, the K6-3 450 is faster than the higher clocked K6-2 475, unfortunately that performance boost comes at a cost to you. Another thing you may want to keep in mind is that the power drawn by the K6-3 (courtesy of its on-die L2 cache) is considerably greater than the K6-2. An unfortunate fact associated with quite a few Super7 motherboards is that they cannot reliably supply ample current to the K6-3 450 in the most stressful of conditions (i.e. when all DIMM banks are populated, etc…) which can lead to stability problems. As Tomshardware recently confirmed and as AnandTech reaffirmed, there are issues with the FIC PA-2013 with 2MB of L2 cache and the K6-3 that are related to this very issue. For this reason, AnandTech's standard Super7 test bed was modified to provide for an Epox MVP3G-M (1MB) instead of the standard FIC PA-2013 (2MB) MVP3 based motherboard.

This first article will concentrate on the performance of the latest graphics cards on the two fastest AMD processors (excluding the K6-3 500 which was just recently made available). The test systems were configured identically and a full system configuration disclosure was made in The Test section of the article. Let's get on to the performance…



2D Image Quality & Performance Comparison

Four years ago, 2D performance was a big deal as the latest batch of 3D accelerators barely deserved the label 3D in their titles. Back then companies like S3, Matrox, and Number Nine were what hardware enthusiasts like yourselves were drooling at, there were no TNT2 Ultras or Voodoo3 3000s to lose sleep over, just pure unadulterated 2D performance.

Since then, the 2D performance issue was quick to die as improved technology and architecture lead to the quick demise of the question "what 2D card is the fastest?" among gamers. While 2D performance is still a topic of discussion among the high end graphics professionals, the desktop 2D market is at a point where virtually any current generation accelerator will do just fine.

While the 2D performance of all current generation products is virtually identical, the respective image quality isn't. 2D image quality is a factor that still remains ignored by most manufacturers, simply because addressing the problem would mean spending time and money on something that the competition isn't doing. The first misconception about 2D image quality that seems to be the general train of thought is that a faster RAMDAC means a sharper 2D image.

As taken from the AnandTech Number Nine Revolution IV Review, here's a quick primer on the controversy behind 2D image quality:

Since its release, NVIDIA's TNT chipset has become a little more than a 2D/3D card for gamers.  It seems as if the TNT is being crammed down everyone's throat, even if they have no intention of touching a frame of Quake 2 or even picking up the crowbar in Half-Life.  Now, the TNT is a fairly affordable graphics solution considering it is a 2D/3D combo card, and its success is good news for NVIDIA.  Being a successful chipset isn't a bad thing, where the TNT does get a bad reputation is when someone with a 21" monitor unravels the TNT's dark secret and tries to run their card at 1600 x 1200 x 32bpp at a high refresh rate under Windows.  Look around the newsgroups, ask TNT owners, or try it for yourself, the TNT as well as many other 2D/3D combo cards don't provide the best 2D image quality when it comes to driving large monitors (i.e. 21") at high resolutions.  The most common occurrence being that when viewing black text on a white background (or vise versa), the characters will begin to seem a bit fuzzy, and, especially after hours of staring at the screen, your eyes will begin to feel the wrath of a poorly constructed card. 

Keep in mind that this scenario only really affects those with larger monitors running at resolutions above 1024 x 768 (most likely above 1280 x 1024).  The assumption being made here by most manufacturers is that their customers won't use their products for professional purposes (i.e. intensive image editing, publishing, etc...) and as long as their 2D quality and performance is top notch at resolutions under 1280 x 1024 at refresh rates under 75Hz (which most users do tend to stay under, simply due to monitor sizes refresh rate limitations), they'll be perfectly fine.  This holds true in a great percentage of the cases, which is why you'll hear people saying that the 2D image quality on the TNT or on the Savage3D is "top-notch" or "beautiful."  However, when you happen to push your TNT card to the limits at 1600 x 1200, or when you give the Savage3D a run for its money at the same resolution, and you see some "fuzzy" text, it's quite difficult to believe that just about every single TNT/Savage3D owner out there could be wrong in saying that the 2D image quality is astounding...but in your case, they are. 

The reason behind this is simple, in order to cut costs, the amount of filters placed between the analog VGA output on your video card and the RAMDAC are cut down to the bare minimum.  This sacrifice is made simply because of the assumption made above. The RAMDAC on a video card is the device that converts the digital signal from the local graphics memory (RAM) and converts it into an analog signal for the monitor using a Digital Analog Converter (DAC) since most displays are in fact analog devices, with the exception of a relative few digital LCD displays (not all LCD displays are digital, in fact, most are analog as well). The speed of the RAMDAC is a defining factor in how crisp the 2D quality of your video card is.

Since most of these cards will be used for 3D games, and since there isn't a next-generation 2D/3D combo card out there capable of running any 3D game at 1600 x 1200 in a high performing fashion, most manufacturers figure that it's better to keep costs low and satisfy a greater percentage of the population than increase the costs to satisfy a smaller percentage.  That is the unfortunate truth, however if you're a gamer, using a 15" or maybe even a 17" monitor, chances are that you'd rather pay $130 for a card that suits your needs instead of paying $160 for a card that suits your needs as well as your neighbor with a 21" monitor.  At the same time, if you put yourself in the shoes of your neighbor with the 21" monitor, chances are that your neighbor would rather pay $160 for a card that does everything they need it to do rather than pay $130 for a card of noticeably lesser quality. 

So why the trend to push faster RAMDACs?  Well, as mentioned before, there isn't a 2D/3D combo card capable of running 3D games at 1600 x 1200 in a high performing fashion...but the current crop of cards are getting there.  With frame rates of around 30 fps, 1600 x 1200 is now a possibility although not much of a playable one most of the time.  For that reason, manufacturers had to bump up the speeds of their RAMDACs to support higher, more comfortable refresh rates at resolutions above the once unheard of 1024 x 768 resolution for games.

It all depends on your perspective as a consumer, and instead of allowing users to have two options (a professional and a home use version) most manufacturers will go after the "one-size fits all" market and hope to succeed. In terms of 2D quality, at resolutions up to 1280 x 1024 the G400, Permedia3, Savage4, TNT2, and Voodoo3 are generally the same. While some cards are worse than others, the differences between the cards are minimal, however once you pass 1280 x 1024 things quickly become more divided. The G400 and Permedia3 quickly take the lead at resolutions of 1600 x 1200 and greater, followed by a toss up between Savage4, TNT2, and Voodoo3 cards. Keep in mind that each card, even if they are made by the same manufacturer and are seemingly identical, can have a completely different level of 2D image quality. Generally cards that follow reference designs set by the manufacturer have superior 2D image quality to those that follow a proprietary design, as the amount of filters placed between the RAMDAC and VGA-out is not controlled/suggested by the chip manufacturer in the case of a non-reference design.



Driver Quality & Stability Comparison

A very important consideration when pursuing one of the hottest 3D accelerators on the market for your Super7 system is the quality and stability of that card's drivers. Until recently, 3dfx had been the only strong supporter of 3DNow! in their drivers, however now that Matrox, NVIDIA, and S3 have either announced or are boasting both SSE and 3DNow! compliance the focus shifts to the quality and stability of the drivers that are being provided.

The first issue is whether or not the particular chipset manufacturer has a working OpenGL ICD (Installable Client Driver). All of the compared manufacturers, 3dfx, 3DLabs, Matrox, NVIDIA, and S3 all have "functional" OpenGL ICDs for their products. While they aren't complete or optimized for performance just yet, this means that all of the cards featured here can run Quake 3 (a question which quite a few people probably wanted answered long ago) ;)

Unfortunately, not all of these manufacturers have quality OpenGL ICDs for their products. In the case of 3dfx, a beta OpenGL ICD was made available to the public just prior to the release of idSoftware's Quake3 Arena test to allow for compatibility (and to cut down on the amount of hate mail to 3dfx tech support for not supporting Q3A). This beta ICD provides for performance in OpenGL games at anywhere from 50 - 85% of the performance of the 3dfx MiniGL, which is essentially a small driver intended to be used for a few games in particular, not a fully functional OpenGL ICD. For this reason, AnandTech benchmarked with both the current OpenGL ICD and the MiniGL drivers to show current performance of 3dfx products in both cases. Once the OpenGL ICD is optimized, the performance of 3dfx products should be back up to close to the levels of the MiniGL drivers, however until then, you'll be playing Quake3 Arena at speeds slower than the TNT2 Ultra (it's a fact, accept it ;) ). Once the OpenGL ICD is complete, things will most likely change, and the Voodoo3 will grow much more competitive in arenas where the MiniGL cannot be used and the OpenGL ICD must stand on its own.

The current Voodoo3 OpenGL ICD is only available under Windows 9x, meaning that professional users by day/gamers by night won't want a Voodoo3 for some heavy 3DStudio Max rendering under Windows NT4. In AnandTech's tests, the beta OpenGL ICD seemed to have stability problems running at 1600 x 1200

On the opposite end of things, 3DLabs has the most complete and optimized OpenGL ICD for their Permedia3 out of the manufacturers compared here, unfortunate poor gaming performance will keep the Permedia3 out of the hands of most Super7 users. NVIDIA probably has the next best OpenGL ICD out of the bunch, second to only 3DLabs. Their OpenGL ICD is solid, stable, and has been improved and maturing since the days of the original TNT.  The same can't be said about the latest detonator 1.88 drivers, during AnandTech's tests, the detonator 1.88 drivers with the TNT2 crashed more than all of the other cards combined.  NVIDIA has some serious issues to work out with the 1.88 drivers before the majority of Super7 users can begin using them without getting frustrated.  There are a number of unofficial releases available online that claim to fix these problems, so you may want to take a look at those if you're interested, at your own risk of course.

In the middle of it all are Matrox and S3, S3's OpenGL ICD for the Savage4 could use some performance tweaking as well as the Matrox G400 OpenGL ICD. Both OpenGL ICDs seem to only support 16-bit Z-buffering in OpenGL applications/games, this is often times a trick implemented to provide the illusion of a relatively nonexistent performance drop when going from 16-bit to 32-bit color. NVIDIA's OpenGL ICD on the other hand does enable 32-bit Z-buffering (24-bit Z + 8-bit stencil) by default, and therefore exhibits a larger performance drop when moving from 16 to 32-bit color.

32-bit Z-buffering is noticeable once you move into the 32-bit color arena, so be sure that when you're comparing performance benchmarks in 32-bit color you know the Z-buffer depths before jumping to any conclusions. The G400 and Savage4 both exhibited problems at 1600 x 1200 x 32-bit color, both seemingly driver related.



3DNow! Support & the Future

In terms of performance, 3dfx and NVIDIA have the best implementations of 3DNow! support in their drivers. While both Matrox and S3 claim support, there is relatively no performance increase that can be attributed to 3DNow! when testing with the G400/Savage4 on Super7 systems.

The big question is what happens a couple months down the road when AMD makes the debut of the Athlon processor, driver support for the Athlon seems to be cooking much better than for the K6-2/3. AMD has been working with most manufacturers including Matrox and NVIDIA to make sure that their products receive the greatest benefit from the Athlon processor and its enhanced version of 3DNow!

After months of waiting, 3DNow! support is finally being taken more seriously by the graphics card manufacturers, but is what they have to offer too little too late?

The Test

AnandTech's Super7 test bed was configured as follows:

  • AMD K6-2 475 running at 95MHz x 5.0 at 2.4v core on an Epox MVP3G-M (1MB)
  • AMD K6-3 450 running at 100MHz x 4.5 at 2.4v core on an Epox MVP3G-M (1MB)

Both processors were cooled using Cooler Master fans provided by Boston Limited

  • 128MB Mushkin Samsung Original PC100 SDRAM
  • Western Digital Caviar 5.1GB EIDE HDD
  • 48X Phillips EIDE CDROM Drive provided by Boston Limited
  • Windows 98 SE + DirectX 6.1a
  • Quake 2 using demo1.dm2 and 3Finger's crusher.dm2 demos
  • Expendable using -timedemo option

The Cards

The Drivers



OpenGL Performance Quake 2 demo1.dm2

At the higher resolutions, 3dfx and NVIDIA duke it out for the top positions with the K6-2 475.  Clock for clock, the Voodoo3 beats the TNT2 (core clock, not memory clock), unfortunately 3dfx lacks the 32-bit color rendering support that NVIDIA provides with its TNT2, albeit at a 10 - 20% drop in performance.  If true color (32-bit) support doesn't matter to you, then the Voodoo3 is just dandy, however there is one more consideration that you must take into account.  Once you take away 3dfx's precious MiniGL, and use the beta OpenGL ICD that all Quake3 test players are forced to use, not only does NVIDIA topple 3dfx in performance, but the stability of the solution decreases as the beta OpenGL ICD is still not a solid solution.  It will take some time for 3dfx to perfect their OpenGL ICD to the point where it can compete head to head with NVIDIA's, how long are you will to wait?

With this current crop of 3D accelerators, 1600 x 1200 is much more playable of a resolution that it once was.  Unfortunately demo1.dm2 is only an indication of the best possible frame rate you can hope to achieve with your card, and with the highest score here being 36 fps by an overclocked Voodoo3 3000 with the MiniGL drivers, chances are that your frame rate will often drop below the 15 fps mark during gameplay, especially in deathmatches/online play.  Keep a close eye on the Matrox Millennium G400MAX and note its position in the ranking as we decrease the resolution.

Although not a replacement for the now comfortable 1024 x 768 resolution, 1152 x 864 provides an intermediate step between 1024 x 768 and 1600 x 1200 for those users that are looking for a little more out of their screen area.  The most noticeable performance drop occurs between 1024 x 768 and 1152 x 864, before completely dropping off at 1600 x 1200.  The Voodoo3 and TNT2 still battle for first place with 3dfx edging out the competition once again, but keep in mind, once you switch from the fast performing MiniGL to the beta OpenGL ICD, 3dfx doesn't look too happy anymore.   If you're looking for the best overall performance now, NVIDIA is the way to go, unless you don't ever want to touch Quake3 test or any other OpenGL games that don't support 3dfx's MiniGL.

These are the scores you probably want to pay the most attention to, 1024 x 768 is one of the most popular resolutions to run your new graphics card at, and the performance winner is 3dfx.  The same MiniGL vs OpenGL ICD argument comes up again, Quake3 test will not work with 3dfx's MiniGL, meaning you need to use their beta OpenGL driver, which performs much worse than the MiniGL, dropping the performance of the card to well below the TNT2 Ultra.  Eventually 3dfx's OpenGL ICD should perform much closer to the MiniGL, however the day when 3dfx releases a full OpenGL ICD in final form seems far away.



At the lower resolutions, Matrox almost drops off the face of the earth barely beating the Savage4 PRO+.  The poor performance is most likely related to driver issues with their current OpenGL ICD, however Matrox isn't known for their high quality OpenGL ICD development so we'll have to wait and see if anything develops with that.  On the other hand, 800 x 600 is on its way out as a playable resolution, so there's really no reason to invest time in investigating performance.  3dfx's weakness continues to be its poor OpenGL ICD as the performance with the MiniGL driver is over twice as fast as with the OpenGL ICD.



The Matrox G400MAX takes advantage of the faster L2 cache with the K6-3 450 and jumps to the top of the 1600 x 1200 performance charts.  Unfortunately, an unstable OpenGL ICD kept it from completing benchmarks in 32-bit color at this resolution.

 

The standings seem to remain the same with the K6-3 450 as they did with the K6-2 475, however the K6-3 450, overall, is the faster processor, even for gaming in spite of the K6-2's higher clock rate. 





3dfx's beta OpenGL ICD continuously crashed during the crusher.dm2 demo at 1600 x 1200, therefore explaining the absence of any scores using the OpenGL ICD from the below chart.  The TNT2 Ultra takes the gold here with an extremely high score of 27.8 fps.  The crusher.dm2 benchmark is generally representative of the worst case scenario as far as performance is concerned, meaning that the frame rate generally won't drop too far below a card's crusher.dm2 score.  With the TNT2 achieving a crusher.dm2 score of close to 28 fps, and a demo1.dm2 score of 29 fps, you'll probably get very playable performance with the TNT2 at 1600 x 1200, so long as you don't mind 30 fps performance levels. 

3dfx's lack of support for AGP texturing chokes them in this test, as the Voodoo3 3000 is plagued by texture swapping during the texture intensive crusher.dm2 test.

At 1152 x 864 3dfx does manage to pull ahead, however switching from the MiniGL to the OpenGL ICD puts the Voodoo3 3000 behind the Matrox Millennium G400MAX which attempts to remain somewhat competitive.





The same standings are present with the K6-3 450, with the only difference really being that the G400MAX performs noticeably faster due to the faster L2 cache of the K6-3 450. In AnandTech's tests, crusher.dm2 would not successfully complete at 1600 x 1200 x 32-bit color with any of the cards on a regular basis, making it far from a viable option.





For months now, there hasn't been a decent Direct3D game that could measure performance accurately.  Shogo is probably one of the worst benchmarks ever used on AnandTech as a Direct3D benchmark, however it was the best/only option for quite some time.  At the same time, Incoming/Forsaken were quickly growing to be beyond highly outdated benchmarks, luckily Rage Software, the makers of Incoming, included a nice benchmark in their latest release, Expendable (which is a pretty cool game actually).   Unfortunately Expendable is very CPU dependent and isn't the best way of comparing one card to another, nevertheless it is the best thing to benchmark Direct3D with currently, so we'll have to use it.  The game to keep an eye out for is Unreal Tournament, if Epic includes a decent benchmark in UT, you can expect that to become the next Direct3D game used for benchmarking. 

To run the Expendable benchmark, simply run the go.exe file with the '-timedemo' extension, i.e. C:\Expendable\go.exe -timedemo.  All Expendable benchmarks were performed with bumpmapping turned off, and texture detail set to high.   The performance illustrated by the Expendable demo did not vary greatly between 16-bit and 32-bit color, making the results redundant, and therefore only 16-bit color tests were shown.  At the same time, the performance drop vs resolution increase was negligible, making 1024 x 768 - 16 bit color the only performance represented.

Image25.gif (20986 bytes)

For Direct3D performance, Expendable provides a decent benchmark when comparing among processors, however for video card comparisons Expendable isn't the best benchmark to use to compare cards.  Direct3D performance is a little different than OpenGL performance since there is no ICD tweaking necessary to have good performance, simple driver tweaking and raw power contributes to a card's Direct3D performance.   The leaders in this area are the G400MAX and the TNT2 as indicated, albeit poorly, by the Expendable benchmark.  Expendable is much like crusher.dm2 in terms of a benchmark, the minimal differences between the four cards compared here translate into noticeable although not prominent performance differences in real world gaming scenarios.  



Conclusion

With the conclusion of round one of the comparison, NVIDIA has taken the overall lead due to 3dfx's poor performing OpenGL ICD.  Among the important points illustrated by the benchmarks, one of the most important is that the clock speed of your graphics processor increases in importance as the resolution of your games increase.   At 640 x 480 the TNT2, TNT2 Ultra, and overclocked TNT2 Ultra performed virtually identical to one another, however at 1024 x 768 the differences grow to 12% and 5% respectively.  If you're debating as to whether or not a Voodoo3 3000 is worth the added cost over a 2000, or whether the TNT2 Ultra is all that much faster than a regular TNT2 for your Super7 system, ask yourself this question, what resolutions do I plan on running at?

Let's see what happens when the clock speed of the systems drop in addition to the inclusion of some old school performers to the scene...how valuable are those Voodoo2's now?  Till next time...

Log in

Don't have an account? Sign up now