Original Link: https://www.anandtech.com/show/429

NVIDIA GeForce 256 DDR

by Anand Lal Shimpi on December 25, 1999 7:23 PM EST


On October 11, 1999, NVIDIA introduced their brand-new graphics chipset to the world, the GeForce 256. Just about a month later, the GeForce started appearing in cards made by, among other companies, Creative Labs and Guillemot on store shelves, as well as in the hands of online vendors. The price was and still is quite steep for a single peripheral, especially considering that we are currently in a period where a relatively powerful gaming system can be had for around $1000 - $1500.

But then again, everything is relative, right? So the actual severity of the price depends on the overall experience though, correct? If you pay $300 for a graphics card and barely get an extra 5 fps under your favorite game then the card wasn't really worth $300 to you in the first place, now was it?

Then again, if you purchase a $300 card and, immediately after firing up your favorite game, it feels like you've never even seen the title before this moment, the card quickly begins to earn its value.

For many users, the feelings they were left with after seeing what NVIDIA's GeForce could do, were sort of in limbo between the "eh, whatever" and "I have seen the light" reactions. Even in our own tests, the conclusion after reviewing the GeForce was that it's the fastest chip in its class, but nothing more. While the GeForce boasts a fairly powerful hardware transforming & lighting engine, which NVIDIA cutely dubbed their GPU, the overall experience with the GeForce ended up being nothing more than faster gameplay at 1024 x 768, but still not capable of driving higher resolutions at 32-bit color.

For the minimum $249 cost of a card based on the GeForce, many TNT2 Ultra owners found themselves truly not interested in throwing away around $300 on a card that wouldn't improve the gaming experience significantly. While it is true that games with a high polygon count would take advantage of the GeForce's hardware T&L and could potentially perform much better on a GeForce than on any other card, it is also true that we have yet to see a game of that nature and when one does become available (it's inevitable), the GeForce will also carry a noticeable lower price tag (this is also inevitable). So for the NVIDIA loyalists that spent the $300 on a TNT2 Ultra the minute it became available, was the GeForce going to be another $300 "well spent?" Not this time around, not unless NVIDIA could offer a little more for the money.

It's alright to produce a product with flaws, just don't plan on it selling as well as one that is closer to achieving that flawless state that engineers strive for. NVIDIA's GeForce had an inherent flaw; its relatively low memory clock (in comparison to a TNT2 Ultra) and single 128-bit memory bus left it with a noticeable bottleneck from the start -- memory bandwidth.



Why is the memory bandwidth of the GeForce a limitation? When its memory is running at its default clock of 166MHz, the GeForce has an available 2.7GB/s of memory bandwidth. While this is a large amount of available bandwidth, as the resolution and color depth of any particular application or game increase, so does the amount of memory required to render that particular scene. As you begin to saturate the GeForce's memory bus, you will notice a large drop in frame rate. The best way to illustrate this is by comparing the performance of a GeForce running at default clock in Quake III Arena at 16-bit and 32-bit color depths at 1024 x 768.

As you can see by the above graph, the performance difference is definitely noticeable. Now compare this to the performance drop at 640 x 480 when switching from 16-bit color to 32-bit color modes under Q3A, and we see much less of a drop, since at 640 x 480, the memory bandwidth requirements are significantly less than at 1024 x 768 (the higher the resolution, the more memory is needed).

There are a number of solutions to this problem, but only a few are practical. You could always increase the memory bus width to 256-bits, but that isn't very realistic because it is a significant change in the architecture of the chip. The tweaker's method of getting around this problem is by simply purchasing a GeForce with faster memory and overclocking the memory bus to higher levels, such as 183MHz or 200MHz. If you are lucky enough to have a GeForce board with 5ns SDRAM, then increasing your memory frequency to 200MHz on your GeForce will increase the available memory bandwidth to 3.2GB/s, an increase of 11% from a regularly clocked GeForce.

The more elegant solution is to use a memory technology that inherently offers greater bandwidth than conventional SG/DRAM. One technology in particular, Double Data Rate SG/DRAM, is capable of effectively doubling the memory bandwidth without significantly altering the design of the card itself. By transferring on both the rising and falling edges of the clock (normally transfers only occur on the rising edge of the clock), DDR SDRAM (and DDR SGRAM), while operating at a frequency of 150MHz, for example, can effectively transfer data as if the clock frequency were twice that, or 300MHz.

So, with a simple switch of memory types, the GeForce's memory bandwidth rockets from 2.7GB/s to an unparalleled 4.8GB/s*, which gives the GeForce a healthy boost in high resolution scenarios, so, without further ado, let's get to the benchmarks of a GeForce 256 equipped with DDR SDRAM.

*150MHz x 2 x 16-byte memory bus width = 4.8GB/s



The Specs

Features Benefits
Single-Chip GPU (Graphics Processing Unit) On-chip integration of the entire 3D pipeline (transformation, lighting, setup and rendering) offers the lowest possible component and board design cost.
Integrated Transform and Lighting Delivers 2-4X the triangle rate for 2-4X more detailed 3D scenes. Frees up CPU bandwidth for physics and artificial intelligence (AI), which results in more realistic object behaviors and character animation.
Independent Pipelined QuadEngine™ Separate engines for transformation, lighting, setup and rendering provide a very powerful, highly efficient architecture that delivers 15 million triangles per second. Allows applications to represent 3D characters and environments with the highest degree of complexity possible.
256-Bit QuadPipe™ Rendering Engine Four independent pixel-rendering pipelines deliver up to 480 million 8-sample fully filtered pixels per second. Guarantees highest color quality and texturing special effects at maximum frame rate.
AGP 4X with Fast Writes Enables the CPU to send data directly to the GPU to maximize overall system performance. Avoids a costly data copy to and from valuable main memory bandwidth that graphics processors without Fast Writes must incur.
High-Quality HDTV Processor Delivers the highest quality DVD and HDTV playback and digital recording.
350MHz RAMDAC Delivers the clearest, sharpest, most solid image quality at 2048 x 1536 resolution at 75Hz.
High-Speed Memory Interface Design to support current SDRAM/SGRAM and upcoming SDR/DDR high-speed memory.
256-Bit 2D Rendering Engine Delivers the industry’s fastest 2D performance for ultra-fast screen refresh at high resolutions and 32-bit color depths.
Complete Support for New Microsoft® DirectX® 7 and OpenGL® Features Ensures that applications can leverage the new features without additional cost or support. Guarantees best out-of-box end user experience.



The Test

Windows 98 SE Test System

Hardware

CPU(s)

Intel Pentium III 700
Intel Pentium III 600E
Intel Pentium III 450

Intel Celeron 500
Intel Celeron 366
AMD Athlon 700
Motherboard(s) ABIT BX6 R2 ABIT BX6 R2 + "Sloket" Adapter Gigabyte GA-7IX
Memory

128MB PC133 Corsair SDRAM

Hard Drive

IBM Deskstar 22GXP 22GB Ultra ATA 66 HDD

CDROM

Phillips 48X

Video Card(s)

3dfx Voodoo3 2000 16MB (default clock - 143/143)
3dfx Voodoo3 3000 16MB (default clock - 166/166)
3dfx Voodoo3 3500 16MB (default clock - 183/183)
ATI Rage Fury Pro 32MB (default clock - 125/143)
ATI Rage Fury MAXX 64MB (default clock - 125/143)
Diamond Viper II Z200 Savage 2000 (default clock - 125/155)
Matrox Millennium G400 MAX 32MB (default clock - 166/200)
NVIDIA GeForce 256 32MB SDR (default clock - 120/166)
NVIDIA GeForce 256 32MB DDR (default clock - 120/150 DDR)
NVIDIA RIVA TNT2 32MB (default clock - 125/150)
NVIDIA RIVA TNT2 Ultra 32MB (default clock - 150/183)

Ethernet

Linksys LNE100TX 100Mbit PCI Ethernet Adapter

Software

Operating System

Windows 98 SE

Video Drivers

3dfx Voodoo3 - 1.03.04

ATI Rage Fury Pro - Retail Shipping Drivers (NA for download)

ATI Rage Fury MAXX - Retail Shipping Drivers (NA for download)

Diamond Viper II Z200 Savage 2000 - 1.09 (NA for download)

Matrox Millennium G400 - 5.41.008

NVIDIA GeForce 256 - Detonator 3.65

NVIDIA Riva TNT2 - Detonator 3.65

Benchmarking Applications

Gaming

idSoftware Quake III Arena demo001.dm3
GT Interactive Unreal Tournament 4.00 UTbench.dem



You can expect the GeForce equipped with DDR SDRAM to dominate in all the benchmarks, but the lead it holds over the competition is naturally reduced when running at 640 x 480 since this is primarily a CPU test and the effects of fill rate limitations on the slower cards can't really be seen at this low of a resolution.

One thing that you can take note of is the fact that the switch to 32-bit color on the DDR GeForce is about 3% whereas the drop is about 20% on the SDR version of the card. The effects of the greater memory bandwidth provided by the DDR SDRAM can already be seen at this low of a resolution. There are three other cards that exhibit a behavior similar to that of the DDR GeForce when moving to 32-bit color, each for different reasons.

The S3 Savage 2000 deviates from the rules we have been preaching for so long. With an available memory bandwidth virtually equal to that of the SDR GeForce (2.5GB/s vs 2.65GB/s), the Savage 2000 should take a similar drop in performance when switching to 32-bit color rendering. There are two reasons that explain why this does not happen, the first is that the latest drivers (v1.09) deliver a 24-bit Z-buffer to Quake III Arena without providing for an 8-bit stencil buffer, unlike the GeForce which provides the 24-bit Z and 8-bit stencil buffer as requested by the application. This frees up a noticeable amount of memory bandwidth which thus lessens the effect of the switch to 32-bit color rendering which is a bandwidth hog in itself. The second factor that helps the Savage 2000 out was pointed out to us by a helpful reader posting on AnandTech's Forums. If you recall, one of the most talked about features of the Savage 3D was its support for S3TC, or S3's Texture Compression algorithm. S3TC made its way into the Savage4 and now the Savage 2000, fortunately id has included a provision for S3TC in Quake III Arena, instead of using higher resolution textures whenever S3TC is enabled, Quake III Arena simply uses the compressed textures, which are physically smaller in size without sacrificing image quality, and thus a decent amount of memory bandwidth is freed once again making the transition to 32-bit color on the Savage 2000 less painful than it is on the SDR GeForce.

Matrox's G400MAX and ATI's Rage Fury MAXX are also performing quite well in 32-bit color, this is primarily because of their inherently massive available memory bandwidth figures.

These are all factors you'll want to keep in mind as we further investigate performance.



Now at a more realistic resolution for someone with a Pentium III 700 and a $350 video card, at 1024 x 768 we see the DDR GeForce truly pull ahead of the competition in the 16-bit color tests. Beating out the next fastest Rage Fury MAXX by over 24 fps, the DDR GeForce even delivers a playable 64 fps at 1024 x 768 x 32-bit color. If frame rate is indeed king, then NVIDIA takes the crown here while delivering superb image quality.

Most of the previous generation of cards are having a very hard time competing at this resolution, with the exception of the G400MAX, which is performing at close to a next generation level, the chart is topped with the latest and the greatest from NVIDIA, ATI and S3.

1024 x 768 x 32 at 60 fps? It's possible, and the DDR GeForce brings it to you.

Even the DDR GeForce isn't capable of delivering a playable frame rate at 1600 x 1200 x 32, but in 16-bit color the 35.5 fps it does deliver is quite impressive. It's still clear that 60 fps at 1600 x 1200 is still some time away, but monitors will have to drop in price before you see an influx of gamers demanding 60 fps at such a high resolution. For now, 60 fps at 1024 x 768 or insanely high frame rates at 640 x 480 is all that is needed to fulfill the needs of most of the gamers out there.



All of the cards lose some of their horsepower as we make the move down to the Pentium III 600, but the standings remain the same. Once again, the Savage 2000 loses very little performance from the move to 32-bit color because of the aforementioned points of conserving memory bandwidth. There also seems to be some driver tweaking in play here as S3/Diamond must have really concentrated on performance in situations of high memory bandwidth usage.

The performance of the Rage Fury MAXX here is definitely being limited by its drivers, but these are the final shipping drivers so don't expect performance to change until the next driver update.

The performance situation at 1024 x 768 and 1600 x 1200 with the Pentium III 600E is virutally identical to that with the Pentium III 700.



As the CPU performance drops, so does the gap between the performance of the 11 cards compared here as the focus of performance turns to the limitation of the CPU rather than that of the video card.

Although the DDR GeForce is still on top, it's 16-bit color performance advantage is no where near as large as it was in the past two performance tests with the 600 and 700MHz CPUs. The only reason the DDR GeForce would make for a good buy in this case is for future performance with faster CPUs as well as 32-bit color performance since, at 1024 x 768 x 32, it still offers close to that magical 60 fps mark.

The SDR GeForce starts to choke in 32-bit color, especially in comparison to the Savage 2000 which brings forth very promising performance. The history of S3's driver support as well as the previous entries into the Savage line of graphics chipsets makes a purchase here a very weary one at best. S3's history has come back to haunt them and it will take a lot of proof before this market accepts them as a viable alternative to the likes of NVIDIA and 3dfx who have been delivering, more or less, on their promises of performance in recent times.







Since Unreal Tournament is affected by more factors than just video card driver quality at 640 x 480 the DDR GeForce and Rage Fury MAXX are virtually tied for the first place spot at 640 x 480. All of the cards perform respectably here, the Savage 2000 is held back by the poor D3D performance of its drivers.

The Voodoo3 cards excel in this test because we were forced to use the Glide setting for the UT tests with the 3dfx cards, the D3D setting would not complete our tests properly. In any case, if you're into playing UT, then the Voodoo3 with its native Glide support will give you excellent gameplay simply because UT was designed with superb Glide support. The D3D and especially the OpenGL performance under UT is not up to par with its Glide performance at all, giving the 3dfx cards the edge in the UT tests.

For those of you wondering about Athlon 700 performance, the standings are identical to that of the Pentium III 700.



Once again there is very little difference between the DDR GeForce and the Rage Fury MAXX, making a recommendation for the more expensive DDR GeForce very difficult unless you play games other than UT (i.e. Quake III Arena, in which case the DDR GeForce would give you the better overall experience).

The Voodoo3 is once again performing quite respectably as a result of its native Glide support.

The Savage 2000 holds up the rear as it is begging S3 for updated drivers. Not only is its performance under UT below par, the Savage 2000 can't take advantage of Epic/GT's second CD of S3TC compressed textures which would normally be where the S3 cards shine. S3 dropped support for their MeTaL API with the Savage 2000 and unfortunately the only way to take advantage of S3TC is apparently through the MeTaL API as there are some texture management issues that Epic ran into while attempting to use DXTC which is a part of DirectX. The only cards with support for the MeTaL API are the Savage 3D and Savage 4, so to make a long story short, the Savage 2000 is not the card for those that are hooked on Unreal Tournament (Note from the Gamer within Anand: and if you've ever played Assult on UT then you're probably already hooked although you can't beat DM on Q3A).



The 1600 x 1200 scores you see above won't change all that much as we cycle through the various processors. The DDR GeForce finally separates itself from the Rage Fury MAXX but not by a huge amount.

The Savage 2000 finally moves up the ladder as a result of its efficient memory management, with some driver tweaks for D3D performance it wouldn't be surprising to see the Savage 2000 perform even better. Keep an eye on this card, if S3 can get their act together with their next driver update (due out in January), the Savage 2000 may end up being the "low-cost" (compared to the GeForce and other cards in its class) high performing card that we were promised on paper a few months ago.

The native Glide support can only do so much for the Voodoo3 cards, at 1600 x 1200 the cards begin to show their limits as they fall down below the average performing TNT2 Ultra.



With the Pentium III 600E we get a breakdown of performance similar to that we experienced on the 700, just a bit slower.



The range of frame rates on the Pentium III 450 shrinks to such a level that an upgrade for a Pentium III 450 owner from any one of these cards to the DDR GeForce is not justified at all.



We have a similar situation with the two Celeron CPUs, the benefit of the DDR GeForce is noticeable but not worth the added cost. You'd be better off upgrading your CPU first then worrying about a video card upgrade if you're serious about improving UT performance.





Conclusion

The DDR GeForce is everything we expected from the original GeForce, unfortunately its high price tag will keep it out of the hands of many. The performance of the DDR GeForce so completely shadows that of the SDR version that it almost makes no sense to opt for the SDR version.

With the price difference between the two measuring out to be around $100 or less your best bet is to either opt for the DDR GeForce or wait until prices drop and go after the DDR version. The SDR version does not perform at a high enough level in high resolution situations to justify the already high price tag of the card.

The DDR cards are just now becoming available on the market, and in a few months time even they will be outperformed by what NVIDIA likes to call their "Spring Refresh," which is basically the next product in their 6-month product cycle calendar. The next NVIDIA product, codenamed NV15, should boast a more powerful T&L engine (the number 15 million triangles/s comes to mind) as well as a higher fill rate.

What will become of DDR memory as a performance booster in video cards then? It would be a smart move by NVIDIA to make their NV15 a DDR product and leave SDR memory a thing of the past, either that or increase the memory bus width, the latter seems unlikely though. While the prospect of the NV15 being a DDR product also possibly equally as unlikely it makes the most sense for NVIDIA's next generation product to offer a significant advantage over their current generation.

The Savage 2000 is also a contender to watch out for. Currently retailing at under $200, the Savage 2000 could become a powerful competitor if S3 gets the drivers tweaked to the point that they offer solid D3D performance as well as as close to 100% compatibility as possible. Currently the compatibility of the Savage 2000 with the latest games is questionable. Visually, it runs Q3A and UT just fine, but there are many more games that gamers find themselves craving the need for speed in; and until S3 can prove to the world that they have changed and that their Savage 2000 resembles their previous 2 cards in name alone, you should approach the Savage 2000 with caution.

Currently, the SDR GeForce is not worth upgrading to, if you're going to spend that much money you're better off either going with the DDR version or sticking with something like a TNT2 Ultra which is still a very high performing competitor. The Voodoo3 is still a fill rate monster at lower resolutions and can be considered a viable alternative unless you're big on visual quality, in which case the lack of 32-bit color support may turn you off of the 3dfx line and onto the competing solutions.

ATI's Rage Fury MAXX is also a very promising competitor, if you aren't in the position to pay the extra $70 - $100 for the DDR GeForce, then the Rage Fury MAXX may be exactly what you need. While it isn't as fast as the DDR GeForce under Quake III Arena, it's larger available memory bandwidth keeps it above the SDR GeForce in some situations and on par with the DDR in others.

Log in

Don't have an account? Sign up now