Original Link: https://www.anandtech.com/show/633



In the past two decades, the PC has been witness to a serious change in the way the public accepted its presence.  What was once reserved for a very small percentage of the population has now become increasingly commonplace.  Just two years ago there were 43 million adults with Internet access in the United States, and today there are well over 100 million.  Back then only 35% of US households had at least one computer in them, and as you can guess, that number has increased dramatically as the computer becomes more of a home appliance and less of a feared bit of technology.

With this welcome adoption of the computer and in our case, the PC, has come a number of applications that the PC hadn’t seen prior to this recent boom in public acceptance.  While continuing to be a tool for the professionals in virtually all conceivable fields, the PC has also taken on an alter ego, a home entertainment appliance, much like the TVs, game consoles and VCRs that had previously held that market to themselves. 

By entering this market, the PC was destined to take on some of the features that its relatives in the home entertainment industry have boasted for quite some time.  We saw the introduction of TV tuner cards so you could watch TV on your PC, and let’s not forget about the advent of powerful 3D graphics accelerators that would pave the way for a gaming revolution on the PC. 

While all of this was going on, the assimilation of yet another technology into the home entertainment industry was taking place.  Once called the Digital Video Disc or Digital Versatile Disc, the DVD started to become the perfect addition to any home theatre collection.  Destined to replace the VCR as the ideal playback medium because of its high capacity and long lasting nature as a medium, the DVD was also set on a crash course with the PC. 



DVD meets the PC

When DVD drives began appearing on store shelves of popular computer retail sales locations, they were generally bundled with much more than the cable and occasional drivers disk they are now.  Instead, DVD drives generally came as a package, that package included some software, but more importantly, the bundle included a hardware decoder card. 


Click to Enlarge

The reason for this was simple, CPUs weren’t powerful enough back then to handle the very intensive task of DVD playback yet there was a demand for the adoption of DVD support on the PC.  The solution to this problem was to offload the task of decoding DVD streams from the CPU, onto a PCI card, in this case the decoder cards that were bundled in these DVD packages. 

This solution obviously did have its downsides.  For starters, it required that yet another PCI slot was occupied by a card.  This was during the period of time where 5 and 6 PCI slot configurations were not incredibly popular on motherboards, and in many cases they were unheard of, combined with up to two PCI slots for your video cards (we all remember the two card, Voodoo2 SLI setups), and maybe a SCSI controller, the fact that you had to have an additional PCI card just so you could watch DVDs was a bit of a discouragement. 

However the fact that you’d lose a slot wasn’t the biggest concern at all.  If you remember back to the days of the aforementioned Voodoo2s that were incredibly popular, or even back to 3dfx’s first victory with the original Voodoo, you will recall that one of the biggest complaints about these 3D-only cards was regarding their pass-through cable.  While the Voodoo2 was much better than the original Voodoo about this, the fact of the matter was that in order for you to use the accelerators, you would have to plug your monitor into the Voodoo card, then use a pass through cable to plug into your 2D card.  Often times, especially at higher resolutions, this resulted in some pretty nasty 2D output because of the signal degradation caused by this pass-through. 

DVD decoder cards were the same way, they required a pass through cable as well.  And if you had a 3D card such as the original Voodoo or the Voodoo2 in conjunction with a DVD decoder card things got even worse. 

Last but not least, DVD decoder cards added what would eventually become an unnecessary cost to the price tag associated with being able to play DVDs on your computer. 

It didn’t take much foresight to realize that the life of hardware DVD decoder cards was limited, but without the power in today’s CPUs, where else would the DVD decoding process be offloaded to?



S3’s Return – Hardware Motion Compensation

In 1998, AnandTech as well as the rest of the eager hardware community saw the return of an industry giant to the limelight.  With the 3D(fx)-revolution of the past couple of years, manufacturers such as S3 that had no real 3D solutions and had depended on their line of 2D graphics accelerators to carry them had quickly lost purpose in the eyes of the quickly changing add-in graphics card market.  However S3 did make more than one attempt at regaining the ground they lost, such an attempt was in 1998 with the Savage3D. 

While the Savage3D ended up being an overall disappointment once it actually hit the streets, it did bring one very interesting feature to the table, Hardware Assisted DVD Playback or as S3 called it back then, and as we call it today, Hardware Motion Compensation. 

Part of why MPEG-2 compression works so well in terms of compression efficiency is because it removes a lot of the data that would otherwise be unnoticeable to the viewer.  For example, let’s take two scenes, one where an individual is standing still in front of a relatively stationary background and one where the individual is shifted to the left without any real changes to the background.  In an uncompressed video state, all of the data from those two scenes would be kept in their entirety.  However, any intelligent compression algorithm would say that it would make sense to keep one copy of the background data and simply shift the data corresponding to the individual being moved.  It’s the equivalent of refraining from reinventing the wheel in respect to video compression. 

This is what the motion compensation aspect of MPEG-2 compression entails, the translation of that object, or in the case of the aforementioned example, the translation of the individual from one position to the next. 

Normally, without any hardware decoder present in a system, this part of the MPEG-2/DVD decoding process is handled entirely by the CPU.  But since it is such a prevalent part of the decoding process it would only make sense to offload it onto some other part of the system, this is the opportunity that S3 saw when they introduced the Savage3D. 

While it wasn’t focused on nearly as much as it should have been during its time, the S3 Savage3D featured one of the first types of Hardware Motion Compensation on a PC graphics accelerator.  What this allowed to happen was one step of the decoding process to be offloaded onto the graphics chip, leaving more headroom for the CPUs of the time (Pentium II 400’s were the fastest things out around then) to work with when handling the rest of the fairly complex DVD decoding process. 

At the release of the Savage3D, no competing solutions offered any soft of Hardware Motion Compensation, not even the NVIDIA TNT2 that would quickly overshadow the Savage3D as the card to have. 



The next big step – ATI’s Rage 128

Another big contributor to the evolution of DVD playback on the PC was ATI.  ATI’s Rage Pro chipset, one of the first chipsets ever to be used on an AGP card, also featured a Hardware Motion Compensation engine, even precluding the Savage3D’s HWMC engine.

However ATI did not stop there, in the next incarnation of the Rage family, the Rage 128, ATI took two more DVD decoding tasks off the host CPU and placed their graphics chip in control of. 

The first feature the Rage 128 chip boasted as hardware sub-picture acceleration.  During DVD playback, there are times when a sub-picture (compressed bitmap) must be decompressed and outputted.  Such situations include displaying subtitles or menu features.  The Rage 128 was able to handle this relatively simple task of decompression in hardware. 

Secondly, the Rage 128 implemented a feature known as inverse Discrete Cosine Transformation (iDCT) in hardware as well.  In the MPEG-2 decoding process, this corresponds to subdividing the translated image we mentioned in our explanation of HWMC and dealing with the removal of data that isn’t noticeable to the viewer.  Having hardware iDCT support gave the Rage 128 an advantage over the competition that ATI would continue to hold even through the release of their Radeon graphics chip. 



NVIDIA gets in on the action

With ATI and S3 both attempting to raise the bar in hardware assisted DVD performance on the PC, it was about time for NVIDIA to jump on the bandwagon as well.  Finally, around 7 months after the release of the Rage 128, and over a year after the release of S3’s Savage3D, the “king” of the graphics card industry finally came out with their own hardware motion compensation engine. 

NVIDIA’s GeForce 256 featured the company’s first HWMC engine and in spite of its late entrance into the market while both ATI and S3 had the feature for quite some time, it was actually a very well placed introduction.  If you recall, the GeForce 256 was the first card from NVIDIA to boast their ‘GPU’ or hardware T&L unit, designed to improve performance on slower systems as well as offload the transformation and lighting stages of the rendering process from the CPU onto the GPU.  Going along with this idea of taking some of the load off of the host CPU, the GeForce 256’s HWMC support does the same exact thing, only with reference to DVD decoding instead of 3D rendering. 

The HWMC engine we saw debut with the GeForce 256 has been with NVIDIA since then, and is also present on the GeForce2 MX as well as the GeForce2 GTS. 

Matrox, where are you?

Amidst all of this criticism towards NVIDIA for being slow to adopt a HWMC engine, Matrox, a strong promoter of DVD playback on your PC, continued to show no support for any sort of motion compensation or iDCT in hardware.  Their reasons were simple, CPUs were getting fast enough that such hardware support wasn’t necessary, and thus the Millennium G200, G400 and G450 did not feature any sort of HWMC. 

Matrox can claim that they have support for videoport overlay and Colorspace Conversion (CSC) in hardware, but so do all other manufacturers so there’s not much point in touting those features. 



More than just performance

It doesn’t take much to realize that the answer to making a desirable DVD decoding solution isn’t solely producing one that’s high performing.  While performance is a major factor, it isn’t the only one. 

Image quality is quite possibly just as important, if not more important than DVD playback performance simply because, what good is the ability to play back a DVD on your PC at full speed if the image quality of the video is horrendous?  It is the same argument that was once the topic of heated debates in the 3D accelerator arena between 3dfx and NVIDIA supporters, performance vs image quality, which matters the most and why.

And just like in the 3D graphics arena, there is no one single way to benchmark image quality when it comes to DVD playback.  It is relatively simple to measure performance visually, you can easily notice any dropped frames or general choppiness during playback, however there is no standard benchmark for image quality.  At least there wasn’t for quite a while. 

MadOnion, the creators of 3DMark, put together one of the first comprehensive video related benchmarks called Video2000.  As we will come to see, the benchmark definitely does have its downsides, but at the same time helps to provide us with a solid metric for measuring DVD image quality as well as performance. 

Video2000 measures three aspects of your graphics card and system’s ability to play and manipulate complex video streams such as those from MPEG-2/DVD sources.  The suite measures the Quality, Features and finally Performance of your setup.  The three areas carry different weights, with quality and performance pulling 40% of the final score each, and the remaining 20% of the total Video2000 score belonging to the features tests. 



The Test

DVD Test System

Hardware

CPU(s)
AMD Athlon 1.1GHz
Motherboard(s)
ASUS A7V
Memory
128MB PC133 Corsair SDRAM (Micron -7E Chips)
Hard Drive

Western Digital 153BA Ultra ATA 66 7200 RPM

Video Card(s)

3dfx Voodoo3 3500TV
3dfx Voodoo5 5500
ATI Radeon 64DDR
Matrox Millennium G200
Matrox Millennium G400MAX
NVIDIA GeForce SDR
NVIDIA GeForce2 MX
NVIDIA GeForce2 GTS (32MB)
S3 Savage 20000

Ethernet

Linksys LNE100TX 100Mbit PCI Ethernet Adapter

Software

Operating System

Windows 98 SE w/ WinDVD 2.2

Video Drivers

3dfx Voodoo3 3500TV - 1.04.02
3dfx Voodoo5 5500 - 1.01.03
Matrox G400 - 6.01.015
Matrox G200 - 5.41.008
All NVIDIA cards used Detonator 6.18 drivers
S3 Savage 2000 - 95103

Benchmarking Applications

Video
MadOnion Video2000


Quality Comparison

There are four categories that the quality of the various cards were compared according to.  The areas are: Upscaling/downscaling, colorspace conversion, de-interlacing and tearing. 

Upscaling & Downscaling

One of the most important factors when considering video playback quality is the filtering algorithms the graphics chipsets employ when dealing with displaying video at a non-native size. 

For example, DVD’s are intended to be played back at a standard 720 x 480 resolution, but no one runs their PC desktop at 720 x 480.  Instead you have 640 x 480, 800 x 600, 1024 x 768 and larger desktops. 

Playing back a DVD on a 640 x 480 desktop, at full screen, would require the video to be scaled down to fit the desktop, and depending on the graphics chipset, some video data may be lost during this downscaling process. 

Video2000 measures the accuracy of the graphics chipset in downscaling by forcing the graphics card to display a test image with a series of lines then downscaling that image and allowing the test operator to select what lines have disappeared from the original image.  The more lines that are left visible, the more accurate the graphics chipset is at downscaling and the less video data will be lost when you’re shrinking the DVD playback window from full screen to something other than its native resolution. 


Click to Enlarge


Click to Enlarge

With the advent of DTV, and video resolutions at up to 1920 x 1080, the ability for a graphics card to accurately downscale a video stream that is being played back from the extremely high 1920 x 1080 resolution down to something that can fit on most desktops will be very important. 

The opposite of downscaling then, is upscaling, where the video being played back is displayed at a resolution greater than its native resolution.  Testing upscaling quality is a bit more complicated than testing downscaling accuracy, but the principles remain the same. 

Two tests comprise the upscale quality test, the first accounts for 2/3 of the upscale quality score and that is the Jagged Edges test and the remaining 1/3 is the Moiré test.  The Jagged Edges test simply compares an upscaled test image to two reference images, one featuring jagged edges and the other illustrating a much more smooth curve as can be seen below. 


Click to Enlarge

If a graphics chipset implements a high quality upscaling algorithm the Jagged Edges test should be simple to pass, and the image will look more like the smooth reference image on the right.  If, however, a graphics chipset simply replicates the pixels at the higher resolution, the upscaled image will end up looking like the picture on the left and the solution will fail the test.

The Moiré test is much more difficult to pass, and it also carries half as much weight as the Jagged Edges test in determining the upscaling quality score of the solutions being tested.  The goal of this test is to stress the importance of higher quality filters as defined by the PC99 specification and as proposed for the PC2001 spec, and if such filters are implemented there will be no evidence of a moiré pattern in the test image. 



Upscaling & Downscaling – Results

The upscaling and downscaling tests together make up over half of the total quality score, so a card’s performance in these two tests has a dramatic impact on the overall quality score of the solution.

Blitter Upscale Quality Tests: Jagged Edges
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Blitter 600%
Fail
Fail
Pass
Fail
Pass
Pass
Pass
Fail
Blitter 400%
Fail
Fail
Pass
Fail
Pass
Pass
Pass
Fail
Blitter 356%
Fail
Fail
Pass
Fail
Pass
Pass
Pass
Fail
Blitter 286%
Fail
Fail
Pass
Fail
Pass
Pass
Pass
Fail

ATI, NVIDIA and Matrox (only for the G400) show that they use a higher quality upscale filter as the upscaled test image was not nearly as jaggy in any of those three cases when compared to the test image on all of the cards that failed the test.

The 3dfx cards, the Matrox G200 and the Savage 2000 all seemed to do nothing more than reproduce the low resolution pixels at the higher, nonnative upscaled resolution in the test.

Blitter Upscale Quality Tests: Moiré
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Blitter 600% Horizontal
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Pass
Blitter 600% Vertical
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Pass
Blitter 400% Horizontal
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Pass
Blitter 400% Vertical
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Pass
Blitter 356% Horizontal
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Pass
Blitter 356% Vertical
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Pass
Blitter 286% Horizontal
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Pass
Blitter 286% Vertical
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Pass

Only the Savage 2000 with its 16-tap x 16-tap filter was able to pass this very difficult test. The rest of the cards produced an undesirable moiré pattern on the test image.

According to MadOnion, only the highest order filters with between 8 and 16 taps will be able to pass this test.

Even the ATI Radeon only features a 4-tap x 3-tap filter which prevents it from passing any of the 8 tests in this category. Luckily for the rest of the manufacturers, this test only accounts for 3.3% of the total quality score, so while S3 may have passed with flying colors, it doesn't buy the Savage 2000 much of a lead.

Overlay Upscale Quality Tests: Jagged Edges
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Blitter 600%
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Blitter 400%
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Blitter 356%
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Blitter 286%
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass

All of the solutions emerged victorious here.

Overlay Upscale Quality Tests: Moiré
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Blitter 600% Horizontal
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Blitter 600% Vertical
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Blitter 400% Horizontal
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Blitter 400% Vertical
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Blitter 356% Horizontal
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Blitter 356% Vertical
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Blitter 286% Horizontal
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Blitter 286% Vertical
Fail
Fail
Fail
Fail
Fail
Fail
Fail
Fail

Not even the Savage 2000 is able to pass a single part of this last upscale quality test, illustrating that there is still room to improve, even among the latest graphics accelerators.



Blitter Downscale Quality Tests (# of Visible Lines)
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Blitter 60% Vertical
10/10
10/10
10/10
10/10
10/10
10/10
10/10
0/10
Blitter 50% Vertical
0/10
0/10
10/10
0/10
10/10
10/10
10/10
0/10
Blitter 35% Vertical
5/10
5/10
10/10
1/10
9/10
9/10
9/10
10/10
Blitter 33% Vertical
3/10
3/10
10/10
4/10
8/10
8/10
9/10
10/10
Blitter 25% Vertical
5/10
5/10
10/10
0/10
5/10
8/10
8/10
10/10
Blitter 20% Vertical
0/10
0/10
0/10
10/10
10/10
7/10
7/10
0/10
Blitter 60% Horizontal
15/15
15/15
15/15
15/15
15/15
15/15
15/15
15/15
Blitter 50% Horizontal
15/15
15/15
15/15
15/15
15/15
15/15
15/15
0/15
Blitter 35% Horizontal
0/15
0/15
15/15
7/15
7/15
15/15
15/15
15/15
Blitter 33% Horizontal
0/15
0/15
15/15
7/15
7/15
14/15
14/15
15/15
Blitter 25% Horizontal
0/15
0/15
15/15
7/15
7/15
14/15
14/15
15/15
Blitter 20% Horizontal
0/15
0/15
15/15
15/15
15/15
11/15
12/15
15/15

As we mentioned before, downscale quality will only increase in importance as time goes on, especially once video sources start readily being offered in resolutions greater than 720 x 480.

The ATI Radeon has what it takes to meet the demands of the future as it perfectly displayed all lines in every situation except for the 20% vertical downscale test. The NVIDIA cards came close, and the G400 can be considered for a distant third place, however other than that there were no runners up in this test.

The Voodoo5 was quite disappointing as it, being a recently released card, was still unable to reproduce any lines in a number of the tests.

Overlay Downscale Quality Tests (# of Visible Lines)
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Blitter 60% Vertical
6/10
0/10
10/10
10/10
10/10
10/10
10/10
0/10
Blitter 50% Vertical
5/10
0/10
10/10
0/10
0/10
8/10
9/10
10/10
Blitter 35% Vertical
3/10
6/10
10/10
4/10
4/10
8/10
9/10
10/10
Blitter 33% Vertical
3/10
6/10
8/10
3/10
3/10
8/10
8/10
10/10
Blitter 25% Vertical
2/10
6/10
5/10
2/10
2/10
7/10
7/10
10/10
Blitter 20% Vertical
2/10
2/10
10/10
3/10
3/10
5/10
8/10
0/10
Blitter 60% Horizontal
9/15
0/15
15/15
15/15
15/15
15/15
15/15
15/15
Blitter 50% Horizontal
7/15
0/15
15/15
14/15
14/15
15/15
15/15
15/15
Blitter 35% Horizontal
5/15
11/15
15/15
10/15
10/15
15/15
15/15
15/15
Blitter 33% Horizontal
4/15
9/15
15/15
11/15
11/15
15/15
15/15
15/15
Blitter 25% Horizontal
3/15
7/15
15/15
7/15
7/15
15/15
14/15
15/15
Blitter 20% Horizontal
2/15
7/15
15/15
4/15
4/15
15/15
14/15
15/15

ATI once again took the lead in terms of quality as it was able to perfectly reproduce all of the original lines in all but two downscale quality tests.

The NVIDIA cards are somewhat close to the Radeon however other than that there are, once again, no real runners up.



Colorspace Conversion

The Colorspace Conversion tests should be passed by any current generation graphics chipset which is the reason why it only accounts for 4% of the overall quality score.  The test simply makes sure that the graphics adapter can perform a colorspace conversion on the fly (YUV to RGB).  Colorspace conversion is necessary when the video data is stored in YUV mode and needs to be converted into RGB data for display output. 


Click to Enlarge

CSC – Results

Overlay Color Space Conversion Quality (Color Banding)
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Overlay Red
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Overlay Blue
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Overlay Yellow
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass

All decent cards should pass these tests, and as you can tell by the above chart, all of our contenders did. The only thing we'd like to note here is that the 3dfx cards did seem to produce a much more washed out rendition of the test image than the rest of the cards, however the cards did pass the test.



De-Interlacing

Before we get into this quality test you’ll have to understand the difference between interlaced scan and progressive scan video.  In order to do this, let’s take a look at how a regular (non-digital) television works.

A TV set has an electrically charged device inside of it called the picture tube that shoots streams of electrons against the chemically covered backside of the TV screen.  When the electrons hit individual portions of the screen, the chemical on the screen (usually a type of Phosphorus) reacts and produces an individual pixel.  This tube (more technically referred to as a Cathode-ray Tube, or CRT for short) shoots streams of electrons in horizontal lines, however it performs two sweeps of the entire screen surface in order to create the 525 scan lines NTSC TV's feature.  On the first pass every other horizontal line is scanned (first field), and on the second pass the remaining lines are scanned (second field).  Using the physical properties of the chemical on the backside of the screen combined with the way the human eye works, these two fields seem blurred together to bring us one frame of video. 

This type of video is known as interlaced scan, since the two fields are interlaced together.  The main benefit of this is you get a high refresh rate (60Hz for NTSC) while only having to display half the amount of data. 

Progressive scan is the method in which your computer screen as well as DTV displays video, at full resolution (all the lines are displayed at once) in a frame.  This is a much sharper way of displaying video. 

Most DVDs that you purchase are in 480i format which is interlaced scan video (480 scan lines), however the film source they were taken from were generally in progressive scan format.  The process of de-interlacing converts this interlaced scan video into progressive scan video by implementing one of two basic methods, commonly known as bob and weave.

Bob de-interlacing is used when the video was originally in an interlaced form, and in the most simplest of terms, it basically doubles the amount of lines in any given picture.  This unfortunately can result in quite a bit of blur appearing because of the line doubling (you don’t really have any new data to use, you’re simply doubling what’s there), it is much like the blurred effect that results from Full Scene Anti Aliasing (FSAA). 

Weave de-interlacing is used when the source video was originally in progressive scan form, in which case this method of conversion simply weaves the two fields together into one. 

With the introduction of ATI’s Radeon we saw a third type of de-interlacing come into focus: adaptive de-interlacing.  Adaptive de-interlacing dynamically switches between bob and weave de-interlacing methods on a per-pixel basis by examining multiple fields and applying the necessary de-interlacing algorithm to certain sections of the video.


Click to Enlarge



De-Interlacing – Results

De-Interlacing plays a fairly important role in the overall quality score, it accounts for 25% of the quality rating. 

De-Interlacing Quality
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Line Flicker
Pass
Pass
Pass
Pass
Pass
Pass
Crash
Fail
Feathering
Fail
Fail
Fail
Pass
Pass
Pass
Fail
Pass

From what we've seen it doesn't appear that Video2000 takes full advantage of ATI's adaptive de-interlacing algorithm thus causing the Radeon to fail the Feathering test. This unfortunately hurts the Radeon quite a bit since that test accounts for 5% of the total quality score.

The TNT2 wouldn't complete the De-Interlacing quality tests without crashing, which unfortunately prevents it from obtaining a final overall quality score.

With the exception of the TNT2, the Savage2000 was the only card to fail the Line Flicker test. This test simply tests functionality of the Bob de-interlacing method, and it's very odd that any card would fail, especially the Savage 2000 which has otherwise proved to be a decent solution for video playback.

Video2000 has two other De-Interlacing tests that are run if the card can pass both the Line Flicker and Feathering tests, however none of the three cards that met that requirement went on to pass any of the remaining tests.

Tearing

This test is just what you think it is, it’s the same type of artifacts that you may notice when you leave V-Sync off and you play certain games.  This test accounts for 15% of the total quality score.

Page Flipping
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Page Flipping
Pass
Pass
Pass
Pass
Pass
Pass
Fail
Pass

Only the TNT2 fails the tearing test, everything else passes without any problems.



Overall Quality

Now that we've analyzed all of the parts individually, it's time to look at the final quality score taken from Video2000:

The GeForce2 GTS comes out on top, even higher than the ATI Radeon which was a bit of a surprise to us at first. But if you look at the difference in overall quality marks between the GeForce2 and the Radeon you'll notice that it translates to approximately a 5% difference. Remember the De-Interlacing test that the Radeon failed? That kicked it down the 5% necessary to give NVIDIA the lead.

We mentioned that the TNT2 couldn't complete any of the De-Interlacing tests which prevented the final production of a quality mark score.

In this comparison, the Savage 2000 would be the cutoff point for ideal quality, so anything over 1000 in this test seems to be ideal for now. It's disappointing to see the Voodoo3 and more importantly the Voodoo5 down at the bottom at levels lower than the two year old G200. In 3dfx's defense, the Voodoo series was never positioned as a home entertainment card, but then again neither was the GeForce2.



Features Comparison

This part of the Video2000 benchmark is a simple set of tests that detect the presence of certain DirectX features.  This accounts for 20% of the total Video2000 score. 

Overlay Features
Feature
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Scaling: Min Stretching
10.00%
10.00%
3.10%
3.20%
3.20%
0.10%
0.10%
6.40%
Scaling: Max Stretching
1000%
1000%
51200%
1638400%
1638400%
2000%
2000%
1600%
Shrinking in X
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Shrinking in Y
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Stretching in X
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Stretching in Y
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported
BOB De-Interlacing
Supported
Supported
Supported
Not
Supported
Not
Supported
Supported
Supported
Supported
Interleaved BOB De-Interlacing
Supported
Supported
Supported
Not
Supported
Not
Supported
Supported
Supported
Supported
Non-Interleaved BOB De-Interlacing
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported
RGB Destination Color Keying
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported
YUV Destination Color Keying
Not
Supported
Not
Supported
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Supported
RGB Source Color Keying
Not
Supported
Not
Supported
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Supported
YUV Source Color Keying
Not
Supported
Not
Supported
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Supported
Multiple Color Keys
Not
Supported
Not
Supported
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Supported
Auto-Flipping
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Color Control
Not
Supported
Not
Supported
Supported
Not
Supported
Not
Supported
Supported
Supported
Supported
FourCC
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Clipping
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not Supported

The important things to note here is the lack of BOB De-Interlacing support by the Matrox cards, as well as the superior feature set of the Radeon as well as the now defunct Savage 2000.

All of the cards supported the Blitter scaling features Video2000 so there's no reason to crowd the page with another table for that.



Video Port Features
Feature
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce
NVIDIA TNT2
S3 Savage 2000
Scaling Taps
0
0
12
0
0
3
6
32
Pre-Shrinking in X
Supported
Supported
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Supported
Pre-Shrinking in Y
Supported
Supported
Supported
Not
Supported
Not
Supported
Supported
Supported
Supported
Pre-Stretching in X
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not Supported
Pre-Stretching in Y
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not
Supported
Not Supported
Max Width
4096
4096
4096
2048
2048
4096
4096
2048
Max Height
2048
2048
4096
1024
1023
4096
640
2048
System Memory Surfaces
Supported
Supported
Supported
Not
Supported
Not
Supported
Supported
Supported
Supported
Cropping
Supported
Supported
Supported
Supported
Supported
Supported
Supported
Supported

Now we begin to see why the Savage 2000 was the only card that could pass the Blitter Moiré quality test courtesy of its 32 scaling taps (16 x 16).

Out of all of the cards, the Matrox Gx00 cards seem to be the least feature filled which is disappointing since the DVDMax feature of the G400 is perfect for DVD playback on TVs from your computer.

The ATI and S3 solutions pull ahead in terms of features as we've noticed throughout this article, and it's represented in the feature marks score.

The Matrox cards pull up the rear yet again, this time being out spaced by the 3dfx cards and once again by the NVIDIA cards.



Performance Comparison

The overall performance is split into five major sections, blitter, data transfer, MPEG-2 encoding, default and reference decoder performance. 

The Blitter and Data transfer tests are mainly dependent on local video memory performance and AGP performance.  The MPEG-2 encoding test is almost exclusively dependent on the CPU performance of the system and was thus not included as it didn't really have any purpose in the results.  The default decoder performance measures the CPU utilization using the currently installed software decoder while playing back a 3Mbit, 6Mbit and 9Mbit MPEG-2 stream. 

Blitter Performance (MB/s)
Test
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce2
NVIDIA TNT2
S3 Savage 2000
100% Large Blitting
1079
836
1747
326
877
1741
1102
631
60% Downscaling
411
341
359
211
323
399
198
167
50% Downscaling
338
300
391
196
241
849
334
117
35% Downscaling
242
213
40
171
174
309
117
30
33% Downscaling
229
206
35
162
165
277
117
26
25% Downscaling
173
156
21
129
127
166
70
15
20% Downscaling
138
125
13
106
114
96
39
9
100% Small Blitting
714
718
359
279
444
602
596
558
286% Upscaling
694
625
1182
304
735
1695
851
221
356% Upscaling
693
626
1222
311
771
1934
933
281
400% Upscaling
696
626
1247
313
799
2252
978
249
600% Upscaling
696
626
1317
323
828
2607
1036
523

For the most part the GeForce2 remains in the lead, and in a few cases the Radeon comes pretty close as well. The biggest disappointment here is the Savage 2000 but then again we've never expected the Savage 2000 to be the best performer in any situation. The two 3dfx cards perform very well in these tests, in some situations even besting the GeForce2.

The Data transfer tests were pretty much dependent on a card and driver's ability to transfer data at a high rate, the Radeon ended up coming out in front in Non-Local to Local memory copies, while the Matrox solutions came ahead in the Local to Non-Local and Video to System memory copies.



Decoder Performance (CPU Utilization)
Video Stream
3dfx Voodoo3
3dfx Voodoo5
ATI Radeon
Matrox
G200
Matrox
G400
NVIDIA GeForce2
NVIDIA TNT2
S3 Savage 2000
3Mbit/s
35%
35%
10%
34%
34%
31%
31%
18%
6Mbit/s
38%
38%
13%
37%
37%
32%
32%
28%
9Mbit/s
42%
42%
17%
41%
41%
38%
38%
24%

Here's where things begin to get interesting. The WinDVD 2.2 player we used as the Default Decoder in this test took advantage of the ATI's hardware iDCT and HWMC and thus gave ATI the lowest CPU utilization numbers.

The Savage 2000 wasn't too far behind, however what was odd about the Savage 2000 was that the test system's CPU utilization was higher when decoding a 6Mbit stream than it was when decoding a 9Mbit stream, and this phenomenon was reproducible.

The Matrox and 3dfx cards, without any sort of Hardware Motion Compensation pulled up the rear, but what's truly interesting were the NVIDIA numbers.

As we mentioned before, the GeForce was the first NVIDIA chip to boast HWMC, however as the Video2000 test shows, the CPU utilization on the GeForce (as well as the GeForce2 and GeForce2 MX) was identical to that of the TNT2 which doesn't have a HWMC engine. So why is the GeForce with HWMC taking up just as much CPU time as the TNT2 without HWMC?

Apparently, most DVD players have two modes of operation: strict DirectShow compliance for WHQL qualification, and a Private mode for vendor specific operations. Such operations may include enabling features such as HWMC or iDCT features. It turns out that Video2000 is a DirectShow only benchmark, meaning it forces the decoder into that first mode of operation, which is obviously fine for ATI and S3 since their hardware acceleration is taken advantage of and it also doesn't carry any meaning to 3dfx and Matrox who don't have any hardware assisted playback, but it does pose a problem for NVIDIA. So according to NVIDIA, the benchmark is at fault and HWMC does actually makes a difference on the GeForce, GeForce2 MX and GeForce2 GTS cards.

The only way for us to confirm this was to fire up a copy of WinDVD 2.2 and try playing some DVDs. We used a Celeron 366 for these informal tests as we wanted to make sure that the differences we would see because of HWMC wouldn't be so small that they would be ignored. Using Wintop to monitor CPU utilization during playback, NVIDIA claims were substantiated as the TNT2 consistently was 10 - 20% higher in CPU usage than any of the GeForce cards. However without iDCT support in hardware, the ATI Radeon (as well as the Rage 128 and Rage 128 Pro) still boasts lower CPU usage numbers.



With its iDCT and HWMC taken advantage of, it isn't a surprise that the Radeon pulls ahead by a noticeable amount, in fact it is the only real lead that is present among the contenders here. The Savage 2000 has a bit of a lead as well however the rest of the cards are pretty much on par with one another when it comes to overall performance. Remember that the NVIDIA cards (with the exception of the TNT2 that has no HWMC) are running in this benchmark without HWMC enabled for the reasons that we explained before, basically regarding the way Video2000 conducts its performance tests. Enabling HWMC would probably kick the GeForce2 GTS up to the level of the Savage 2000 but definitely not up to the Radeon. However, most of the differences here come from differences in the Blitter and Data transfer performance numbers and not from CPU usage numbers.



Final Words

And thus we conclude our first entry into the world of DVD performance. We have seen some very interesting things in this investigation, for starters, it's obvious that in the quest to be the best, often times very little attention is paid to video features that are becoming increasingly important as time goes by.

It is very surprising to see the Savage 2000 come out on top in terms of features since we are so used to seeing it placed at the bottom of the list. It is a shame that the Savage 2000 went down the path it did, because with solid driver support and the very low prices it got to, the Savage 2000 could have been an excellent overall solution.

3dfx and Matrox are far behind the times in terms of video features, you can only focus on a single aspect of the market for so long before your competitors begin to surpass you. For 3dfx it was sheer power and performance, for Matrox it was 2D quality, and for both of them, those are arguments that don't hold up as well as they once did when the competition was struggling to keep up.

ATI has shown once again that it is the king of the video world, in terms of features, quality and performance, the Radeon is the mark that 3dfx, Matrox and NVIDIA should aim for. For the first time since the 3D revolution took off, ATI doesn't only have to tout their video features, but they can also boast some very high performing gaming frame rates as well. If ATI can successfully follow up the Radeon's release with another hard hitting product, while continuing to improve on their drivers, NVIDIA could begin to feel some serious heat from the company that just a year ago we would've never expected to have come this far.

What will the future bring us? As CPUs continue to increase in power, hardware assisted MPEG-2/DVD decoding will become less of an issue, and serious multitasking while playing DVDs will become more commonplace. However, what is currently a very CPU intensive task is decoding MPEG-4 video. What will really be a benefit as time goes on will be hardware assisted MPEG-4 decoding, as even today's mainstream processors aren't too happy decoding these complex streams.

Log in

Don't have an account? Sign up now