Original Link: https://www.anandtech.com/show/298



Imagine for a moment, a picture in the dark, revealed to you by a focused light only one square at a time. You have two options, either react to one of the squares of the picture that have already been revealed or hold your breath until the final piece falls into place.

title.jpg (14767 bytes)

It's a gamble, you have no idea what the next piece will bring, it may be a tremendously beautiful image or the most gruesome sight you have ever lay your eyes on. No, we're not talking about playing a silly Halloween game, rather we're providing simpler terms by which you can understand the state of the video card industry up to this point.

Pretty much following the law of order the history of the market has shown us, 3dfx was first to the punch with their undoubtedly amazing Voodoo3 accelerator, then followed by NVIDIA whose TNT2 and Ultra counterpart have already put smirks on the faces of NVIDIA supporters across the globe. While NVIDIA was stealing the spotlight from 3dfx's mammoth marketing campaign, S3 made a quiet release of the Savage4, a somewhat shortcoming successor to the original flop, the Savage3D. With every review of the aforementioned video chipsets, AnandTech concluded by essentially offering a third option, wait until the next competitor is to be released. You've waited long enough, the time is now; the only square left unrevealed from this picture? Matrox. Let's enter the G400, AnandTech-style.

The Stoning of Matrox

The time was August 1998, the product was the Matrox G200, an extremely high quality alternative to the 3dfx/NVIDIA dominated market that had potential to please the eyes of more than a few users. The problem? Matrox's flagship card lacked a completed OpenGL ICD, something the competitor's cards already had (NVIDIA). Although loyal Matrox users insisted that the OpenGL ICD would surely be "on the way" it turned out that what was sure to be another high acclaim on Matrox's track record turned into the company walking home with their heads hung low as even their once faithful supporters dropped the $150 for a NVIDIA TNT card. Matrox's attempt to please the business user as well as the hard core gamer failed miserably, although the G200 did end up making its way into a number of systems, its potential was severely hurt by Matrox's inability to release a high performing OpenGL ICD upon the launch of the chip.

As soon as the word got out about Matrox's next product, the G400, the only question users seemed to ask was whether or not Matrox would have an ICD ready for deployment, performance took a close second to the million dollar question in this case. The basic principles of common sense will tell you that it would be suicide for a company in Matrox's position to release another graphics accelerator and fail to concentrate on one of the most obvious weaknesses of its predecessor, because, of course, the reason for making a next generation product is to address the problems the previous generation posed, right? If there were no problems in the previous generation and if there was no room for improvement, the market wouldn't really work too well now would it? Needless to say, an unspoken requirement for Matrox's G400 was that it ship with a fully functional OpenGL ICD. And will it? You better believe so.

The Death of the Mystique

Last year marked the division of graphics accelerators between gaming and business cards, you could've never guessed that 1998 would be the last year you'd see a manufacturer attempt to segregate primarily based on gaming/business use. With the 3D market hitting everyone, 3D accelerators are no longer viewed as "only for gamers" although gamers do seem to get much more of a benefit from a Voodoo3 than business users do. This year the trend seems to be division according to clock speed, instead of worrying about releasing a single product and hoping the competition can't beat it in performance, why not release a variety of products, with each one offering a higher clock speed and theoretically faster performance. If the competition comes out on top, simply work towards better yields on your chips and release a higher clock speed version of your product. Simple enough?

3dfx did it with the Voodoo3 running at 143MHz and the 166MHz parts, calling them the 2000 and 3000 models respectively. NVIDIA did the same with the 125MHz and 150MHz TNT parts, with the latter being the "Ultra" version. Even S3, the relatively quiet member of this family announced two different performance flavors of their Savage4, the regular and the Pro. Needless to say, Matrox has jumped on this bandwagon with their first clock-speed segregated product release in the history of the company, the Matrox Millennium G400 and the Millennium G400MAX.

Interestingly enough, the name "Mystique," which has so often been associated with the word "crap" after the original Matrox Mystique product, is absent from the G400 product line, indicating a trend moving away from the business/gamer classification of graphics cards and towards classification based on performance. Following the lead of 3dfx, Matrox chose to divide the two Millennium G400 products according to three factors: 1) core clock speed, 2) memory clock speed, and 3) internal RAMDAC speed.



Two Smiling Faces: G400 & G400MAX

Although Matrox has a strict (very strict) policy about not releasing the clock speeds of their products, basic math will tell you the core clock speeds of the two G400 models. According to Matrox's spec sheet, the regular G400 achieves a peak fill rate of 250 Megatexels per second, and since the G400 is capable of the now commonplace single pass multi-texture rendering (2 textures in one clock) dividing the fill rate by 2 results in a 125 clock frequency, meaning the G400 runs at an internal clock frequency of 125MHz. The memory clock on the G400 happens to be an unusually high frequency (for such a low core speed) of 166MHz, and AnandTech speculates (Matrox won't give out the exact numbers) that the MAX version features a memory clock of approximately 200MHz. The 333 Megatexels/sec fill rate of the G400MAX points us in the direction of a 166MHz core clock speed (333 Megatexels/sec / 2 cycles per clock = 166MHz) which is a tad lower than what the competition is offering already.

g400_small.jpg (9394 bytes) g400max_small.jpg (9916 bytes)
Matrox G400 - 16MB Matrox G400MAX - 32MB
Click to Enlarge

As mentioned above, the third differentiating quality is the speed of the integrated RAMDAC featured on the chip. The regular G400 features a 300MHz internal RAMDAC while the G400MAX, in an attempt to gain the edge over 3dfx, features a 360MHz internal RAMDAC for support for the higher resolutions. The G400 will be available in both a 16MB and 32MB configuration, and the G400MAX comes exclusively in a 32MB configuration. As far as pricing goes, directly from Matrox, the G400 should be shipping in mid-June (no official word on the MAX, but probably around then or shortly thereafter) at the following prices:

Matrox Millennium G400 16MB - $149

Matrox Millennium G400 32MB - $199

Matrox Millennium G400MAX 32MB - $249

There are already reports of pre-orders going on with prices up to $30 less than the official quotes from Matrox which makes the 32MB G400 a bit more affordable, although it still makes the MAX as expensive as the most expensive TNT2 Ultra; and as you're about to find out, the performance of the G400MAX isn't exactly TNT2 Ultra comparable.

While the launch of the G400 chip itself will be reserved solely for the two arriving Millennium G400 products, Matrox has assured AnandTech that towards the end of the summer or possibly into Q4, the G400 should be available in a Marvel version as well for Video Editing enthusiasts. The Marvel G400 should offer the same features as the Marvel G200 (check out AnandTech's review on the Marvel G200) did with a few added features to make the ride worth it. With 3dfx entering the TV-in/out arena with their Voodoo3 3500TV, Matrox needs to make sure any competing products they release can paint the walls with any features 3dfx can crank out. We'll have to wait to see about that though

The bottom line is that the Mystique is gone, the Millennium G400 will be the only G400 product for at least a few more months, followed by the release of the more expensive Marvel G400 version. Easy enough to understand? Perfect, let's complicate it ;)



Chip Specifications

specs_sm.gif (23813 bytes)
Click to Enlarge



Exploring the G400

From the start Matrox had a unique presence in the industry, like ATI and now 3dfx/STB, they were both manufacturers of their own boards as well as their own graphics chips. Following in the age-old Matrox tradition, the G400 will only be featured on Matrox manufactured boards and does offer quite a few significant advantages over its predecessor, the G200.

According to Matrox, the technology behind the G400 is very similar to that of the G200, making driver updates easier to maintain across both technologies. Matrox told AnandTech that they are still committed to releasing a full OpenGL ICD for the G200, and if anything, the similarities between the G400 and G200 architecture will allow this to happen even easier. For those of you that thought Matrox would dump support for the G200 just because they have a new flagship, here's a reason to give Matrox a chance (now if Matrox fails to accomplish this, which is highly doubtful, feel free to let 'em have it).

AGP 2X/4X Support

As with most of the newly released (or announced) graphics accelerators, the Matrox G400 does boast AGP 2X and 4X compatibility. While many companies are being quite ambiguous about whether their products support AGP 4X out of the box, Matrox's G400 will be shipping with both AGP 2X and 4X compliance out of the box. What this means is that a few months down the line, when Intel's 820 chipset (aka Camino) is released, you can pop your G400 in a i820 board and enjoy increased performance due to the G400's ability to take advantage of AGP 4X transfer rates.

Vibrant Color Quality2

One of the most marketable features the G200 carried was its Vibrant Color Quality (VCQ) rendering, interestingly enough, VCQ isn't really a technology at all, rather a system Matrox defined. The G400, naturally, is back with a new "version" of VCQ rightfully entitled Vibrant Color Quality2 (VCQ2). VCQ2 offers the same advantages the original VCQ rendering system offered, which was basically the ability to render all scenes with 32-bit accuracy internally, then dither the final image down to 16 bits of color per pixel. This gave Matrox the best looking 16-bit rendering available at the time, needless to say that the G400 renders all scenes internally with 32-bit accuracy and then dithers them to 16 bits of color per pixel provided that you are set to render in 16-bit color mode.

If you're not running in 16-bit mode, then you have the option of enabling what Matrox likes to call VCQ2. Basically VCQ2 is a combination of a 32-bit color mode, a 32-bit Z-Buffer, and the same 32-bit accuracy performed with all internal calculations. This combination provides for the absolute best possible visual experience available in a game, unfortunately for Matrox, the G400 isn't the only chipset with this capability. The NVIDIA Riva TNT2 is also capable of achieving the same image quality in this case, the only limiting factor here is the design of the board and the RAMDAC which will make the picture look somewhat (albeit barely noticeable) less crisp as that of a G400. Matrox also boasts support for what is known as Stencil Buffering, or the ability to render only the visible part of a scene, a performance booster in ideal cases.

Matrox can't claim that they're 100% unique with the idea behind VCQ2, but since the term is copyrighted, they can always claim that no other company has VCQ2. Bottom line? Don't get fooled by the marketing, you're not getting anything special with VCQ2, NVIDIA has had this for a while now.

What's the performance hit when running in VCQ2? The performance hit when going from 32-bit color to 16-bit color rendering is next to nothing. If you're looking for numbers, the drop is only 6% in Quake2 [demo1.dm2] on a 16MB G400 (not the MAX) at 1024 x 768, which isn't bad at all considering you only have 16MB of frame buffer to work with. The drop plummets to under 3% with the G400MAX on the same Pentium III 500 system at the same resolution, the only difference being the 32MB frame buffer on the MAX. Under Direct3D tests, enabling the 32-bit Z-buffer doesn't seem to hinder performance to any noticeable degree, however the G400's OpenGL ICD does not yet support 32-bit Z-buffering under OpenGL applications, so there is no way to test the performance hit under 32-bit Z happy games such as Quake 3 Arena. According to Matrox's engineers, the performance drop should be negligible, but it's always best to be sure. Matrox assured AnandTech that the OpenGL ICD would eventually support 32-bit Z-buffering, however in its current revision it does not.



Environment-Mapped Bump Mapping - The Term we never knew Existed

You all remember the game Trespasser right? It was supposed to be one of the most realistic gaming titles ever released due to its incredibly accurate (and what we all eventually realized was an incredible unplayable) physics engine. However I couldn't help but notice that the quality of the graphics in the game were completely horrible on my (at the time) Voodoo2 SLI setup, especially when looking at or swimming in water.

Although the NVIDIA Riva TNT did make the water look considerably better, there was still an obvious separation between what water even remotely looks like, and what was present in the game. After talking with a friend, we both came to the conclusion that the quality of effects such as water or the textures on walls are still far from realistic. We've all noticed at one point or another that the water in our favorite first person shooters don't look all too "water-like" and the wooden textures in our nerdy-past times don't really look all that different from pieces of plastic on the screen, but we've never known what we were lacking all this time.

While Matrox can't claim exclusive rights to the technology behind VCQ2, they can claim that they are the only desktop accelerator in this scope that supports a technology known as Environment-Mapped Bump Mapping. Bump Mapping is a technology that allows a texture to appear more realistic by the inclusion of individual bumps in the texture. If you look closely at a painted wall in real life (not in Q3A world guys) you'll notice that the wall isn't perfectly smooth, there are bumps and numerous imperfections. Bump Mapping essentially allows for those bumps to be put on walls and basically any textures in a game. It truly makes your gaming experience something else. NVIDIA supports a type of bump mapping known as embossed bump mapping, which can be considered a form of what we're talking about, but not the best and most realistic. Embossing is basically a cheap work around to avoid Dot Product 3 or Environment-Mapped Bump Mapping, the latter being the implementation Matrox used in the G400 (3Dlabs Permedia 3 will be the only card that will support Dot Product 3 Bump Mapping, it should be at least equal in quality to what the G400 can product with Environment-Mapped Bump Mapping).

Instead of manipulating a standalone texture map to make it "look" like it has bumps on it, Environment-Mapped Bump Mapping adds a third texture to the rendering sequence (in cases where both an Environment Map and a Texture Map are present). The third texture, or the bump-map, is nothing more than the basic environment map with bumps and grooves. Since the G400 is capable of single pass multi-texturing (as well as 2 pixels per clock in the event that only a single texture is being processed), the Environment Map and Texture Map are processed first in 1 clock cycle, then the Bump-Map is added on top of them in 1/2 a clock cycle. It's simple addition of textures, unfortunately the added 1/2 clock cycle required by the Bump-Map does tend to slow things down a little.

Using Rage Software's Expendable, an incredible looking game, there was a noticeable drop in frame rate when Bump Mapping was enabled near points of extensive use of Environment-Mapped Bump Mapping. At 1024 x 768 x 32-bit color & 32-bit Z-buffer on a normal Pentium II 400, the frame rate dropped from extremely fluid to a point where the game was a little choppy. The game was still playable, although be warned that there is a drop in performance. If you look at it from the point of view of the G400 chip itself, it makes sense that there would be a drop in performance as you're making the processor calculate information for another 1/2 clock cycle just so you can look at pretty water - the things we make our computers do ;)

No Bump Mapping

expend-nobump.jpg (24289 bytes)

Environment Mapped Bump Mapping

expend-bump.jpg (33715 bytes)

Will Bump Mapping catch on? Considering that the only requirement for a game to support Environment Mapped Bump Mapping is that the developer makes use of DirectX 6 or greater and includes specific support for it (not too complex), don't be too surprised if the market leaves some doors open for G400 users. Hopefully other manufacturers will catch on and we'll see Environment Mapped Bump Mapping catch on big time, it would be sad to lose this kind of quality because of poor market support.



256-bit DualBus Architecture

In AnandTech's original coverage of the Matrox G200, the first feature praised was the G200's 128-bit DualBus Architecture, a technology that was really much more than a marketing ploy. The success of the 128-bit DualBus Architecture inspired Matrox to make another quantum leap ahead of the quickly approaching competition with the new 256-bit DualBus Architecture (DBA), exclusive to the G400. The same analogy that applied to the G200's 128-bit DBA still applies to the 256-bit DBA, let's first take a look at what the 128-bit DBA did for the G200:

Imagine that you are on an 8-lane highway. The 8-lanes of this highway allow for more traffic to move from one end of it to the other, however there is a catch. The cars on the highway can only be moving in one direction at a time, meaning that all the cars must either be moving up the highway or down it but not both at the same time (all 8-lanes move in the same direction). Consider that the limited functionality of an internal 128-bit Data Bus when applied to video cards, on any given CPU clock cycle the data being transferred via the internal 128-bit Data Bus can only flow in one direction (to the graphics engine). On the following clock cycle the data can be transferred down the bus in the other direction (from the graphics engine). While this approach does have its benefits, when dealing with 2D images and bitmaps where the data that must be transferred down the bus remains quite small (less than 128-bits) there is a much more efficient way of approaching this.

Let's take that highway example from above, now instead of making that highway an 8-lane highway let's split it up into a 4-lane going and a 4-lane coming highway. Meaning that at the same time 4 lanes of cars can be traveling on the highway in the opposite direction of 4 lanes of cars on the other side of the highway (4 lanes can be leaving the city while 4 lanes can be entering). If there is no need for 8 lanes to be open for transportation in any one direction then the first 8-lane highway wouldn't be as efficient as this modified 4/4-lane highway. The same theory applies to the Matrox G200.

Instead of occupying the entire width of a 128-bit bus to transfer data in 64-bit chunks why not create a dual 64-bit setup with one bus dedicated to sending data to the graphics engine and the other dedicated to receiving data from it. This is what the G200's 128-bit DualBus architecture is, in essence it is 2 64-bit buses offering the same combined bandwidth as a single 128bit data bus while allowing for data to be sent in parallel to and from the graphics engine.

This time around, instead of splitting the 128-bit bus into two 64-bit buses, Matrox doubled the effective bandwidth by implementing dual 128-bit buses, making up a 256-bit I/O bus for transferring data between the G400's graphics engine and the memory buffers. Keep in mind that although the G400 features an internal 256-bit DBA, the external memory bus (bus connecting the G400 chip to the memory on the board itself) is still 128 bits wide. This brings up the need for extremely fast access to the memory, which is more than adequately taken care of using the 166/200MHz SDRAM on the G400/MAX.



DualHead Display

While just about every other video chipset manufacturer is providing support for digital flat panel output, Matrox decided to put their innovative abilities to good use and come up with a technology known as DualHead Display. dualheaddisp.jpg (9492 bytes)

Instead of outfitting the G400 (both the regular and MAX versions) with digital LCD out ports that most users won't end up using, Matrox decided to include dual VGA output ports. Why on earth would you do something like that? Here's where DualHead comes in.

The G400 features two independent Cathode Ray Tube Controllers (CRTCs - the Cathode Ray Tube (CRT) is the "picture" tube found in non-HD Television sets), one that outputs the primary signal to your VGA monitor, and a second controller that is capable of performing a number of tasks. The second CRTC can either output to Matrox's on-board MGA-TVO controller or a digital flat panel (via a separate add-on module). The MGA-TVO is more than a TV-output controller, rather it is able to output to either an external NTSC or PAL TV, or to another monitor. The ability to output to two displays at once is the result of Matrox's DualHead Display technology on the G400 and G400MAX boards.

Using the supplied adapter, you can use either a S-Video or RCA output connector for the MGA-TVO driven TV-output. The Matrox G400MAX has a heatsink on the actual MGA-TVO chip, which according to Matrox, is there as a precautionary measure and is not related to any heat problems; on another note, the regular G400 does not feature the same heatsink. Because you are using the MGA-TVO for the secondary display output, your second monitor won't appear to be as sharp as your primary adapter. The second CRTC does not use the G400's integrated 300/360MHz RAMDAC, and therefore only supports resolutions up to 1280 x 1024, still not bad for a second monitor.

With a second monitor you have the option of spanning your desktop across both monitors as in a normal dual monitor Windows 98/2000 setup, or you can use the second monitor to provide the same display as your primary desktop at either a regular or zoomed resolutions.

The TV-output the MGA-TVO provides is of a fairly decent quality, and courtesy of the DualHead technology, you can keep your desktop (primary monitor) at any resolution/color depth/refresh rate while outputting to a TV screen on the second display without causing any problems. If anything, the DualHead is targeted for presentations just as much as it is for users that have a craving for more desktop space.

Another feature allowed for by the DualHead setup is the ability to output full screen DVD-out to a TV while running your desktop in a normal resolution on your primary display. This essentially turns your computer into a fully functional DVD player without losing any flexibility with when you can use your system to do work.

The G400 has wonderful DVD support, however AnandTech will take a closer look at DVD support from all of the latest graphics accelerators in an upcoming article. If you're a DVD fanatic (as am I), the G400 won't let you down.



Drivers & 2D Image Quality

As mentioned before, the G400 has a functional OpenGL ICD. The OpenGL ICD is for Windows 9x only (no NT yet) and is not a final candidate, meaning performance and stability still have room for improvement. AnandTech noticed a few texture problems with the OpenGL ICD under Quake 2, but overall the ICD seemed quite strong. Super7 support by the OpenGL ICD is scarce, and stability under OpenGL apps with Super7 systems is not too pleasing at all. AnandTech was informed by Matrox that they are working on stability issues with the ICD (there are a few) and they have assured AnandTech that the problems will be cleared up by the time the final drivers need to be complete.

info.gif (35074 bytes) options.gif (22592 bytes) monitorsettings.gif (40455 bytes)
dualhead.gif (38115 bytes)

Click to Enlarge

color.gif (30945 bytes)

In an impromptu interview at this year's E3, Matrox confirmed that G400 users would have Windows 2000 drivers upon the release of Microsoft's upcoming operating system, although they regretted to inform us that there were no Win2K drivers available as of now.

Support for AMD's 3DNow! instruction set will be provided in the final release candidates of the drivers as well as support for AMD's K7 and DirectX 6.2. Matrox is determined to make the G400 a please all solution, and the Super7/K7 market is one that truly does need a leader in the video industry with that sort of an attitude.

The 2D image quality of the G400 boards AnandTech received was top quality, partially due to the higher speed RAMDAC on the G400MAX (it allowed for higher resolutions to be achieved), but primarily because of the excellent manufacturing that went into the production of the boards themselves.   Matrox takes the AnandTech Editor's Choice Award for Best Image Quality (2D & 3D) for their outstanding 2D image quality, coupled with superb 3D image quality as well.   But what's an award without performance to back it up?

The Test

AnandTech received evaluation G400 and G400MAX boards from Matrox. The G400 was outfitted with 16MB of SDRAM, and the G400MAX with 32MB. The cards were compared to the 3dfx Voodoo3 2000/3000, Leadtek Winfast 3D S320-II 16MB TNT, and the Diamond Stealth III S540 32MB Savage4 cards. AnandTech's Slot-1/Socket-370 test configuration was as follows:

  • Intel Pentium III 500, Intel Pentium II 400, Intel Pentium II 266, Intel Celeron 333, Intel Celeron 266 (0KB L2) on an ABIT BX6 Revision 2.0 or an ABIT ZM6 for the Socket-370 Celeron 333 tests.
  • 64MB of Memman/Mushkin SEC Original SDRAM was used in each test system
  • Western Digital 5.1GB Ultra ATA/33 HDD
  • Microsoft Windows 98

AnandTech's Super7 test configuration differed only in the processor/motherboard used

  • AMD K6-3 500, AMD K6-2 300, AMD K6-266
  • ASUS P5A Aladdin V based Super7 Board w/ ALi AGP v1.54 drivers

The benchmark software used was as follows:

  • id Software's Quake 2 Version 3.20 using demo1.dm2 and 3Finger's crusher.dm2
  • Monolith's Shogo using 3Finger's RevDemo
  • Interplay's Descent3 Demo2 using AnandTech's Descent3 Torture Demo could not be used, Matrox is investigating a problem with Descent 3 and the G400 at this time

Each benchmark was run a total of three times and the average frame rates taken. Vsync was disabled

All scores were taken at a 16-bit color depth and 16-bit Z-buffer unless otherwise stated

AnandTech used Matrox Millennium G400 DualHead driver revision 4.11.01.1100



OpenGL Performance - Quake 2 demo1.dm2













OpenGL Performance - Quake 2 crusher.dm2











Quake 2 Performance Conclusions

The first, if not most obvious, conclusion you can come to with the Quake 2 performance numbers is that the G400 is extremely CPU dependent. It's performance at speeds lower than a Pentium II 400 is bordering disappointing in comparison to what the TNT2 and Voodoo3 are capable of accomplishing, at the same time it can be argued that the G400 isn't intended to be competing with the TNT2 and Voodoo3 on that level of gaming. It depends entirely on how fast you're looking to play your games, the image quality is there with the G400 as is the performance, but how much is another question.

Older Pentium II owners may want to steer shy of the G400 if you're remotely concerned with performance, 3dfx seems to still be king of the lower performing market, and Matrox is definitely far from it.

The instability of the OpenGL ICD on AnandTech's Super7 test bed was a bit disappointing, luckily the Direct3D tests came through, with not too disappointing numbers. Overall, the Quake 2 numbers speak for themselves, the G400 is geared towards the balanced user that needs a healthy dosage of games and work and enjoys getting the most out of both. Another key thing to keep in mind is the negligible performance drop when jumping to 32-bit color from 16-bit when running with a 16-bit Z-buffer. We'll have to wait for 32-bit Z tests under OpenGL, hopefully Matrox will have their updated drivers ready for release soon.



Direct3D Performance - Shogo RevShogo



Image40.gif (17519 bytes)

Image41.gif (17980 bytes)



Image42.gif (16910 bytes)

Image43.gif (18318 bytes)

Image44.gif (18506 bytes)



Image45.gif (17050 bytes)

Image46.gif (18008 bytes)

Image47.gif (17541 bytes)



Image48.gif (17055 bytes)

Image49.gif (17943 bytes)

Image50.gif (18507 bytes)



Image54.gif (16342 bytes)

Image55.gif (16578 bytes)

Image56.gif (15615 bytes)



Image51.gif (16569 bytes)

Image52.gif (16443 bytes)

Image53.gif (16343 bytes)



Image57.gif (23208 bytes)

Shogo Performance Conclusions

The G400 once again shows its incredible CPU dependency (and thus incredible CPU scalability) in Shogo.  A surprising lead is the G400MAX at the top of the 1600 x 1200 render scene with the Pentium III 500, even topping the "faster" Voodoo3 3000 and the TNT2. 

The rest of the results are pretty straight forward, the Super7 performance is an interesting topic to comment on.  The G400 doesn't perform nearly as well as the 3dfx cards do in a Super7 situation due to 3DNow! optimizations (or lack thereof, comparatively speaking), however the G400 brings much more to the Super7 platform than any other video card manufacturer, giving Super7 users a nice alternative as well.  However the smart thing to do before making any final decisions is to make sure that the OpenGL ICD is in its most stable form before slapping down a good $200 for a G400, especially if you're a Super7 user. 

Final Words

The G400 is finally here, and it is definitely not a Voodoo3 or TNT2 killer.  The hard core gamer that simply wants performance will probably want to stay away from the G400, however if you don't mind not having the absolute best in 3D performance then the G400 quickly becomes a viable option.

Owners of slower computers will want to stay away from the G400, instead you'll probably want to explore 3dfx's solutions, or maybe NVIDIA's TNT2 depending on how "slow" your computer happens to be (in terms of CPU speed).   Mid range systems should be fine with the G400, however don't expect eyebrow raising performance out of the card, even the MAX version.  Higher end systems will prove to close the gap between the G400 and the more performance oriented alternatives, the G400 has some room to grow, so the faster your CPU, the better your G400 will perform, that's a given.

Matrox definitely has a winner on their hands, the G400 is much more than everything the G200 should have been, and it's no surprise that such a combination of features, performance, and outstanding image quality will be making its way into the hands of quite a few anxious users that have renewed faith in Matrox.   Myself included ;)  Let's just hope that Matrox can iron out the last few bugs with their ICD, and work on improving performance.  Although the G400 will probably never reach TNT2 Ultra levels of performance, the closer Matrox gets, the better.   The cards are ready and out in the open, you make the decision.

Log in

Don't have an account? Sign up now