I don't think it's the right market for that. Nostalgia might sell where it's more analog/vintagey in nature, like cars and guitars.
But 3D chips? Buying a new card people are looking for cutting edge tech, not Glide support and good analog output quality (well... maybe that was Matrox more than 3dfx). Dual-releasing Nvidia chips in 3dfx branded cards will at best dilute the brand. A better strategy migth be to add retro-support in new drivers: official Glide support. :)
BTW, it's surprising to realize that 3dfx's heyday, or actually the whole of its life in the market, was only 3-4 years.
They had a couple of years as "arcade only" video card (chip?) providers (they were seen in Atari's SF Rush). Then a year or two as extremely niche consumer video cards (4-5 times the cost of typical video cards), followed by more or less "mainstream-enthusiast" for 3-4 years. Basically from the time 2MB of low latency dram was relatively cheap until the original "SST" design became obsolete.
If they ever had a successor to the "SST" design (that mult-rendering thing?), it didn't come in time.
3dfx lost its audience a long time ago. Some of us still revere 3dfx because we owned them and lived through it, but we're few and far between. You can't lose your audience and have any value. The most Nvidia could do with it at this point is a "3dfx Edition" or "Voodoo Edition" card. Nvidia Geforce 4070 Voodoo. Instead of Super or Ti. Something like that. I think Voodoo was actually a better name than Geforce. It's possible the 5000 series was an ode to 3dfx with the FX moniker.
I don't think there is any chance of Nvidia using the Voodoo branding as long as Geforce remains synonymous with top level gaming performance. They'll only chance branding if they do something to tank the cache of the Geforce brand name.
Did nvidia ever re-allow the "glide wrapper". It looks like you can download it, even though 3dFX took it down originally. I'd have thought I'd notice it (although I was rocking a voodoo 3, and when I switched to the Radeon glide was dead).
Once they bought 3dfx, the glide wrapper could only help sell nvidia cards.
They have always been very solid products. Obviously nothing GPU competitive in decades, the last realistically being the G400 MAX, but you saw Matrox GPU cores a lot in servers for decades up until recently as dedicated 2D graphics output is dying in even SMB servers with the integration of iGPU in Xeons or reliance on IMPI for remote management. Occasionally there are still dedicated graphics chips but these days they seem to mostly be Aspeed.
Anyway, the one thing you get, or got, was incredible 2D quality. Throughout the 90's, Matrox was regarded as having the cleanest analog output available thanks to high quality chokes, circuit designs, filters, and class-leading RAMDACs. As everything went digital this all became less of a feature and more of a given, but it was one thing that really put them on the map.
I remember complaining about having to buy a Matrox G100 on top of two Voodoo2 to replace a single Voodoo Rush which never worked properly with new 3D games (primarily Unreal with its reflective surfaces and online deathmatch mode). Ended up never having a problem with it and sold it years later in a PC I built from parts.
I ran two Voodoo two's in SLI with a Tseng Labs ET6000 dos accelerator as any gamer of the day would not give up on Duke 3D and the like. A Sound Blaster AWE32 was also a must have.
Single slot, no power connectors required, and even a low profile fanless SKU. Its a pity that there are niche instead of mainstream graphics adapters.
Theres nothing stopping team red or green developing the 2d portion of their cards to as high as standard, just not many gamers would need it or pay for it if the costs were added on. Its a niche product for a reason, not many workstations need multihead full bandwidth 4k - 8k arrays just yet.
Can the smaller A310 based cards be installed in the 1x slots of a standard PC as you suggest in the article? You're not capable of getting 75W out of a 1x slot. Or were you positing the existance of a desktop PC with more than 2 16x slots? If they had external power connectors, this wouldn't be an issue. So, while not requiring an external power connection is normally a feature, in this use case, it seems that not having one as an *option* is more of a liability.
Per specification of the PCI-E The maximum 12v power that may be consumed by 1x slot card, is 25W from 12V after software congratulation as high power card, so unless the card pulls the remaining power from 3.3v rail (which I do not see it happening) the 30W LUMA A310 cannot run on 1x slot (at least not officially). I guess if you need several cards make sure your motherboard has enough 16x slots.
Exactly, that's why I question the wording of 'desktop' when it clearly needs a bunch of at least 8x slots which is not a 'desktop', maybe a workstation or small server, but that's not a 'desktop'.
I fell for an A770 early this year, because the price was simply too attractive to ignore. I wanted to see if it was a proper upgrade for a GTX1080ti, that wasn't doing so well on 4k for "quickie games" (got another system for the serious stuff).
And while it was quite ok for gaming even at 4k, DP port compatibility was a nightmare with my Aten KVM and LG monitor (rarely a picture and if I managed somehow, it wouldn't survive a KVM switch or a reboot), the only connection that worked was via HDMI (my KVM is dual DPx4, so that's not a productive option).
That was quite a surprise, given that the same setup has no issues with just about every generation of iGPU since Sandy Bridge and has led me to think that perhaps there was much less synergy or even shared technology between the iGPU and dGPU teams at Intel, both in hardware and in software, than I had simply assumed from their marketing.
Since I then read about others having similar issues without resolution, I sent it back within the free return window and decided to stick with the green team, the old one for now, because anything new that actually performs better, is too expensive for "quickie" stuff and too low-power for real gaming: exactly where the A770 is currently promising, but failing to deliver reliably.
With Matrox being all about multi-monitor and KVM compatibility, perhaps some of their testing/validation might help Intel solve such issues, unless it's a hardware thing after all.
And single slot GPUs are a USP that deserves some attention, e.g. when you want to do GPU passthrough virtualization and don't happen to have an ASpeed console device or iGPU on your workstation.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
26 Comments
Back to Article
tipoo - Friday, April 28, 2023 - link
Matrox! Intel is SAVED!Exotica - Friday, April 28, 2023 - link
So is Voodoo coming back too?blppt - Saturday, April 29, 2023 - link
I'm actually surprised that Nvidia hasn't tried to resurrect the name "3dfx" or "Voodoo" for its products since nostalgia always sells.sheh - Saturday, April 29, 2023 - link
I don't think it's the right market for that. Nostalgia might sell where it's more analog/vintagey in nature, like cars and guitars.But 3D chips? Buying a new card people are looking for cutting edge tech, not Glide support and good analog output quality (well... maybe that was Matrox more than 3dfx). Dual-releasing Nvidia chips in 3dfx branded cards will at best dilute the brand. A better strategy migth be to add retro-support in new drivers: official Glide support. :)
BTW, it's surprising to realize that 3dfx's heyday, or actually the whole of its life in the market, was only 3-4 years.
wumpus - Sunday, April 30, 2023 - link
They had a couple of years as "arcade only" video card (chip?) providers (they were seen in Atari's SF Rush). Then a year or two as extremely niche consumer video cards (4-5 times the cost of typical video cards), followed by more or less "mainstream-enthusiast" for 3-4 years. Basically from the time 2MB of low latency dram was relatively cheap until the original "SST" design became obsolete.If they ever had a successor to the "SST" design (that mult-rendering thing?), it didn't come in time.
Flying Aardvark - Monday, May 1, 2023 - link
3dfx lost its audience a long time ago. Some of us still revere 3dfx because we owned them and lived through it, but we're few and far between. You can't lose your audience and have any value. The most Nvidia could do with it at this point is a "3dfx Edition" or "Voodoo Edition" card. Nvidia Geforce 4070 Voodoo. Instead of Super or Ti. Something like that. I think Voodoo was actually a better name than Geforce. It's possible the 5000 series was an ode to 3dfx with the FX moniker.Flunk - Monday, May 1, 2023 - link
I don't think there is any chance of Nvidia using the Voodoo branding as long as Geforce remains synonymous with top level gaming performance. They'll only chance branding if they do something to tank the cache of the Geforce brand name.wumpus - Sunday, April 30, 2023 - link
Did nvidia ever re-allow the "glide wrapper". It looks like you can download it, even though 3dFX took it down originally. I'd have thought I'd notice it (although I was rocking a voodoo 3, and when I switched to the Radeon glide was dead).Once they bought 3dfx, the glide wrapper could only help sell nvidia cards.
Scipio Africanus - Saturday, April 29, 2023 - link
Never had a Matrox card, my first PC card was an ATI Mach32. Funny how they've survived all the way till now though technically its AMD now.Samus - Saturday, April 29, 2023 - link
They have always been very solid products. Obviously nothing GPU competitive in decades, the last realistically being the G400 MAX, but you saw Matrox GPU cores a lot in servers for decades up until recently as dedicated 2D graphics output is dying in even SMB servers with the integration of iGPU in Xeons or reliance on IMPI for remote management. Occasionally there are still dedicated graphics chips but these days they seem to mostly be Aspeed.Anyway, the one thing you get, or got, was incredible 2D quality. Throughout the 90's, Matrox was regarded as having the cleanest analog output available thanks to high quality chokes, circuit designs, filters, and class-leading RAMDACs. As everything went digital this all became less of a feature and more of a given, but it was one thing that really put them on the map.
Threska - Saturday, April 29, 2023 - link
Multi-monitor setups on the wall street trading floor for example. Most modern GPUs give you about four outputs.Findecanor - Saturday, April 29, 2023 - link
I have also seen Matrox in the broadcasting industry, as they have supported professional video formats and connectors.MadAd - Saturday, April 29, 2023 - link
I remember complaining about having to buy a Matrox G100 on top of two Voodoo2 to replace a single Voodoo Rush which never worked properly with new 3D games (primarily Unreal with its reflective surfaces and online deathmatch mode). Ended up never having a problem with it and sold it years later in a PC I built from parts.dubyadubya - Tuesday, May 2, 2023 - link
I ran two Voodoo two's in SLI with a Tseng Labs ET6000 dos accelerator as any gamer of the day would not give up on Duke 3D and the like. A Sound Blaster AWE32 was also a must have.meacupla - Saturday, April 29, 2023 - link
Matrox, the SEGA of graphics cards industry.Samus - Tuesday, May 2, 2023 - link
I laughed at this but hey at least Matrox has remained profitable.PeachNCream - Saturday, April 29, 2023 - link
Single slot, no power connectors required, and even a low profile fanless SKU. Its a pity that there are niche instead of mainstream graphics adapters.MadAd - Saturday, April 29, 2023 - link
Theres nothing stopping team red or green developing the 2d portion of their cards to as high as standard, just not many gamers would need it or pay for it if the costs were added on. Its a niche product for a reason, not many workstations need multihead full bandwidth 4k - 8k arrays just yet.Threska - Saturday, April 29, 2023 - link
2D implemented using 3D.https://gamedev.net/forums/topic/613700-2d-acceler...
https://www.tomshardware.com/reviews/2d-windows-gd...
PeachNCream - Sunday, April 30, 2023 - link
Did you reply to the wrong person? I didn't mention anything about 2D or 3D so I'm not sure what point it is that you're attempting to articulate.Threska - Sunday, April 30, 2023 - link
I replied to MadAD and that's the way the indention shows.PeachNCream - Tuesday, May 2, 2023 - link
Ugh this reply system. I was replying to MadAd as well, not you.dwillmore - Saturday, April 29, 2023 - link
Can the smaller A310 based cards be installed in the 1x slots of a standard PC as you suggest in the article? You're not capable of getting 75W out of a 1x slot. Or were you positing the existance of a desktop PC with more than 2 16x slots? If they had external power connectors, this wouldn't be an issue. So, while not requiring an external power connection is normally a feature, in this use case, it seems that not having one as an *option* is more of a liability.Eliadbu - Sunday, April 30, 2023 - link
Per specification of the PCI-EThe maximum 12v power that may be consumed by 1x slot card, is 25W from 12V after software congratulation as high power card, so unless the card pulls the remaining power from 3.3v rail (which I do not see it happening) the 30W LUMA A310 cannot run on 1x slot (at least not officially). I guess if you need several cards make sure your motherboard has enough 16x slots.
dwillmore - Sunday, April 30, 2023 - link
Exactly, that's why I question the wording of 'desktop' when it clearly needs a bunch of at least 8x slots which is not a 'desktop', maybe a workstation or small server, but that's not a 'desktop'.abufrejoval - Sunday, April 30, 2023 - link
I fell for an A770 early this year, because the price was simply too attractive to ignore. I wanted to see if it was a proper upgrade for a GTX1080ti, that wasn't doing so well on 4k for "quickie games" (got another system for the serious stuff).And while it was quite ok for gaming even at 4k, DP port compatibility was a nightmare with my Aten KVM and LG monitor (rarely a picture and if I managed somehow, it wouldn't survive a KVM switch or a reboot), the only connection that worked was via HDMI (my KVM is dual DPx4, so that's not a productive option).
That was quite a surprise, given that the same setup has no issues with just about every generation of iGPU since Sandy Bridge and has led me to think that perhaps there was much less synergy or even shared technology between the iGPU and dGPU teams at Intel, both in hardware and in software, than I had simply assumed from their marketing.
Since I then read about others having similar issues without resolution, I sent it back within the free return window and decided to stick with the green team, the old one for now, because anything new that actually performs better, is too expensive for "quickie" stuff and too low-power for real gaming: exactly where the A770 is currently promising, but failing to deliver reliably.
With Matrox being all about multi-monitor and KVM compatibility, perhaps some of their testing/validation might help Intel solve such issues, unless it's a hardware thing after all.
And single slot GPUs are a USP that deserves some attention, e.g. when you want to do GPU passthrough virtualization and don't happen to have an ASpeed console device or iGPU on your workstation.