Comments Locked

42 Comments

Back to Article

  • r3loaded - Saturday, April 7, 2018 - link

    Goodbye Thermi, you were always the butt of many internet jokes and memes.
  • Samus - Saturday, April 7, 2018 - link

    I'll never forget night GTX470 and I would cuddle up, the sensuous hum at idle ever so present, swiftly heating the cubby near my feet.
  • Hurr Durr - Saturday, April 7, 2018 - link

    One wonders why it didn`t happen sooner.
  • madwolfa - Saturday, April 7, 2018 - link

    Long Term Support / Enterprise.
  • DanNeely - Sunday, April 8, 2018 - link

    It's been less than 4 years since the last Fermi products (a few low end 700 series branded cards) were released. Providing at least nominal support (new drivers work, but not much in the way of per game optimization is done) for old architectures until they're woefully obsolete is SOP for gaming cards.

    AMD did pull the plug on their similarly old VLIW4 cards a few years sooner; but that was mostly noteworthy because it happened as early as it did. Presumably that happened because the radical architecture change to GCN meant that they weren't able to piggyback on the work done with more modern designs and supporting them was a lot more expensive.
  • spaceship9876 - Saturday, April 7, 2018 - link

    Yet Nvidia still sell the fermi GT710 for ~$35 which has no direct modern replacement at this price point. They also sell the GT610 and GT210.
  • cyrand - Saturday, April 7, 2018 - link

    I thought the fermi GT710 was OEM only and the GT710 that currently being sold is the Kepler base one.
  • DanNeely - Saturday, April 7, 2018 - link

    is it actually NV selling them, or just 3rd parties working on ancient inventory?

    The growing size of video en/decode blocks means that NV really can't go any smaller than the 1030 with current generation tech. The GP-107 (1050/Ti) was already down to only 50% GPU cores and 50% everything else in die area; while dropping the PCIe from 16x to 4x and halving the size of the memory bus to 64bit for GP-108 means there's no room to cut farther unless they start cutting into higher end en/decoding hardware. At the point of doing that though they might as well just dust off GM108 instead.
  • 427269616e - Sunday, April 8, 2018 - link

    Isn't the replacement integrated graphics? They are surprisingly decent now.
  • DanNeely - Sunday, April 8, 2018 - link

    Upgrading integrated graphics after 2 or 3 years requires a new CPU, often a new mobo, and potentially new ram; collectively that tends to be a lot more expensive than popping in a half sized card. Besides, there's still a significant chunk of the barely gaming capable system buying market that still 'knows' they need a separate GPU to game on because the IGP is horrible.
  • Flunk - Sunday, April 8, 2018 - link

    If you're upgrading a system, anything below a 1040 isn't worth buying at the moment (and even that is only really a worthwhile upgrade for older systems) because anything less can't really handle current gen games and even then the value isn't really there.

    Most current-gen low-mid range cards cost too much to be worthwhile purchase weighed against the readily available used cards. The GTX 970 and 960 have a lot of usable life left in them and can be had for around $200 and $100 dollars respectively.
  • DanNeely - Sunday, April 8, 2018 - link

    I assume you mean a 1050 since there's no such thing as a 1040. One of them or a used card from the previous generation is going to be a lot more bang for the buck if it'll fit into your system and not need a PSU upgrade. but OEM systems often won't have the PSU headroom for even a 20 or 30W upgrade, and small form factor systems often can only fit single slot or half height cards (there were a few half height 1050's released but the stupidly high prices listed on amazon suggest they've been sold out long enough to attract attention from scalpers).

    And of course despite what some people in the commentariat always seem to think not all gamers have an enthusiast budget, if you can't afford the $100 card the $70 card that lets you go from cant play to 720p low or from 720p low to 720p high is still still a solid upgrade when the alternative is nothing at all.
  • RaistlinZ - Saturday, April 7, 2018 - link

    How will I heat my home in the winter now?
  • Raylit20 - Saturday, April 7, 2018 - link

    With a Q6600!
  • Samus - Saturday, April 7, 2018 - link

    Overclocked no less! 105w is just the beginning!
  • Alexvrb - Saturday, April 7, 2018 - link

    That's no joke, I have a buddy with an overclocked Q6600 and if you were going by thermals you'd think it was an overclocked overvolted BD-based architecture.
  • Alexvrb - Saturday, April 7, 2018 - link

    had*

    The buddy is still around but the CPU was retired a couple years back.
  • Achaios - Saturday, April 7, 2018 - link

    I retired (ebayed) my Gigabyte GTX 580 SOC in Nov 2014. Most gamers should have moved on away from Fermi years ago.
  • stardude82 - Sunday, April 8, 2018 - link

    If $300 GTX 970s didn't move people upgrade, $170 RX 470s should have. <$100 cards in the GTX 950/1050, RX 460/560 should have ensured it no real market with its power consumption. Maybe now it's a viable stop gap card?
  • Dribble - Monday, April 9, 2018 - link

    My son is still using my old GTX 570, and it still plays everything fine.
  • ಬುಲ್ವಿಂಕಲ್ ಜೆ ಮೂಸ್ - Saturday, April 7, 2018 - link

    Hmm? Google must be broken.....

    Kepler Support ends when?

    Maxwell Support ends when?

    Pascal Support ends when?

    Volta Support ends when?
  • Ryan Smith - Sunday, April 8, 2018 - link

    NVIDIA doesn't have a hard lifecycle policy in place for GPU support. However no GPU family since the switch to unified shaders has been supported for less than 8 years. So that would be 2020, 2022, 2024, and who knows, respectively.
  • ಬುಲ್ವಿಂಕಲ್ ಜೆ ಮೂಸ್ - Sunday, April 8, 2018 - link

    So.....8-10 years
    Good to know even if there is no hard lifecycle policy in place

    Besides....
    Hard End of Life Support policies are only as good as the Company that wrote it
    (Microsoft)
  • MelvisLives - Saturday, April 7, 2018 - link

    1000W and above PSU lines just had their sales predictions adjusted way down...
  • rocky12345 - Saturday, April 7, 2018 - link

    I'm surprised they didn't drop Fermi along time ago. Just like AMD has done now Nvidia is also doing the same dropping 32 bit windows support. Someone should send MS a note and tell them to take the 32 bit install option away when installing windows 10. I guess it does still give the option of being able to upgrade an older win 8.1 or win 7 32 bit install to win 10 32 bit but if there will be no real up to date drivers what's the point.
  • haukionkannel - Saturday, April 7, 2018 - link

    There still are 32 bit Atoms... So not yet...
    But 5 more years, maybe the 32-bit version will die... Maybe...
  • wolrah - Saturday, April 7, 2018 - link

    Pretty much the entire point of Windows 10 32 bit is non-updated drivers. It exists to run old hardware which has not seen a 64 bit driver release, or 16 bit era apps that don't run on 64 bit Windows.

    There are technically a few 32 bit Intel CPUs that are still supported to some extent, but anyone trying to actually use them as a desktop system is insane.

    Of course in either case I say good riddance, anyone who still feels that they "need" to run 32 bit drivers or 16 bit apps is someone who's been ignoring the writing on the wall for over a decade that they need to replace their ancient trash. It's long past time to cut them off and force them to stop being cheap.
  • Alexvrb - Saturday, April 7, 2018 - link

    And if you have mission critical software that can't easily be replaced, VMs. The underlying OS should definitely be 64-bit, at least on semi-modern hardware.
  • DanNeely - Sunday, April 8, 2018 - link

    In the ideal case yes, but a lot of embedded/industrial/scientific control systems need more direct access to raw hardware than you can get in a VM. If you as the IT guy stamp your feet and try insisting to your bosses that a perfectly working six to eight figure piece of equipment needs to be replaced because you don't want to deal with the crappy old computer it runs on *something* will be replaced. The odds are overwhelmingly high it won't be the hardware though. Even in the low five figure range you'd be fighting a very uphill battle to retire it as long as it still worked.
  • deepblue08 - Saturday, April 7, 2018 - link

    It was a great card for it's time, except that it ran at absolutely insane temperatures. Mine died after 2 years, although my gaming sessions during that time were quite insane in length as well.
  • Dragonstongue - Saturday, April 7, 2018 - link

    back when they made "full fat" cards and still used underspec component selection like they currently do on the Vreg and capacitors, even then they did not give full DX11/12 support ^.^
    (super high temperatures because of messy amp/volts)

    I have had my 7870 since about 4 months after launch which wa in 2013, still running perfect to this day, there is the difference between "quality" and "speed" when it comes to AMD and Nvidia.

    seems to me IMO the only thing Nv have gotten better at is chopping more away and emulating where they can to keep thermals/power use (average) in check and ramping clock speeds up vs building a better product over the years, same crappy capacitor/Vreg selections etc, they seem to use the "minimum" grade selection on components vs higher than needed.

    I stick with Radeon for many reasons, even if they do not run mach 2 speeds at least they do not emulate things that end up looking "crappy" to my eyes and do what they are able to do via the hardware alone.

    anyways ^.^
  • eddman - Sunday, April 8, 2018 - link

    "emulate"? What are you on about?
  • OrbitingCow - Sunday, April 8, 2018 - link

    @Dragonstongue

    Strange post to be honest. You are advocating AMD because of some random cap thing you have going that basically does nothing, while saying that Gameworks, Physx, Ansel, OpenGL, and everything else Nvidia does better does not matter?

    That seems pretty crazy to me. I like AMD CPUs these days, but their GPUs no thanks. Nvidia has way too many QoL things no gamer can ignore outright if you ask me. No way I am playing games like Witcher 3 and all the rest of them without Nvidia cards.
  • Hurr Durr - Monday, April 9, 2018 - link

    People not right in the head are a significant chunk of AMD userbase, what do you want.
  • RSAUser - Monday, April 9, 2018 - link

    I am an AMD user since the R9 290X was a great deal, still runs near everything at max 1080p.
    Price/performance is still my most important metric (performance I include power usage and noise). Using the Chill feature is what makes me like it more than my 1060 machine.
  • Spunjji - Monday, April 9, 2018 - link

    Well done, you managed to make this section of the comment thread stupider than it was with just the AMD fanboy fanboying.
  • Marburg U - Sunday, April 8, 2018 - link

    Unfortunately nVidia bricked my GTX 460 on march 2016 with their 364.72 WHQL-certified drivers. That was quite a thing....

    Still using a gtx 540m on a sandy bridge laptop, and i won't retire it for quite some time.
  • 0ldman79 - Sunday, April 8, 2018 - link

    DanNeely knows of what he speaks.
  • HollyDOL - Monday, April 9, 2018 - link

    It was about time to let it go...

    While certain backward compatibility is nice I can't wait to see a day when no NEW apps are released as 32bit (looking especially at games and visual studio (with resharper) hitting 2/2 or 3/1 RAM limit more and more often). Spend too much on backward compatibility and you won't have anything left to move forward.
  • James5mith - Monday, April 9, 2018 - link

    According to the latest Steam Survey, this affects less than 2% of users in the Windows space. (The discontinuing of 32-bit support.)

    Good riddance to 32-bit OS support. It should have died nearly a decade ago when all hardware was 64-bit capable, and WoW was already pretty much solidified and fully functional.
  • DanNeely - Monday, April 9, 2018 - link

    There're more gamers than just on steam, and more people with discrete cards than just gamers. But yeah pulling the plug on 32bit drivers at this point isn't a surprise.

    I suspect 32bit is still a larger share in the broader world mostly due to people who're still using 7-10 year old machines for web browsing, but without StatsCounter or NetMarketshare breaking down OS data by 32/64bittedness in their public data it's hard get a read on what their overall share is.
  • gggirlgeek - Monday, June 18, 2018 - link

    32-bit OS runs much faster on older computers. Some like the Core 2 duo CPU are still very relevant, especially paired with an inexpensive GT-640. My boyfriend's computer with these specs starts up faster than my 64-bit (same start apps,) and he runs Steam games on it all day. He just can't play the higher-end games requiring i5 CPU and better graphics, like Doom. We have to build an entire new system for that. Otherwise, my high-end desktop does nothing faster than his old 32bit. The exception is 4k video and gaming. For "most" people that's not an issue.

Log in

Don't have an account? Sign up now