NVIDIA has diversified its investments, a lot into Deep learning. In the upcoming years, Machine Learning is going to be ubiquitous and NVIDIA will play a huge role in it.
AMD needs to leverage its capability to have CPU & GPU developed in house, and come up. I sincerely hope, NVIDIA doesn't become Intel of GPUs.
"NVIDIA doesn't become Intel of GPU" ~ that usually happens when enthusiasts don't punish Nvidia for the stunt they pulled with the GTX 970, Gameworks, Kepler performance et al. Even with the latest Fury (Non X) you have an unlockable card & excellent VFM product, but as always the value proposition that Nvidia brand brings to the table for'em is hard to quantify.
For Enterprise you could assume the value of CUDA & Nvidia's software/hardware ecosystem playing a crucial role for their dominant role in that part of the market segment. However no Enterprise customer would let Nvidia off the hook with that ~3.5GB fiasco, the question is why do normal users put up with that kind of schtick?
Samsung was hit with a fine for astroturf, hiring people to post comments favorable to its products and unfavorable competitors'. I think a certain other company is doing this more successfully.
Yeap, I see certain green people posting multiple comments and multiple sites every day. Those people either don't need to work, don't have a life, or they are just getting payed for doing that.
But forum posters is one thing, the tech press is another. For the same things, they will take part in damage control or downgrade the importance of the bad news if it has to do with Nvidia, they will start shouting all over the place, post articles and analysis, if it has to do with AMD.
Nvidia driver problems with browsers and Windows 10. NOTHING. Shield problems with it's battery? Nothing to see here. AMD long period between WHQL drivers. FULL analysis, AMD failed, a sign that AMD is going bankrupt. No mention about the beta drivers that where posted and where also stable.
Nvidia lying about the specs of a card. Period of lies. 6 months at least. Damage control, full acceptance of Nvidia's excuses. AMD pump problem in the first batch. AMD fixes it immediately. Full scale articles, pump noise compared to a train, conclusion, AMD and Fury X a failure.
"Nolifers" are extremely common, much more common than you may think. A very large percentage of World of Warcraft gamers are "nolifers" for example. I can easily picture a WoW nolifer and NVIDIA fanboi alt+tabbing from WoW to write down a panegyric on his NVIDIA GPU.
"Shield problems with it's battery? Nothing to see here."
Outright lie. Every hardware news site has covered this. But when it's affected a grand total of FOUR people, it's hardly newsworthy.
"AMD long period between WHQL drivers. FULL analysis, AMD failed, a sign that AMD is going bankrupt. No mention about the beta drivers that where posted and where also stable."
I think you might want to check up on the definition of the word "beta" in relation to the word "stable".
"Nvidia lying about the specs of a card. Period of lies. 6 months at least. Damage control, full acceptance of Nvidia's excuses."
nVIDIA has apologised and said the same thing won't happen again. And the "3.5GB" issue doesn't affect performance anyway, so again it's not newsworthy. The GTX 970 was sold as a card with 4GB of addressable memory, and it is.
"AMD pump problem in the first batch. AMD fixes it immediately. Full scale articles, pump noise compared to a train, conclusion, AMD and Fury X a failure."
AMD promised that the liquid cooler would make the Fury X the quietest card ever. The pump noise rendered that claim false. AMD probably would've got a pass on that issue, if not for the fact that they outright lied about Fury X's performance and overclocking abilities as well.
About Shield, you downgrade it yourself. The same did the press. Just reporting it, no extra sauce. Excuses.
AMD's beta drivers didn't came with problems. WHQL drivers from Nvidia where having problems with Chrome browser, leading to crushes of the GPU driver. Maybe you should explain us what is Nvidia's definition for "WHQL driver". Double standards.
Nvidia apologize? REALLY? REALLY? Should I sell you something different from what I advertise you and then try to get away with a simple apology? Again covering Nvidia from your part.
About the pump. AMD fixed that problem instantly. Again you accept a simple apology from Nvidia while accusing AMD. Double standards again.
By the way, it was CoolerMaster's mistake with the pump, but OK let's make it AMD's. Even blaming AMD here, this whole story was about the first cards. It was fixed instantly. You are doing what the press did. You took a minority of some cards and make it look like all the cards are having pumps that make noise. That's why the tech sites rush to provide full analysis on the issue. They wanted to have ready articles about this before AMD fixes it. Now and in the future all these articles, will be there for people to google them and read them In 6 months from now when the problem on Fury's pump will be non existent. That was the whole idea. That's why there was full analysis and videos and everything.
"About Shield, you downgrade it yourself. The same did the press. Just reporting it, no extra sauce. Excuses."
So you're agreeing it's a nonissue.
"AMD's beta drivers didn't came with problems. WHQL drivers from Nvidia where having problems with Chrome browser, leading to crushes of the GPU driver. Maybe you should explain us what is Nvidia's definition for "WHQL driver". Double standards."
While I agree that the Chrome TDR issue was a nVIDIA f**kup, the fact is that Microsoft certified those buggy drivers, so they should share the blame.
"Nvidia apologize? REALLY? REALLY? Should I sell you something different from what I advertise you and then try to get away with a simple apology? Again covering Nvidia from your part."
Nothing excuses that nVIDIA c**ked up conveying the specs of the 970's memory partitioning, but it doesn't affect performance for the vast, vast majority of people (including me) who bought that card. Which makes it a nonissue unless you're routinely maxing out the GTX 970's VRAM, in which case you will almost certainly find a 980/Ti or Fury/X to be a better purchase.
"About the pump. AMD fixed that problem instantly. Again you accept a simple apology from Nvidia while accusing AMD. Double standards again.
By the way, it was CoolerMaster's mistake with the pump, but OK let's make it AMD's. Even blaming AMD here, this whole story was about the first cards."
If I buy a Ford vehicle with an nVIDIA Tegra computer in it, and the computer breaks because of a manufacturing defect, do I return the vehicle to nVIDIA? Does nVIDIA get the bad press? No, the blame (rightly or wrongly) falls on Ford. It's all about perception. Similarly, in the case of Fury X, AMD was perceived to have a "faulty" product, even though the fault lies with Coolermaster. Personally I find the pump noise to be a nonissue, but when AMD was selling Fury X as the quietest high-end card in town, and then not delivering the quietest card, it was low-hanging fruit for the tech press to grab. That doesn't make it right, but scandal sells...
Because real people care about real performance and none of the benchmarks results changed? It was even tested in SLI @ 3840x2160 before the news broke without any obvious signs of trouble: http://www.guru3d.com/articles_pages/geforce_gtx_9...
When you consider that AMD rebrand GCN 1.0 cards as new in 2015, I'd rather take a honest mistake over AMD's intentional dishonesty any day. The GTX 970 is still a kick ass card with corrected specs, why do you think it's the most popular dGPU on the market and the sales keep accelerating? http://store.steampowered.com/hwsurvey/videocard/
As usual, all the AMD fans can do is wank over the powerpoint presentations that show Zen and the R9 Nano will be great, now that we know Fury is barely 980 Ti-ish despite all the hype. Wake up and smell the coffee, people don't buy spec sheets and powerpoints they buy products. And there nVidia has delivered and AMD been utter disappointments the last few years.
GT 730 DDR3 version is based on Fermi. It is a rebranded GT 430. It was also GT 630 before becoming GT 730.
The funny thing is that there are multiple versions of low end Nvidia cads with the same model number that are NOT the same cards. So there are two more versions of GT 730 with Kepler GPUs. Nvidia has made it a habit to mislead people about it's cards specs. You see three GT 730's and they all perform differently to each other. Until Nvidia provides a DX12 driver for the Fermi cards, one of those GT 730's it isn't even DX12 compatible.
But ignore these things. It's Nvidia's rebranding, Nvidia's misleading, Nvidia's lies, so it is perfectly fine. I am pretty sure that you will respond will plenty of excuses.
The GT 730 is technically available at retail, but essentially no one buys it. It's basically an OEM-only card. And for that reason, the tech specs don't much matter. It's basically just there to provide low-end video output to systems that don't have an iGPU.
There's a big difference between rebranding one ultra-low-end product, and rebranding basically your entire product stack. No one cares if AMD wants to rebrand the old 5450 chip for $19.99 OEM cards, but when they bring back Pitcairn from 2012 and try to seriously position it as a viable gaming product in 2015, this is when they start to get pushback - and rightly so.
Pitcairn was an OEM rebrand so according to you no one cares(although apparently you do). The olands are part of the very low powered r7 and r5 OEM lines as well. Also only retail pitcairn card that is for gaming was the r9 370 and thats a pitcairn refresh, R9 370 is entry level realm of GPUs it does fine there.
I get 35 models on a Greek site that compared prices. I bet if I try a US site I will find more models. Prices start at over 50 euros going over 90 euros. Not exactly $19.99.
There are Fermi based models, Kepler based models, DDR3, GDDR5, 64bit, 128bit, with 1GB, 2GB or 4GB.
The difference in performance between the Kepler based 64bit DDR3 (14.4 GB/sec) and the Kepler based 64bit GDDR5 (40GB/sec) is just huge.
The third DDR3 model with the Fermi based GPU (only 96 CUDA cores compared to 384 on the Kepler models) isn't even DX12 capable today.
The differences between these three models is scandalous at least. You buy one card and you get something totally different if you don't know.
Well, the thing is, Pitcairn is a viable gaming product. That's why the "outrage" over the rebrands is ridiculous. All of the rebrands are still competitive. Why should AMD create a whole new architecture when they can just move the previous one down? They're all on the same process node anyway. Once we finally get a new process node the story may be different, but right now there is not really that much to be gained by pumping out a new architecture for the mid to low end.
"The GTX 970 is still a kick ass card with corrected specs"
Do you mean Nvidia has now finally changed their website to no longer claim the impossible 224 GB/s 4 GB of VRAM?
Because the last times I looked it was still wrong.
As for being "kick-ass"... The 970 was intentionally hobbled with a partition that operates at a ridiculously slow 28 GB/s -- half the speed of the VRAM of the 8800 GT from 2007. There is no excuse for that in an enthusiast card, period.
Nvidia probably has games fill that partition with cached information to hide the massive performance problem associated with it -- effectively rendering the 970 a 3.5 GB card without people knowing it. Convenient for those who specifically purchased two for SLI on the idea that it wasn't a good value to buy the 980 because it shipped with the same amount of VRAM. Too bad for them!
NVIDIA do rebrands too. A crazy example is the GT 640. There were at least 5 different variations, with a minimum of 3 separate chips powering those variations. One of them was a rebranded Fermi. At least with the current AMD lineup, you know you're buying a rebrand. A more well known example of NVIDIA rebranding is the 8800GTS, it was rebranded across several generations.
What stunt? Even with 500MB less VRAM, the benchmark results speak for themselves. The GPU isn't powerful enough for 4K gaming anywhere, which is the only area more than 3.5GB VRAM is necessary.
If NVidia should be punished for something, it should be their back and forth with locking down overclocking capability of mobile GPU's, but in the end they did listen to the community and re-introduce the functionality.
Overall, NVidia is doing ok, but everyone here is right...it's important they do not become a monopoly and frankly AMD's latest and greatest is competitive on performance and especially price. The only problem is NVidia is substantially more power efficient, which speaks well to many people who consider efficiency a sign of superior engineering.
That's the problem with SLI, you always increase your GPU power beyond VRAM capacity. SLI 970's are going to be memory limited at 3.5GB or 4GB, no matter how you look at it. Even 6GB isn't enough in most cases to balance the raw GPU resources of 2x970's, which together are more powerful than a GTX980Ti (which comes with...you guessed it, 6GB VRAM.)
I ran into this problem on my last two SLI setups, which is why I gave up SLI. On both my GTX460 and GTX570 SLI's, I hit the (1GB and 1.25GB) memory bottlenecks in SLI when enabling higher quality textures. To scale, the GTX970 with 3.5GB is very balanced, but by no means appropriate for SLI if your goal is to run higher resolutions or higher quality textures (which is usually the ideal goal of SLI)
The counter arguement is most AAA DX12 games are going to be optimized for more than 4GB VRAM going forward due to the PS4 and XBOX having 8GB RAM, the vast majority of which is used for textures and GPU resources. But that doesn't mean the GTX970 isn't futureproof for at least another two years.
People specifically purchased the 970 in pairs in large numbers because it was supposed to have the same amount of useful VRAM as the 980.
Fraud. Period.
Stuffing dummy data into that partition so they game never uses it (to avoid stutter but make it look like all 4 GB is beign used) or caching data that rarely gets accessed are both cute but they aren't adequate.
GTX 970 "stunt"? It was a good product with good performance at a good price. There was an initially undocumented performance limitation with memory, but that generally didn't affect real world performance. It still compared favourably to AMD hardware even when the limitation was announced.
GameWorks on the other hand, they need to stop pushing that mess.
Because most users know it's still 4GB, and even AMD loving anandtech can't find where it hurts 970. Would you have been happy if they just left off 512MB? I'd rather have them put it in there and optimize drivers to only stick stuff in there that isn't speed dependent. They seem to be doing a great job if nobody can find it's weak spot right?
At the end of the day did the card perform ANY different than reviews said? NOPE. That is WHY customers couldn't care less. Spec sheets don't matter if you win anyway...LOL.
If you buy hardware based on paper specs, you're setting yourself up for disappointmnet. Sure, it may mean the GTX 970 isn't going to be useful as long as you'd like, but really, that's more than likely 3+ years down the road when the 900 series becomes a footnote in the back of everyone's minds for being obsolete.
The GTX 970 delivered even without knowing about this issue. Even if we did know of this issue and we saw the results, then we'd only add a caveat that when 4K gaming finally takes off, the GPU may not be as suited as those with more memory. But then again, 4K gaming (the PCMR requirement of 4K@60FPS with maxed out settings on Crysis 3 level) isn't poised to take off for at least two generations at the rate things are going.
You bought the hype. Nvidia consistently lags in OpenCL performance and tries to push its own proprietary tools. It's already Intel, and it will also suffer the same fate: irrelevancy.
What Intel, Nvidia and other don't get is the computing industry hasn't changed in 50 years: lower cost with greater performance has always driven the industry.
Financial / fiscal years of companies are not aligned with calendars.
"July 21, 2015 Microsoft Fiscal Year 2015 Fourth Quarter Earnings" "Apple® today announced financial results for its fiscal 2015 third quarter ended June 27, 2015"
So MS Fiscal year 2015 was from beginning of Q3/2014 to end of Q2/2014 Apple's Fiscal year 2015 is still ongoing, from Q4/2014 to end of Q3/2015
Nvidia is well ahead of both with it's fiscal year aligns with calendar years, but is one year ahead, being half a year ahead of MS and three quarters of a year compared to Apple.
Fiscal years is the inmates running the asylum in accounting, don't try to understand just accept that companies make this shit up with no relation to the actual calendar.
One thing that isn't mentioned here is that the Tegra division lost $57 million last quarter - and obviously more this quarter. Tegra has lost money every quarter since it's inception (9 years ago?). Nvidia don't mention the Tegra operating loss in the press release, because it looks bad, but you can find it in form 10-Q.
They need to find some better markets for that thing. They should be pushing Chromebooks and stuff like that.
One of the problems I've noticed with the ARM stuff is that it takes way too long between a design being ready for production and it actually shipping in products. In some ways, the Tegra X1 (both CPU and GPU) is "old technology" by now. Compare this to Intel, where they essentially have partners shipping products the day they announce their chips.
But I do think that companies like nvidia are going to have to start pushing non-x86-tethered platforms if they want to open opportunities for Tegra (so Android, ChromeOS, etc.).
When you write these you don't really need to mention the "Non-GAAP" results because they're total hogwash. If you don't follow GAAP you can just make things up or manipulate the results. Our financial system is totally dependent on Generally Accepted Accounting Principles and if you throw those out the window then your data in meaningless.
Non-GAAP is at least useful as a "we couldn't even manipulate our way into a profit" data point. Companies frequently use their non-GAAP results to highlight the strengths of their profitable decisions, even if they "officially" have to report their missteps in their GAAP numbers. I.e., posting a profit in both is great, a loss in GAAP but profit in non-GAAP should trigger some analysis, while a loss in both mean that either your management or business model or both are outright failing.
GAAP is complete hogwash. If you don't report your number under IFRS, might as well not report them. Using GAAP is like using furlongs per fortnight, different across countries and no one knows where you are hiding that shell company.
This has not been a good year for acquisitions. AMD writing off SeaMicro, Microsoft writing off Nokia, and now Nvidia writing off Icera.
One might almost think that many corporate acquisitions are about empire-building and self-enrichment by executives, not about what actually makes sense for the corporation's core business plan.
Most acquisitions are about using money corporation can`t return into the country without paying tax on them. Then whatever they buy gets gutted into the mothership and the shell is written off. For good or bad, that`s an approved strategy with investors. They didn`t even flinch when Microsoft had to write down aQuantive, which was deemed to be worth like 2 of Nokia.
NVIDIA is a GPU company. They try to expand into markets not related, and it doesn't work out. About the only place Tegra is being used is in the companies own gaming tablet, which isn't exactly a stellar success in sales. Purchasing a modem company? What does that have to do with computing graphics? Absolutely nothing.
When will these companies learn that doing one thing, and doing it really well, is the key to success. Trying to do everything, just because it seems like it is a profitable venture, usually doesn't work out.
NVIDIA, stick to doing what you do best. You do it well, and you won't have to worry about a major division, unrelated to your core business, dragging you down.
I think they rightly see that the rise of IGPs / APUs is directly impacting revenue, and doesn't bode well for the future. So they have to make up that revenue somehow. And that means taking risks on unknown markets.
You forgot to mention NV handily beat the street (1.01B expected revenue), which is why it was up ~10%! How can you leave out what everyone else had as the headline?
Integrated modems won't matter as soon as die shrinks makes the power on these tank just like everything else. Many throw in a qcom/intel modem even today.
No so sure Intel's fees end in 2017, as they'll have to come up with something themselves that doesn't trample NV's patents again, or pay up anyway. Considering the Markman hearing showed the judge favoring NV in 6/7 patents, I'm not sure NV needs to make any money right now in mobile when they're likely to get billions from WILLFUL infringement, a long term lic fees from samsung/qcom/apple etc. It's just a matter of figuring out who's most responsible and for how much. When you consider MSFT makes ~7-8B a year now on $4/unit sold with android, you can easily see how NV has a valid argument for a few bucks themselves. Qcom charges a % of the ENTIRE device whenever their modem is inside (in court now in china due to this, but a valid argument until USA shoots it down too).
Anand himself didn't call it the wild west of patent infringement for nothing right? Seeing Intel's 1.5B loss in the same type of suit (over Nv's gpus), I can't see how samsung (not a USA company), with profits dwarfing Intel (22-30B, vs. Intel ~8B now), could get away with less than that + lic fees for a decade or more. Those patents are from 1999-2001. They will continue to trample desktop's gpus/patents for 15 years after those as NV/AMD have pretty much every patent that puts a game on your screen for the last 15-20yrs. I expect an AMD suit if they have anything, as soon as NV wins and gives them a leg to stand on [cheaply] in court. I doubt Intel will get out of patents going forward either. They will either have to pay up again (new contract) or remove gpus from their cpus at some point. Just like Qcom with modems and sending signals etc, there is not a radically different way to get a game on your phone/tablet that desktop gpus haven't already patented for 15yrs straight.
My point is, they're already diversified, you're just not getting it. A lic fee and a fine is just as good as selling a soc. Just ask Microsoft about their $7-8B a year (a Billion alone for the last 2yrs from samsung...LOL) from 2B units of android. Also note it won't help Samsung that they were LATE paying MSFT what they agreed to pay and MSFT added a 6mil late fee...ROFL. A jury won't like people with 22-30B a year that don't pay debts and are NOT a USA company pilfering USA patents correct? I can't wait to see that sticker shock price compared to a NON-Willful infringement case of 1.5B (just broke chipset deal so became infringing, not willful for 2-3yrs). NV makes ~600mil a year. Merely a $1 per android devices nets you 2B of pure income. MSFT doesn't have to keep pouring R&D into new lines of android code for that $3.5-4 bucks a pop per device either. Again, a great reason for a jury to award NV big, when they have to keep pouring 1.2B a year into gpu tech etc. Google can't seem to figure out a way out of the code after years of android.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
55 Comments
Back to Article
karthik.hegde - Friday, August 7, 2015 - link
NVIDIA has diversified its investments, a lot into Deep learning. In the upcoming years, Machine Learning is going to be ubiquitous and NVIDIA will play a huge role in it.AMD needs to leverage its capability to have CPU & GPU developed in house, and come up. I sincerely hope, NVIDIA doesn't become Intel of GPUs.
R0H1T - Friday, August 7, 2015 - link
"NVIDIA doesn't become Intel of GPU" ~ that usually happens when enthusiasts don't punish Nvidia for the stunt they pulled with the GTX 970, Gameworks, Kepler performance et al. Even with the latest Fury (Non X) you have an unlockable card & excellent VFM product, but as always the value proposition that Nvidia brand brings to the table for'em is hard to quantify.For Enterprise you could assume the value of CUDA & Nvidia's software/hardware ecosystem playing a crucial role for their dominant role in that part of the market segment. However no Enterprise customer would let Nvidia off the hook with that ~3.5GB fiasco, the question is why do normal users put up with that kind of schtick?
Oxford Guy - Friday, August 7, 2015 - link
Samsung was hit with a fine for astroturf, hiring people to post comments favorable to its products and unfavorable competitors'. I think a certain other company is doing this more successfully.yannigr2 - Friday, August 7, 2015 - link
Yeap, I see certain green people posting multiple comments and multiple sites every day. Those people either don't need to work, don't have a life, or they are just getting payed for doing that.But forum posters is one thing, the tech press is another. For the same things, they will take part in damage control or downgrade the importance of the bad news if it has to do with Nvidia, they will start shouting all over the place, post articles and analysis, if it has to do with AMD.
Nvidia driver problems with browsers and Windows 10. NOTHING.
Shield problems with it's battery? Nothing to see here.
AMD long period between WHQL drivers. FULL analysis, AMD failed, a sign that AMD is going bankrupt. No mention about the beta drivers that where posted and where also stable.
Nvidia lying about the specs of a card. Period of lies. 6 months at least. Damage control, full acceptance of Nvidia's excuses.
AMD pump problem in the first batch. AMD fixes it immediately. Full scale articles, pump noise compared to a train, conclusion, AMD and Fury X a failure.
I could write more. Never mind.
Achaios - Friday, August 7, 2015 - link
"Nolifers" are extremely common, much more common than you may think. A very large percentage of World of Warcraft gamers are "nolifers" for example. I can easily picture a WoW nolifer and NVIDIA fanboi alt+tabbing from WoW to write down a panegyric on his NVIDIA GPU.The_Assimilator - Friday, August 7, 2015 - link
"Shield problems with it's battery? Nothing to see here."Outright lie. Every hardware news site has covered this. But when it's affected a grand total of FOUR people, it's hardly newsworthy.
"AMD long period between WHQL drivers. FULL analysis, AMD failed, a sign that AMD is going bankrupt. No mention about the beta drivers that where posted and where also stable."
I think you might want to check up on the definition of the word "beta" in relation to the word "stable".
"Nvidia lying about the specs of a card. Period of lies. 6 months at least. Damage control, full acceptance of Nvidia's excuses."
nVIDIA has apologised and said the same thing won't happen again. And the "3.5GB" issue doesn't affect performance anyway, so again it's not newsworthy. The GTX 970 was sold as a card with 4GB of addressable memory, and it is.
"AMD pump problem in the first batch. AMD fixes it immediately. Full scale articles, pump noise compared to a train, conclusion, AMD and Fury X a failure."
AMD promised that the liquid cooler would make the Fury X the quietest card ever. The pump noise rendered that claim false. AMD probably would've got a pass on that issue, if not for the fact that they outright lied about Fury X's performance and overclocking abilities as well.
yannigr2 - Friday, August 7, 2015 - link
About Shield, you downgrade it yourself. The same did the press. Just reporting it, no extra sauce. Excuses.AMD's beta drivers didn't came with problems. WHQL drivers from Nvidia where having problems with Chrome browser, leading to crushes of the GPU driver. Maybe you should explain us what is Nvidia's definition for "WHQL driver". Double standards.
Nvidia apologize? REALLY? REALLY? Should I sell you something different from what I advertise you and then try to get away with a simple apology? Again covering Nvidia from your part.
About the pump. AMD fixed that problem instantly. Again you accept a simple apology from Nvidia while accusing AMD. Double standards again.
By the way, it was CoolerMaster's mistake with the pump, but OK let's make it AMD's. Even blaming AMD here, this whole story was about the first cards. It was fixed instantly. You are doing what the press did. You took a minority of some cards and make it look like all the cards are having pumps that make noise. That's why the tech sites rush to provide full analysis on the issue. They wanted to have ready articles about this before AMD fixes it. Now and in the future all these articles, will be there for people to google them and read them In 6 months from now when the problem on Fury's pump will be non existent. That was the whole idea. That's why there was full analysis and videos and everything.
The_Assimilator - Saturday, August 8, 2015 - link
"About Shield, you downgrade it yourself. The same did the press. Just reporting it, no extra sauce. Excuses."So you're agreeing it's a nonissue.
"AMD's beta drivers didn't came with problems. WHQL drivers from Nvidia where having problems with Chrome browser, leading to crushes of the GPU driver. Maybe you should explain us what is Nvidia's definition for "WHQL driver". Double standards."
While I agree that the Chrome TDR issue was a nVIDIA f**kup, the fact is that Microsoft certified those buggy drivers, so they should share the blame.
"Nvidia apologize? REALLY? REALLY? Should I sell you something different from what I advertise you and then try to get away with a simple apology? Again covering Nvidia from your part."
Nothing excuses that nVIDIA c**ked up conveying the specs of the 970's memory partitioning, but it doesn't affect performance for the vast, vast majority of people (including me) who bought that card. Which makes it a nonissue unless you're routinely maxing out the GTX 970's VRAM, in which case you will almost certainly find a 980/Ti or Fury/X to be a better purchase.
"About the pump. AMD fixed that problem instantly. Again you accept a simple apology from Nvidia while accusing AMD. Double standards again.
By the way, it was CoolerMaster's mistake with the pump, but OK let's make it AMD's. Even blaming AMD here, this whole story was about the first cards."
If I buy a Ford vehicle with an nVIDIA Tegra computer in it, and the computer breaks because of a manufacturing defect, do I return the vehicle to nVIDIA? Does nVIDIA get the bad press? No, the blame (rightly or wrongly) falls on Ford. It's all about perception. Similarly, in the case of Fury X, AMD was perceived to have a "faulty" product, even though the fault lies with Coolermaster. Personally I find the pump noise to be a nonissue, but when AMD was selling Fury X as the quietest high-end card in town, and then not delivering the quietest card, it was low-hanging fruit for the tech press to grab. That doesn't make it right, but scandal sells...
yannigr2 - Saturday, August 8, 2015 - link
Don't put words in my mouth. I don't agree. But I like your post. It's not the typical fanboy BS, but a more balanced approach.As you can see, things that you also call "Nvidia's f**kups" don't end up a ten page analysis on tech sites. Things about AMD do.
Kjella - Friday, August 7, 2015 - link
Because real people care about real performance and none of the benchmarks results changed? It was even tested in SLI @ 3840x2160 before the news broke without any obvious signs of trouble:http://www.guru3d.com/articles_pages/geforce_gtx_9...
When you consider that AMD rebrand GCN 1.0 cards as new in 2015, I'd rather take a honest mistake over AMD's intentional dishonesty any day. The GTX 970 is still a kick ass card with corrected specs, why do you think it's the most popular dGPU on the market and the sales keep accelerating?
http://store.steampowered.com/hwsurvey/videocard/
As usual, all the AMD fans can do is wank over the powerpoint presentations that show Zen and the R9 Nano will be great, now that we know Fury is barely 980 Ti-ish despite all the hype. Wake up and smell the coffee, people don't buy spec sheets and powerpoints they buy products. And there nVidia has delivered and AMD been utter disappointments the last few years.
yannigr2 - Friday, August 7, 2015 - link
GT 730 DDR3 version is based on Fermi. It is a rebranded GT 430. It was also GT 630 before becoming GT 730.The funny thing is that there are multiple versions of low end Nvidia cads with the same model number that are NOT the same cards. So there are two more versions of GT 730 with Kepler GPUs. Nvidia has made it a habit to mislead people about it's cards specs. You see three GT 730's and they all perform differently to each other. Until Nvidia provides a DX12 driver for the Fermi cards, one of those GT 730's it isn't even DX12 compatible.
But ignore these things. It's Nvidia's rebranding, Nvidia's misleading, Nvidia's lies, so it is perfectly fine. I am pretty sure that you will respond will plenty of excuses.
JDG1980 - Friday, August 7, 2015 - link
The GT 730 is technically available at retail, but essentially no one buys it. It's basically an OEM-only card. And for that reason, the tech specs don't much matter. It's basically just there to provide low-end video output to systems that don't have an iGPU.There's a big difference between rebranding one ultra-low-end product, and rebranding basically your entire product stack. No one cares if AMD wants to rebrand the old 5450 chip for $19.99 OEM cards, but when they bring back Pitcairn from 2012 and try to seriously position it as a viable gaming product in 2015, this is when they start to get pushback - and rightly so.
Crunchy005 - Friday, August 7, 2015 - link
Pitcairn was an OEM rebrand so according to you no one cares(although apparently you do). The olands are part of the very low powered r7 and r5 OEM lines as well. Also only retail pitcairn card that is for gaming was the r9 370 and thats a pitcairn refresh, R9 370 is entry level realm of GPUs it does fine there.yannigr2 - Friday, August 7, 2015 - link
GT 730 is a retail card.I get 35 models on a Greek site that compared prices. I bet if I try a US site I will find more models. Prices start at over 50 euros going over 90 euros. Not exactly $19.99.
There are Fermi based models, Kepler based models, DDR3, GDDR5, 64bit, 128bit, with 1GB, 2GB or 4GB.
The difference in performance between the Kepler based 64bit DDR3 (14.4 GB/sec) and the Kepler based 64bit GDDR5 (40GB/sec) is just huge.
The third DDR3 model with the Fermi based GPU (only 96 CUDA cores compared to 384 on the Kepler models) isn't even DX12 capable today.
The differences between these three models is scandalous at least. You buy one card and you get something totally different if you don't know.
http://www.geforce.com/hardware/desktop-gpus/gefor...
Nagorak - Saturday, August 8, 2015 - link
Well, the thing is, Pitcairn is a viable gaming product. That's why the "outrage" over the rebrands is ridiculous. All of the rebrands are still competitive. Why should AMD create a whole new architecture when they can just move the previous one down? They're all on the same process node anyway. Once we finally get a new process node the story may be different, but right now there is not really that much to be gained by pumping out a new architecture for the mid to low end.Oxford Guy - Sunday, August 9, 2015 - link
Neither company should be permitted to do rebrands. It's not clear/obvious fraud like the 970's false specs, but it's not good business.Oxford Guy - Friday, August 7, 2015 - link
"The GTX 970 is still a kick ass card with corrected specs"Do you mean Nvidia has now finally changed their website to no longer claim the impossible 224 GB/s 4 GB of VRAM?
Because the last times I looked it was still wrong.
As for being "kick-ass"... The 970 was intentionally hobbled with a partition that operates at a ridiculously slow 28 GB/s -- half the speed of the VRAM of the 8800 GT from 2007. There is no excuse for that in an enthusiast card, period.
Nvidia probably has games fill that partition with cached information to hide the massive performance problem associated with it -- effectively rendering the 970 a 3.5 GB card without people knowing it. Convenient for those who specifically purchased two for SLI on the idea that it wasn't a good value to buy the 980 because it shipped with the same amount of VRAM. Too bad for them!
medi03 - Saturday, August 8, 2015 - link
Did you check that steam page dude?970 is 2.49% of the steam users, cough. Is that serious?
AMD's 7900 is 2.43%, for comparison.
Michael Bay - Friday, August 7, 2015 - link
M-muh 970 memory is literally a non-issue. Not so with rebrandeon electric boogaloo.Oxford Guy - Friday, August 7, 2015 - link
28 GB/s with XOR contention is not a non-issue.It us purposefully sabotage of an enthusiast-level card coupled with fraudulent marketing claims.
Michael Bay - Saturday, August 8, 2015 - link
>970>enthusiast
I like your desperation.
Oxford Guy - Sunday, August 9, 2015 - link
Go back to 4Chan where you belong.Nagorak - Saturday, August 8, 2015 - link
Actually you have that reversed. One is a real issue. The other is just cards that are still competitive being renamed.Michael Bay - Saturday, August 8, 2015 - link
Competitive with room heating, sure. For intended use literally everybody sane prefers nV now.silverblue - Sunday, August 9, 2015 - link
I wasn't aware that Pitcairn was that warm; the other products were only really an issue in reference form.How's chizow, by the way?
Gigaplex - Saturday, August 8, 2015 - link
NVIDIA do rebrands too. A crazy example is the GT 640. There were at least 5 different variations, with a minimum of 3 separate chips powering those variations. One of them was a rebranded Fermi. At least with the current AMD lineup, you know you're buying a rebrand. A more well known example of NVIDIA rebranding is the 8800GTS, it was rebranded across several generations.Samus - Friday, August 7, 2015 - link
What stunt? Even with 500MB less VRAM, the benchmark results speak for themselves. The GPU isn't powerful enough for 4K gaming anywhere, which is the only area more than 3.5GB VRAM is necessary.If NVidia should be punished for something, it should be their back and forth with locking down overclocking capability of mobile GPU's, but in the end they did listen to the community and re-introduce the functionality.
Overall, NVidia is doing ok, but everyone here is right...it's important they do not become a monopoly and frankly AMD's latest and greatest is competitive on performance and especially price. The only problem is NVidia is substantially more power efficient, which speaks well to many people who consider efficiency a sign of superior engineering.
Crunchy005 - Friday, August 7, 2015 - link
It is an issue with SLI and it can game at 1440p or 4k in SLI mode. How the RAM is managed in SLI it easily reaches the 4GB cap and causes issues.Samus - Friday, August 7, 2015 - link
That's the problem with SLI, you always increase your GPU power beyond VRAM capacity. SLI 970's are going to be memory limited at 3.5GB or 4GB, no matter how you look at it. Even 6GB isn't enough in most cases to balance the raw GPU resources of 2x970's, which together are more powerful than a GTX980Ti (which comes with...you guessed it, 6GB VRAM.)I ran into this problem on my last two SLI setups, which is why I gave up SLI. On both my GTX460 and GTX570 SLI's, I hit the (1GB and 1.25GB) memory bottlenecks in SLI when enabling higher quality textures. To scale, the GTX970 with 3.5GB is very balanced, but by no means appropriate for SLI if your goal is to run higher resolutions or higher quality textures (which is usually the ideal goal of SLI)
The counter arguement is most AAA DX12 games are going to be optimized for more than 4GB VRAM going forward due to the PS4 and XBOX having 8GB RAM, the vast majority of which is used for textures and GPU resources. But that doesn't mean the GTX970 isn't futureproof for at least another two years.
Oxford Guy - Friday, August 7, 2015 - link
People specifically purchased the 970 in pairs in large numbers because it was supposed to have the same amount of useful VRAM as the 980.Fraud. Period.
Stuffing dummy data into that partition so they game never uses it (to avoid stutter but make it look like all 4 GB is beign used) or caching data that rarely gets accessed are both cute but they aren't adequate.
Gigaplex - Saturday, August 8, 2015 - link
GTX 970 "stunt"? It was a good product with good performance at a good price. There was an initially undocumented performance limitation with memory, but that generally didn't affect real world performance. It still compared favourably to AMD hardware even when the limitation was announced.GameWorks on the other hand, they need to stop pushing that mess.
Oxford Guy - Saturday, August 8, 2015 - link
Let me know in what universe 28 GB/s + XOR contention in an enthusiast-grade card constitutes "a good product with good performance at a good price".Oh, and the midrange 8800 GT of 2007 had double that performance in its VRAM.
TheJian - Sunday, August 9, 2015 - link
Because most users know it's still 4GB, and even AMD loving anandtech can't find where it hurts 970. Would you have been happy if they just left off 512MB? I'd rather have them put it in there and optimize drivers to only stick stuff in there that isn't speed dependent. They seem to be doing a great job if nobody can find it's weak spot right?At the end of the day did the card perform ANY different than reviews said? NOPE. That is WHY customers couldn't care less. Spec sheets don't matter if you win anyway...LOL.
xenol - Wednesday, August 12, 2015 - link
If you buy hardware based on paper specs, you're setting yourself up for disappointmnet. Sure, it may mean the GTX 970 isn't going to be useful as long as you'd like, but really, that's more than likely 3+ years down the road when the 900 series becomes a footnote in the back of everyone's minds for being obsolete.The GTX 970 delivered even without knowing about this issue. Even if we did know of this issue and we saw the results, then we'd only add a caveat that when 4K gaming finally takes off, the GPU may not be as suited as those with more memory. But then again, 4K gaming (the PCMR requirement of 4K@60FPS with maxed out settings on Crysis 3 level) isn't poised to take off for at least two generations at the rate things are going.
prisonerX - Friday, August 7, 2015 - link
You bought the hype. Nvidia consistently lags in OpenCL performance and tries to push its own proprietary tools. It's already Intel, and it will also suffer the same fate: irrelevancy.What Intel, Nvidia and other don't get is the computing industry hasn't changed in 50 years: lower cost with greater performance has always driven the industry.
The_Assimilator - Friday, August 7, 2015 - link
Intel, irrelevant? Hahahahahaha. My god you AMD fanboys are hilarious. Oh man. I haven't laughed this hard in a while.Samus - Friday, August 7, 2015 - link
Yeah he acts like x86 is dead or something because it's proprietary. Intel sets a lot of industry standards. Ever heard of USB?Michael Bay - Saturday, August 8, 2015 - link
OpenCL is irrelevant, so nV sees no need in pursuing it. Call it lobbying if you want.medi03 - Saturday, August 8, 2015 - link
Xbox One & PS4, cough.austinsguitar - Friday, August 7, 2015 - link
is this based on the future or something? last i checked it was 2015.... really anandtech?zepi - Friday, August 7, 2015 - link
Financial / fiscal years of companies are not aligned with calendars."July 21, 2015 Microsoft Fiscal Year 2015 Fourth Quarter Earnings"
"Apple® today announced financial results for its fiscal 2015 third quarter ended June 27, 2015"
So MS Fiscal year 2015 was from beginning of Q3/2014 to end of Q2/2014
Apple's Fiscal year 2015 is still ongoing, from Q4/2014 to end of Q3/2015
Nvidia is well ahead of both with it's fiscal year aligns with calendar years, but is one year ahead, being half a year ahead of MS and three quarters of a year compared to Apple.
Kjella - Friday, August 7, 2015 - link
Fiscal years is the inmates running the asylum in accounting, don't try to understand just accept that companies make this shit up with no relation to the actual calendar.prisonerX - Friday, August 7, 2015 - link
Their fiscal years are named by the year in which they end. Why is that so hard to understand?lefty2 - Friday, August 7, 2015 - link
One thing that isn't mentioned here is that the Tegra division lost $57 million last quarter - and obviously more this quarter.Tegra has lost money every quarter since it's inception (9 years ago?). Nvidia don't mention the Tegra operating loss in the press release, because it looks bad, but you can find it in form 10-Q.
jwcalla - Friday, August 7, 2015 - link
They need to find some better markets for that thing. They should be pushing Chromebooks and stuff like that.One of the problems I've noticed with the ARM stuff is that it takes way too long between a design being ready for production and it actually shipping in products. In some ways, the Tegra X1 (both CPU and GPU) is "old technology" by now. Compare this to Intel, where they essentially have partners shipping products the day they announce their chips.
But I do think that companies like nvidia are going to have to start pushing non-x86-tethered platforms if they want to open opportunities for Tegra (so Android, ChromeOS, etc.).
dat graph tho... up 10% after hours.
Flunk - Friday, August 7, 2015 - link
When you write these you don't really need to mention the "Non-GAAP" results because they're total hogwash. If you don't follow GAAP you can just make things up or manipulate the results. Our financial system is totally dependent on Generally Accepted Accounting Principles and if you throw those out the window then your data in meaningless.takeship - Friday, August 7, 2015 - link
Non-GAAP is at least useful as a "we couldn't even manipulate our way into a profit" data point. Companies frequently use their non-GAAP results to highlight the strengths of their profitable decisions, even if they "officially" have to report their missteps in their GAAP numbers. I.e., posting a profit in both is great, a loss in GAAP but profit in non-GAAP should trigger some analysis, while a loss in both mean that either your management or business model or both are outright failing.Peichen - Saturday, August 8, 2015 - link
GAAP is complete hogwash. If you don't report your number under IFRS, might as well not report them. Using GAAP is like using furlongs per fortnight, different across countries and no one knows where you are hiding that shell company.JDG1980 - Friday, August 7, 2015 - link
This has not been a good year for acquisitions. AMD writing off SeaMicro, Microsoft writing off Nokia, and now Nvidia writing off Icera.One might almost think that many corporate acquisitions are about empire-building and self-enrichment by executives, not about what actually makes sense for the corporation's core business plan.
Michael Bay - Saturday, August 8, 2015 - link
Most acquisitions are about using money corporation can`t return into the country without paying tax on them. Then whatever they buy gets gutted into the mothership and the shell is written off.For good or bad, that`s an approved strategy with investors. They didn`t even flinch when Microsoft had to write down aQuantive, which was deemed to be worth like 2 of Nokia.
Shadowmaster625 - Friday, August 7, 2015 - link
Non-GAAP earnings up! Rally time! lol people are so dumb. Where is that rally monkey?jardows2 - Friday, August 7, 2015 - link
NVIDIA is a GPU company. They try to expand into markets not related, and it doesn't work out. About the only place Tegra is being used is in the companies own gaming tablet, which isn't exactly a stellar success in sales. Purchasing a modem company? What does that have to do with computing graphics? Absolutely nothing.When will these companies learn that doing one thing, and doing it really well, is the key to success. Trying to do everything, just because it seems like it is a profitable venture, usually doesn't work out.
NVIDIA, stick to doing what you do best. You do it well, and you won't have to worry about a major division, unrelated to your core business, dragging you down.
jwcalla - Friday, August 7, 2015 - link
I think they rightly see that the rise of IGPs / APUs is directly impacting revenue, and doesn't bode well for the future. So they have to make up that revenue somehow. And that means taking risks on unknown markets.Gigaplex - Saturday, August 8, 2015 - link
"Purchasing a modem company? What does that have to do with computing graphics? Absolutely nothing."If they want their "computing graphics" products in mobile SoCs, they need a decent modem to support it.
TheJian - Sunday, August 9, 2015 - link
You forgot to mention NV handily beat the street (1.01B expected revenue), which is why it was up ~10%! How can you leave out what everyone else had as the headline?Integrated modems won't matter as soon as die shrinks makes the power on these tank just like everything else. Many throw in a qcom/intel modem even today.
No so sure Intel's fees end in 2017, as they'll have to come up with something themselves that doesn't trample NV's patents again, or pay up anyway. Considering the Markman hearing showed the judge favoring NV in 6/7 patents, I'm not sure NV needs to make any money right now in mobile when they're likely to get billions from WILLFUL infringement, a long term lic fees from samsung/qcom/apple etc. It's just a matter of figuring out who's most responsible and for how much. When you consider MSFT makes ~7-8B a year now on $4/unit sold with android, you can easily see how NV has a valid argument for a few bucks themselves. Qcom charges a % of the ENTIRE device whenever their modem is inside (in court now in china due to this, but a valid argument until USA shoots it down too).
Anand himself didn't call it the wild west of patent infringement for nothing right? Seeing Intel's 1.5B loss in the same type of suit (over Nv's gpus), I can't see how samsung (not a USA company), with profits dwarfing Intel (22-30B, vs. Intel ~8B now), could get away with less than that + lic fees for a decade or more. Those patents are from 1999-2001. They will continue to trample desktop's gpus/patents for 15 years after those as NV/AMD have pretty much every patent that puts a game on your screen for the last 15-20yrs. I expect an AMD suit if they have anything, as soon as NV wins and gives them a leg to stand on [cheaply] in court. I doubt Intel will get out of patents going forward either. They will either have to pay up again (new contract) or remove gpus from their cpus at some point. Just like Qcom with modems and sending signals etc, there is not a radically different way to get a game on your phone/tablet that desktop gpus haven't already patented for 15yrs straight.
My point is, they're already diversified, you're just not getting it. A lic fee and a fine is just as good as selling a soc. Just ask Microsoft about their $7-8B a year (a Billion alone for the last 2yrs from samsung...LOL) from 2B units of android. Also note it won't help Samsung that they were LATE paying MSFT what they agreed to pay and MSFT added a 6mil late fee...ROFL. A jury won't like people with 22-30B a year that don't pay debts and are NOT a USA company pilfering USA patents correct? I can't wait to see that sticker shock price compared to a NON-Willful infringement case of 1.5B (just broke chipset deal so became infringing, not willful for 2-3yrs). NV makes ~600mil a year. Merely a $1 per android devices nets you 2B of pure income. MSFT doesn't have to keep pouring R&D into new lines of android code for that $3.5-4 bucks a pop per device either. Again, a great reason for a jury to award NV big, when they have to keep pouring 1.2B a year into gpu tech etc. Google can't seem to figure out a way out of the code after years of android.
http://www.dailytech.com/Samsungs+Royalty+Fee+to+M...