The Switch was anything but modern though, even when it launched; it's based on an outdated SoC (which they further nerfed/disabled IIRC).
Why would having flash and ARM CPU cores and an old NV GPU arch make it "modern" anyway? ARM architecture dates back to the mid-1980s, and phones have used ARM chips and flash since the 1990s. Other consoles also had onboard flash games storage before the Switch launched including models from Sony, MS and...Nintendo themselves. :P Also, NV maxwell GPU was old hat by the time Switch launched.
Also, what SoC made the x1 outdated in 2016? It throttled the A10 and it wasn't until the 845 in '18 that Qualcomm was competitive. For a portable system the Switch was quite powerful for when it came out.
ARM enables higher performance at low power. nVidia's GPUs are top. Flash storage is a must and Nintendo was first to switch to it. And Nintendo was first to release a NO CD DRIVE NO SPINING RUST HARD DRIVE system. All places that Microsoft and Sony are heading to. Hence, MODERN.
Well considering what was available and how prices of many components had fallen at the time of the Switch's development, of course it was going to have modern hardware. It wouldn't have made sense not to.
And of course the fact that the body of the Switch is pretty much an Nvidia Shield tablet... I think it's pretty likely Nvidia were eager to clear some inventory and make something back from their investment into the X1.
It could, but would Nintendo really want to give up all the nice NVIDIA support they've included that has brought in 3rd party developers to the system? The support and optimizations from Qualcomm for a scaled down RDNA architecture are just not likely to have anywhere near the same maturity. Besides, the initial plan was for a long-term partnership between NVIDIA and Nintendo. I also doubt that Qualcomm will achieve the same energy efficiency at a similar power threshold by using the RDNA architecture, not to mention that AMD is still behind in memory bandwidth efficiency as well. It's just not easy to replace NVIDIA in that space at the moment.
AMD are definitely already ahead of where the current switch console is with its Maxwell GPU; if RDNA 2 lives up to AMD's goals then it will close the gap further.
Nintendo never go for the latest and greatest hardware in their devices, so it doesn't need to beat whatever Nvidia might offer (if they even offer anything - they haven't made a mobile-focused Tegra platform for a while).
That said, I'd be surprised to see them jump onto a Samsung/AMD platform unless it had already been out for a year or two.
Nvidia aren't renowned for being a good company to work with when it comes to that sort of customisation, though. That's why AMD ended up snapping up all the home consoles.
It's not a question of customization, it is a question of costs. AMD decided to go to provide APUs for the consoles gaining a piece of cheese for that work. Nvidia which didn't had idling engineers but lots of plans for the datacenter market preferred to invest its brains into something which would bring back much more value.
In fact AMD OEM market, which includes EPYC market beside console SoC, is equal to nvidia OEM market, which includes Nintendo SoC, and it is in total less than 10% than Nvidia total revenue.
I’ve made $66,000 so far this year w0rking 0nline and I’m a full time student. I’m using an 0nline business opportunity I heard about and I’ve made such great m0ney. It’s really user friendly and I’m just so happy that I found out about it… www.iⅭash68.ⅭOⅯ
If one or both Ampere GPUs could also be used for gaming, some of us might decamp to the self-driving car for that. Would give "mobile gaming" a new meaning (: It can drive itself, but can it run Crysis?
Do they have any partners using this for Autonomous driving? Is tesla or Mobileye going to be customers? Or is Nvidia doing their own independent Autonomous driving push I haven't heard about?
Mobileye is a competitor, not a customer. Tesla plans on using their own. And as you noted, most of the rest of the industry is using NVIDIA's chips for development, although not that many have actually brought anything to production, yet. Instead, they are using Mobileye ADAS. That's probably why NVIDIA is bringing Orin down to 5W. Self-driving cars look like they will take a lot longer than people hoped for 5 years back, so NVIDIA is trying to get the car companies to use NVIDIA chips for ADAS while they develop autonomy algorithms. That could be a cost-savings if they can create a smooth transition from one to the other, rather than developing two completely different systems, one for ADAS using Mobileye and one for self-driving using NVIDIA. I think Mobileye are converging their ADAS and autonomy so that would give them a leg-up in the competition to attract development efforts in autonomy if NVIDIA doesn't also offer competitive ADAS solution that can converge with their autonomy solutions.
The two GA100 Ampere GPUs in the last image appear to have 4 stacks of HBM2 rather than 6. So Nvidia also shaved power (and money) by removing 4 HBM dies beside downclocking the GPUs a bit. In any case, can they actually do -assuming there is a practical way to supply the beast with 800W of power in a *car*- L5 self-driving?
Have they tested that or did they just model it - or, even worse, just calculated that "this is the computing power required for L5 self-driving"? I would assume it's one of the latter two, since I don't think even the most advanced test / prototype cars have the sensors and other required peripheral features for L5 self-driving. Maybe L4 tops. Models and calculations never tell the full picture, and that must particularly be the case for self-driving, which has a myriad of variables and surprises. Until something claiming to support "ability X" is tested we cannot know it actually supports it, therefore the claim is baseless.
What's the problem with 800W power consumption? Do you know what's the power consumption of an air conditioner on the car?
BTW it's 800W as maximum consumption. I do not think that it is going 100% of the computing capacity for 100% of the time.
It seems Nvidia is using the defective GA100 GPUs in these solutions. It is good for them to have more markets beside the gaming one where they can sell different cuts of their products in order to gain as much as possible.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
32 Comments
Back to Article
Alistair - Thursday, May 14, 2020 - link
I hope this forms the basis for the next Nintendo home console. I'll take that 45W version for the home, and the 15W form for a portable version. :)Sttm - Thursday, May 14, 2020 - link
Would be very interesting if Nvidia could turn out a high performance ARM/Ampere console chip for Nintendo. RTX Mario Kart!nandnandnand - Thursday, May 14, 2020 - link
The next Nintendo Switch could use Samsung + AMD graphics.https://hothardware.com/news/nintendo-samsung-exyn...
Alistair - Thursday, May 14, 2020 - link
That would be cool also. I def. Nintendo was the first to move to modern hardware. nVidia, ARM, solid state storage.FaaR - Thursday, May 14, 2020 - link
The Switch was anything but modern though, even when it launched; it's based on an outdated SoC (which they further nerfed/disabled IIRC).Why would having flash and ARM CPU cores and an old NV GPU arch make it "modern" anyway? ARM architecture dates back to the mid-1980s, and phones have used ARM chips and flash since the 1990s. Other consoles also had onboard flash games storage before the Switch launched including models from Sony, MS and...Nintendo themselves. :P Also, NV maxwell GPU was old hat by the time Switch launched.
Strange reasoning! :)
BenSkywalker - Friday, May 15, 2020 - link
You realize x86 came out in the 70s, right?Also, what SoC made the x1 outdated in 2016? It throttled the A10 and it wasn't until the 845 in '18 that Qualcomm was competitive. For a portable system the Switch was quite powerful for when it came out.
Tams80 - Friday, May 15, 2020 - link
I mean, if you're going to be really picky, the Game Boy Advance used an ARM SoC.Alistair - Thursday, July 30, 2020 - link
ARM enables higher performance at low power. nVidia's GPUs are top. Flash storage is a must and Nintendo was first to switch to it. And Nintendo was first to release a NO CD DRIVE NO SPINING RUST HARD DRIVE system. All places that Microsoft and Sony are heading to. Hence, MODERN.Tams80 - Friday, May 15, 2020 - link
Well considering what was available and how prices of many components had fallen at the time of the Switch's development, of course it was going to have modern hardware. It wouldn't have made sense not to.And of course the fact that the body of the Switch is pretty much an Nvidia Shield tablet... I think it's pretty likely Nvidia were eager to clear some inventory and make something back from their investment into the X1.
Yojimbo - Thursday, May 14, 2020 - link
It could, but would Nintendo really want to give up all the nice NVIDIA support they've included that has brought in 3rd party developers to the system? The support and optimizations from Qualcomm for a scaled down RDNA architecture are just not likely to have anywhere near the same maturity. Besides, the initial plan was for a long-term partnership between NVIDIA and Nintendo. I also doubt that Qualcomm will achieve the same energy efficiency at a similar power threshold by using the RDNA architecture, not to mention that AMD is still behind in memory bandwidth efficiency as well. It's just not easy to replace NVIDIA in that space at the moment.nandnandnand - Thursday, May 14, 2020 - link
You mixed up Qualcomm with Samsung.Using some version of RDNA could make developing for Nintendo easier, since XSX/PS5 will be using RDNA2, maybe with some customizations.
Spunjji - Friday, May 15, 2020 - link
AMD are definitely already ahead of where the current switch console is with its Maxwell GPU; if RDNA 2 lives up to AMD's goals then it will close the gap further.Nintendo never go for the latest and greatest hardware in their devices, so it doesn't need to beat whatever Nvidia might offer (if they even offer anything - they haven't made a mobile-focused Tegra platform for a while).
That said, I'd be surprised to see them jump onto a Samsung/AMD platform unless it had already been out for a year or two.
brucethemoose - Thursday, May 14, 2020 - link
15W is pretty toasty for a handheld, 45W would be more like a laptop.Alistair - Thursday, May 14, 2020 - link
The Switch has a fan and works fine with 15W.brucethemoose - Thursday, May 14, 2020 - link
Also, Orin is clearly focused on AI performance, which Nintendo doesn't really need.Alistair - Thursday, May 14, 2020 - link
You just remove the parts you don't need, and make a new mask for the Switch without it. I want the CPU and GPU and memory parts mainly of course.Spunjji - Friday, May 15, 2020 - link
Nvidia aren't renowned for being a good company to work with when it comes to that sort of customisation, though. That's why AMD ended up snapping up all the home consoles.CiccioB - Thursday, May 21, 2020 - link
It's not a question of customization, it is a question of costs.AMD decided to go to provide APUs for the consoles gaining a piece of cheese for that work.
Nvidia which didn't had idling engineers but lots of plans for the datacenter market preferred to invest its brains into something which would bring back much more value.
In fact AMD OEM market, which includes EPYC market beside console SoC, is equal to nvidia OEM market, which includes Nintendo SoC, and it is in total less than 10% than Nvidia total revenue.
katherinedmathis50 - Friday, May 29, 2020 - link
I’ve made $66,000 so far this year w0rking 0nline and I’m a full time student. I’m using an 0nline business opportunity I heard about and I’ve made such great m0ney. It’s really user friendly and I’m just so happy that I found out about it… www.iⅭash68.ⅭOⅯeastcoast_pete - Thursday, May 14, 2020 - link
If one or both Ampere GPUs could also be used for gaming, some of us might decamp to the self-driving car for that. Would give "mobile gaming" a new meaning (:It can drive itself, but can it run Crysis?
Sivar - Thursday, May 14, 2020 - link
How does Orin compare with Tesla's hardware 3.0 FSD chip?brucethemoose - Thursday, May 14, 2020 - link
Orin has more transistors, at the very least.cyrusfox - Thursday, May 14, 2020 - link
Do they have any partners using this for Autonomous driving? Is tesla or Mobileye going to be customers? Or is Nvidia doing their own independent Autonomous driving push I haven't heard about?cyrusfox - Thursday, May 14, 2020 - link
Looks like I am mighty ignorant, looks like they may have the lionshare of the market https://www.nvidia.com/en-us/self-driving-cars/par...Yojimbo - Thursday, May 14, 2020 - link
Mobileye is a competitor, not a customer. Tesla plans on using their own. And as you noted, most of the rest of the industry is using NVIDIA's chips for development, although not that many have actually brought anything to production, yet. Instead, they are using Mobileye ADAS. That's probably why NVIDIA is bringing Orin down to 5W. Self-driving cars look like they will take a lot longer than people hoped for 5 years back, so NVIDIA is trying to get the car companies to use NVIDIA chips for ADAS while they develop autonomy algorithms. That could be a cost-savings if they can create a smooth transition from one to the other, rather than developing two completely different systems, one for ADAS using Mobileye and one for self-driving using NVIDIA. I think Mobileye are converging their ADAS and autonomy so that would give them a leg-up in the competition to attract development efforts in autonomy if NVIDIA doesn't also offer competitive ADAS solution that can converge with their autonomy solutions.Santoval - Sunday, May 17, 2020 - link
The two GA100 Ampere GPUs in the last image appear to have 4 stacks of HBM2 rather than 6. So Nvidia also shaved power (and money) by removing 4 HBM dies beside downclocking the GPUs a bit. In any case, can they actually do -assuming there is a practical way to supply the beast with 800W of power in a *car*- L5 self-driving?Have they tested that or did they just model it - or, even worse, just calculated that "this is the computing power required for L5 self-driving"? I would assume it's one of the latter two, since I don't think even the most advanced test / prototype cars have the sensors and other required peripheral features for L5 self-driving. Maybe L4 tops.
Models and calculations never tell the full picture, and that must particularly be the case for self-driving, which has a myriad of variables and surprises. Until something claiming to support "ability X" is tested we cannot know it actually supports it, therefore the claim is baseless.
CiccioB - Thursday, May 21, 2020 - link
What's the problem with 800W power consumption?Do you know what's the power consumption of an air conditioner on the car?
BTW it's 800W as maximum consumption. I do not think that it is going 100% of the computing capacity for 100% of the time.
It seems Nvidia is using the defective GA100 GPUs in these solutions. It is good for them to have more markets beside the gaming one where they can sell different cuts of their products in order to gain as much as possible.
phoenix_rizzen - Monday, May 25, 2020 - link
The article and chart list 200 TOPS multiple times, but each of the Nvidia graphics show 2000 TOPS.Which is it? 200 or 2000 TOPS?
cyrand - Tuesday, May 26, 2020 - link
One Orin Chip is 200 TOPS. The Drive for RoboTaxi has 2 Orin chips and 2 discrete Ampere GPUs which give is 2000 TOPS.