An unrelated sidenote. The SHIELD controller is still not supported on non-Nvidia-powered PC, even in wired mode. Perhaps they just lack the manpower to develop a proper windows driver for it.
It's not really disabling PhysX on AMD GPUs. It's worse than that. They disable PhysX on NVIDIA GPUs if an AMD GPU is detected in the system (whether you're actively using it or not).
It's even worst than that. PhysX and CUDA are disabled even if you are using a secondary USB monitor. The USB monitor's driver is treated as a graphics driver and because it doesn't have Nvidia's name on it, it is considered as competitor's hardware. Nice isn't it?
Lemme get this straight, you guys have AMD gpus, and you want Nvidia to give the CUDA technology to AMD, so AMD can make their GPUs have CUDA cores in them, so they can process CUDA physics, a.k.a. PhysX, so that NEXT time when you buy an AMD GPU, you can have PhysX in your games? Because unless the architecture isn't native for the CUDA code, it would run in emulation mode, with a significant overhead. I would say you may have better luck asking AMD to implement FreeSync on all their new 3xx GPUs. Although frankly I wouldn't hold my breath even for that.
It's a "you can use what you paid for ONLY if we're exclusive" kind of deal. Imagine Mercedes turning off your navigation or disabling 2 cylinders because you also own a BMW.
I am sorry but your analogy is far from apt. Mercedes (since you brought them up) has its own IPs. One such is called "4Matic all-wheel-drive system"...
... which according to them, provides optimal traction, plus a number of other benefits, ultimately making driving safer. Now if only Merc made that particular IP available to smaller companies, a lot more lives could potentially be saved, with safer cars. Why wouldn't the govt. interfere and make Merc give that "value added feature" away?
That's the thing with humanity you see. The govt. essentially lets those people die because it knows that without the incentive of a big payoff, big investment in R&D would plummet, which means that technological development would come to a crawl and the country would be left behind in the tech race. Like it or not, IP laws are a byproduct of capitalism, and aren't going anywhere anytime soon, and companies are going to continue to be willing to fight tooth and nail over them.
You car analogy suggests that if you place a system with an Nvidia gpu next to another system with an AMD gpu, then the Nvidia gpu would turn off some cores. Which is not the case. Vehicles are complete systems, not componants like GPUs.
The proper comparison would be someone wanting to run a BMW's system software on a Merc, and the Merc's system responding with "unknown software, cannot execute". Go ahead and ask BMW to do that for you, because you prefer their software over Merc's. I wager their response would boil down to something like, "please buy a BMW vehicle, if you want the BMW software."
No, the entire point is that you have nvidia code and an nvidia card to execute it on but you can't because there's also an AMD card in there even though the AMD card is totally irrelevant to the execution of the nvidia code. You can do this on a system with an nvidia card and an older nvidia card just fine, where you just use the weaker card for physx no problem. But suddenly if you change the main card to an AMD card without changing the card the physx code was executed on, you can't execute the physx code.
Nvidia is trying their hardest to replace pc gaming with nvidia gaming, and that's not a good thing.
Nvidia is trying their hardest to replace pc gaming with nvidia gaming, and that's not a good thing.
Good or bad, PC gaming appears to be changing into "Nvidia gaming" because instead of fighting for their primary products, AMD chose to invest on consoles, and got left behind where it really mattered.
Why do you have a problem with: "Nvidia is trying their hardest to replace pc gaming with nvidia gaming, and that's not a good thing."?
Do you think a bunch of verbatim console ports with no PC-specific features or settings options is a good thing? Its just incredibly hypocritical and ironic when AMD fans, supporters, sympathizers and supposed PC enthusiasts constantly say things like:
1) XYZ game is just another crappy console port. Yet when a company like Nvidia tries to work with devs to bring additional features, even some that work on competitor hardware, its bad for PC gaming? lol. 2) We need AMD for competition! Competition is always good! But when Intel and Nvidia compete and establish dominance, competition is bad and they are competing too hard. 3) Close and proprietary are evil, even when they drive innovation in the marketplace and produce results (G-Sync, CUDA). But when AMD does proprietary for their own gains and product differentation, it's A-OK!
Just some clear examples of the kind of double-standards and hypocrisy AMD and their fans exhibit, regularly.
Bottom line is this, you say Nvidia trying to make PC gaming Nvidia gaming and its a bad thing, but what's to stop you from simply not using said features. And, do you think a game without Nvidia features is better or worst as a result of them? Just curious.
chizow Seeing you all over the internet defending/supporting Nvidia, it is really fun watching you talking about hypocrisy.
1) Additional features are good. But games that are sponsored from Nvidia don't come with just addition features. Usually come with problems for the competitor's hardware. Sometimes means also the removal of competitor's features like the DirectX 10.1 that was removed from Assassin's Creed because Nvidia cards where not supporting it at the time.
2)You can't have competition with Intel and Nvidia controlling the market throw financial strength. Nvidia goes even further by trying to lock everybody in their proprietary ecosystem. They try to guaranty that in the future their will be NO competition.
3) Close and proprietary are great for driving innovation, but after some time in the market returning the investment to the company created them, it is in everybody's best interest to be replaced by open standards. Because while proprietary stuff can drive innovation in the beginning, it can add obstacles latter. AMD created a proprietary tech like Mantle, the drived innovation in the correct direction and then stepped down for DX12 and Vulkan. So yes, when AMD does it it is A-OK because they do it the right way. They did the same with AMD64.
The only hypocrite here is you, fortunately. And yes a game with a good physics engine looks much better that one without one. And unfortunately games that use PhysX usually are under performing in that kind of visuals when turning PhysX off, not taking advantage the rest of the system's resources to create an equal experience. Just a coincidence.
@yannigr2: I defend innovation, features, benefits of certain products, you defend garbage and half-assery, see the difference? :D
1) Any GameWorks game that has additional features implemented by Nvidia are BETTER than the console version period, if there's a problem AMD should work on sorting them out. But that's not their MO. Their MO is to half-ass some tech out there, half-support it, and then when there's a problem, claim its Open and thus, not their issue! We've seen this dozens of times, FreeSync, HD3D, Mantle, and even the DX10.1 bug you are going waaaay back on. As soon as there are any problems with these AMD solutions, AMD throws it back on the vendor to fix lolol.
2) No, I don't think you and various other socialist hippies from non-capitalist countries even understand what competition means. You simply want tech sharing in some happy global co-op. Except that's not how it works. Nvidia has every right to invest in tech and value-add features that benefit themselves, their users, and their shareholders. They have no obligation to help otherwise, but they still do when it makes sense. That's true competition and the fact of the matter is, Nvidia's competition and innovation has pushed AMD to the brink. You bet on the loser. The sooner you and the rest of the AMD fanboy hippies get this, the better, but I know you understand this, because you were hypocritically espousing the benefits of the closed and proprietary Mantle for months until it failed and died a few months ago.
3) Except Nvidia and any other innovator has no incentive to do this. They are their market leader, they have no obligation to do the work and give it to everyone, especially when all they did was "Compete" as you claimed was necessary. So again, stop being hypocritical and acknowledge the fact Nvidia was simply better at competing, because as we have seen with Mantle, AMD attempted to do the same much to the delight of their fanboys like you, they just FAILED at making it stick. Of course, to any non-fanboy, this was the only possible outcome because AMD simply did not have the market position, funds, or clout to drive a proprietary API in the marketplace. Lesson learned, only hundreds of millions of resources direly needed elsewhere wasted. And what do you get some 18-24 months later? A late product to combat Maxwell, a nearly full stack of rebrands, and complete slaughter in the marketplace nearing historical highs in the 75-80% range in favor of Nvidia.
So yes, if you have a problem with Nvidia's features, simply turn them off! Enjoy the AMD Radeon experience of dumbed-down console ports, that is what YOU CHOSE when you stupidly voted with your wallet. And now you want to cry about it. lol. GLHF, it will all be over soon.
@yannigr2's usual backpedaling, deflecting, stupidity when called on his BS:
1) Yes it was a bug, but given AMD fanboys' low standards, they would rather have a buggy, faster solution that skipped an entire lighting pass! LOL. BF4 Mantle was another great example of this, I guess it should be fast if its not rendering everything it should. Remember BF4 Fanboy Fog TM? :D
http://techreport.com/news/14707/ubisoft-comments-... "In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly. "
2. Yes, you obviously are, because you have no idea what competition actually means, and when a company competes your fanboy favorite into the ground, suddenly competition is bad and they are competing too hard. Ouch! Stop it! Imma cry. Competition hurts! :'(
3) Mantle was a failure, to any non-Fanboy. Complete disaster for AMD. And for what? You're Greek, you should know damn well what a Pyrrhic Victory is. So AMD fanboys can claim dead Mantle lives on in spiritual successor Vulkan (hey look, another Greek reference!), but who gives a crap when Vulkan will be irrelevant as well and AMD pumped hundreds of millions into a dead-end API. Funds that would have been better spent elsewhere!!!!!
Pay for what? LOL. Like I pay for super awesome monopoly approved Intel processors for the last 9 years since AMD got Conroe'd!!! Let go of the fanboy reigns and enjoy what the tech world has to offer! Free yourself from the bondage of the dying techbottom feeders known collectively known as AMD fanboy and enjoy!
Oh my. Chizow the Nvidia Fanboy just gone full overclock. So much BS in your comment, so much admiration for Nvidia, so much hate for AMD, so many favorable conclusions, so much one sided (para)logic. DX 10.1 was a bug. Still hilarious. DX10.1 stopped being a bug after Nvidia supported it of course.
Oh my yannigr2, ignorantly commenting as usual, ignoring relevant links with the entire back story with comments from both the vendors and the game developers. But this is the usual MO for AMD and their fanboy. Launch a bunch of promises on a slide deck, let misinformation/FUD grow and bloom, then ignore relevant actual proof to the contrary.
I am sure you will tell me how you are enjoying how Free FreeSync is flashing all your monitors firmware enjoying 9-240Hz refresh rates on every cheap monitor on the market that has no additional hardware or cost huh?
LMAO, idiots.
PS. Nvidia never supported DX10.1, like Mantle, it was another irrelevant early-adoption effort from AMD. After it rolled its features into DX11 however, Nvidia did as they always do, they did DX11 done right and of course, killed AMD in one of the main DX10.1 features AMD was trumpeting the whole time: tesselation. Then of course, suddenly tesselation isn't important to AMD and Nvidia is competing too hard, devs are using too much tesselation etc. etc. lol
Yeah right. Game developers. Ubisoft. LOL LOL LOL and more LOLs. Everyone knows Ubisoft and everyone knows their relationship with Nvidia.
"PS. Nvidia never supported DX10.1" Low end Nvidia 200 series (205, 210, 220 and 240) and OEM 300 series is DX10.1 moron. What? The technical department failed again to inform you of the marketing department?
Except this happened long before GameWorks, and it is a DIRECT quote from the developer with independently verified links showing graphical anomalies, so yes, keep burying your fanboy head in the sand as I am sure you will once again stupidly claim AMD's buggy (literal) solutions are better lol.
PS: Clearly I don't care about low-end tech, so yeah great, low-end junk OEM parts supported DX10.1 but that doesn't change the fact Nvidia did not care to support it and it became irrelevant until it rolled into DX11, at which suddenly DX10.1 features were bad for AMD fanboys because Nvidia did it better. :)
An analogy isn't meant to be perfect, just similar enough. But still I don't think you understood the main issue and what people are complaining about. Nobody said anything about running Nvidia software on AMD GPU. It's about being able to run Nvidia software on Nvidia GPU while also having an AMD GPU in your system. Disabling Nvidia components that I paid for just because it detects that I also have an AMD card is plain wrong. It's a crappy way of fighting competition forcing me to remove an AMD card from my system just so I can use something that I have already paid for. And no, you don't see this on the box.
What if Intel decides to disable some feature on their CPUs or the software support as soon as it detects an AMD card? How can that be considered fair practice?
So I'm not hoping to run BMW software on a merc. I am demanding however to be allowed to run the BMW as I bought it regardless of the fact that I also want to own another different brand. I bought a complete set of components + features and I wand to use them all as long as promised.
What if Intel decides to disable some feature on their CPUs or the software support as soon as it detects an AMD card? How can that be considered fair practice?
An intel CPU with an AMD GPU isn't the same as two different GPUs in the same system, sharing the same OS, with their individual drivers competing for the same resources. Would Intel provide support for a 3rd party dual CPU board, which had one socket for Intel and the other for AMD? Now if Nvidia GPU was not doing CUDA with an AMD CPU, that would be a different matter altogether.
I am demanding however to be allowed to run the BMW as I bought it regardless of the fact that I also want to own another different brand. I bought a complete set of components + features and I wand to use them all as long as promised.
Let me reiterate, a car is a standalone system. A GPU is a component of a system, you can't compare apples with oranges.
It seems like you don't want to own two systems (two cars) and run them side-by-side, what your analogy suggests you really want to do is take a car, mod it's engine with unauthorized parts, and then expect the OEM to let their engine timing software be run in such a system, and your argument is that since some people managed to do that without the car catching on fire so the software shouldn't be locking anyone out. From an individual's point of view, that seems to make sense, since it is your money spent and you should be able to do whatever you please with it. From a business' point of view though, things get a lot more complicated with liability thrown into the mix.
Also, if it was really that risk free for GPUs from the two companies to work in a CUDA environment, you can bet that AMD would have sued Nvidia's bollocks off for anti-competitive business practices and won. If they haven't still, then it means that Nvidia may just have a legally sound excuse for doing what they're doing.
@D. Lister: You forget that Intel also has the GPU on die. This is good enough reason to disable stuff on that die when detecting an AMD or Nvidia GPU or disable support for QuickSync, or Optimus, or whatever. Because. This should be a good enough analogy even for a completely non-IT person. Like someone who thinks the graphic subSYSTEM (sub because it's part of an even larger system) is not actually a SYSTEM, like a car... you know... ;) They just decided to call it that for fun.
Regardless, I think you're just trolling because it's impossible to "not get it" after so many tries. I paid for a card with certain features. As long as there's no clear statement on the box that those features will be arbitrarily disabled in case any competing product is found in the system then this is just crappy conduct put in place and supported by people with little to no respect for their customers. Just like the 970 4GB RAM issue, which was neither transmitted to the buyers, nor fully assumed by the company. There is no excuse for crippling a product that I paid in full just because I decided to also use a competing product or a product that looks like it's a competing one. AMD has no reason to sue Nvidia because they are not entitled to use that proprietary tech. Also, I don't care about AMD, you can replace this with any other driver as other guys commented (like an USB monitor). The customers have a right to sue because they paid for something and it's been disabled. I paid for a card and it's not behaving as advertised. It's as simple as that. The problem is it's too much of a hustle for a consumer to sue a company like Nvidia for this. I can't use my cards features because it finds traces of other drivers in my system so it artificially shuts down the whole thing.
And related to the cars vs. GPUs, yet again you fail to realize that I AM talking about a system. It's composed out of GPU, graphic RAM, drivers and supporting software. It's called a graphic subsystem for God's sake, isn't that a good enough clue that it's a system? And in that GPU you have multiple blocks doing different things. One talks to the memory. One schedules instructions. Etc. It's a system but since it's integrated (hence integrated circuits) people unrelated to the domain tend to think it's "just there". They forget about all the parts that make it work. The software just arbitrarily disables a feature present in the hardware after a simple system check. "IF non-Nvidia driver detected THEN disable X".
Hmmm, Nvidia hacking other computers on the LAN to see if any non Nvidia cards are in use and turning off PhysX and CUDA on the Nvidia computer. I can see that happening.
No one here is asking Nvidia tech to run on AMD hardware, just that Nvidia disables their proprietary stuff if it detects anything other than Nvidia in the system. I wonder if this happens if your running an AMD APU with an nvidia card but the graphics on the APU aren't being used at all but are detected. Bye bye PhysX and CUDA, because that makes sense.
You did NOT got it straight. Let me enlighten you.
We guys have Nvidia GPUs AND AMD GPUs. On the box on our Nvidia GPUs it was saying "PhysX support, CUDA support". Nvidia is DENYING a feature that it is selling on the box because we did a huge mistake. We put the Nvidia GPU as secondary. In all Nvidia's arrogance having an AMD GPU as primary and an Nvidia GPU as secondary is a big insult. So as Nvidia customers we have to be punished, by removing support for PhysX and CUDA.
In case you start talking about compatibility and stability problems let me enlighten you some more. Nvidia did a mistake in the past and 258 beta driver, go out on public WITHOUT a lock. So they add the lock latter, not while building the driver. A couple of patches out there where giving the option to use a GeForce card as a PhysX card having an AMD GPU as primary. Never got a BSOD with that patch in Batman, Alice, Borderland, or other games. Even if there where problems Nvidia could offer that option with a big, huge pop up window in the face of the user saying that they can not guaranty any performance or stability problems with AMD cards. They don't. They LOCK IT away from their OWN customers, even while advertising as a feature on the box.
One last question. What part of the "USB monitor driver" didn't you understood?
What? "Cuda Cores" is just marketting speak for their Cuda-capable graphics micro-architectures, of which they have several -- Cuda isn't a hardware instruction set, its a high-level language. Cuda source code could be compiled for AMD's GPU microarchitectures just as easily, or even your CPU (it would just run a lot more slowly). Remember, nVidia didn't invent PhysX either -- that started off as a CPU library and PhysX's own physics accelerator cards.
That's not what they are saying at all. What they are saying is the nVidia disables CUDA and PhysX when ever there is another video driver functioning in the system. Theoretically you could get a Radeon 290X for the video chores and dedicate a lower end nVidia based card to processing PhysX. You can do this if you have two nVidia cards but not if you mix flavors. If you remember correctly PhysX existed way before CUDA. If nVidia wanted, they could make it an open standard and allow AMD, Intel, eccetera to create their own processing engine for Phsyx or even CUDA for that matter. They are using shader engines to do floating point calculations for things other than pixels, vertexes or geometry. Personally I got an nVidia card because my last card was AMD/ATI. I try to rotate brands every other generation or so if the value is similar. I personally haven't had any driver related issues with either vendor in the past 5+ years.
If the IGP is not powering the primary screen I THINK yes. The primary monitor would have to be on the Nvidia card so your IGP will be used only to show a picture on the secondary monitor, nothing else.
Yep, Nvidia has said all along they don't want to support this, ultimately PhysX is their product and part of the Nvidia brand, so problems will reflect poorly upon them (its funny because obviously AMD does not feel this way re: FreeSync).
This may change in the future with DX12 as Microsoft claims they are going to take on this burden within the API to address multiple vendor's hardware, but I guarantee you, first sign of trouble and Microsoft is going to run for the hills and leave the end-users to fend for themselves.
No, MS will just take a year to patch the issue which will cause another issue that will later need to be patched and so on and so forth. If you had the displeasure of using DirectX in the early days you know exactly what I'm talking about. OpenGL was king for a long time for a reason...
Uh, OpenGL was king because it was fast, and you had brilliant coders like Carmack and Sweeney pushing its limits at breakneck speed. And sure DX was a mess to start, but ultimately, it was the driving force that galvanized and unified the PC to create the strong platform it is today.
@yannigr2 - I know you're a proven AMD fanboy, but set that fact aside for a moment to see what you've said is not uniformly true. Nvidia dGPU works just fine with Intel IGP, for both CUDA and PhysX, so it is not a case of Nvidia disabling it for competitor hardware.
It is simply a choice of what they choose to support, as they've said all along. Nvidia supports Optimus as their IGP and dGPU solution, so those configurations work just fine.
They do an exception for Intel because if they don't, PhysX and CUDA is dead. It's simple logic, nothing strange here. It's completely obvious. If AMD was controlling 80% of the CPU business they could have introduce compatibility problems with Intel hardware, hoping to push Intel to bankruptcy and get the x86 license. It's just how Nvidia thinks and works. Nothing new or strange really.
The Intel IGP proves that PhysX lock is a deliberate move from Nvidia. But even without the Intel IGP example, the fact that a simple patch could enable PhysX with AMD primary GPUs and work just fine, I think it's enough proof here. Why are we still talking about this? I was using Nvidia's 258 UNlocked driver a few years ago with an HD4890 and a 9600GT for PhysX and had no problems. That set up was really nice and I never understood why Nvidia didn't choose to push it's lower end cards as PhysX cards. Especially today with all those IGPs, it could make sense. The only explanation is arrogance. They can not accept a setup where an Nvidia card is secondary to an AMD card. You know, a proprietary standard is not bad when it is accessible from everybody, even if you have to pay for an Nvidia card. But when you lock it the way Nvidia does, it is BAD. Really BAD.
And while I am a fan of AMD, when I am posting something I base it on facts and logic. It's another thing to be a fan of a company and another thing to be a brainless fanboy, and I hate the second. Especially when I have to talk logic with brainless fanboys. I was a fan of Nvidia when they where not trying to lock the market under their own arrogance. I still own 3 (low end) Nvidia cards. I don't mind owning Nvidia cards, but I can not support their business practices.
So you were wrong to claim Nvidia does this unilaterally correct? By the same token, it is plausible Nvidia simply does not want to support multiple IHVs for their own proprietary solution, correct? Given you don't even use their products as your primary driver, what makes you feel like you can dictate terms of their usage?
To anyone other than a brainless fanboy this all actually makes sense, but to you, obviously hacky workarounds and half-assed, half-broken products is acceptable, given you are an AMD fanboy which necessarily means you are a fan of half-assed solutions.
As I've said many times, I'm a fan of the best and Nvidia continually comes up with solutions that find new ways to improve and maximize my enjoyment of the products I buy from them. PhysX, G-Sync, and now GameStream are all examples of this. To any non-fanboy that wants these features, the solution is simple. Buy into the ecosystem or STFD.
@chisow You never fail to show how much of a fanboy you are. A small part of Nvidia's marketing department on the internet. Being wrong? How convenient. Dictate terms? Nvidia dictates terms in my own PC. I just protest about that. I didn't knew that I do not have an opinion when using an Nvidia card as secondary. I guess arrogance is something that had passed from the company to it's loyal fanboys.
I don't have to comment the last two paragraphs. As I said, you are just a small part of Nvidia marketing department on the internet. And those too paragraphs shows exactly that, and YOUR FANATICISM.
Cool so you were wrong, just wanted to clarify that. And you chose poorly, so enjoy your decisions! You certainly deserve all AMD has to offer for as long as they offer it! :)
Yes, I guess they only thing is left for you to do is to troll. But wait. Ah yes. Never forget in every phrase to advertise Nvidia. Typical. Thank you for showing us who you are.
Yes I've once again cleared up egregious misinformation from a proven AMD fanboy, once again I've done the internet a solid.
No one needs to read your misleading statements about Nvidia unilaterally disabling this feature when a competitor's part is detected, they are simply choosing what they want to support with their own proprietary solution, as they have every right to do.
We already know that your love for Nvidia is huge, that you will justify everything they do, and that you will attack at anyone not showing enough love for them.
Yes its important to repeat the fact you as any good AMD fanboy continues to spread BS misinformation as if it were fact in an attempt to disparage Nvidia features that the products you love, do not have.
And talk about love for products being huge LOLOL from one of the last remaining AMD *CPU* fanboys on the planet, that is rich. That is deep seeded love that dies hard. :D
If only this would work with all brands and not just shield. Then it would be awesome. 30mbit is a pretty darn high bitrate but with movies its only 24 fps so 30mbit isn't as insane as it sounds since it's pushing more than twice as many frames. The same as 12mbit at 24 fps.
Only a few European countries have average speed that high, and the US is a head of a number of them, which is actually impressive given their population density differences.
I think median is more important than average. In the US something like 50% of people meet the new FCC definition of broadband (25/3), so getting to 30 mbps especially with upcoming DOCSIS 3.1 shouldn't be hard.
Indeed, and I think it's more likely that people who would be interested in this type of service would also be far more likely to be in the upper half.
Although I'm not interested in the Shield Android TV for myself, it does offer an interesting alternative that is aimed directly at current-gen consoles with the bump up to 1080p/60.
Theoretically, you can now get better-than-console experience in terms of quality (settings, 1080p, framerate) for $200 + subscription price ($10/mo???). Its an interesting angle that Nvidia is pushing, we'll see if it works.
Beyond that I think they need to do a much better job of communicating and marketing the Shield's capabilities. They are just trying to do so much with it but none of it is that interesting to me as a PC Gamer, I guess. Maybe this is for people with a lower hardware budget that want to enjoy PC gaming, ie. console gamers, but even then console gamers have a lot of franchise loyalty that Nvidia won't easily overcome.
This platform also got an interesting new usage scenario now that Win10 officially drops MCE, Silicon Dust is planning to launch a new DVR/tuner client and Android is a leading platform for it. For $200 along with everything else it does, this Android TV should be a popular choice in the HTPC market.
My guess is their goal is to not cannibalize GeForce GTX sales. Shield Android TV is $200, so is GTX 960, so I guess that is the convergence point where they figure if you have a GRID-capable PC you may not need GRID as a service, you'd just run it locally on your PC. Nvidia is really looking at that budget gaming market in the $100-300 range right in the console wheelhouse by reducing total cost of ownership and introducing a type of library/subscription service.
There is some cross-over though, you can buy a game for example with GRID and you also get a valid PC key on Steam, so it is like a virtual-key system. They may also have agreements in place with Valve for non-compete on the PC platform.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
61 Comments
Back to Article
eanazag - Tuesday, May 12, 2015 - link
Good info on this Nvidia only service.hyno111 - Tuesday, May 12, 2015 - link
An unrelated sidenote. The SHIELD controller is still not supported on non-Nvidia-powered PC, even in wired mode. Perhaps they just lack the manpower to develop a proper windows driver for it.IlllI - Tuesday, May 12, 2015 - link
or maybe it is done on purpose like the bullcrap they pull with disabling physx on amd gpusGigaplex - Wednesday, May 13, 2015 - link
It's not really disabling PhysX on AMD GPUs. It's worse than that. They disable PhysX on NVIDIA GPUs if an AMD GPU is detected in the system (whether you're actively using it or not).Morawka - Wednesday, May 13, 2015 - link
you sure thats still true?yannigr2 - Wednesday, May 13, 2015 - link
That's true.yannigr2 - Wednesday, May 13, 2015 - link
It's even worst than that. PhysX and CUDA are disabled even if you are using a secondary USB monitor. The USB monitor's driver is treated as a graphics driver and because it doesn't have Nvidia's name on it, it is considered as competitor's hardware. Nice isn't it?D. Lister - Wednesday, May 13, 2015 - link
Lemme get this straight, you guys have AMD gpus, and you want Nvidia to give the CUDA technology to AMD, so AMD can make their GPUs have CUDA cores in them, so they can process CUDA physics, a.k.a. PhysX, so that NEXT time when you buy an AMD GPU, you can have PhysX in your games? Because unless the architecture isn't native for the CUDA code, it would run in emulation mode, with a significant overhead. I would say you may have better luck asking AMD to implement FreeSync on all their new 3xx GPUs. Although frankly I wouldn't hold my breath even for that.D. Lister - Wednesday, May 13, 2015 - link
Correction: "Because [s]unless[/s] if the architecture isn't native for the CUDA code".close - Wednesday, May 13, 2015 - link
It's a "you can use what you paid for ONLY if we're exclusive" kind of deal. Imagine Mercedes turning off your navigation or disabling 2 cylinders because you also own a BMW.D. Lister - Wednesday, May 13, 2015 - link
I am sorry but your analogy is far from apt. Mercedes (since you brought them up) has its own IPs. One such is called "4Matic all-wheel-drive system"...http://www.mercedes-benz.ca/content/canada/mpc/mpc...
... which according to them, provides optimal traction, plus a number of other benefits, ultimately making driving safer. Now if only Merc made that particular IP available to smaller companies, a lot more lives could potentially be saved, with safer cars. Why wouldn't the govt. interfere and make Merc give that "value added feature" away?
That's the thing with humanity you see. The govt. essentially lets those people die because it knows that without the incentive of a big payoff, big investment in R&D would plummet, which means that technological development would come to a crawl and the country would be left behind in the tech race. Like it or not, IP laws are a byproduct of capitalism, and aren't going anywhere anytime soon, and companies are going to continue to be willing to fight tooth and nail over them.
yannigr2 - Wednesday, May 13, 2015 - link
No one asks Nvidia to make PhysX run on AMD GPUs or give PhysX on AMD. What part of the phrase you don't understand?D. Lister - Wednesday, May 13, 2015 - link
@closeYou car analogy suggests that if you place a system with an Nvidia gpu next to another system with an AMD gpu, then the Nvidia gpu would turn off some cores. Which is not the case. Vehicles are complete systems, not componants like GPUs.
The proper comparison would be someone wanting to run a BMW's system software on a Merc, and the Merc's system responding with "unknown software, cannot execute". Go ahead and ask BMW to do that for you, because you prefer their software over Merc's. I wager their response would boil down to something like, "please buy a BMW vehicle, if you want the BMW software."
xthetenth - Wednesday, May 13, 2015 - link
No, the entire point is that you have nvidia code and an nvidia card to execute it on but you can't because there's also an AMD card in there even though the AMD card is totally irrelevant to the execution of the nvidia code. You can do this on a system with an nvidia card and an older nvidia card just fine, where you just use the weaker card for physx no problem. But suddenly if you change the main card to an AMD card without changing the card the physx code was executed on, you can't execute the physx code.Nvidia is trying their hardest to replace pc gaming with nvidia gaming, and that's not a good thing.
D. Lister - Wednesday, May 13, 2015 - link
@xthetenthNvidia is trying their hardest to replace pc gaming with nvidia gaming, and that's not a good thing.
Good or bad, PC gaming appears to be changing into "Nvidia gaming" because instead of fighting for their primary products, AMD chose to invest on consoles, and got left behind where it really mattered.
xthetenth - Wednesday, May 13, 2015 - link
I agree, AMD really needed to put more effort into marketing.chizow - Wednesday, May 13, 2015 - link
@xthetenth:Why do you have a problem with: "Nvidia is trying their hardest to replace pc gaming with nvidia gaming, and that's not a good thing."?
Do you think a bunch of verbatim console ports with no PC-specific features or settings options is a good thing? Its just incredibly hypocritical and ironic when AMD fans, supporters, sympathizers and supposed PC enthusiasts constantly say things like:
1) XYZ game is just another crappy console port. Yet when a company like Nvidia tries to work with devs to bring additional features, even some that work on competitor hardware, its bad for PC gaming? lol.
2) We need AMD for competition! Competition is always good! But when Intel and Nvidia compete and establish dominance, competition is bad and they are competing too hard.
3) Close and proprietary are evil, even when they drive innovation in the marketplace and produce results (G-Sync, CUDA). But when AMD does proprietary for their own gains and product differentation, it's A-OK!
Just some clear examples of the kind of double-standards and hypocrisy AMD and their fans exhibit, regularly.
Bottom line is this, you say Nvidia trying to make PC gaming Nvidia gaming and its a bad thing, but what's to stop you from simply not using said features. And, do you think a game without Nvidia features is better or worst as a result of them? Just curious.
yannigr2 - Thursday, May 14, 2015 - link
chizowSeeing you all over the internet defending/supporting Nvidia, it is really fun watching you talking about hypocrisy.
1) Additional features are good. But games that are sponsored from Nvidia don't come with just addition features. Usually come with problems for the competitor's hardware. Sometimes means also the removal of competitor's features like the DirectX 10.1 that was removed from Assassin's Creed because Nvidia cards where not supporting it at the time.
2)You can't have competition with Intel and Nvidia controlling the market throw financial strength. Nvidia goes even further by trying to lock everybody in their proprietary ecosystem. They try to guaranty that in the future their will be NO competition.
3) Close and proprietary are great for driving innovation, but after some time in the market returning the investment to the company created them, it is in everybody's best interest to be replaced by open standards. Because while proprietary stuff can drive innovation in the beginning, it can add obstacles latter. AMD created a proprietary tech like Mantle, the drived innovation in the correct direction and then stepped down for DX12 and Vulkan. So yes, when AMD does it it is A-OK because they do it the right way. They did the same with AMD64.
The only hypocrite here is you, fortunately. And yes a game with a good physics engine looks much better that one without one. And unfortunately games that use PhysX usually are under performing in that kind of visuals when turning PhysX off, not taking advantage the rest of the system's resources to create an equal experience. Just a coincidence.
chizow - Thursday, May 14, 2015 - link
@yannigr2: I defend innovation, features, benefits of certain products, you defend garbage and half-assery, see the difference? :D1) Any GameWorks game that has additional features implemented by Nvidia are BETTER than the console version period, if there's a problem AMD should work on sorting them out. But that's not their MO. Their MO is to half-ass some tech out there, half-support it, and then when there's a problem, claim its Open and thus, not their issue! We've seen this dozens of times, FreeSync, HD3D, Mantle, and even the DX10.1 bug you are going waaaay back on. As soon as there are any problems with these AMD solutions, AMD throws it back on the vendor to fix lolol.
2) No, I don't think you and various other socialist hippies from non-capitalist countries even understand what competition means. You simply want tech sharing in some happy global co-op. Except that's not how it works. Nvidia has every right to invest in tech and value-add features that benefit themselves, their users, and their shareholders. They have no obligation to help otherwise, but they still do when it makes sense. That's true competition and the fact of the matter is, Nvidia's competition and innovation has pushed AMD to the brink. You bet on the loser. The sooner you and the rest of the AMD fanboy hippies get this, the better, but I know you understand this, because you were hypocritically espousing the benefits of the closed and proprietary Mantle for months until it failed and died a few months ago.
3) Except Nvidia and any other innovator has no incentive to do this. They are their market leader, they have no obligation to do the work and give it to everyone, especially when all they did was "Compete" as you claimed was necessary. So again, stop being hypocritical and acknowledge the fact Nvidia was simply better at competing, because as we have seen with Mantle, AMD attempted to do the same much to the delight of their fanboys like you, they just FAILED at making it stick. Of course, to any non-fanboy, this was the only possible outcome because AMD simply did not have the market position, funds, or clout to drive a proprietary API in the marketplace. Lesson learned, only hundreds of millions of resources direly needed elsewhere wasted. And what do you get some 18-24 months later? A late product to combat Maxwell, a nearly full stack of rebrands, and complete slaughter in the marketplace nearing historical highs in the 75-80% range in favor of Nvidia.
So yes, if you have a problem with Nvidia's features, simply turn them off! Enjoy the AMD Radeon experience of dumbed-down console ports, that is what YOU CHOSE when you stupidly voted with your wallet. And now you want to cry about it. lol. GLHF, it will all be over soon.
yannigr2 - Thursday, May 14, 2015 - link
@chisowfirst paragraph Nvidia's advertisement
1) Nvidia's marketing department makes a speech. What we learn here? DX10.1 was a bug. LOL! Nice one.
2) Continues. What we learn here? We are "socialist hippies from non-capitalist countries". Damn. Busted! LOL!
3) And... continues. Nvidia market leaders. AMD failure. Got that. Thanks for the info.
I hope they pay you for this.
chizow - Thursday, May 14, 2015 - link
@yannigr2's usual backpedaling, deflecting, stupidity when called on his BS:1) Yes it was a bug, but given AMD fanboys' low standards, they would rather have a buggy, faster solution that skipped an entire lighting pass! LOL. BF4 Mantle was another great example of this, I guess it should be fast if its not rendering everything it should. Remember BF4 Fanboy Fog TM? :D
http://techreport.com/news/14707/ubisoft-comments-...
"In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly. "
2. Yes, you obviously are, because you have no idea what competition actually means, and when a company competes your fanboy favorite into the ground, suddenly competition is bad and they are competing too hard. Ouch! Stop it! Imma cry. Competition hurts! :'(
3) Mantle was a failure, to any non-Fanboy. Complete disaster for AMD. And for what? You're Greek, you should know damn well what a Pyrrhic Victory is. So AMD fanboys can claim dead Mantle lives on in spiritual successor Vulkan (hey look, another Greek reference!), but who gives a crap when Vulkan will be irrelevant as well and AMD pumped hundreds of millions into a dead-end API. Funds that would have been better spent elsewhere!!!!!
Pay for what? LOL. Like I pay for super awesome monopoly approved Intel processors for the last 9 years since AMD got Conroe'd!!! Let go of the fanboy reigns and enjoy what the tech world has to offer! Free yourself from the bondage of the dying techbottom feeders known collectively known as AMD fanboy and enjoy!
yannigr2 - Thursday, May 14, 2015 - link
Oh my. Chizow the Nvidia Fanboy just gone full overclock. So much BS in your comment, so much admiration for Nvidia, so much hate for AMD, so many favorable conclusions, so much one sided (para)logic. DX 10.1 was a bug. Still hilarious. DX10.1 stopped being a bug after Nvidia supported it of course.chizow - Thursday, May 14, 2015 - link
Oh my yannigr2, ignorantly commenting as usual, ignoring relevant links with the entire back story with comments from both the vendors and the game developers. But this is the usual MO for AMD and their fanboy. Launch a bunch of promises on a slide deck, let misinformation/FUD grow and bloom, then ignore relevant actual proof to the contrary.I am sure you will tell me how you are enjoying how Free FreeSync is flashing all your monitors firmware enjoying 9-240Hz refresh rates on every cheap monitor on the market that has no additional hardware or cost huh?
LMAO, idiots.
PS. Nvidia never supported DX10.1, like Mantle, it was another irrelevant early-adoption effort from AMD. After it rolled its features into DX11 however, Nvidia did as they always do, they did DX11 done right and of course, killed AMD in one of the main DX10.1 features AMD was trumpeting the whole time: tesselation. Then of course, suddenly tesselation isn't important to AMD and Nvidia is competing too hard, devs are using too much tesselation etc. etc. lol
QQ more AMD fanboy.
yannigr2 - Friday, May 15, 2015 - link
Yeah right. Game developers. Ubisoft. LOL LOL LOL and more LOLs. Everyone knows Ubisoft and everyone knows their relationship with Nvidia."PS. Nvidia never supported DX10.1"
Low end Nvidia 200 series (205, 210, 220 and 240) and OEM 300 series is DX10.1 moron. What? The technical department failed again to inform you of the marketing department?
chizow - Friday, May 15, 2015 - link
Except this happened long before GameWorks, and it is a DIRECT quote from the developer with independently verified links showing graphical anomalies, so yes, keep burying your fanboy head in the sand as I am sure you will once again stupidly claim AMD's buggy (literal) solutions are better lol.PS: Clearly I don't care about low-end tech, so yeah great, low-end junk OEM parts supported DX10.1 but that doesn't change the fact Nvidia did not care to support it and it became irrelevant until it rolled into DX11, at which suddenly DX10.1 features were bad for AMD fanboys because Nvidia did it better. :)
close - Wednesday, May 13, 2015 - link
An analogy isn't meant to be perfect, just similar enough. But still I don't think you understood the main issue and what people are complaining about. Nobody said anything about running Nvidia software on AMD GPU. It's about being able to run Nvidia software on Nvidia GPU while also having an AMD GPU in your system. Disabling Nvidia components that I paid for just because it detects that I also have an AMD card is plain wrong. It's a crappy way of fighting competition forcing me to remove an AMD card from my system just so I can use something that I have already paid for. And no, you don't see this on the box.What if Intel decides to disable some feature on their CPUs or the software support as soon as it detects an AMD card? How can that be considered fair practice?
So I'm not hoping to run BMW software on a merc. I am demanding however to be allowed to run the BMW as I bought it regardless of the fact that I also want to own another different brand. I bought a complete set of components + features and I wand to use them all as long as promised.
D. Lister - Wednesday, May 13, 2015 - link
@closeWhat if Intel decides to disable some feature on their CPUs or the software support as soon as it detects an AMD card? How can that be considered fair practice?
An intel CPU with an AMD GPU isn't the same as two different GPUs in the same system, sharing the same OS, with their individual drivers competing for the same resources. Would Intel provide support for a 3rd party dual CPU board, which had one socket for Intel and the other for AMD? Now if Nvidia GPU was not doing CUDA with an AMD CPU, that would be a different matter altogether.
I am demanding however to be allowed to run the BMW as I bought it regardless of the fact that I also want to own another different brand. I bought a complete set of components + features and I wand to use them all as long as promised.
Let me reiterate, a car is a standalone system. A GPU is a component of a system, you can't compare apples with oranges.
It seems like you don't want to own two systems (two cars) and run them side-by-side, what your analogy suggests you really want to do is take a car, mod it's engine with unauthorized parts, and then expect the OEM to let their engine timing software be run in such a system, and your argument is that since some people managed to do that without the car catching on fire so the software shouldn't be locking anyone out. From an individual's point of view, that seems to make sense, since it is your money spent and you should be able to do whatever you please with it. From a business' point of view though, things get a lot more complicated with liability thrown into the mix.
Also, if it was really that risk free for GPUs from the two companies to work in a CUDA environment, you can bet that AMD would have sued Nvidia's bollocks off for anti-competitive business practices and won. If they haven't still, then it means that Nvidia may just have a legally sound excuse for doing what they're doing.
close - Wednesday, May 13, 2015 - link
@D. Lister: You forget that Intel also has the GPU on die. This is good enough reason to disable stuff on that die when detecting an AMD or Nvidia GPU or disable support for QuickSync, or Optimus, or whatever. Because. This should be a good enough analogy even for a completely non-IT person. Like someone who thinks the graphic subSYSTEM (sub because it's part of an even larger system) is not actually a SYSTEM, like a car... you know... ;) They just decided to call it that for fun.Regardless, I think you're just trolling because it's impossible to "not get it" after so many tries. I paid for a card with certain features. As long as there's no clear statement on the box that those features will be arbitrarily disabled in case any competing product is found in the system then this is just crappy conduct put in place and supported by people with little to no respect for their customers. Just like the 970 4GB RAM issue, which was neither transmitted to the buyers, nor fully assumed by the company.
There is no excuse for crippling a product that I paid in full just because I decided to also use a competing product or a product that looks like it's a competing one.
AMD has no reason to sue Nvidia because they are not entitled to use that proprietary tech. Also, I don't care about AMD, you can replace this with any other driver as other guys commented (like an USB monitor). The customers have a right to sue because they paid for something and it's been disabled. I paid for a card and it's not behaving as advertised. It's as simple as that. The problem is it's too much of a hustle for a consumer to sue a company like Nvidia for this. I can't use my cards features because it finds traces of other drivers in my system so it artificially shuts down the whole thing.
And related to the cars vs. GPUs, yet again you fail to realize that I AM talking about a system. It's composed out of GPU, graphic RAM, drivers and supporting software. It's called a graphic subsystem for God's sake, isn't that a good enough clue that it's a system? And in that GPU you have multiple blocks doing different things. One talks to the memory. One schedules instructions. Etc. It's a system but since it's integrated (hence integrated circuits) people unrelated to the domain tend to think it's "just there". They forget about all the parts that make it work. The software just arbitrarily disables a feature present in the hardware after a simple system check. "IF non-Nvidia driver detected THEN disable X".
yannigr2 - Wednesday, May 13, 2015 - link
No no no. Either YOU DON'T GET IT, or you deliberately try to make it look completely different. In any case you are wrong.Crunchy005 - Wednesday, May 13, 2015 - link
Hmmm, Nvidia hacking other computers on the LAN to see if any non Nvidia cards are in use and turning off PhysX and CUDA on the Nvidia computer. I can see that happening.No one here is asking Nvidia tech to run on AMD hardware, just that Nvidia disables their proprietary stuff if it detects anything other than Nvidia in the system. I wonder if this happens if your running an AMD APU with an nvidia card but the graphics on the APU aren't being used at all but are detected. Bye bye PhysX and CUDA, because that makes sense.
yannigr2 - Wednesday, May 13, 2015 - link
You did NOT got it straight. Let me enlighten you.We guys have Nvidia GPUs AND AMD GPUs. On the box on our Nvidia GPUs it was saying "PhysX support, CUDA support". Nvidia is DENYING a feature that it is selling on the box because we did a huge mistake. We put the Nvidia GPU as secondary. In all Nvidia's arrogance having an AMD GPU as primary and an Nvidia GPU as secondary is a big insult. So as Nvidia customers we have to be punished, by removing support for PhysX and CUDA.
In case you start talking about compatibility and stability problems let me enlighten you some more. Nvidia did a mistake in the past and 258 beta driver, go out on public WITHOUT a lock. So they add the lock latter, not while building the driver. A couple of patches out there where giving the option to use a GeForce card as a PhysX card having an AMD GPU as primary. Never got a BSOD with that patch in Batman, Alice, Borderland, or other games. Even if there where problems Nvidia could offer that option with a big, huge pop up window in the face of the user saying that they can not guaranty any performance or stability problems with AMD cards. They don't. They LOCK IT away from their OWN customers, even while advertising as a feature on the box.
One last question. What part of the "USB monitor driver" didn't you understood?
ravyne - Wednesday, May 13, 2015 - link
What? "Cuda Cores" is just marketting speak for their Cuda-capable graphics micro-architectures, of which they have several -- Cuda isn't a hardware instruction set, its a high-level language. Cuda source code could be compiled for AMD's GPU microarchitectures just as easily, or even your CPU (it would just run a lot more slowly). Remember, nVidia didn't invent PhysX either -- that started off as a CPU library and PhysX's own physics accelerator cards.Einy0 - Wednesday, May 13, 2015 - link
That's not what they are saying at all. What they are saying is the nVidia disables CUDA and PhysX when ever there is another video driver functioning in the system. Theoretically you could get a Radeon 290X for the video chores and dedicate a lower end nVidia based card to processing PhysX. You can do this if you have two nVidia cards but not if you mix flavors. If you remember correctly PhysX existed way before CUDA. If nVidia wanted, they could make it an open standard and allow AMD, Intel, eccetera to create their own processing engine for Phsyx or even CUDA for that matter. They are using shader engines to do floating point calculations for things other than pixels, vertexes or geometry. Personally I got an nVidia card because my last card was AMD/ATI. I try to rotate brands every other generation or so if the value is similar. I personally haven't had any driver related issues with either vendor in the past 5+ years.rtho782 - Wednesday, May 13, 2015 - link
PhysX and CUDA work for me with a 2nd monitor on my IGP?yannigr2 - Wednesday, May 13, 2015 - link
If the IGP is not powering the primary screen I THINK yes. The primary monitor would have to be on the Nvidia card so your IGP will be used only to show a picture on the secondary monitor, nothing else.extide - Wednesday, May 13, 2015 - link
They make an exception for IGP's because otherwise Optimus wouldn't work at all.chizow - Wednesday, May 13, 2015 - link
Yep, Nvidia has said all along they don't want to support this, ultimately PhysX is their product and part of the Nvidia brand, so problems will reflect poorly upon them (its funny because obviously AMD does not feel this way re: FreeSync).This may change in the future with DX12 as Microsoft claims they are going to take on this burden within the API to address multiple vendor's hardware, but I guarantee you, first sign of trouble and Microsoft is going to run for the hills and leave the end-users to fend for themselves.
Einy0 - Wednesday, May 13, 2015 - link
No, MS will just take a year to patch the issue which will cause another issue that will later need to be patched and so on and so forth. If you had the displeasure of using DirectX in the early days you know exactly what I'm talking about. OpenGL was king for a long time for a reason...chizow - Thursday, May 14, 2015 - link
Uh, OpenGL was king because it was fast, and you had brilliant coders like Carmack and Sweeney pushing its limits at breakneck speed. And sure DX was a mess to start, but ultimately, it was the driving force that galvanized and unified the PC to create the strong platform it is today.chizow - Wednesday, May 13, 2015 - link
@yannigr2 - I know you're a proven AMD fanboy, but set that fact aside for a moment to see what you've said is not uniformly true. Nvidia dGPU works just fine with Intel IGP, for both CUDA and PhysX, so it is not a case of Nvidia disabling it for competitor hardware.It is simply a choice of what they choose to support, as they've said all along. Nvidia supports Optimus as their IGP and dGPU solution, so those configurations work just fine.
yannigr2 - Thursday, May 14, 2015 - link
They do an exception for Intel because if they don't, PhysX and CUDA is dead. It's simple logic, nothing strange here. It's completely obvious. If AMD was controlling 80% of the CPU business they could have introduce compatibility problems with Intel hardware, hoping to push Intel to bankruptcy and get the x86 license. It's just how Nvidia thinks and works. Nothing new or strange really.The Intel IGP proves that PhysX lock is a deliberate move from Nvidia. But even without the Intel IGP example, the fact that a simple patch could enable PhysX with AMD primary GPUs and work just fine, I think it's enough proof here. Why are we still talking about this? I was using Nvidia's 258 UNlocked driver a few years ago with an HD4890 and a 9600GT for PhysX and had no problems. That set up was really nice and I never understood why Nvidia didn't choose to push it's lower end cards as PhysX cards. Especially today with all those IGPs, it could make sense. The only explanation is arrogance. They can not accept a setup where an Nvidia card is secondary to an AMD card. You know, a proprietary standard is not bad when it is accessible from everybody, even if you have to pay for an Nvidia card. But when you lock it the way Nvidia does, it is BAD. Really BAD.
And while I am a fan of AMD, when I am posting something I base it on facts and logic. It's another thing to be a fan of a company and another thing to be a brainless fanboy, and I hate the second. Especially when I have to talk logic with brainless fanboys. I was a fan of Nvidia when they where not trying to lock the market under their own arrogance. I still own 3 (low end) Nvidia cards. I don't mind owning Nvidia cards, but I can not support their business practices.
chizow - Thursday, May 14, 2015 - link
So you were wrong to claim Nvidia does this unilaterally correct? By the same token, it is plausible Nvidia simply does not want to support multiple IHVs for their own proprietary solution, correct? Given you don't even use their products as your primary driver, what makes you feel like you can dictate terms of their usage?To anyone other than a brainless fanboy this all actually makes sense, but to you, obviously hacky workarounds and half-assed, half-broken products is acceptable, given you are an AMD fanboy which necessarily means you are a fan of half-assed solutions.
As I've said many times, I'm a fan of the best and Nvidia continually comes up with solutions that find new ways to improve and maximize my enjoyment of the products I buy from them. PhysX, G-Sync, and now GameStream are all examples of this. To any non-fanboy that wants these features, the solution is simple. Buy into the ecosystem or STFD.
yannigr2 - Thursday, May 14, 2015 - link
@chisowYou never fail to show how much of a fanboy you are. A small part of Nvidia's marketing department on the internet.
Being wrong? How convenient. Dictate terms? Nvidia dictates terms in my own PC. I just protest about that. I didn't knew that I do not have an opinion when using an Nvidia card as secondary. I guess arrogance is something that had passed from the company to it's loyal fanboys.
I don't have to comment the last two paragraphs. As I said, you are just a small part of Nvidia marketing department on the internet. And those too paragraphs shows exactly that, and YOUR FANATICISM.
chizow - Thursday, May 14, 2015 - link
Cool so you were wrong, just wanted to clarify that. And you chose poorly, so enjoy your decisions! You certainly deserve all AMD has to offer for as long as they offer it! :)yannigr2 - Friday, May 15, 2015 - link
Yes, I guess they only thing is left for you to do is to troll. But wait. Ah yes. Never forget in every phrase to advertise Nvidia. Typical.Thank you for showing us who you are.
chizow - Friday, May 15, 2015 - link
Yes I've once again cleared up egregious misinformation from a proven AMD fanboy, once again I've done the internet a solid.No one needs to read your misleading statements about Nvidia unilaterally disabling this feature when a competitor's part is detected, they are simply choosing what they want to support with their own proprietary solution, as they have every right to do.
yannigr2 - Saturday, May 16, 2015 - link
You keep repeating yourself.We already know that your love for Nvidia is huge, that you will justify everything they do, and that you will attack at anyone not showing enough love for them.
We already know that.
chizow - Saturday, May 16, 2015 - link
Yes its important to repeat the fact you as any good AMD fanboy continues to spread BS misinformation as if it were fact in an attempt to disparage Nvidia features that the products you love, do not have.And talk about love for products being huge LOLOL from one of the last remaining AMD *CPU* fanboys on the planet, that is rich. That is deep seeded love that dies hard. :D
Laststop311 - Wednesday, May 13, 2015 - link
If only this would work with all brands and not just shield. Then it would be awesome. 30mbit is a pretty darn high bitrate but with movies its only 24 fps so 30mbit isn't as insane as it sounds since it's pushing more than twice as many frames. The same as 12mbit at 24 fps.jamesbond2015 - Wednesday, May 13, 2015 - link
Maybe the plan is to start rolling it out in some developed industrial country (EU, South Korea, Japan) where 50MBit is standard and not in USA ?Salvor - Wednesday, May 13, 2015 - link
If you think 50Mbit+ is standard in Europe I don't know what you're smoking.Here's some data for you: http://www.netindex.com/download/map
Only a few European countries have average speed that high, and the US is a head of a number of them, which is actually impressive given their population density differences.
sonicmerlin - Wednesday, May 13, 2015 - link
I think median is more important than average. In the US something like 50% of people meet the new FCC definition of broadband (25/3), so getting to 30 mbps especially with upcoming DOCSIS 3.1 shouldn't be hard.Salvor - Wednesday, May 13, 2015 - link
Indeed, and I think it's more likely that people who would be interested in this type of service would also be far more likely to be in the upper half.jamesbond2015 - Wednesday, May 13, 2015 - link
Dream on :)http://en.wikipedia.org/wiki/List_of_countries_by_...
Salvor - Thursday, May 14, 2015 - link
What exactly does this show? The map clearly does not agree with the charts, and it and they just prove my point.chizow - Wednesday, May 13, 2015 - link
Although I'm not interested in the Shield Android TV for myself, it does offer an interesting alternative that is aimed directly at current-gen consoles with the bump up to 1080p/60.Theoretically, you can now get better-than-console experience in terms of quality (settings, 1080p, framerate) for $200 + subscription price ($10/mo???). Its an interesting angle that Nvidia is pushing, we'll see if it works.
Beyond that I think they need to do a much better job of communicating and marketing the Shield's capabilities. They are just trying to do so much with it but none of it is that interesting to me as a PC Gamer, I guess. Maybe this is for people with a lower hardware budget that want to enjoy PC gaming, ie. console gamers, but even then console gamers have a lot of franchise loyalty that Nvidia won't easily overcome.
This platform also got an interesting new usage scenario now that Win10 officially drops MCE, Silicon Dust is planning to launch a new DVR/tuner client and Android is a leading platform for it. For $200 along with everything else it does, this Android TV should be a popular choice in the HTPC market.
Guspaz - Wednesday, May 13, 2015 - link
So why is GRID exclusive to nVidia handhelds? Why isn't it available to PC users? It's a subscription service, they're leaving money on the table.chizow - Wednesday, May 13, 2015 - link
My guess is their goal is to not cannibalize GeForce GTX sales. Shield Android TV is $200, so is GTX 960, so I guess that is the convergence point where they figure if you have a GRID-capable PC you may not need GRID as a service, you'd just run it locally on your PC. Nvidia is really looking at that budget gaming market in the $100-300 range right in the console wheelhouse by reducing total cost of ownership and introducing a type of library/subscription service.There is some cross-over though, you can buy a game for example with GRID and you also get a valid PC key on Steam, so it is like a virtual-key system. They may also have agreements in place with Valve for non-compete on the PC platform.