As far as 2060 is concerned every detail was leaked. also very dubious claim about 1440p with RTX on as even the more expensive GPUs struggle with RTX on at that resolution.
If this happens, I can finally upgrade from my my relatively tine y 27" screen. 65" are definitely too big and too expensive for my desk, but 43" ish + Freesync HDR that works with nVidia = heaven ;)
Also from that blog post "G-SYNC Compatible tests will identify monitors that deliver a baseline VRR experience on GeForce RTX 20-series and GeForce GTX 10-series graphics cards, and activate their VRR features automatically."
I was really dreading seeing a little "Available on 20 series" tag somewhere, but with 10 series support too this actually means something.
Sometimes I am wondering why people are saying GPU expensive. 450mm2 of 12nm Die Size with 6GB of GDDR6 only cost $349 with board included bundled with Games.
On CPU you will be lucky to get half of that die size for double the cost. Without Memory and Board as well.
I had doubt these Ray Tracing will ever be good enough, but judging from these and developers said they still have lots more optimisation to come I have changed my mind and have high hopes for it. Once these Software optimisation landed in both engines and Drivers, and the move to 7nm GPU, Gaming industry will continue to expand and boom.
The size of the die and manufacturing costs of the board are completely irrelevant for the consumer.
$349 is enthusiast pricing, and represents a price/performance level that you could already get - from Nvidia themselves - in 2016.
We, meaning gamers overall, would be far better served by a new generation of midrange ($150 to $250) cards with a strong performance uplift (and a new generation of consoles) than more pushing in the enthusiast space. Ultimately it's the adoption of a better midrange that will allow new games to target higher fidelity graphics.
@iwod isn't talking about what consumers care about, but rather the underlying costs. As for the chips, both Nvidia and Intel have pretty hefty margins (over 50%, IIRC). The board makers and other component manufacturers don't do nearly as well.
The unit cost of software is virtually zero - they only have to factor in the lost sales opportunity for customers who'd have bought it anyway.
I understand that, I'm not saying that Nvidia is overcharging (more than they usually are).
I'm just saying that the 'why' is irrelevant.
Nvidia is releasing their new "midrange" card with roughly the same pricing as the 1070 in 2016 - and roughly the same performance. From the perspective of those looking for an actual midrange upgrade that's too expensive, regardless of what it costs Nvidia to make the boards.
Sure. That's why the RTX series has met with such a poor reception by the gaming community. Performance uplift is less than Maxwell -> Pascal (which jumped about 2 tiers), yet pricing is significantly inflated.
Now, let's see what this rumored GTX 11 series is all about...
For one thing, GPUs have a lot more units that can be disabled at a finer granularity, to improve yield. Perhaps it also helps that they run at lower clock speeds?
1) GTX10 and RTX20 (but not GTX20?) series get G-Sync by drivers technology on nVidia certified Adaptive Sync monitors (a whole whopping 14 of them) or manually enabled on all of them. This means that All FreeSync monitors can now be used with modern nVidia cards
2) If you want the Full G-Sync Ultimate (HDR, color calibration, blackligh control, etc) experience, you will still have to shell out money for a G-Sync Ultimate certified card (which generations??) AND a G-Sync Ultimate Certified monitor, with the G-Sync tax. I.e. a generic AdaptiveSync (non-G-Sync) monitor will not give the full G-Sync Ultimate experience?
Nice embrace and extend there, nVidia. A page from the Microsoft playbook.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
17 Comments
Back to Article
watersb - Sunday, January 6, 2019 - link
Your photos of the event are famtastic. Live blogging this. Wow. Thanks!mode_13h - Monday, January 7, 2019 - link
I second this. Very good pics. Those + the (searchable) summary are much appreciated.Smell This - Monday, January 7, 2019 - link
Holy Wall of Graphics, Batman ...Chaitanya - Monday, January 7, 2019 - link
As far as 2060 is concerned every detail was leaked. also very dubious claim about 1440p with RTX on as even the more expensive GPUs struggle with RTX on at that resolution.bubblyboo - Monday, January 7, 2019 - link
We need clarification from Nvidia on the Freesync thing. Only interesting thing this whole presentation.limitedaccess - Monday, January 7, 2019 - link
https://blogs.nvidia.com/blog/2019/01/06/g-sync-di..."For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too."
webdoctors - Monday, January 7, 2019 - link
Nice!!Will definitely have to try this on my ultra widescreen WQHD monitor.
xchaotic - Monday, January 7, 2019 - link
If this happens, I can finally upgrade from my my relatively tine y 27" screen. 65" are definitely too big and too expensive for my desk, but 43" ish + Freesync HDR that works with nVidia = heaven ;)Mr Perfect - Monday, January 7, 2019 - link
Also from that blog post "G-SYNC Compatible tests will identify monitors that deliver a baseline VRR experience on GeForce RTX 20-series and GeForce GTX 10-series graphics cards, and activate their VRR features automatically."I was really dreading seeing a little "Available on 20 series" tag somewhere, but with 10 series support too this actually means something.
iwod - Monday, January 7, 2019 - link
Sometimes I am wondering why people are saying GPU expensive. 450mm2 of 12nm Die Size with 6GB of GDDR6 only cost $349 with board included bundled with Games.On CPU you will be lucky to get half of that die size for double the cost. Without Memory and Board as well.
I had doubt these Ray Tracing will ever be good enough, but judging from these and developers said they still have lots more optimisation to come I have changed my mind and have high hopes for it. Once these Software optimisation landed in both engines and Drivers, and the move to 7nm GPU, Gaming industry will continue to expand and boom.
Exciting times.
Note: Where are you Navi / AMD?
Exodite - Monday, January 7, 2019 - link
The size of the die and manufacturing costs of the board are completely irrelevant for the consumer.$349 is enthusiast pricing, and represents a price/performance level that you could already get - from Nvidia themselves - in 2016.
We, meaning gamers overall, would be far better served by a new generation of midrange ($150 to $250) cards with a strong performance uplift (and a new generation of consoles) than more pushing in the enthusiast space. Ultimately it's the adoption of a better midrange that will allow new games to target higher fidelity graphics.
mode_13h - Monday, January 7, 2019 - link
@iwod isn't talking about what consumers care about, but rather the underlying costs. As for the chips, both Nvidia and Intel have pretty hefty margins (over 50%, IIRC). The board makers and other component manufacturers don't do nearly as well.The unit cost of software is virtually zero - they only have to factor in the lost sales opportunity for customers who'd have bought it anyway.
Exodite - Monday, January 7, 2019 - link
I understand that, I'm not saying that Nvidia is overcharging (more than they usually are).I'm just saying that the 'why' is irrelevant.
Nvidia is releasing their new "midrange" card with roughly the same pricing as the 1070 in 2016 - and roughly the same performance. From the perspective of those looking for an actual midrange upgrade that's too expensive, regardless of what it costs Nvidia to make the boards.
mode_13h - Monday, January 7, 2019 - link
Sure. That's why the RTX series has met with such a poor reception by the gaming community. Performance uplift is less than Maxwell -> Pascal (which jumped about 2 tiers), yet pricing is significantly inflated.Now, let's see what this rumored GTX 11 series is all about...
mode_13h - Monday, January 7, 2019 - link
For one thing, GPUs have a lot more units that can be disabled at a finer granularity, to improve yield. Perhaps it also helps that they run at lower clock speeds?halcyon - Monday, January 7, 2019 - link
So, for Adaptive Sync technologies:1) GTX10 and RTX20 (but not GTX20?) series get G-Sync by drivers technology on nVidia certified Adaptive Sync monitors (a whole whopping 14 of them) or manually enabled on all of them. This means that All FreeSync monitors can now be used with modern nVidia cards
2) If you want the Full G-Sync Ultimate (HDR, color calibration, blackligh control, etc) experience, you will still have to shell out money for a G-Sync Ultimate certified card (which generations??) AND a G-Sync Ultimate Certified monitor, with the G-Sync tax. I.e. a generic AdaptiveSync (non-G-Sync) monitor will not give the full G-Sync Ultimate experience?
Nice embrace and extend there, nVidia. A page from the Microsoft playbook.
Jake13942 - Tuesday, January 8, 2019 - link
No HDMI 2.1No VRR Support
Too expensive