I have a question. Why/How is shedding 1/3 VRAM bandwidth and 2/3 cache size (vs Navi 22) make for a better product than shedding all the cache but keeping all 12 GB VRAM and bandwidth?
I am asking this because the consoles don't have any cache and they seem to do just fine.
It's more of a power efficiency tradeoff. The cache pulls less power than a wider memory bus (according to AMD), and it seems that it's such an integral part of RDNA2, that they're not letting it go without losing performance on PC. The consoles use a 128-bit (XSS), 256-bit (PS5) and 320-bit (XSX) memory bus, but they also can use the internal SSD as temporary memory, so maybe that's why. They also have built-in hardware specifically developed to speeding I/O, and add to that the more unified architecture of a console, that can help to optimize VRAM usage.
> they also can use the internal SSD as temporary memory, so maybe that's why.
GDDR6 is like 2 orders of magnitude faster than NVMe SSDs. And you don't want to treat them like DRAM, because that would burn out the SSDs prematurely.
Depends what you mean by "better product". The cache is a key power-saving feature, but it's also allowing for a simpler PCB design, so margins here are likely to be better than what was possible with their older designs that relied on larger buses (RX580, 5700, etc.). That makes it a better product for AMD and its partners, while seemingly remaining competitive for consumers... if they can find any at a reasonable price.
Well nivisia did have to chose between 6gb and 12gb and the pressure to offer more than 6gb was too strong. And yeah… for 1080p 8gb is fine… for 1440p… 12gb is not owerkill, but 8gb is still good. 6gb for 1440p ok the other hand… would not be so great. 12gb is even good for 4K! But 3060 is not 4K gpu in speed wice…
The 12 GB of the 3060 is an oddball. The Nvidia cards from 3060 ti to 3080 all have less VRAM. It does show that those better models could have double the amount they have (GDDR6X limitations notwithstanding). My guess is that Nvidia didn't want to canibalise their more expensive workstation GPUs with larger VRAMs.
Doom Eternal with Ray Tracing, Max Settings eats over 8 gigs of VRAM even at 1080p afaik. If this becomes common with AAA games 4-5 years later than the RTX 3060 with it's 50% greater VRAM size and RT cores makes for a better long term value.
so now, since for once in a decade of all the midranges the memory size is in favor of a single nvidia card it makes it a better card for the future :). It is by design what AMD did with the infinity cache to need less memory and buswidth.
and doom only uses 9GB for 4K gaming in "nightmare" settings so i doubt it requires 8GB in 1080p :)
not to mention the RT hype... on a 1080p 24" ish inch screen range yeah right so much added value and than it kills your FPS but oh wait we will get some games with DLSS to up that again since at the same time they reduce pixel density again... people follow the marketing hype :)
Infinity Cache, as all Caches, help with bandwidth - not the amount of memory necessary. The only thing that helps with the amount of memory necessary is reducing the amount of graphical data.
Hard disagree. This logic would be true for the 3070 and 3080, but the 3060 barely has the grunt to be running those settings with RT even today, let alone 4-5 years from now.
I genuinely think AMD has done a better job of balancing VRAM and processing power this generation. Nvidia are off the pace.
Oh, 100%. Nvidia really stuffed us up this gen and then got confused halfway when it comes to VRAM. AMD is bang on. The way I see it scaling in 2020-2025 is: 16GB = 2160p = Ultra 8GB = 1440p = High 4GB = 1080p = Medium 2GB = 720p = Low Settings
Because the consoles designed for "4K"/1440p gaming have 10GB of RAM for the GPU. Translating to PC's, it means that companies will likely target 12GB for 1440p, which means 8GB should be enough for RT-disabled 1080p.
Also worth pointing out that most folks won't notice the difference between "Ultra" or equivalent settings and a single step below, so even in the unlikely event that games start pushing over 8GB of VRAM at 1080p Ultra, PC users always have the option to finesse VRAM usage just below their available frame-buffer. 6GB, on the other hand, will probably warrant a more serious drop in visuals in that 3-5 year time frame.
Of course, if you're buying a 1080p gaming card and intending to keep it for years, absolute peak visual quality over the life of the card isn't likely to be the primary concern and cost/performance is more important. I'm not sure that either Nvidia or AMD have made a good job of this generation in that regard - they seem to have optimised for the ~$5-600 price point.
To be fair RT eats more DRAM than americans eat burgers. These GPU's are clearly not meant for Ray-Tracing, regardless of AMD's theoretical support for it.
It must come *somewhere* near it if it's beating out the 3060, but I'm doubtful it's close enough to justify that price.
I'm wondering if we'll ever return to the days where GPU manufacturers would release a GPU that outperformed an existing product at the same price as the existing product...
A few things could go a long way towards dropping GPU prices:
* deflation of the cryptocurrency bubble * custom AI chips significantly surpassing GPUs at AI * Intel entering the gaming GPU maket * increase in the supply/demand ratio
I wouldn't buy a dGPU quite yet, unless *really* needed/wanted one.
Well the 3060ti stree price is about $800 to $1000… so i expect this become, much much cheaper than 3060ti… The aib seems to over price the gpu the more the better original msrp was… because 6600xt msrp is not great, i expect little bit less AIB tax above msrp… that 3060ti has. Even 3060 has less aib extra compared to 3060ti…
re: " Navi 23 comes with 32 CUs – just 8 fewer than Navi 22/6700XT – and all of them come enabled on the RX 6600 XT. This gives the card 2080 stream processors’ worth of ALUs to work with"
Should read "2048 stream processors" rather than 2080.
For a card that is only mildly improved over its predecessor (5600XT) and has almost no future proofing with 8GB VRAM, it’s pretty bold to charge $100 more, or 25% more than the previous gen card, when basic math says it won’t even have a 25% performance delta.
It is a 1080p card so 8GB is more than enough, why do you think the TI also has 8GB?
Because nvidia and AMD don't know what they need in the market???? As usual the internet posters have much better knowledge and technology awareness...
At this performance category, 8GB is plenty for the future. It's been strange seeing people take opposite sides on the VRAM debate and be equally wrong.
Not only these "1080p mainstream" cards are $379 compared to rx580's launch price of $229, but you won't be able to buy them anywhere close to MSRP anyway. Thanks no thanks.
Agreed. Even acknowledging that AMD aren't benefiting from a die shrink over the 5700, the overall design should be - at most - roughly the same cost, if not a little cheaper.
I'd consider stretching to whatever the UK equivalent of $300 is, but not more.
AMD would be charging $299 for this... but they're doing good business and taking advantage of the current market conditions. I'm not okay with this, but AMD clearly need the money, so I'll let it slide. But they won't get too many passes of these, that's what happened with Intel. So I'm eager to see how AMD's ramping up of R&D will yield better CPUs, GPUs, and SoCs in the near future.
LOLno. AMD has already shoveled out much of their old debt and has been raking in cash the last year. Even if thye need money that doesnt mean overpricing their hardware is an acceptable practice.
Yes, because they couldnt make specific request from their aibs /distributers to sell at leats part of the cards at msrp. And even if you think they really couldnt, they could just sell a lot more through their store,and open up the store to more countries. But instead, they decided they wont be releasing this card on amd shop at all! Wonder why.
The 128-bit onboard ram bus is what effectively reigns this particular card in--it's the performance governor of the entire system, imo. It's also cheaper to manufacture than a 192-bit or 256-bit onboard ram bus GPU. However, I think the GPU is priced about $100 too high. Just my opinion on this issue. RDNA2 must be a beast of a performer, considering everything AMD has had to do to pare down the performance to 1080P for this product. Not quite like trying to pull the Mississippi through a straw, but you get the picture...;)
Not to be a party pooper. But thats 2x rx 480 at twice the price after 5 years. The best one is gonna be 6500XT, rx 480 performance for rx 480 price! After just 5.5 years. Its gonna have less vram too!
Yeah, this is the equation that looks really miserable - until you account for the fact that RX580s have been selling for $3-400 this year, so by comparison to *that*, this looks like a bargain 😂
There's a bunch of problems here; the current marketplace doesn't readily bear comparison to what we're used to.
No, you've got it backwards. GPUs weren't designed for mining. It's certain cryptocurrencies that were designed to run well on GPUs. Given that, you can't really make a GPU that is *bad* at mining them.
And consoles are only a scam in your imagination. As an iPhone user, complaining so loudly & frequently about walled gardens is basically makes you a pot calling the kettle black.
It's truly amazing how fast the gaming industry died, if you're not a millionaire you need not apply. You literally can't even buy a worthwhile GPU for $150 anymore, even used. Guess it's a good thing it's been almost 10 years since a decent game was released :/
I don't follow GPU cards much but a 1060 or rx480/580 at double the price in just over 4 years? Also consider the RX has high-end memory in its time, relatively cheaper than implementing the huge cache. Good thing I bought plenty of the 580.😅✌
> 1080 is ‘mainstream’ only due to lack of adequate competition
No, 1080 first became the standard like 10 years ago. dGPUs have gotten about an order of magnitude faster, since then. Even for the same price (if you compare to pre-pandemic pricing, at least).
The reason 1080p is still mainstream is probably two-fold. First, 1920x1080 is a television resolution, which made LCD panels at that resolution very cost-effective. And because most gamers had screens that size (or connected to TVs), developers targeted their games to it. Games can improve graphical sophistication and quality by more than simply increasing the resolution.
What's really pushing the mainstream above 1080p isn't even primarily faster GPUs. It's actually monitors. As higher-res gaming monitors and 4k TVs get cheaper, it's creating pressure for game developers to support them at playable framerates. That's actually where DLSS and FSR come in and help ease the burden.
Also, competition can only do so much. For instance, when Intel wasn't under any competitive pressure, they seemed quite comfortable keeping mainstream desktops at <= 4 cores. When Zen and its decedents started to gain market share, we learned how much Intel had been holding back on both cores and clock speed. However, once they reached Comet Lake, it was clear they had run out of gas. Further competition really wouldn't have helped.
As for their manufacturing woes, it's unclear what impact stiffer competition might've had on their 10 nm node. If the issue was indeed that they were too ambitious with their density targets, more competitive pressure mightn't have made much difference. And if their density was primarily limited by lack of mature EUV machines from ASML, then more competition wouldn't have had any bearing on that.
Nothing. We're at the whim of the crypto market (and the cost of energy used to power it).
China is allegedly cracking down on mining, and that did lead to a real drop in crypto and hardware prices, but crypto prices seem to have since recovered. Hard to know what's behind that, but it could have to do with investors flocking to buy up "under-valued" crypto, in the face of new Covid lockdowns hurting the recovery of traditional investments.
At some point, this stuff will rationalize. No one can predict exactly how or when. I'd be very wealthy, if I had any clue.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
63 Comments
Back to Article
blanarahul - Friday, July 30, 2021 - link
I have a question. Why/How is shedding 1/3 VRAM bandwidth and 2/3 cache size (vs Navi 22) make for a better product than shedding all the cache but keeping all 12 GB VRAM and bandwidth?I am asking this because the consoles don't have any cache and they seem to do just fine.
Zyhawk42 - Friday, July 30, 2021 - link
It's more of a power efficiency tradeoff. The cache pulls less power than a wider memory bus (according to AMD), and it seems that it's such an integral part of RDNA2, that they're not letting it go without losing performance on PC. The consoles use a 128-bit (XSS), 256-bit (PS5) and 320-bit (XSX) memory bus, but they also can use the internal SSD as temporary memory, so maybe that's why. They also have built-in hardware specifically developed to speeding I/O, and add to that the more unified architecture of a console, that can help to optimize VRAM usage.mode_13h - Monday, August 2, 2021 - link
> they also can use the internal SSD as temporary memory, so maybe that's why.GDDR6 is like 2 orders of magnitude faster than NVMe SSDs. And you don't want to treat them like DRAM, because that would burn out the SSDs prematurely.
Spunjji - Friday, July 30, 2021 - link
Depends what you mean by "better product". The cache is a key power-saving feature, but it's also allowing for a simpler PCB design, so margins here are likely to be better than what was possible with their older designs that relied on larger buses (RX580, 5700, etc.). That makes it a better product for AMD and its partners, while seemingly remaining competitive for consumers... if they can find any at a reasonable price.Flunk - Friday, July 30, 2021 - link
8GB is enough VRAM for this low-mid range card, putting too much RAM on video cards only increases the bill of materials.shabby - Friday, July 30, 2021 - link
3060 says hihaukionkannel - Friday, July 30, 2021 - link
Well nivisia did have to chose between 6gb and 12gb and the pressure to offer more than 6gb was too strong. And yeah… for 1080p 8gb is fine… for 1440p… 12gb is not owerkill, but 8gb is still good. 6gb for 1440p ok the other hand… would not be so great. 12gb is even good for 4K! But 3060 is not 4K gpu in speed wice…Rudde - Friday, July 30, 2021 - link
The 12 GB of the 3060 is an oddball. The Nvidia cards from 3060 ti to 3080 all have less VRAM. It does show that those better models could have double the amount they have (GDDR6X limitations notwithstanding). My guess is that Nvidia didn't want to canibalise their more expensive workstation GPUs with larger VRAMs.blanarahul - Friday, July 30, 2021 - link
"The net result is that Navi 23 sheds over 7 billion transistors versus Navi 22 while only giving up a small bit of actual compute hardware."According it the table it sheds over 6 billion transistors, not 7.
Ryan Smith - Friday, July 30, 2021 - link
Apparently I shed a brain cell or two in the process... Thanks!cigar3tte - Friday, July 30, 2021 - link
$379, that's only $20 cheaper than the 3060 Ti, and I highly doubt it'll come close to it in performance with only 128-bit memory bus.blanarahul - Friday, July 30, 2021 - link
Doom Eternal with Ray Tracing, Max Settings eats over 8 gigs of VRAM even at 1080p afaik. If this becomes common with AAA games 4-5 years later than the RTX 3060 with it's 50% greater VRAM size and RT cores makes for a better long term value.duploxxx - Friday, July 30, 2021 - link
so now, since for once in a decade of all the midranges the memory size is in favor of a single nvidia card it makes it a better card for the future :). It is by design what AMD did with the infinity cache to need less memory and buswidth.and doom only uses 9GB for 4K gaming in "nightmare" settings so i doubt it requires 8GB in 1080p :)
not to mention the RT hype... on a 1080p 24" ish inch screen range yeah right so much added value and than it kills your FPS but oh wait we will get some games with DLSS to up that again since at the same time they reduce pixel density again... people follow the marketing hype :)
Wereweeb - Friday, July 30, 2021 - link
Infinity Cache, as all Caches, help with bandwidth - not the amount of memory necessary. The only thing that helps with the amount of memory necessary is reducing the amount of graphical data.Spunjji - Friday, July 30, 2021 - link
Hard disagree. This logic would be true for the 3070 and 3080, but the 3060 barely has the grunt to be running those settings with RT even today, let alone 4-5 years from now.I genuinely think AMD has done a better job of balancing VRAM and processing power this generation. Nvidia are off the pace.
Kangal - Friday, July 30, 2021 - link
Oh, 100%.Nvidia really stuffed us up this gen and then got confused halfway when it comes to VRAM. AMD is bang on. The way I see it scaling in 2020-2025 is:
16GB = 2160p = Ultra
8GB = 1440p = High
4GB = 1080p = Medium
2GB = 720p = Low Settings
blanarahul - Friday, July 30, 2021 - link
https://www.hardwaretimes.com/amds-radeon-rx-6800-...because of the VRAM limitation. I know it's just one game but who's to say increased VRAM usage won't be increasingly common in 3-4 years?
Wereweeb - Friday, July 30, 2021 - link
Because the consoles designed for "4K"/1440p gaming have 10GB of RAM for the GPU. Translating to PC's, it means that companies will likely target 12GB for 1440p, which means 8GB should be enough for RT-disabled 1080p.Spunjji - Monday, August 2, 2021 - link
Also worth pointing out that most folks won't notice the difference between "Ultra" or equivalent settings and a single step below, so even in the unlikely event that games start pushing over 8GB of VRAM at 1080p Ultra, PC users always have the option to finesse VRAM usage just below their available frame-buffer. 6GB, on the other hand, will probably warrant a more serious drop in visuals in that 3-5 year time frame.Of course, if you're buying a 1080p gaming card and intending to keep it for years, absolute peak visual quality over the life of the card isn't likely to be the primary concern and cost/performance is more important. I'm not sure that either Nvidia or AMD have made a good job of this generation in that regard - they seem to have optimised for the ~$5-600 price point.
Wereweeb - Friday, July 30, 2021 - link
To be fair RT eats more DRAM than americans eat burgers. These GPU's are clearly not meant for Ray-Tracing, regardless of AMD's theoretical support for it.Samus - Friday, July 30, 2021 - link
I agree this card is really misguided and not worth the nearly $400 asking price.Spunjji - Friday, July 30, 2021 - link
It must come *somewhere* near it if it's beating out the 3060, but I'm doubtful it's close enough to justify that price.I'm wondering if we'll ever return to the days where GPU manufacturers would release a GPU that outperformed an existing product at the same price as the existing product...
mode_13h - Monday, August 2, 2021 - link
Give it time. GPUs cannot stay on their current generational-price curve.mode_13h - Monday, August 2, 2021 - link
A few things could go a long way towards dropping GPU prices:* deflation of the cryptocurrency bubble
* custom AI chips significantly surpassing GPUs at AI
* Intel entering the gaming GPU maket
* increase in the supply/demand ratio
I wouldn't buy a dGPU quite yet, unless *really* needed/wanted one.
Oxford Guy - Tuesday, August 3, 2021 - link
Adequate competition... not our fake capitalism.Oxford Guy - Tuesday, August 3, 2021 - link
Good ole Intel the savior. Brand new company after all... not one that has had eons to save PC gamers.haukionkannel - Friday, July 30, 2021 - link
Well the 3060ti stree price is about $800 to $1000… so i expect this become, much much cheaper than 3060ti…The aib seems to over price the gpu the more the better original msrp was… because 6600xt msrp is not great, i expect little bit less AIB tax above msrp… that 3060ti has. Even 3060 has less aib extra compared to 3060ti…
bridgmanAMD - Friday, July 30, 2021 - link
re: " Navi 23 comes with 32 CUs – just 8 fewer than Navi 22/6700XT – and all of them come enabled on the RX 6600 XT. This gives the card 2080 stream processors’ worth of ALUs to work with"Should read "2048 stream processors" rather than 2080.
yetanotherhuman - Friday, July 30, 2021 - link
Coming to ebay near you for only $700!Samus - Friday, July 30, 2021 - link
For a card that is only mildly improved over its predecessor (5600XT) and has almost no future proofing with 8GB VRAM, it’s pretty bold to charge $100 more, or 25% more than the previous gen card, when basic math says it won’t even have a 25% performance delta.duploxxx - Friday, July 30, 2021 - link
It is a 1080p card so 8GB is more than enough, why do you think the TI also has 8GB?Because nvidia and AMD don't know what they need in the market???? As usual the internet posters have much better knowledge and technology awareness...
ArcadeEngineer - Friday, July 30, 2021 - link
Have you not been around long enough to see a company launch a clearly bad product?Spunjji - Friday, July 30, 2021 - link
"almost no future proofing with 8GB VRAM"At this performance category, 8GB is plenty for the future. It's been strange seeing people take opposite sides on the VRAM debate and be equally wrong.
Agreed entirely on the price, though.
isthisavailable - Friday, July 30, 2021 - link
Not only these "1080p mainstream" cards are $379 compared to rx580's launch price of $229, but you won't be able to buy them anywhere close to MSRP anyway. Thanks no thanks.TheinsanegamerN - Friday, July 30, 2021 - link
So a card is launching for $379 and offers performance similar to the $350 5700 two years later? Am I supposed to be impressed?Hard pass, until this card is $300 or lower nobody should consider it.
Spunjji - Friday, July 30, 2021 - link
Agreed. Even acknowledging that AMD aren't benefiting from a die shrink over the 5700, the overall design should be - at most - roughly the same cost, if not a little cheaper.I'd consider stretching to whatever the UK equivalent of $300 is, but not more.
Kangal - Friday, July 30, 2021 - link
AMD would be charging $299 for this... but they're doing good business and taking advantage of the current market conditions. I'm not okay with this, but AMD clearly need the money, so I'll let it slide. But they won't get too many passes of these, that's what happened with Intel. So I'm eager to see how AMD's ramping up of R&D will yield better CPUs, GPUs, and SoCs in the near future.TheinsanegamerN - Friday, July 30, 2021 - link
"AMD clearly needs the money"LOLno. AMD has already shoveled out much of their old debt and has been raking in cash the last year. Even if thye need money that doesnt mean overpricing their hardware is an acceptable practice.
Wereweeb - Friday, July 30, 2021 - link
From their point of view, either they take it, or the scalpers do.RaV[666] - Sunday, August 1, 2021 - link
Yes, because they couldnt make specific request from their aibs /distributers to sell at leats part of the cards at msrp.And even if you think they really couldnt, they could just sell a lot more through their store,and open up the store to more countries.
But instead, they decided they wont be releasing this card on amd shop at all! Wonder why.
WaltC - Friday, July 30, 2021 - link
The 128-bit onboard ram bus is what effectively reigns this particular card in--it's the performance governor of the entire system, imo. It's also cheaper to manufacture than a 192-bit or 256-bit onboard ram bus GPU. However, I think the GPU is priced about $100 too high. Just my opinion on this issue. RDNA2 must be a beast of a performer, considering everything AMD has had to do to pare down the performance to 1080P for this product. Not quite like trying to pull the Mississippi through a straw, but you get the picture...;)Questor - Friday, July 30, 2021 - link
Mainstream starting at $379.00? No wonder my interest in anything PC is waning.benedict - Friday, July 30, 2021 - link
So twice the performance of RX 580 at twice the price. What a huge improvement and it took only 4 years.RaV[666] - Friday, July 30, 2021 - link
Not to be a party pooper.But thats 2x rx 480 at twice the price after 5 years.
The best one is gonna be 6500XT, rx 480 performance for rx 480 price! After just 5.5 years.
Its gonna have less vram too!
Spunjji - Monday, August 2, 2021 - link
Yeah, this is the equation that looks really miserable - until you account for the fact that RX580s have been selling for $3-400 this year, so by comparison to *that*, this looks like a bargain 😂There's a bunch of problems here; the current marketplace doesn't readily bear comparison to what we're used to.
Alaa - Friday, July 30, 2021 - link
I remember when I bought NV 9600GT for $150. The current prices are insane for the 600 series.Spunjji - Monday, August 2, 2021 - link
Back when the 6 series cards were straightforwardly 50% of the high-end hardware, and the high-end wasn't over-engineered workstation hardware...Oxford Guy - Tuesday, August 3, 2021 - link
You mean GPU companies didn’t design their products for mining and pretend otherwise?(While also simultaneously competing directly against the PC gaming platform via the console scam?)
mode_13h - Wednesday, August 4, 2021 - link
No, you've got it backwards. GPUs weren't designed for mining. It's certain cryptocurrencies that were designed to run well on GPUs. Given that, you can't really make a GPU that is *bad* at mining them.Sometimes, an undesirable situation is not the fault of any one actor. For example: https://en.wikipedia.org/wiki/Tragedy_of_the_commo...
And consoles are only a scam in your imagination. As an iPhone user, complaining so loudly & frequently about walled gardens is basically makes you a pot calling the kettle black.
Oxford Guy - Tuesday, August 17, 2021 - link
'GPUs weren't designed for mining.'Citation needed.
'And consoles are only a scam in your imagination.'
Ad hom, not a rebuttal. As usual.
'As an iPhone user, complaining so loudly & frequently about walled gardens is basically makes you a pot calling the kettle black.'
Another failed ad hom. Google is no better than Apple and the phone situation is not relevant to the console scam.
If your posts in response to mind were to become fallacy-free it would be...
Hrel - Saturday, August 7, 2021 - link
It's truly amazing how fast the gaming industry died, if you're not a millionaire you need not apply. You literally can't even buy a worthwhile GPU for $150 anymore, even used. Guess it's a good thing it's been almost 10 years since a decent game was released :/Oxford Guy - Tuesday, August 17, 2021 - link
One and a half companies isn't enough to sustain the competition needed to have a healthy PC gaming ecosystem.Instead of that, there are a bunch of parasites ('consoles', AMD). AMD is both half a competitor and one of the parasites.
zodiacfml - Saturday, July 31, 2021 - link
I don't follow GPU cards much but a 1060 or rx480/580 at double the price in just over 4 years? Also consider the RX has high-end memory in its time, relatively cheaper than implementing the huge cache. Good thing I bought plenty of the 580.😅✌Oxford Guy - Tuesday, August 3, 2021 - link
1080 is ‘mainstream’ only due to lack of adequate competition, including the console scam’s effect on the PC gaming market.1440 should be where ‘mainstream’ lies.
mode_13h - Wednesday, August 4, 2021 - link
> 1080 is ‘mainstream’ only due to lack of adequate competitionNo, 1080 first became the standard like 10 years ago. dGPUs have gotten about an order of magnitude faster, since then. Even for the same price (if you compare to pre-pandemic pricing, at least).
The reason 1080p is still mainstream is probably two-fold. First, 1920x1080 is a television resolution, which made LCD panels at that resolution very cost-effective. And because most gamers had screens that size (or connected to TVs), developers targeted their games to it. Games can improve graphical sophistication and quality by more than simply increasing the resolution.
What's really pushing the mainstream above 1080p isn't even primarily faster GPUs. It's actually monitors. As higher-res gaming monitors and 4k TVs get cheaper, it's creating pressure for game developers to support them at playable framerates. That's actually where DLSS and FSR come in and help ease the burden.
Oxford Guy - Tuesday, August 17, 2021 - link
I said: 1080 is ‘mainstream’ only due to lack of adequate competition.Your response: No, 1080 first became the standard like 10 years ago.
Does not compute.
'Also, competition can only do so much.'
Citing a situation of grossly inadequate competition isn't very useful for actually proving your attempted point.
Where are you trying to go with this line of reasoning? What's supposed to substitute for competition? Deus ex machina?
mode_13h - Wednesday, August 4, 2021 - link
Also, competition can only do so much. For instance, when Intel wasn't under any competitive pressure, they seemed quite comfortable keeping mainstream desktops at <= 4 cores. When Zen and its decedents started to gain market share, we learned how much Intel had been holding back on both cores and clock speed. However, once they reached Comet Lake, it was clear they had run out of gas. Further competition really wouldn't have helped.As for their manufacturing woes, it's unclear what impact stiffer competition might've had on their 10 nm node. If the issue was indeed that they were too ambitious with their density targets, more competitive pressure mightn't have made much difference. And if their density was primarily limited by lack of mature EUV machines from ASML, then more competition wouldn't have had any bearing on that.
ProxyMoxy - Monday, August 9, 2021 - link
So what prevents the crypto miners from buying up all of these leaving us with having to pay 2 or 3 times MSRP from some shady scalpers on eBay etc..mode_13h - Tuesday, August 10, 2021 - link
Nothing. We're at the whim of the crypto market (and the cost of energy used to power it).China is allegedly cracking down on mining, and that did lead to a real drop in crypto and hardware prices, but crypto prices seem to have since recovered. Hard to know what's behind that, but it could have to do with investors flocking to buy up "under-valued" crypto, in the face of new Covid lockdowns hurting the recovery of traditional investments.
At some point, this stuff will rationalize. No one can predict exactly how or when. I'd be very wealthy, if I had any clue.
mode_13h - Thursday, August 12, 2021 - link
So, I couldn't help but notice that August 11th came and went, without any reviews or launch coverage on this site.