Feels like gpu manufacturers don't even care any more. Like, screw IQ, were upscaling and interpolate to reduce your IQ to new levels, and for what? 50% performance gains!
Soon, they'll just post gray pictures in 599fps. Calling it the ultimate performance card, for only 1299usd.
Pathetic cope is just assuming every dev is releasing poor software because a handful of console ports land that way.
People are dropping 60fps videos with frame time graphs for just about anything even vaguely popular, so finding the games that run well is way less difficult than it was a decade ago.
Plenty of studios are making well-optimized games. Buy those.
These features improve image quality. If you bought a card costing the same that focused entirely on maximizing raster graphics performance with no upscaling or frame generation or ray tracing and you had a game developed to cater specifically to that you'd get something that looks a whole lot worse than if you bought a card that was instead designed for these technologies and a game that was developed to take advantage of it. It's a generational difference. Just compare the Alan Wake II screenshots.
Except companies like nVidia are released garbage GPU's like the Geforce 4060 which isn't improving performance over the previous generation... But instead are using DLSS as a "Crutch" to advertise higher performance.
The 4060 isn't garbage. It costs less, considering inflation, and uses less power than the 3060 did at launch. And it has the capability of DLSS 3
DLSS isn't a crutch. It's a smarter way to apply compute resources. If you made a GPU with the same number of transistors as the 4060 but with only raster capabilities, no dlss, and sold it for 250 it would be far worse value than the 300 dollar 4060. Sure it would be faster with "real" (I'm using that word mockingly to people who call it that) rasterized pixels, but the image quality would be worse and the fps lower.
3060's 12 GB of VRAM was an anomaly in the broken Ampere lineup (more than the 3080 with 10 GB), and it's a popular choice for babby's first AI GPU. The 4060 is still faster than the 3060 at 1080p/1440p, even if the 8 GB of VRAM isn't going to age very well or already falls flat in some games.
" DLSS isn't a crutch" go look at reviews of the 4060 vs the 3060. with out DLSS the 4060 is slower. the reality is, the 4060, really should be labeled as a 4050. nvidia castrated the cards way too much vs previous gen. the 3060 even has a faster memory bus then the 4060
What does the market position of the 4060 have to do with DLSS? Both the 30 series and the 40 series of cards are pursuing the same resource paradigm, anyway. You need to compare it with a card, real or hypothetical, that is pursuing a paradigm of purely traditional rasterization.
The way a comparison should be made is with resources to produce the card versus quality of output, which include image quality and performance. Power is also a relevant factor but I'm pretty sure that would come down on the side of the DLSS/ray tracing paradigm, and, anyway, the image quality differences would be so great that it would be like comparing apples to oranges, power-wise.
As far as the 4060, it's not a direct replacement for the 3060. It's far cheaper, adjusted for inflation (CPI-adjusted price of the Feb 2021 $329 3060 would be $380 now. The 4060 was released at $299), and it uses far less power. Nvidia's product stack was compressed in the Turing generation (20 series), and since then they have been stretching it back out. The 16 series was a one-off dead end because Nvidia could not cover the entire stack with the new paradigm in the first generation. So the 2060 was actually a large and power hungry (440 mm^2 and 160 W) compared to the 1060 (200 mm^2 and 120 W). The difference in size is somewhat inflated by the fact that they are on the same-generation process technology (16 nm vs 12 nm) and cost per transistor had gone down in the ensuing 2 years, but even adjusted for that, the 2060 is a much larger chip. The 4060 is now back down to 146 mm^2 and 115 W. For comparison, the 3060 is 276 mm^2 and 170 W, so you can see the trend of reduced die area and/or power after the jump from the 1060 to the 2060 down to the 4060. The $249 price of the 1060 at launch in 2016 is an inflation-adjusted $316 today, whereas the 4060 has launched for $299. In fact the die size suggests that the 4060 occupies a position in the stack akin to somewhere between a 1050 Ti and a 1060, but is slightly more expensive. All the narratives about the 4060 and 4060 Ti being bad values and out of line with prior generations are the result of flawed analysis.
Nvidia is likely done re-stretching their product stack now. Also, although inflation is likely to remain higher over the next 2 years than it was prior to 2021, the big hump of inflation is past us. When the 50 series comes out these hardware sites will be celebrating the miraculous return to gen on gen value increases, probably attributing it to competition with Intel, or something, when in fact it will be pretty much in line with previous generational changes including the change from the 30 series to the 40 series.
advancement naturally slows down as the technology matures. it's been slowing ever since the beginning. The compute and memory bandwidth improvement per year is getting smaller and smaller, the easy algorithms for raster tricks were found years ago, and it's getting harder and harder to apply the increase in compute resource that do occur to significantly improve image quality. That's why technologies like ray tracing, upscaling, and frame generation are important to squeeze out image quality improvements more efficiently.
You are mistaking erroneous frames being created faster for increased quality, when you are actually sacrificing quality for improved framerates, that to visually challenged people can appear to look good. In reality the number of artifacts per frame is rising quickly.
All computer graphics consist of erroneous frames. Rasterized graphics consist of grossly erroneous frames. There's no purity in your "real pixels". "Real pixels" aren't real. What matters is what the brain perceives. You don't call black frame insertion "erroneous frames", do you? Motion video itself is a trick. You are missing over 99% of the data, have you ever thought about that? You get one still image and then a jump to another still image some time later that is related spatially and temporally and your brain interpolates what must have happened in between. It's not real. A computer can also interpolate what happens in between. It's not worse than your mind in doing so, but it helps your mind along with the process. Your brain just needs to understand what is happening with the underlying motion these frames are trying to approximate. The generated frames help with that just as the calculated frames do. Looking at a still frame of a generated frame and seeing artifacts means nothing. If you rarely see the artifacts in full motion yet the image is less blurry and smoother then you have increased the image quality because you have created a more accurate and more comfortable perception for the brain.
No. You are mistaking resolution for absolute image quality. Image quality is not only about resolution, as most gamers think. "I play at 4K, so I have the best image quality possible". Actually, image quality depends mostly on many other things, like texture quality, polygons density, lights quality (and their representation). Moreover you have to take care also of the frame rate, as a super realistic still image doesn't allow you to play to anything even it is the best thing your eyes have ever seen.
Having limited resources, you have to compromise between all those features to get a good enough image. Now, if you take from the equation scalers and taking for granted you need a fast enough generation for smoothness, you end in having to make heavy compromises with everything, especially light rendering, to get an image at good native resolution and FPS. If you instead can increase rendering quality at a lower resolution and have it scaled, you can end up with a better image quality you could have without scaling. Of course it is still a compromise, as the more you scale the worse it will look nonetheless.
And that's why Nvidia DLSS is better than FSR from the quality point of view: because using inference, DLSS has to scale the image less (that is preserving more details) than what FSR needs to get the same compromises.
Now, everyone would like to have small 4090s at $300 capable of implementing all the new features that go beyond the now outdated raster tricks to render images at certain frequencies. Going beyond raster tricks allows engines to go beyond 2006 image quality with big flat surfaces, repeated geometric models and boring illumination. Unfortunately desire is not reality, and to reach some performances with new modern techniques, a big quantity of silicon is needed. Thus the fact that in 2023 still exists a difference between low level tier GPU and enthusiasts ones and we do not have all 4090 class GPUs in our hands for few bucks. If you don't like 4060, you are not obliged to buy it. Find a better board for the same money. But be sure that better is really better, and not because you have decided that using flat surfaces and boring illumination is better than using a full featured rendering just a little scaled (all using less energy).
Sorry, but RT is the least efficient way possible in terms of spending transistors vs return on performance. I would rather have 30% more shaders and raw rasterization power than RT at 30fps.
No it's not. Just look at the Alan Wake II screenshots above. The rasterized to path traced images represent a generational shift in image quality (like going from PS4 to PS5). There's no way to get that path-traced image quality and performance through devoting all the transistors to rasterized graphics. You couldn't even come close.
It depends on what you mean with "efficiency". RT is inefficient with respect to rasterization up to a certain quality level. Beyond that level, ray tracing is the only way to get an image with a quality that classic rasterization cannot achieve. Moreover ray tracing, from the point of view of the developer, is a blessing, for many reason. First because it doesn't need to "optimize", that is continuously find tricks to get around performance issue for certain effects. Moreover, a pure RT engine can evolve in time together with the HW, so it would be possible to keep old games at top image quality without having to create "remastered" versions of it. Pure RT rendering engines are still far away in time for sure. But we have reached a point were pure rasterization and all the tricks it requires are no more better in quality, development time and calculation time than partially ray traced parts of the image. I know that still having GPUa that can't keep the pace of the leader in terms of ray tracing power, makes someone think that RT is just an unneeded plus. But I bid my money that as always in the past, as soon as the underdog become capable of the same performance (or near it) as the competitor, its fans will find RT as the best thing after sliced buttered bread, speaking of rasterization as the "obsolete way to get things fixed" (if you know what requires having a complete scene rendered without glitches in a free roaming world where the user can look at it in many different ways. Of course limited games like COD can still be done with pure rasterization and none will ever notice a problem (nor an improvement too).
Blame Nvidia, they are introducing these stupid quantity over quality "features", unfortunately AMD has to follow because of the average GPU buyers indifference to graphical fidelity. I wish they could spend all those resources on actual improvements instead.
" AMD has to follow because GPU buyers want greater image quality."
sorry but dlss/fsr, doesnt do that, it IS a crutch, as stated above, the 4060 with dlss gives the performance the 4060 SHOULD of had with out it.. the ONLY 40 series card that is actually worth the price for the performance it gives, is the 4090, the rest are crap.
"the 4060 with dlss gives the performance the 4060 SHOULD of had with out it" Yes, that board it is called 4070Ti, which has a different price and power consumption. As 4060 has NOT 4070Ti rendering capabilities in order to gain the same FPS with same image quality without DLSS, then DLSS is needed. Or you have to cut on rendering features, like illumination, light effects, rendering distance and so on. If you think 4060 is really that bad, but the 4070/ti or a equivalent priced AMD and use the limited raster only rendering engine with them (and of course be prod to do it in native resolution, not scaled). Twice the power consumption.
The only things that provide better image quality are: DLAA and now FSR3 Native AA.
Upscaling never looks better than native. The two solutions above clean up potato-tier TAA to make the final image much closer to what native should be at all times.
Unfortunately, people really do believe the "better-than-native" image quality narrative. No, DLSS and FSR are just better at TAA than game implementation and remove the potato TAA from the equation entirely. It's a false equivalency, as an upscaled image is always softer/blurrier than a natively rendered one, especially without TAA. DLSS tries its best, but there's still missing information per-pixel. It's close enough that people will take the extra framerate.
No, all resources are scarce. You are operating under a false assumption that somehow you could have the same features and just cut out the upscaling and result in something that looks better. While it is true that an image rendered with DLAA in native resolution will look better than one that is upscaled, such a solution would result in much lower performance. Once a performance target is set, the best image quality is achieved by increasing quality-improving graphics settings (such as path tracing, but it also holds true for entirely rasterized graphics) and using DLSS upscaling rather than having lower-quality graphics settings run entirely in native resolution.
I feel bad for people that stick the letter "R" at the end of the word "game" and fancy themselves chasing after computer hardware at great personal cost. It hasn't been such a good time for people that have difficulty triggering the reward response in their brain other ways. Addiction is a terrible thing.
Think of all the greens fees people pay. Hi-Fi stereos. Quadcopters and GoPros. Skateboards with fancy wheels. Land for a garden. Espresso machines. Hobbies are the scourge of man.
Does not any review how your articles appear on mobile devices? You cannot even read the articles due to ads all over the top of the article. This has been going on for a while. Even trying to leave a comment is impossible because an ad appears over the area to leave a comment. I am using chrome and this is on more than one device. Come on guys. This is rediculous.
The issue with these kind of complaints is that the AnandTech staff can't exactly bite the hand that feeds them and say "Yeah, our publisher has done us dirty on ads and we hate it."
The ads are terrible, and I'm sure that they know it. But if you've ever seen another Future site, they're all like this. So it's clearly being driven by decisions made at the top. Future really seems to be trying to squeeze mobile viewers right now.
Go ahead and install an ad blocker and call it a day. It's better to be able to read an article than not.
There is very little content in these articles anyway and most of them are essentially a misuse of what was previously an independent hardware reviewer to post company product announcements so if one were to eliminate advertising, there would be VERY little here to read. Don't be upset, just vacate the area and read about tech things from a site that hasn't sold out to the machine (but even then, keep your salt shaker within reach since companies that produce hardware are empowered to force reviewer action by withholding hardware to prevent those seemingly all important moment-of-NDA-lift-gives-us-all-our-revenue reviews).
This makes me sad. I started reading this site in 1998 and it occupied my home page for probably 4-5 years back then. For the most part I rarely come here anymore.
Several years ago when I was moving I stumbled upon an old *printed* review of the Stealth II S220 done by Anandtech. I'm guessing I was looking at it on dialup and printed it so I could read it without tying up the phone line.
If people did as 29a did then get ready to pay for more paywalled content. Netflix/Disney+ subscriptions, now let's do that with websites like Anandtech, Engadget, ArsTechnica, etc.
A Paywall may have worked when anand was still here. I've also been on this site since 1999. Now however that would not work you will just bleed the rest of site with those of us that are still here.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
43 Comments
Back to Article
Kurosaki - Friday, August 25, 2023 - link
Feels like gpu manufacturers don't even care any more. Like, screw IQ, were upscaling and interpolate to reduce your IQ to new levels, and for what? 50% performance gains!Soon, they'll just post gray pictures in 599fps. Calling it the ultimate performance card, for only 1299usd.
nandnandnand - Friday, August 25, 2023 - link
Don't use those features, then.TheinsanegamerN - Tuesday, August 29, 2023 - link
Then you are left with terrible performance because devs no longer optimize thanks to said "features".Almost like said features have an adverse effect that people dont like or something, and "just ignore it" is pathetic cope.
mukiex - Wednesday, August 30, 2023 - link
Pathetic cope is just assuming every dev is releasing poor software because a handful of console ports land that way.People are dropping 60fps videos with frame time graphs for just about anything even vaguely popular, so finding the games that run well is way less difficult than it was a decade ago.
Plenty of studios are making well-optimized games. Buy those.
Dante Verizon - Friday, August 25, 2023 - link
The main problem for me is the quality of AAA games deteriorating with the appearance of these magical technologies...meacupla - Friday, August 25, 2023 - link
AMD/nvidia/Intel: Adds features to improve frame rates on lower end hardwareGame Devs: Doesn't bother optimizing their game and uses FSR as crutch
Yojimbo - Friday, August 25, 2023 - link
These features improve image quality. If you bought a card costing the same that focused entirely on maximizing raster graphics performance with no upscaling or frame generation or ray tracing and you had a game developed to cater specifically to that you'd get something that looks a whole lot worse than if you bought a card that was instead designed for these technologies and a game that was developed to take advantage of it. It's a generational difference. Just compare the Alan Wake II screenshots.StevoLincolnite - Friday, August 25, 2023 - link
Except companies like nVidia are released garbage GPU's like the Geforce 4060 which isn't improving performance over the previous generation... But instead are using DLSS as a "Crutch" to advertise higher performance.Yojimbo - Saturday, August 26, 2023 - link
The 4060 isn't garbage. It costs less, considering inflation, and uses less power than the 3060 did at launch. And it has the capability of DLSS 3DLSS isn't a crutch. It's a smarter way to apply compute resources. If you made a GPU with the same number of transistors as the 4060 but with only raster capabilities, no dlss, and sold it for 250 it would be far worse value than the 300 dollar 4060. Sure it would be faster with "real" (I'm using that word mockingly to people who call it that) rasterized pixels, but the image quality would be worse and the fps lower.
meacupla - Saturday, August 26, 2023 - link
4060 is garbage. 8GB VRAM in 2023 isn't cutting it. It's why the 3060 with 12GB is dominating sales.nandnandnand - Saturday, August 26, 2023 - link
3060's 12 GB of VRAM was an anomaly in the broken Ampere lineup (more than the 3080 with 10 GB), and it's a popular choice for babby's first AI GPU. The 4060 is still faster than the 3060 at 1080p/1440p, even if the 8 GB of VRAM isn't going to age very well or already falls flat in some games.Qasar - Sunday, August 27, 2023 - link
" DLSS isn't a crutch" go look at reviews of the 4060 vs the 3060. with out DLSS the 4060 is slower. the reality is, the 4060, really should be labeled as a 4050. nvidia castrated the cards way too much vs previous gen. the 3060 even has a faster memory bus then the 4060Yojimbo - Tuesday, August 29, 2023 - link
What does the market position of the 4060 have to do with DLSS? Both the 30 series and the 40 series of cards are pursuing the same resource paradigm, anyway. You need to compare it with a card, real or hypothetical, that is pursuing a paradigm of purely traditional rasterization.The way a comparison should be made is with resources to produce the card versus quality of output, which include image quality and performance. Power is also a relevant factor but I'm pretty sure that would come down on the side of the DLSS/ray tracing paradigm, and, anyway, the image quality differences would be so great that it would be like comparing apples to oranges, power-wise.
As far as the 4060, it's not a direct replacement for the 3060. It's far cheaper, adjusted for inflation (CPI-adjusted price of the Feb 2021 $329 3060 would be $380 now. The 4060 was released at $299), and it uses far less power. Nvidia's product stack was compressed in the Turing generation (20 series), and since then they have been stretching it back out. The 16 series was a one-off dead end because Nvidia could not cover the entire stack with the new paradigm in the first generation. So the 2060 was actually a large and power hungry (440 mm^2 and 160 W) compared to the 1060 (200 mm^2 and 120 W). The difference in size is somewhat inflated by the fact that they are on the same-generation process technology (16 nm vs 12 nm) and cost per transistor had gone down in the ensuing 2 years, but even adjusted for that, the 2060 is a much larger chip. The 4060 is now back down to 146 mm^2 and 115 W. For comparison, the 3060 is 276 mm^2 and 170 W, so you can see the trend of reduced die area and/or power after the jump from the 1060 to the 2060 down to the 4060. The $249 price of the 1060 at launch in 2016 is an inflation-adjusted $316 today, whereas the 4060 has launched for $299. In fact the die size suggests that the 4060 occupies a position in the stack akin to somewhere between a 1050 Ti and a 1060, but is slightly more expensive. All the narratives about the 4060 and 4060 Ti being bad values and out of line with prior generations are the result of flawed analysis.
Nvidia is likely done re-stretching their product stack now. Also, although inflation is likely to remain higher over the next 2 years than it was prior to 2021, the big hump of inflation is past us. When the 50 series comes out these hardware sites will be celebrating the miraculous return to gen on gen value increases, probably attributing it to competition with Intel, or something, when in fact it will be pretty much in line with previous generational changes including the change from the 30 series to the 40 series.
Kurosaki - Tuesday, August 29, 2023 - link
DLSS is just a way to run games in 1080p but say you can run it in 4k. It's a sham.Dante Verizon - Friday, August 25, 2023 - link
The advancement in graphics is precarious compared to the requirements, I only see signs of poorly optimized games tbhYojimbo - Saturday, August 26, 2023 - link
advancement naturally slows down as the technology matures. it's been slowing ever since the beginning. The compute and memory bandwidth improvement per year is getting smaller and smaller, the easy algorithms for raster tricks were found years ago, and it's getting harder and harder to apply the increase in compute resource that do occur to significantly improve image quality. That's why technologies like ray tracing, upscaling, and frame generation are important to squeeze out image quality improvements more efficiently.Zoolook - Saturday, August 26, 2023 - link
You are mistaking erroneous frames being created faster for increased quality, when you are actually sacrificing quality for improved framerates, that to visually challenged people can appear to look good.In reality the number of artifacts per frame is rising quickly.
Yojimbo - Saturday, August 26, 2023 - link
All computer graphics consist of erroneous frames. Rasterized graphics consist of grossly erroneous frames. There's no purity in your "real pixels". "Real pixels" aren't real. What matters is what the brain perceives. You don't call black frame insertion "erroneous frames", do you? Motion video itself is a trick. You are missing over 99% of the data, have you ever thought about that? You get one still image and then a jump to another still image some time later that is related spatially and temporally and your brain interpolates what must have happened in between. It's not real. A computer can also interpolate what happens in between. It's not worse than your mind in doing so, but it helps your mind along with the process. Your brain just needs to understand what is happening with the underlying motion these frames are trying to approximate. The generated frames help with that just as the calculated frames do. Looking at a still frame of a generated frame and seeing artifacts means nothing. If you rarely see the artifacts in full motion yet the image is less blurry and smoother then you have increased the image quality because you have created a more accurate and more comfortable perception for the brain.CiccioB - Tuesday, August 29, 2023 - link
No.You are mistaking resolution for absolute image quality.
Image quality is not only about resolution, as most gamers think. "I play at 4K, so I have the best image quality possible".
Actually, image quality depends mostly on many other things, like texture quality, polygons density, lights quality (and their representation). Moreover you have to take care also of the frame rate, as a super realistic still image doesn't allow you to play to anything even it is the best thing your eyes have ever seen.
Having limited resources, you have to compromise between all those features to get a good enough image.
Now, if you take from the equation scalers and taking for granted you need a fast enough generation for smoothness, you end in having to make heavy compromises with everything, especially light rendering, to get an image at good native resolution and FPS.
If you instead can increase rendering quality at a lower resolution and have it scaled, you can end up with a better image quality you could have without scaling.
Of course it is still a compromise, as the more you scale the worse it will look nonetheless.
And that's why Nvidia DLSS is better than FSR from the quality point of view: because using inference, DLSS has to scale the image less (that is preserving more details) than what FSR needs to get the same compromises.
Now, everyone would like to have small 4090s at $300 capable of implementing all the new features that go beyond the now outdated raster tricks to render images at certain frequencies. Going beyond raster tricks allows engines to go beyond 2006 image quality with big flat surfaces, repeated geometric models and boring illumination. Unfortunately desire is not reality, and to reach some performances with new modern techniques, a big quantity of silicon is needed. Thus the fact that in 2023 still exists a difference between low level tier GPU and enthusiasts ones and we do not have all 4090 class GPUs in our hands for few bucks.
If you don't like 4060, you are not obliged to buy it. Find a better board for the same money. But be sure that better is really better, and not because you have decided that using flat surfaces and boring illumination is better than using a full featured rendering just a little scaled (all using less energy).
Dante Verizon - Tuesday, August 29, 2023 - link
Sorry, but RT is the least efficient way possible in terms of spending transistors vs return on performance. I would rather have 30% more shaders and raw rasterization power than RT at 30fps.Yojimbo - Tuesday, August 29, 2023 - link
No it's not. Just look at the Alan Wake II screenshots above. The rasterized to path traced images represent a generational shift in image quality (like going from PS4 to PS5). There's no way to get that path-traced image quality and performance through devoting all the transistors to rasterized graphics. You couldn't even come close.CiccioB - Tuesday, August 29, 2023 - link
It depends on what you mean with "efficiency".RT is inefficient with respect to rasterization up to a certain quality level.
Beyond that level, ray tracing is the only way to get an image with a quality that classic rasterization cannot achieve.
Moreover ray tracing, from the point of view of the developer, is a blessing, for many reason.
First because it doesn't need to "optimize", that is continuously find tricks to get around performance issue for certain effects.
Moreover, a pure RT engine can evolve in time together with the HW, so it would be possible to keep old games at top image quality without having to create "remastered" versions of it.
Pure RT rendering engines are still far away in time for sure. But we have reached a point were pure rasterization and all the tricks it requires are no more better in quality, development time and calculation time than partially ray traced parts of the image.
I know that still having GPUa that can't keep the pace of the leader in terms of ray tracing power, makes someone think that RT is just an unneeded plus. But I bid my money that as always in the past, as soon as the underdog become capable of the same performance (or near it) as the competitor, its fans will find RT as the best thing after sliced buttered bread, speaking of rasterization as the "obsolete way to get things fixed" (if you know what requires having a complete scene rendered without glitches in a free roaming world where the user can look at it in many different ways. Of course limited games like COD can still be done with pure rasterization and none will ever notice a problem (nor an improvement too).
Zoolook - Saturday, August 26, 2023 - link
Blame Nvidia, they are introducing these stupid quantity over quality "features", unfortunately AMD has to follow because of the average GPU buyers indifference to graphical fidelity.I wish they could spend all those resources on actual improvements instead.
Yojimbo - Saturday, August 26, 2023 - link
AMD has to follow because GPU buyers want greater image quality.Qasar - Sunday, August 27, 2023 - link
" AMD has to follow because GPU buyers want greater image quality."sorry but dlss/fsr, doesnt do that, it IS a crutch, as stated above, the 4060 with dlss gives the performance the 4060 SHOULD of had with out it.. the ONLY 40 series card that is actually worth the price for the performance it gives, is the 4090, the rest are crap.
CiccioB - Tuesday, August 29, 2023 - link
"the 4060 with dlss gives the performance the 4060 SHOULD of had with out it"Yes, that board it is called 4070Ti, which has a different price and power consumption.
As 4060 has NOT 4070Ti rendering capabilities in order to gain the same FPS with same image quality without DLSS, then DLSS is needed.
Or you have to cut on rendering features, like illumination, light effects, rendering distance and so on.
If you think 4060 is really that bad, but the 4070/ti or a equivalent priced AMD and use the limited raster only rendering engine with them (and of course be prod to do it in native resolution, not scaled). Twice the power consumption.
JasonMZW20 - Monday, August 28, 2023 - link
The only things that provide better image quality are: DLAA and now FSR3 Native AA.Upscaling never looks better than native. The two solutions above clean up potato-tier TAA to make the final image much closer to what native should be at all times.
Unfortunately, people really do believe the "better-than-native" image quality narrative. No, DLSS and FSR are just better at TAA than game implementation and remove the potato TAA from the equation entirely. It's a false equivalency, as an upscaled image is always softer/blurrier than a natively rendered one, especially without TAA. DLSS tries its best, but there's still missing information per-pixel. It's close enough that people will take the extra framerate.
Yojimbo - Tuesday, August 29, 2023 - link
No, all resources are scarce. You are operating under a false assumption that somehow you could have the same features and just cut out the upscaling and result in something that looks better. While it is true that an image rendered with DLAA in native resolution will look better than one that is upscaled, such a solution would result in much lower performance. Once a performance target is set, the best image quality is achieved by increasing quality-improving graphics settings (such as path tracing, but it also holds true for entirely rasterized graphics) and using DLSS upscaling rather than having lower-quality graphics settings run entirely in native resolution.PeachNCream - Friday, August 25, 2023 - link
I feel bad for people that stick the letter "R" at the end of the word "game" and fancy themselves chasing after computer hardware at great personal cost. It hasn't been such a good time for people that have difficulty triggering the reward response in their brain other ways. Addiction is a terrible thing.Yojimbo - Friday, August 25, 2023 - link
Think of all the greens fees people pay. Hi-Fi stereos. Quadcopters and GoPros. Skateboards with fancy wheels. Land for a garden. Espresso machines. Hobbies are the scourge of man.schlock - Friday, August 25, 2023 - link
Repeat after me. Generated frames are not FPSYojimbo - Saturday, August 26, 2023 - link
Except they are. They don't reduce latency, but they reduce blur and increase smoothness.boozed - Sunday, August 27, 2023 - link
"they reduce blur"...huh?
Yojimbo - Tuesday, August 29, 2023 - link
Here's an article that will explain it better than I can: https://blurbusters.com/blur-busters-law-amazing-j...ratbert1 - Friday, August 25, 2023 - link
Does not any review how your articles appear on mobile devices? You cannot even read the articles due to ads all over the top of the article. This has been going on for a while. Even trying to leave a comment is impossible because an ad appears over the area to leave a comment. I am using chrome and this is on more than one device. Come on guys. This is rediculous.ViRGE - Friday, August 25, 2023 - link
The issue with these kind of complaints is that the AnandTech staff can't exactly bite the hand that feeds them and say "Yeah, our publisher has done us dirty on ads and we hate it."The ads are terrible, and I'm sure that they know it. But if you've ever seen another Future site, they're all like this. So it's clearly being driven by decisions made at the top. Future really seems to be trying to squeeze mobile viewers right now.
Go ahead and install an ad blocker and call it a day. It's better to be able to read an article than not.
PeachNCream - Saturday, August 26, 2023 - link
There is very little content in these articles anyway and most of them are essentially a misuse of what was previously an independent hardware reviewer to post company product announcements so if one were to eliminate advertising, there would be VERY little here to read. Don't be upset, just vacate the area and read about tech things from a site that hasn't sold out to the machine (but even then, keep your salt shaker within reach since companies that produce hardware are empowered to force reviewer action by withholding hardware to prevent those seemingly all important moment-of-NDA-lift-gives-us-all-our-revenue reviews).29a - Monday, August 28, 2023 - link
This makes me sad. I started reading this site in 1998 and it occupied my home page for probably 4-5 years back then. For the most part I rarely come here anymore.BigDH01 - Wednesday, August 30, 2023 - link
Several years ago when I was moving I stumbled upon an old *printed* review of the Stealth II S220 done by Anandtech. I'm guessing I was looking at it on dialup and printed it so I could read it without tying up the phone line.erotomania - Wednesday, August 30, 2023 - link
I bought a Stealth II S220 at Frys.29a - Monday, August 28, 2023 - link
Easy fix, use an ad blocker. If they want to make the ads so ridiculous that they cover half the page then they don't deserve ad revenue.Shlong - Thursday, August 31, 2023 - link
If people did as 29a did then get ready to pay for more paywalled content. Netflix/Disney+ subscriptions, now let's do that with websites like Anandtech, Engadget, ArsTechnica, etc.Makaveli - Wednesday, September 6, 2023 - link
A Paywall may have worked when anand was still here. I've also been on this site since 1999. Now however that would not work you will just bleed the rest of site with those of us that are still here.