Frame pacing is an additional tool we run from time to time as is appropriate, but it's not something we'll use for every review. Frame pacing is largely influenced by drivers and hardware, neither of which shift much on a review-by-review basis. So it's primarily reserved for multi-GPU articles and new architectures as appropriate.
And especially in the case of single-GPU setups, there's not much to look at. None of these cards has trouble delivering frames at a reasonably smooth pace.
Yeah, that's what you guys said before the whole frame latency thing broke, too. It's a shame you aren't doing proper monitoring to catch it the first time and are setting up a scenario where it flies under the radar yet again the next time AMD decides to get lax on making drivers.
Then again, this article is in red, right? AMD News is right next to it. Hell, even the comment button is red. I'm guessing the AMD overlords wouldn't like it very much if you were constantly harping on something they dropped the ball on so completely that their competitor had to slowly explain to them how to even see the problem and then how to fix it.
It's a shame. I'm with your argument. AnandTech should try to include as many indicative benchmarks as possible. At times FCAT is indicative.
But sadly, calling someone a shill with only coincidence is no better than libel. You have made an unsubstantiated allegation. It is decidedly unscientific to insult one's professional integrity with mere coincidental insinuations and no evidence. Why would you do that?
I don't know where you got all the other brands from, but technically yes Ars is in the pockets of AMD. See http://www.anandtech.com/portal/amd - this is sponsored by AMD.
I know _that_. But he is clearly insinuating that their opinions are bought by AMD. And since products from all those companies I listed (who are all competitors) regularly get recommendations, and Anandtech gets then accused of being paid shills, I find it funny that anyone thinks that is true. If they are bought by AMD as suggested, how come they don't come up with a benchmark track that makes AMD CPUs shine? Or how come they slammed the R9 so much for the noise? It's all pretty silly.
It seems that it hurts you how come this site is not biased and doesn't admire every thing Nvidia, like other sites? well you can go read Tom's Hardware, WCCFtech and every other hardware site, and be sure they will satisfy your needs.
Yea, it's like trying to compare samsung to apple again, sure you can say there is no way to compare which one is better hardware considered, the user experience is just not on pair...
It's not just a tool for flagging up multi-screen/multi-GPU stutter issues. By showing the distribution of frame times, you can tell the difference between two cards that both average 50fps, but where one delivers every frame in 20 +/- 1 ms, and the other at +/- 5. The latter will deliver a much smoother output, which is not apparent from a single-number metric.
Anandtech readers are a pretty smart bunch. We'd much prefer Graph Overload to too little information, particularly when other sites provide the additional information as standard.
This rebadge game is terrible. Both parties are guilty here but the fact that these products get reviewed on regular basis just makes no sense.
Here is a proper review of the of the 265 and 260: Lets confuse everyone with new nomenclature and what basically is a 7850 from close to 2 years ago. These cards are nothing but cash cows for the mainstream.
This card is being launched to draw attention to AMD just before the GTX 750 Ti comes to the market, its nothing but emptying the stocks of poor quality chips.
THis has always been the case in the not-high-tier cards. Where have you been? I love these cards. What's wrong with an "update" to tried and true cards. I love these single-pcie-power cards. Besides, most games will prob still be targeting this level of power considering it's comparable to the new consoles.
GeForce4 MX was a modified GeForce2. While not a straight same-component relabel, it WAS intentionally-misleading branding meant to make people think it was an upgrade from the GeForce 3 rather than a downgrade, and was very much in the spirit of the modern rebadge.
That's the earliest example that springs to mind here.
I agree that your example, the GeForce4 MX is one of the earliest, and probably one of the most misleading rebrands ever.
The first video card I ever purchased was a rebrand that occurred a year earlier, though: the lowly Radeon 7000. It was the exact same card as the previous RV100-based Radeon LE, but they gave it a flashy new name when they introduced the new Radeon 7500 with the RV200 chip.
The GeForce 4 MX was probably a worse release than the 2 MX in that the latter was indeed based on the GeForce 2, but lacked the hardware transform and lighting capabilities of the GeForce 2 GTS and the earlier GeForce 256. The GeForce 2 MX 400 was the only model that had a chance of beating the 256 DDR.
The Radeon 9000 was a hacked down 8500 LE, but I suppose considering the low number in the 9xxx series, there had to be a low-end part. Besides which, it did still have T&L.
I can't think of any straight rebrands from back then apart from the 7000.
While you may be right... AMD/Ati does like throwing popular configurations into the mix.. The 265 reminds me a lot of the 4830 and while that card was fairly short lived it was a hot seller for them as it straddled two performance areas but came in at a nicer price point.
Indeed I swapped from being a longtime Nvidia user to AMD back in 2009 as I got fed up with Nvidia regurgitating the old 8800 chips three times in a row for the mid level.
Stuff doesn't have to change radically performance wise but its nice to know new features are added and other things get revised and tweaked. A simple name change isn't enough really.
I'm actually happy they're finally making use of that last digit in their 3-number scheme. From my point of view they could have ditched the X altogether and make the R9-270X an R9-275 (or whatever is appropriate). And speaking of R9: they could have given the R7 265 the rating R9 265 to more closely connect it with R9 270. Or just drop that prefix as well, if the numbers don't overlap anyway and the R9/7/3 is not related to features either!
Speaking about the cards: - boost clocks additional 25 MHz again? I have no idea why these are there. Make it 100+ MHz of leave it. - 1.175 V for a mere 925 MHz? The chip should be able to do 1.0 GHz at ~1.0 V, maybe 1.10 V for guaranteed clocks - same for R7 260 - that voltage is ridiculously high
Anyway, the cards themselves are fine (just like the 7000 series) and the coolers really fit them.
The single GPU frame latency issue has been fixed for more than six months. I doubt it's going to become a problem again like with AMD's handling of 2D a while back.
There are remarks concerning the availability of the R9 270 series and the inability for these parts to keep to their RRP, both of which may not be present if this was some sort of fanboy review.
I had a friend with a 6950 and he was furious that his video card would never idle down in gpu/memory frequencies when he had a second monitor connected.
I personally have a 6850 and two 20" LCDs connected over DVI. I have not looked for the same behaviour but would not be surprised if it were the same.
Power efficiencies are out the window once the user chooses to go multi-monitor to be more productive.e
I have the same issue with my HD7770 to a lesser extent and my workaround for that is connecting my two secondary displays on the integrated Intel GPU. This saves a significant amount of power.
Who is surprised by that? No one that is following GPU reviews since multi-monitors became a thing for the consumer crowd. The first few generations had issues with monitors flickering in a multi-monitor setup because of too aggressive down clocking, so now they are being very conservative there and increase the clocks quite a bit.
I don't think it's acceptable, though. AMD might have reduced idle consumption when a single monitor is being driven, but is still neglecting other usage scenarios that are becoming increasingly common. It's not even just a small power difference, especially with medium to high-end video cards.
STFU. Anandtech had no problem calling 290 - 290X a terrible card because of its blower and the NOISE and crowning Nvidia once again. Much more now with AMD price hike.
I don't think the price hikes are AMD's fault per se, however considering the inflated prices due to bitmining, you'd definitely want a better cooler than the stock one. The third party cards handle this nicely.
If this is just the 7xxx rebadge, then I guess it has the same multi GPU frame pacing issues....LOL, why dont they fix the damn thing FFS!...no CF Eyefinity, no DX9 pacing.....still shit!
With the naming of the card being so close to the 260X, I was really hoping this would be a faster GN1.1 part. How does AMD expect TrueAudio to catch on if they keep releasing card that don't support it? Hopefully the 300 series will sort this out and I can grab one to play Thief on.
A user playing with the s260 will typically have dual core i3. Thats reality. Try that with or without mantle in Multiplayer 64 man on the big maps. It the difference between playable or not playable. Probably more than 50% difference in favor of mantle. Instaed we get this useless talk.
What you're getting here is the equivalent of an Xbox One for $119 or PS4 for $149. It took Nvidia and ATI about 12-18 months just to release a video card of equal or better performance for under $199 after the seventh generation's (Xbox 360/PS3) debut. The fact that it only took 3 months to get to this level for under $150 during this generation only shows just how much $ony and M$FT low balled it's customers on specs.
Except you then have to factor in the rest of the hardware to that price. Think about it - CPU, cooling, motherboard, memory (I don't think 8GB of GDDR5 is cheap), storage, case, power supply, software and the all-important input devices. Add in the fact that developers will get more out of the console GPUs than with the PC and I think you're ragging on them a bit too much.
You mean 4GBs of DDR3. And likely high CAS latency too. Low-watt GPUs. Software only costs how much you pay the employees. Input ports are part of the mobo. DEVs do not get more out of the console GPUs. They are actually underclocked so that you don't need desktop-grade cooling. Consoles will never be serious gaming machines. People who buy consoles either won't spend the money on a good PC, can't spend the money, or would rather spend the money on dozens of games that they'll only play a couple of hours of and then just stick to one game.
270's inflated prices are directly the result of cryptocoin mining as it has been found to offer an advantageous Kilo-hash to Watt ratio. It would be interesting and helpful to many out there if Anandtech started publishing KH/sec and KH/watt metrics in its review for Scrypt mining.
How is that possible when all of the pricing is fake? You are ignoring REAL pricing much like anandtech. They should draw conclusions based on REAL pricing, and ignore ALL companies MSRP. If I can't buy it, it's still fake until I can for MSRP. IE, 290x is $700 right now (actually $709 cheapest on amazon - 3 in stock), NOT $550. So reviews based on $550 pricing are not real. Anandtech continues to give the benefit of the doubt 'one day it might be MSRP and a good deal if they can get to MSRP quickly'...LOL. Is a $709 290x a good deal vs. 780ti? NOPE.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
52 Comments
Back to Article
edzieba - Thursday, February 13, 2014 - link
Are Anandtech considering a switch from average framerates to latency/frame-rating (either with Fraps or FCAT)?Ryan Smith - Thursday, February 13, 2014 - link
Frame pacing is an additional tool we run from time to time as is appropriate, but it's not something we'll use for every review. Frame pacing is largely influenced by drivers and hardware, neither of which shift much on a review-by-review basis. So it's primarily reserved for multi-GPU articles and new architectures as appropriate.And especially in the case of single-GPU setups, there's not much to look at. None of these cards has trouble delivering frames at a reasonably smooth pace.
http://www.anandtech.com/bench/GPU14/873
HisDivineOrder - Thursday, February 13, 2014 - link
Yeah, that's what you guys said before the whole frame latency thing broke, too. It's a shame you aren't doing proper monitoring to catch it the first time and are setting up a scenario where it flies under the radar yet again the next time AMD decides to get lax on making drivers.Then again, this article is in red, right? AMD News is right next to it. Hell, even the comment button is red. I'm guessing the AMD overlords wouldn't like it very much if you were constantly harping on something they dropped the ball on so completely that their competitor had to slowly explain to them how to even see the problem and then how to fix it.
gdansk - Thursday, February 13, 2014 - link
It's a shame. I'm with your argument. AnandTech should try to include as many indicative benchmarks as possible. At times FCAT is indicative.But sadly, calling someone a shill with only coincidence is no better than libel. You have made an unsubstantiated allegation. It is decidedly unscientific to insult one's professional integrity with mere coincidental insinuations and no evidence. Why would you do that?
Death666Angel - Thursday, February 13, 2014 - link
So they are in the pocket of nVidia, Intel, AMD, Android AND Apple? Wow, those companies must really be idiots then.Gigaplex - Thursday, February 13, 2014 - link
I don't know where you got all the other brands from, but technically yes Ars is in the pockets of AMD. See http://www.anandtech.com/portal/amd - this is sponsored by AMD.Gigaplex - Thursday, February 13, 2014 - link
Bah, AnandTech, not ArsDeath666Angel - Thursday, February 13, 2014 - link
I know _that_. But he is clearly insinuating that their opinions are bought by AMD. And since products from all those companies I listed (who are all competitors) regularly get recommendations, and Anandtech gets then accused of being paid shills, I find it funny that anyone thinks that is true. If they are bought by AMD as suggested, how come they don't come up with a benchmark track that makes AMD CPUs shine? Or how come they slammed the R9 so much for the noise? It's all pretty silly.nader_21007 - Friday, February 14, 2014 - link
It seems that it hurts you how come this site is not biased and doesn't admire every thing Nvidia, like other sites? well you can go read Tom's Hardware, WCCFtech and every other hardware site, and be sure they will satisfy your needs.zodiacsoulmate - Friday, February 14, 2014 - link
Yea, it's like trying to compare samsung to apple again, sure you can say there is no way to compare which one is better hardware considered, the user experience is just not on pair...edzieba - Thursday, February 13, 2014 - link
It's not just a tool for flagging up multi-screen/multi-GPU stutter issues. By showing the distribution of frame times, you can tell the difference between two cards that both average 50fps, but where one delivers every frame in 20 +/- 1 ms, and the other at +/- 5. The latter will deliver a much smoother output, which is not apparent from a single-number metric.Anandtech readers are a pretty smart bunch. We'd much prefer Graph Overload to too little information, particularly when other sites provide the additional information as standard.
Death666Angel - Thursday, February 13, 2014 - link
Sounds like the first (+/- 1 ms) would deliver the smoother experience than the latter (+/- 5 ms). :)edzieba - Monday, February 17, 2014 - link
Urp, that's what I /meant/ to write...Cellar Door - Thursday, February 13, 2014 - link
This rebadge game is terrible. Both parties are guilty here but the fact that these products get reviewed on regular basis just makes no sense.Here is a proper review of the of the 265 and 260: Lets confuse everyone with new nomenclature and what basically is a 7850 from close to 2 years ago. These cards are nothing but cash cows for the mainstream.
This card is being launched to draw attention to AMD just before the GTX 750 Ti comes to the market, its nothing but emptying the stocks of poor quality chips.
MrSpadge - Thursday, February 13, 2014 - link
Guess what: they wouldn't build GPUs if they wouldn't expect them to be cash cows!EnzoFX - Thursday, February 13, 2014 - link
THis has always been the case in the not-high-tier cards. Where have you been? I love these cards. What's wrong with an "update" to tried and true cards. I love these single-pcie-power cards. Besides, most games will prob still be targeting this level of power considering it's comparable to the new consoles.Death666Angel - Thursday, February 13, 2014 - link
First time I can remember there being rebadging was with the 8xxx / 9xxx series from nVidia, before that I don't remember it happening.LordOfTheBoired - Friday, February 14, 2014 - link
GeForce4 MX was a modified GeForce2. While not a straight same-component relabel, it WAS intentionally-misleading branding meant to make people think it was an upgrade from the GeForce 3 rather than a downgrade, and was very much in the spirit of the modern rebadge.That's the earliest example that springs to mind here.
rallyhard - Friday, February 14, 2014 - link
I agree that your example, the GeForce4 MX is one of the earliest, and probably one of the most misleading rebrands ever.The first video card I ever purchased was a rebrand that occurred a year earlier, though: the lowly Radeon 7000. It was the exact same card as the previous RV100-based Radeon LE, but they gave it a flashy new name when they introduced the new Radeon 7500 with the RV200 chip.
silverblue - Friday, February 14, 2014 - link
The GeForce 4 MX was probably a worse release than the 2 MX in that the latter was indeed based on the GeForce 2, but lacked the hardware transform and lighting capabilities of the GeForce 2 GTS and the earlier GeForce 256. The GeForce 2 MX 400 was the only model that had a chance of beating the 256 DDR.The Radeon 9000 was a hacked down 8500 LE, but I suppose considering the low number in the 9xxx series, there had to be a low-end part. Besides which, it did still have T&L.
I can't think of any straight rebrands from back then apart from the 7000.
just4U - Thursday, February 13, 2014 - link
While you may be right... AMD/Ati does like throwing popular configurations into the mix.. The 265 reminds me a lot of the 4830 and while that card was fairly short lived it was a hot seller for them as it straddled two performance areas but came in at a nicer price point.jabber - Friday, February 14, 2014 - link
Indeed I swapped from being a longtime Nvidia user to AMD back in 2009 as I got fed up with Nvidia regurgitating the old 8800 chips three times in a row for the mid level.Stuff doesn't have to change radically performance wise but its nice to know new features are added and other things get revised and tweaked. A simple name change isn't enough really.
MrSpadge - Thursday, February 13, 2014 - link
I'm actually happy they're finally making use of that last digit in their 3-number scheme. From my point of view they could have ditched the X altogether and make the R9-270X an R9-275 (or whatever is appropriate). And speaking of R9: they could have given the R7 265 the rating R9 265 to more closely connect it with R9 270. Or just drop that prefix as well, if the numbers don't overlap anyway and the R9/7/3 is not related to features either!Speaking about the cards:
- boost clocks additional 25 MHz again? I have no idea why these are there. Make it 100+ MHz of leave it.
- 1.175 V for a mere 925 MHz? The chip should be able to do 1.0 GHz at ~1.0 V, maybe 1.10 V for guaranteed clocks
- same for R7 260 - that voltage is ridiculously high
Anyway, the cards themselves are fine (just like the 7000 series) and the coolers really fit them.
silverblue - Thursday, February 13, 2014 - link
The single GPU frame latency issue has been fixed for more than six months. I doubt it's going to become a problem again like with AMD's handling of 2D a while back.There are remarks concerning the availability of the R9 270 series and the inability for these parts to keep to their RRP, both of which may not be present if this was some sort of fanboy review.
Spuke - Thursday, February 13, 2014 - link
Has it been 6 months? I thought they recently fixed that problem.silverblue - Thursday, February 13, 2014 - link
It was fixed in Cat 13.8 Beta 1, dated 1st August.silverblue - Thursday, February 13, 2014 - link
My bad - that's when CrossFire had its first fix. Apparently, single-GPU was fixed beforehand, though I can't find which driver version it was.Solid State Brain - Thursday, February 13, 2014 - link
Anandtech: it would be interesting if you tested idle power consumption in multi monitor scenarios. I think you will find out some surprises.creed3020 - Thursday, February 13, 2014 - link
Excellent point!I had a friend with a 6950 and he was furious that his video card would never idle down in gpu/memory frequencies when he had a second monitor connected.
I personally have a 6850 and two 20" LCDs connected over DVI. I have not looked for the same behaviour but would not be surprised if it were the same.
Power efficiencies are out the window once the user chooses to go multi-monitor to be more productive.e
Solid State Brain - Thursday, February 13, 2014 - link
I have the same issue with my HD7770 to a lesser extent and my workaround for that is connecting my two secondary displays on the integrated Intel GPU. This saves a significant amount of power.Death666Angel - Thursday, February 13, 2014 - link
Who is surprised by that? No one that is following GPU reviews since multi-monitors became a thing for the consumer crowd. The first few generations had issues with monitors flickering in a multi-monitor setup because of too aggressive down clocking, so now they are being very conservative there and increase the clocks quite a bit.Solid State Brain - Thursday, February 13, 2014 - link
I don't think it's acceptable, though. AMD might have reduced idle consumption when a single monitor is being driven, but is still neglecting other usage scenarios that are becoming increasingly common. It's not even just a small power difference, especially with medium to high-end video cards.Da W - Thursday, February 13, 2014 - link
STFU.Anandtech had no problem calling 290 - 290X a terrible card because of its blower and the NOISE and crowning Nvidia once again. Much more now with AMD price hike.
The only biased guy here is YOU Nvidia fanboy.
silverblue - Thursday, February 13, 2014 - link
I don't think the price hikes are AMD's fault per se, however considering the inflated prices due to bitmining, you'd definitely want a better cooler than the stock one. The third party cards handle this nicely.Da W - Thursday, February 13, 2014 - link
My last post was directed to HisDivineOrder but the reply button doesn't seem to place my reply below his post.formulav8 - Thursday, February 13, 2014 - link
They always complain when something good is said about AMD whether the good is justified or not justified. They don't care either way.SolMiester - Thursday, February 13, 2014 - link
If this is just the 7xxx rebadge, then I guess it has the same multi GPU frame pacing issues....LOL, why dont they fix the damn thing FFS!...no CF Eyefinity, no DX9 pacing.....still shit!fiasse - Thursday, February 13, 2014 - link
Typo 'and R7 270 holding at $179 (MSRP)'Mr Perfect - Thursday, February 13, 2014 - link
With the naming of the card being so close to the 260X, I was really hoping this would be a faster GN1.1 part. How does AMD expect TrueAudio to catch on if they keep releasing card that don't support it? Hopefully the 300 series will sort this out and I can grab one to play Thief on.fiasse - Thursday, February 13, 2014 - link
looks like another typo on Asus R7 260 page, 'but this is an especially treacherous position if R7 260X prices quickly come down to $199.'AlucardX - Thursday, February 13, 2014 - link
Since the 7850 was a great overclocker, I wonder how this rebadge product is. Any plans to overclocking with an increased Vcore?Ryan Smith - Thursday, February 13, 2014 - link
Not with this one. I only had 2 days to put this review together, so unfortunately there wasn't time for overclocking.krumme - Thursday, February 13, 2014 - link
Intel Core i7-4960X at 4.2GHz with a 260 playing BF4 single player.Perhaps not the most realistic scenario in this world.
MrSpadge - Thursday, February 13, 2014 - link
It shows you what the card can do. If you're concerned about your CPU limiting you in BF4 multi player.. well, better read a CPU review.krumme - Friday, February 14, 2014 - link
A user playing with the s260 will typically have dual core i3. Thats reality. Try that with or without mantle in Multiplayer 64 man on the big maps. It the difference between playable or not playable. Probably more than 50% difference in favor of mantle. Instaed we get this useless talk.Rebel1080 - Friday, February 14, 2014 - link
What you're getting here is the equivalent of an Xbox One for $119 or PS4 for $149. It took Nvidia and ATI about 12-18 months just to release a video card of equal or better performance for under $199 after the seventh generation's (Xbox 360/PS3) debut. The fact that it only took 3 months to get to this level for under $150 during this generation only shows just how much $ony and M$FT low balled it's customers on specs.silverblue - Friday, February 14, 2014 - link
Except you then have to factor in the rest of the hardware to that price. Think about it - CPU, cooling, motherboard, memory (I don't think 8GB of GDDR5 is cheap), storage, case, power supply, software and the all-important input devices. Add in the fact that developers will get more out of the console GPUs than with the PC and I think you're ragging on them a bit too much.Antronman - Wednesday, February 19, 2014 - link
You mean 4GBs of DDR3. And likely high CAS latency too. Low-watt GPUs. Software only costs how much you pay the employees. Input ports are part of the mobo. DEVs do not get more out of the console GPUs. They are actually underclocked so that you don't need desktop-grade cooling. Consoles will never be serious gaming machines. People who buy consoles either won't spend the money on a good PC, can't spend the money, or would rather spend the money on dozens of games that they'll only play a couple of hours of and then just stick to one game.golemite - Saturday, February 15, 2014 - link
270's inflated prices are directly the result of cryptocoin mining as it has been found to offer an advantageous Kilo-hash to Watt ratio. It would be interesting and helpful to many out there if Anandtech started publishing KH/sec and KH/watt metrics in its review for Scrypt mining.Will Robinson - Monday, February 17, 2014 - link
Nice addition to the AMD lineup.... and a pretty convincing demolition of NVDA's competing cards.Thanx for the review!
TheJian - Wednesday, February 19, 2014 - link
How is that possible when all of the pricing is fake? You are ignoring REAL pricing much like anandtech. They should draw conclusions based on REAL pricing, and ignore ALL companies MSRP. If I can't buy it, it's still fake until I can for MSRP. IE, 290x is $700 right now (actually $709 cheapest on amazon - 3 in stock), NOT $550. So reviews based on $550 pricing are not real. Anandtech continues to give the benefit of the doubt 'one day it might be MSRP and a good deal if they can get to MSRP quickly'...LOL. Is a $709 290x a good deal vs. 780ti? NOPE.thejoelhansen - Saturday, April 26, 2014 - link
I hope I didn't overlook something obvious, but are the GTX 760 results missing?