Comments Locked

89 Comments

Back to Article

  • Elixer - Thursday, October 17, 2013 - link

    Was this on a win 7, 8, or 8.1 system?

    Also, you ever get a response from AMD about them dropping Vista driver support ?
  • Ryan Smith - Thursday, October 17, 2013 - link

    Windows 8.1

    http://www.anandtech.com/show/7400/the-radeon-r9-2...
  • Flunk - Thursday, October 17, 2013 - link

    So it's slower than a Nvidia GTX Titan then, interesting but it gives us an idea of what it will cost.
  • marc1000 - Thursday, October 17, 2013 - link

    kinda disappointed too. not that the card is in my budget anyway, but I was hoping for a little more...
  • Brutalizer - Friday, October 18, 2013 - link

    I am curious to see how much improvement Mantle API will give. Will Mantle give 25% more performance? Or 50%? But as one AMD rep said in an official interview "we will ridicule Titan when using Mantle", it seems that Mantle will give far more than 25% improvement. Because 25% more improvement is not "ridiculous". How much is "ridiculous" you think? 100%? 200%? 1000% would certainly be ridiculous. But 25% not.
  • Frallan - Tuesday, October 22, 2013 - link

    In marketing speak it means anything above tied lets say around 2%...

    In engineeringspeak Id say it starts at 50+%

    So just have a look at who said it is he marketing or engineering?
  • tuklap - Wednesday, October 23, 2013 - link

    they sometimes exaggerate things. for us, 10-15% improvement is not that good. but for the designers, its excellent for them. we dont know what lies behind those numbers soo better just wait and appreciate
  • Da W - Thursday, October 17, 2013 - link

    So effectively tying 780 at max detail while R9 280X was lagging by a wide margin. (in fact bioshock seems to favor nvidia cards).
    Not bad. I need to see price still.

    Hey Ryan are you able to include R9 270X in crossfire in your nexts benchmark? Seems to me that Pitcairn is the most power efficient chip, and with price, could be a nice option for eyefinity rigs (with GPU#2 shutted down while in simple desktop mode). I know i know about crossfire issues.
  • nathanddrews - Thursday, October 17, 2013 - link

    Does it favor NVIDIA? I dunno...
    http://www.anandtech.com/bench/GPU13/702

    I can't wait to see some objective reviews. Didn't Roy just call out the Titan the other day? If you're not royally handing the 780 its ass, you probably shouldn't taunt ye olde Titan...
  • Spunjji - Friday, October 18, 2013 - link

    Graphs are funny things - different ones say different things:
    http://www.anandtech.com/bench/GPU13/704

    To be honest I'd say that the game doesn't especially favour either side strongly, but there seems to be a marginal lean towards the nVidia architecture.
  • nathanddrews - Friday, October 18, 2013 - link

    There's a huge difference between pushing 2MP (1080p) and 8MP (4K). That's why I linked to a 6MP (triple screen) example.
  • pattycake0147 - Thursday, October 17, 2013 - link

    Maybe that's planned for the 270X and 260X article that was promised in the 280X launch piece. I hope it comes out soon as that's the article I'm most interested in.
  • Hrel - Thursday, October 17, 2013 - link

    Bioshock favors AMD.
  • tviceman - Thursday, October 17, 2013 - link

    It's a GE title, if anything it should "favor" AMD.
  • Ryan Smith - Thursday, October 17, 2013 - link

    I only have 1 270X, so there won't be any CF results at this time.
  • MoreDinosaurs - Thursday, October 17, 2013 - link

    Outside of the realm of benchmarks, how useful/enjoyable/different is running Bioshock at 4k/60fps? Curious about this strange 4k future...
  • inighthawki - Thursday, October 17, 2013 - link

    I'd like to know too, but it'll probably be a lot more beneficial on large displays (30" or greater), pixel density is usually incredibly poor. On a smaller display such as 22-24" I'd be surprised if the difference is much different than 1080p with good AA and AF. We'll have to wait and see :)
  • A5 - Thursday, October 17, 2013 - link

    Yeah. Not that I have the budget for a 4K monitor anyway, but I'd like to see a subjective comparison of how it feels compared to 1080p or 1440p, especially with the quality tradeoff.
  • Sancus - Thursday, October 17, 2013 - link

    It's pretty cool. You can certainly see higher detail than 1080p in scenes with little or no movement. When there's a lot of moving(turning around, panning, etc) 120hz/Lightboost let you see more detail. For FPSes specifically I would say that higher refresh rates/lightboost add more to the experience than higher resolutions do.

    However, if you like exploring in games like Skyrim, or you enjoy tile-based RTSes/rpgs/strategy games, 4k is pretty cool because you definitely see a lot more detail when you're not jumping/turning around like mad, scenes are sharper, and in the latter case with tile-based games, you get to see a lot more of the map at once without sacrificing the ability to read text. Civ5 is pretty awesome, for example.

    The one big problem with 4k and other high resolutions is that at much smaller monitor sizes(say, 3200x1800 13.3 inch panels or a hypothetical 3840x2160 24 inch panel) scaling becomes a big issue, and even in Windows 8.1, the majority of applications you are going to use have messed up UIs and problems with scaling. Web browsers and built-in WIndows apps tend to be okay, but as soon as you get outside that box, the higher you set the scaling the weirder things are going to get, and then some apps like Skype will just be completely unusable because they don't support scaling at all. Fortunately, my monitor is 31.5 inches so it's only 140dpi, which means that I can get away with 125% scaling, and apps that don't support it at all(Steam, Skype) are still completely usable if a little uncomfortable at times.

    reference: I own an Asus PQ321.
  • Spunjji - Friday, October 18, 2013 - link

    Cheers for the evaluation!
  • nathanddrews - Friday, October 18, 2013 - link

    If you haven't tried it already, you can copy the existing Steam skin and rename it. Open the skin file and replace all font sizes with a size of 18. It works beautifully. If all else fails, use BPM.
  • Shadowmaster625 - Thursday, October 17, 2013 - link

    Actually it might match Titan. I dont know why the chart only compares a 780 but this one shows Titan being about the same amount faster than a 780:

    http://images.anandtech.com/graphs/graph6973/54886...
  • cmikeh2 - Thursday, October 17, 2013 - link

    They should be close at that setting. Looks like the 290X will really excel in high resolution gaming with the great memory bandwidth but NVIDIA will have better shader performance in that comparison.
  • tipoo - Thursday, October 17, 2013 - link

    This is with "quiet mode" enabled, whatever that is. I wonder how much more performance is there in noisy mode.

    Not bad for something still on 28nm and with a smaller/cheaper die size.
  • dragonsqrrl - Thursday, October 17, 2013 - link

    I suppose you could say the same for gk104 vs. Tahiti
  • Spunjji - Friday, October 18, 2013 - link

    Indeed!
  • 200380051 - Friday, October 18, 2013 - link

    Indeed not! Look at the compute performance on gk104...nVidia removed most of the transistor budget for that, for the sake of a more "efficient" gaming chip. Tahiti is roughly equal in gaming but has 3x the compute perfs..
  • dragonsqrrl - Friday, October 18, 2013 - link

    I suppose you could say the same for gf110 vs Cayman... oh, except that in addition to having superior compute performance the GTX580 also consistently outperformed the HD6970 in games. Tell me, since you're obviously so keen on judging the value of a gaming card by its compute performance, were you an Nvidia fanboy before the HD7900 series?

    Oh, and another two points AMD fanboys don't seem to consider, and they're a big ones, Nvidia has had gk110 based Geforce cards on the market for 8 months now (let that sink in for a sec), and they had gk110 based Tesla's available long before that. And then consider that the gk110 based Geforce card in the benchmark above uses a heavily disabled gk110. And then consider that the HD7970 GHz already consumes a similar amount of power under load as the GTX780/Titan. And then, well... there goes all your market strategy, performance, and efficiency misconceptions. It's hard to interpret things any other way from an objective, informed standpoint.

    But, but, AMD will likely bring greater value to the high-end...
    Yes, they probably will, and that's fantastic for consumers, but that has nothing to do with your argument.
  • jjj - Thursday, October 17, 2013 - link

    And you guys thought this is ethical?
  • sorten - Thursday, October 17, 2013 - link

    Which guys? Thought what was ethical?
  • A5 - Thursday, October 17, 2013 - link

    If you think a controlled benchmark release is unethical, you should probably just quit following pretty much all product coverage of anything ever.
  • brucek2 - Thursday, October 17, 2013 - link

    A5 - Name one other video card review in recent memory where the manufacturer was allowed to pick the single game that would be benchmarked? As far as I can recall this is unprecedented.

    Its unclear who picked the single game. If it was AnandTech, fine, I guess. But if it was AMD who told AnandTech the one game and specific setting(s) they were allowed to cover, the only responses with journalistic integrity would either have been to tell them "no thanks" or to run it as a paid ad and label it as such.

    Controlling the release date via emargoes is routine and works out in everyone's best interest. Controlling the content of supposed pieces of journalism is not at all routine and in no one's interest but the manufacturers.
  • DigitalFreak - Thursday, October 17, 2013 - link

    Dunno. Tom's Hardware has both Bioshock and Tomb Raider.
  • cmikeh2 - Thursday, October 17, 2013 - link

    They did say: "And as with all controlled benchmark releases we’d advise not reading too much into any single benchmark here, as the relative performance of NVIDIA and AMD cards changes with the game being tested, at times rather wildly." That's a pretty straightforward disclaimer if you ask me. They don't make any recommendations off of this single benchmark and just use it as a starting point and a general realm of performance.
  • brucek2 - Thursday, October 17, 2013 - link

    Yes, they adequately disclosed the situation as to what happened. But why did they accept the situation as all? To most mainstream journalists, these conditions would be unacceptable and simply result in no story until it was possible to adequately prepare one.

    What will they agree to next? They can publish the review 48 hours early, as long as AMD gets to strike any sentences it doesn't approve of? Or 96 hours early, if they agree to include verbatim several paragraphs provided by AMD's PR department? Or a full week early, if they do the above and agree not to disclose any of the conditions?

    Maybe AnandTech felt as a one off, they were better off getting some early coverage in. But since they and others went for it, they can assume more of these trades of journalistic integrity for early publish date will be coming their way in the future. It may be a short term gain but in the end they are selling off the things they have that matter most, their reputation and credibility.
  • Drumsticks - Thursday, October 17, 2013 - link

    How is that at all related? They aren't striking out any part of the review and are in no way allowing AMD to "approve" what they're writing. AMD said they could release a benchmark, which they did, along with a hefty disclaimer. That's no reason for alarm at all. They aren't sacrificing the integrity of the benchmark or preview. It's just that - a preview.

    There will still be a full 290X review. Every company is going to want to portray their products in a positive light, and the same is true for AMD. That doesn't make it unethical. It would be unethical if they fudged the numbers, which didn't happen. It's equally possible that they wanted a limited preview so they could *provide* a preview, but simultaneously give them time to actually prepare the other games for which drivers or other factors might not be ready.

    Also, in a preview that AMD allowed where one of the data points is virtually even, I don't think they were too focused on cheating us out of "ethical" journalism.
  • brucek2 - Thursday, October 17, 2013 - link

    AMD's unwarranted approval power was preemptive, in that they approved one (or two?) specific games potentially at specific settings, while disapproving all other games/settings. That is way over the line of an unacceptable amount of journalistic control to give up in exchange for access to materials or sources.

    If AMD wants to issue a press release touting overly specific performance numbers / controlled cases, then just forward the release as is and label it accordingly (if judged newsworthy), or don't. But don't attach your journalistic brand to something so obviously unbefitting of an actual independent act of reporting.
  • HisDivineOrder - Thursday, October 17, 2013 - link

    It's very fugly when a company chooses the game and the scenario and the system that plays the game for the "limited comparison" point for a previously unbiased neutral third party. People expect Anandtech to be above the fray, unbiased, and certainly not playing favorites.

    Here's something obvious. nVidia has a show today, but they said virtually nothing about it today. Meanwhile, other sites don't have a problem mentioning it. They mentioned it, but only in the AMD article and only briefly with next to no description of said event.

    Meanwhile, they spent this entire article talking about AMD's limited comparison point made only under AMD's exacting requirements that ensured they looked great during the comparison.

    This reminds me of what AMD did last year with Piledriver's launch. Allowing limited "reviews" of certain, favorable parts. Imagine how loud and constant the howls of fury would be if nVidia launched a new product with "limited reviews" or "comparisons" based on nVidia-selected games and/or benchmarks.

    Say, nVidia is launching the new nVidia Geforce Titanic. They make a whole show at the Caribbean Islands. They have a special Green section at Anandtech that does nothing but report on news favorable to nVidia. Anandtech suddenly is virtually ignoring or going completely dark on the fact that nVidia's prior cards (686 series) are having massive problems with 4K in Crossfire despite being cards "designed for 4K." So every article in this "Green Section" is incredibly positive.

    Then the new Titanic is coming. Journalists all in the Caribbean are drinking pina coladas and are fed new proprietary technologies like nVidia PureAudio DSP's limited only to nVidia cards and only a select group of the new cards. Plus, nVidia Glide 2.0, an initiative to lock in developers to only CUDA-based video cards built in the last year. Imagine then that said "Green Section" clinked glasses and danced in a loop, making a much-read, much-spread article about how the new Glide 2.0 is in fact a low level API from one of the next generation consoles that is about to hit without a shred of proof or much more than a hope and a dream.

    Now imagine nVidia implies to all those reviewers that they'll open up more information about said upcoming Titanic in... a week, maybe two. Then three weeks pass and no one says anything. Waiting longer, they re-fresh and re-release the same products they previously released almost two years ago but with completely new names and similar pricing to what they'd already dropped to. Yet no one says anything but "Wow, the value is great" despite the bundle having been axed.

    Now imagine they offer said "Green Section" a new benchmark for a very specific game guaranteed to give them a "win." Everyone is hungry for benchmarks, for the final answer to whether the card is better, and you have a seemingly unbiased review site popping up with an "early benchmark" that shows the new card in that "Green Section" winning at 4K.

    Man, it looks like the "Green Section" has the goods and the Green Team has the cards, right? Except if they had the goods, then why are they only showing a very limited review instead of being brave and showing everything they got right now when they very clearly can?

    Because showing everything would show everything, warts and all. AMD wants to show you only one specific thing, to choose the battleground, the benchmark, and win out completely in that one place. They're aiming to mislead because if they weren't, they wouldn't need to choose the game, choose the benchmark, and stack the deck.

    That's just another part of how far Anandtech has gone away from ethical. That's what's not ethical. How can anyone NOT question their "unbiased" status now?

    If I buy you a trip to a tropical island and I buy a section in your site and I get preferential treatment that seems suddenly to forget or just not mention any problems I have when they are rather widespread in being discussed and then I buy a free early benchmark of my choosing and design and somehow manage to get you not to talk about what my chief opponent is doing even when what they're doing is so much bigger than that benchmark for that day...

    ...then I have question if I wouldn't have you in my back pocket, right?

    If it were nVidia or Intel doing all this and not AMD, you'd see a howling fit the likes of which the intarwebz would never cool.
  • mycardbrokedown - Thursday, October 17, 2013 - link

    i get your point but there is a falacy to it... this product is not yet released so any and all info is given out under nda. if the product was a launched and sold one you would be right in this case however not. plus all sites may now publish their results for the given games. on the other hand i get it why they did it... the srivers arn't there yet and i think they want to launch with fairly good drivers this time around.
  • Principle - Friday, October 18, 2013 - link

    You guys are ridiculous. You don't want any info, fine, dont read it, now shut up. There is nothing unethical about what happened, nothing. Making a mountain out of a non-issue in the name of Nvidia fanaticism.
  • brucek2 - Friday, October 18, 2013 - link

    Way to miss the big picture. The combined market share of the R290, and the 780 and/or Titan, is a rounding error away from zero. The competitive differences between them will affect at most a tiny slice of the PC user base, and any competitive rankings today will be obsolete within 6 months if not 3. All of that is small beans.

    The larger issue, which will endure for much longer, and will apply to much more mainstream products affecting much greater numbers of users, is the rules of engagement between the tech press and the manufacturer. No one has yet responded to my original question, which was to name another example where the tech press accepted such one sided rules. I believe AMD's PR operation broke new ground today. They got away with it first, but now that precedent has been set they surely won't be the last. Nvidia will of course respond in kind, but the bigger picture is so will other manufacturers in other product categories.

    Tech press is relatively new because tech itself is relatively new. Broader journalism has been around a lot longer, has covered issues that have much more far reaching implications, and has arrived at rules that make sense in protecting the freedom of the press and therefore its readers. Not accepting undue editorial control in exchange for materials, access, favors, etc is an important part of that. The tech press blew it today and we will all eventually be worse off for their mistake.
  • WritersBlock - Monday, October 21, 2013 - link

    I have to say, that bruck2 makes a good point. But in fairness to anandtech, their disclaimer was hard to miss.
  • takeship - Friday, October 18, 2013 - link

    We don't know just quite yet whether or not these numbers have been fudged, but we will in a few more days time. No doubt EVERY site will now run a Bioshock Infinite bench at this res to double-check AMDs claim, and yell to the heavens if AMD was dumb enough to fudge something so easily verifiable. I would like a comment of some sort as to why the 290X score is listed as "Quiet Mode." That parts seems to be marketing driven to me. (i.e. low PowerTune matches a 780, balanced matches a Titan, performance takes them out back)
  • TheJian - Thursday, October 17, 2013 - link

    http://www.tomshardware.com/news/amd-radeon-r9-290...
    Toms was told which games and how to run them...This type of crap would be pounced on here by Ryan if NV did this. Note they are completely ignoring NV's show in montreal...ROFL. Still don't see an NV portal page either to complement the AMD portal they have here. How much does AMD pay anandtech these days?
  • nsiboro - Friday, October 18, 2013 - link

    @brucek2, @HisDivineOder:

    TL;DR;

    You guys going hypotheticals and fantasizing "What-Ifs" = trying so very hard to make something out of nothing.

    It's a preview result. The end.
  • brucek2 - Friday, October 18, 2013 - link

    @nsiboro - the admission that AMD "named the game, the cards, and the resolution" is in the 3rd paragraph. The fact that AnandTech accepted those conditions and ran the story anyway is evident by the existence of this article. There is nothing hypothetical or what iffed about it.
  • nsiboro - Friday, October 18, 2013 - link

    @brucek2

    Lots of Q?, "Maybe", "Assume", "Say...", "Titanic" (so sad, it sunk), "Imagine", "Implies", etc.

    *These words. *sigh*
    *According to you, the publishing of this 1 preview = doomed all journalistic integrity at this site and all over the Internet.

    "The combined market share of the R290, "

    *R9-290/X is not out yet, authority sources?*

    "All of that is small beans."

    *I dunno, 780/Titan are priced very high*

    So, you are not OK with this site and many other (as they also published the same preview).
    I can respect that.
  • brucek2 - Friday, October 18, 2013 - link

    I could answer those nits but I don't want to distract from my main contention, still unanswered. I believe that AMD broke new ground by placing such tight conditions on what could be discussed in the preview, and the press accepting it.

    To all those claiming this is not a big deal and simply business as usual, my question stands -- point me to any previous AnandTech (or other similarly respectable site) article that allowed the manufacturer to dictate the content to that extent. I'm aware that sometimes there's just very little data to report period, so see that as a different case; but here, the team had the ability to generate as many data points as desired yet accepted the restriction to use only those of the manufacturer's choosing.

    You may call it a hypothetical, but allowing control over the one and only game allowed in the story is not so very far from allowing say only positive vs negative comments in a story. The latter demand would have hopefully been laughed out of the room; but I am not understanding why the former should not also have been.
  • nathanddrews - Friday, October 18, 2013 - link

    If AMD invited me to their party and told me that I could run benchmarks for just one game on their brand new super double top secret GPU and share those results with the world, I would do exactly that. I would also include the disclaimer that AT did. I don't see the problem. It's labeled as a preview. No different than press events hosted by Intel or NVIDIA, Samsung or Apple.

    Put down the Haterade.
  • brucek2 - Friday, October 18, 2013 - link

    The difference is not having more information vs. accepting restrictions on information you do have.

    Every reporter faces the challenge of receiving biased, partial information from one sided sources (i.e., a press conference.) Choosing how to evaluate and present that information, and how to complement it with information from other sources, and whether it is ready to run at all, are all basic tasks of journalism.

    That's not what's happening here though. This and other blogs have more information, but they agreed to terms that they would first report only the specific portion of the information that the manufacturer requested. The response with journalistic integrity would have been to decline those terms.
  • Principle - Friday, October 18, 2013 - link

    OMG, get over it. They were allowed to present their data any way they wanted, no one told them what to write, only the scope. The full set of data will come out in a couple weeks at the latest. THEY ARE OPERATING ON A NON DISCLOSURE AGREEMENT. they have had results to report for weeks potentially, but were not allowed to release it. Does that make Anandtech evil, because they may have information they did not give to you??? But now that they gave you a little they are bad people?? The terms of an NDA are clear, the people that pay millions of dollars to bring that product to market get to decide when information is released. That NDA ends when the product is publicly released and all the data will flow. Why does this not make sense???????
  • sorten - Thursday, October 17, 2013 - link

    I'm not excited about the current generation of cards. It seems like NVidia and ATI both got caught napping when 4K monitors started to hit the scene and they've both been busy rebranding and overclocking existing hardware on the same manufacturing process. I'll wait and see what happens with the next generation.
  • MrSpadge - Thursday, October 17, 2013 - link

    What did you expect? A unicorn spraying fairy dust?
    The improvements over the 40 nm generation(s) are starting to become very significant. Can't ask for much more on the hardware front, unless the software changes as well.
  • sorten - Friday, October 18, 2013 - link

    @MrSpadge : I wasn't expecting anything. I'm just saying that I'm keeping my money in my wallet. The advantages of 28nm over 40nm were very significant two years ago when that process change occurred. Since then there has been relatively little movement unless you're spending $600+ for a card, and even then you might be better off with two lower end cards.
  • A5 - Thursday, October 17, 2013 - link

    4K is still a tiny, tiny niche. They have SLI/XFire for the people who need to hit that resolution right now. By the time 4K is mainstreak (3-5 years?) they'll be able to handle it on one high-end card.
  • ddriver - Thursday, October 17, 2013 - link

    Hopefully the madness will end at 4k. Anything above is literal overkill, considering most people cannot distinguish between a 150 and 300 dpi image from normal viewing distance. A 4k monitor at 31.5" is about 150 ppi... There is really no point in rendering pixels that the human eye cannot distinguish.
  • mcnabney - Thursday, October 17, 2013 - link

    That is why I question need for some Ultra settings. When you can't really see the pixels the need for massive AA goes away. UHD resolutions need to be reviewed for appropriate detail levels.
  • Nagorak - Thursday, October 17, 2013 - link

    These ultra high res monitors just seem idiotic to me. When it comes to games you're just shooting yourself in the foot trying to run at such high resolution. It's going to look works being on medium settings than it would at high settings at a lower resolution.

    I bet most of the time people are going to end up running at half monitor res anyway if they want decent performance (basically equivalent to non-4K monitors).
  • jasonelmore - Thursday, October 17, 2013 - link

    It wont. The higher resolution the resolution we can get, the closer we get to Holo Projection.
  • Gigaplex - Thursday, October 17, 2013 - link

    I can easily distinguish between 150 and 300 dpi at normal viewing distance... when it's static. It makes a massive difference for font rendering. I can't say I've experienced it in a game however so you may have a point for games.
  • haukionkannel - Friday, October 18, 2013 - link

    Yep! i agree. When you read anything from monitor, the 4K and even 8K are just fine! When gaming it all depends on how much GPU power you have. I am guite sure that in the beginning running the game in 1080p mode and upscaling it to 4K will be enough! I have read several test reports from TV sets that are 4K. The reports say that native 4K material is indeed very good, but upscaled 1080p material does look good too, better than in normal 1080p resolution TV sets (just like upscaled DVD movies in the distant past). So there definitely is need for 4k and even 8k monitors. Just run the games in resolution that can give you enough fps and upscale from that!
  • c4toast - Thursday, October 17, 2013 - link

    Each of the 8 ACE's can manage up to 8 compute queues for a total of 64 compute commands, in comparison the HD 7970 only has 2 ACE's that could only queue 2 compute commands for a total of 4 is that true? if so what kind of perf we get wen dev fully optimize.
  • psuedonymous - Thursday, October 17, 2013 - link

    With AMDs current issues (http://www.pcper.com/reviews/Graphics-Cards/Frame-... regarding frame rating with multiple monitors (and that all current UHD 60fps displays are driven by two separate 1920x2160 streams), are you allowed by AMD to publish the frame timings in addition to the raw framerate? AMD promised a 'fix' with the R290x, but it would be interesting to see how much things have actually improved.
  • Ryan Smith - Thursday, October 17, 2013 - link

    Frame pacing with multiple monitors over a single GPU has never been an issue. It's only a remaining issue for multiple GPUs in conjunction with multiple monitors. So the 290X would not be affected here.
  • SeeManRun - Thursday, October 17, 2013 - link

    One thing of interest is that ultra quality includes AA. What happens if you turn off AA since it shouldn't be needed at UltraHD?
  • Principle - Friday, October 18, 2013 - link

    You can see it at Toms, where they did something more reasonable and just put it on very high settings instead of medium and could manage nearly the same frame rates.
  • Wreckage - Thursday, October 17, 2013 - link

    So any old factory overclocked off the shelf 780 will be better. Sigh.
  • TwiSparkle - Thursday, October 17, 2013 - link

    You mean I can't overclock the 290X? :( /sarcasm. And really it will come down to price. If this is 550-600 it is a LOT better than a 780.
  • TheJian - Thursday, October 17, 2013 - link

    Umm, did you look at the watts and heat it puts off already at stock?
    http://wccftech.com/amd-radeon-r9-290x-radeon-r9-2...
    94 temps (titan had 84, 780 82), and 404watts vs titan 342. For that extra 62watts this thing should be blowing away Titan. It's not.

    You're under the mistaken impression NV's pricing is stuck in stone. They now offer 3 free games and $100 off shield coupon. Also considering all the driver problems this year/last and still going, NV's reasonable asking for a premium for stuff that "just works" right? ;) You buy AMD you MAYBE get a working driver at some point. Or buy NV and forget about it. I'm sure the shoe will be on the other foot eventually but right now they haven't fixed everything with the last gen. To me, this kind of crap is why AMD has to give more free games away usually and can't get NV's 65% discrete share down one bit for years.
  • just4U - Thursday, October 17, 2013 - link

    Or just throw in a NV and it works? lol.. Driver related problems across both companies are fairly equal from what I can tell and have been for a long time. I am surprised people still tout this argument after 12 years...
  • Principle - Friday, October 18, 2013 - link

    You mean in that non-official release with no credibility? The extra 62 watts was actually the space heater running off the same socket blowing into the PC intake of the AMD machine.
  • Gunbuster - Friday, October 18, 2013 - link

    I'd love to see how warm you get from a space heater that only consumes 62 watts...
  • Principle - Friday, October 18, 2013 - link

    wow, you took that literally.....
  • zeock9 - Thursday, October 17, 2013 - link

    Throughly unimpressive considering the rumor that they are upping the MSRP of these cards from $600 to something close to $700.

    This kind of cliff-hanger PR marketing ploy clearly aimed at hurting its competitor's market more so than boosting its own has always disgusted me.

    If 290x does indeed end up barely competing with Titan at a higher price point than 780, then I will just go with the green camp out of spite for AMD's dubious business practices.
  • Amoro - Thursday, October 17, 2013 - link

    What about the fact that the GTX 780 costs more than twice as much as a 7970 but isn't twice as fast? I mean it's all relative to how much performance you are willing to pay for. The Titan is also ridiculously overpriced as a gaming card. Even compared to the GTX 770, the 780 is not worth $250 more.
  • Principle - Friday, October 18, 2013 - link

    Your argument makes absolutely no sense. They were never promised at $600, maybe youre thinking of the 290, not the 290X. And MSRP and actual pricing are not usually the same.
  • Amoro - Friday, October 18, 2013 - link

    Was the 290X ever promised at $600? I just don't see how it's dubious when the 290X is faster than the GTX 780 and the Titan but priced only slightly above the 780, if it launches at ~$700. It seems to be slotted in well there. If anything, it's the fact that the 780 and Titan were crazy overpriced at launch which allows them to price their 290X at ridiculous prices as well. This is all assuming that they will launch the R9 290X at ~$700

    However, if NVIDIA responds back with immediate pricecuts then AMD could have a situation on their hands.
  • TheJian - Thursday, October 17, 2013 - link

    "As something of a counter-event to NVIDIA’s gaming showcase taking place in Montreal, Canada this week,"

    So Where is coverage of Nvidia's show? Day one in the bag and already announced 770/780/titan holiday bundles come with Batman, Splinter Cell Black list and Assassins Creed 4 + $100 off shield coupon. Not bad, and even better if 290x turns out decent forcing a black friday price cut from NV on top cards. Anyone on the fence about shield would surely be pushed over with a $200 price and 3 great games. Of course I want shield rev2 with maxwell...LOL. Watts/heat look bad on 290x and I need a cool card in AZ to replace my 5850 (which was cooler than NV at the time when I bought it to replace my 8800GT). It shouldn't be a surprise watts and heat would go up on 432mm sized chip vs. 7970ghz.
    http://wccftech.com/amd-radeon-r9-290x-radeon-r9-2...
    I'm guessing NV will just release a jacked up 780/titan pair next week or two running an extra 100mhz to keep their top end pricing the same. The 290x does not appear to be a titan killer and usually is between it and 780. But hey, that does mean the price will drop on 780/770 etc. All will be pushed down by the ULTRA models released soon (or whatever NV calls them) with 100-200mhz more (essentially OC'ed versions of the same cards out now no doubt).
  • Ryan Smith - Thursday, October 17, 2013 - link

    It's coming. Anand is in Montreal and the work queue is quite deep at the moment.
  • Shlomi - Thursday, October 17, 2013 - link

    wtf new series and tech plus the card is not released yet
    just for 1/8 fps more??

    i say NVIDIA won before it even started...
  • Will Robinson - Thursday, October 17, 2013 - link

    I am glad all the NV fans are enjoying the pre launch rumors.
  • Principle - Friday, October 18, 2013 - link

    So no big deal your 780 got shown up, you are in the first stage of grieving.
  • Roboyt0 - Thursday, October 17, 2013 - link

    Why does everyone keep going back to the, likely, fake results from that Chinese site, specifically the temperatures! These are the first possibility of REAL numbers we have to work with. With plentiful actual results just around the corner.

    If in fact the 290X does best the 780 by a bit, and matches Titan performance occasionally, this is still a win for the AMD camp. The 780/Titan houses 1.1 Billion MORE transistors, +18%, on a die that is 25% larger; 551 mm^2 for 780/Titan compared to 438 mm^2 for the 290X. AMD is doing more with less. Let's do some simple math: Nvidia 780 GPU is 25% bigger, with 18% more transistors, and either has a miniscule lead or is losing...what happened? What would Nvidia be doing with a smaller die and transistor count? Not many of you green guys want to focus on this very important detail.

    Also, the exaggeration of the information surrounding the 290X 'crushing' the Titan is obnoxious. To clarify, AMD said the 290X will perform much better than the Titan SPECIFICALLY in BF4 while using the Mantle API. If you think the Titan will beat the 290X under these SPECIFIC conditions, you are sadly mistaken. We won't know exactly how this will play out until Mantle is released...

    http://attackofthefanboy.com/news/radeon-r9-290x-m...

    " Devon Nekechuck says that the new card will compete with the Titan and GTX 780 straight away, but “with Battlefield 4 running with Mantel, the card will be able to ridicule the Titan in terms of performance.” Nekechuck clarified that he means that Battlefield 4 will run much faster with the R9 290X and Mantle than it will with Titan. "

    At the moment we can only be certain of one thing...AMD has erupted the internet into a frenzy regarding their new hardware :D

    P.S.
    This post brought to you by someone currently running a GTX 670
    http://www.3dmark.com/3dm11/7331088
  • 6kle - Friday, October 18, 2013 - link

    We were already told that this card will do well with high resolutions. It is obvious to me that AMD did some cherry picking by saying you can only test at 4k resolution. They probably also chose Bioshock for the same reason.

    In my opinion AnandTech should have said in this article that AMD was likely to choose the most flattering test enviroment for their card for this early benchmark release or simply not release this info because it's borderline free advertising for AMD instead of real benchmarking.
  • Principle - Friday, October 18, 2013 - link

    They did not only test it at 4K, they could only release that data and comment about that.

    What makes you think Bioshock will be the most flattering for AMD?? How about pick one that HD7970 could keep up with the 780, like Grid 2. AMD didn't even beat the 770 before on BioShock.
  • DZtruthseek - Friday, October 18, 2013 - link

    When you guys are able to, could you make a article with benchmarks of the R9 series in crossfire?? Or just post the 290X, 280X, and 270X crossfire benchmarks??
  • wwwcd - Friday, October 18, 2013 - link

    Windows 8.1 is a too new version. So we need or correct drivers for it.I think I deliberately tested with this version of the OS, and that this was done in favor of Nvidia, which have been prepared their drivers precisely such a test
  • James5mith - Friday, October 18, 2013 - link

    Just need to get something off my chest...

    AA is not needed at that resolution in my opinion. Why would you try and use it?
  • Paullwar - Friday, October 18, 2013 - link

    Firstly greetings from across the pond and yes ole blighty has been foggy and cold.

    Ryan, in you're honest opinion are we saying that the ref board is middle ground between titan and 780 on a subjective game review.

    If so - as you cannot disclose pricing what was the power draw in comparison to 780. This I feel shouldn't go against AMD 'embargo' as they are openly asking Anandtech to review both boards.

    For the record and IMO AMD are being more than a tad cheeky with this request.

Log in

Don't have an account? Sign up now