Didn't AMD originally supply an info-graphic (about a year ago, just prior to the release of their 7000 series cards) that the 7790 card was to have been their lowest-priced 7000 series card that still provided a 256-bit memory interface? Any explanation for the downgrade now, to a 128-bit interface?
Amd lies all the time then fails to deliver, so the fanboys give them a thousand breaks, another chance, some more time - " in the future what AMD I bought today will be great, it's "futureproof" " (even though it doesn't work correctly right now)... LOL - goodbye amd So the beta driver, uhh perma beta driver must have crashed out in civ5 a lot.
im sorry but if you must insist on invalid staments you shouldnt be on this website because smarter people will correct you on YOUR fanboyness. First of all amd beats nvidia at every price point(check linus tech tips on his youtube video card chanell) and they have perfect drivers. working for 6 months on the same amd driver and no crashes. infact i tried a new crysis 3 driver and it didnt crash then either and that was a beta driver. and my radeon 7770 destroyed the nvidia gtx 650, which is at the same price point.
AMD Peasants, that's what we are... First time in my over 15-years "career" on PC gaming I went with AMD and a 7950, and you can't even imagine how much I regret it.
A roadmap is nothing but a projection of what is PLANNED for the future, not some kind of "promise" or "guarantee". Calling AMD people liars because the released product didn't match the projection is childish at best.
And before you slap the "fanboy" label on me, I prefer Nvidia generally speaking (but I'm not going to cut off my proverbial nose to spite my face in order to be brand loyal; if AMD has the current best solution for my purposes, I'm going to buy AMD).
They were going to use a cutdown Pitcairn, being 7870/7850 GPU, and cut down the GPU core to use excess cores that couldn't make the cut as 7870/7850s. They might have gone with 256-bit to simplify the product for AIB partners who could just re-use their HD7850 designs, rather than needing a new design for a smaller run product.
The 7790 now is a new GPU designed to be cheaper to produce (as it's smaller) than Pitcairn, and the fact the memory can run at 6GHz is probably due in part to the fact it's a new GPU rather than a cut down Pitcairn.
I don't see a launch date in the whole article, it's NOT available. I guess that's another mystery freebie for AMD's products here. Didn't see port config either, so what cabling do we have to buy to run 3 monitors when Asus 650ti runs 4 out of the box, 3 with dvi and vga only ? Not impressed with the huge AMD biased game line up either, so expect your mileage to be less than shown. No overclock talk really either - so it must blow at that. Other sites are reporting amd's beta driver, so maybe they won't even have a release driver for this card when they release it, as AMD is often known to do, for like a year sometimes or forever in terms of any sort of quality-LOL. Civ5 has only 1 bench rez, it must have crashed in others. Crossfire ? Article didn't say. Multi-monitor - no talk of that anymore since nVidia SPANKS amd to death on that now. Hopefully you've fooled the internet tards again, because amd is bankrupt, for good reason.
Did you even read the article? -Launch date is mentioned on page 1, in one and a half week -Ports are clearly visible and standard, 2 DVI + HDMI + DisplayPort -Lineup is consistent with every other review on Anandtech. -There's an entire page on the new PowerTune and how it impacts overclocking, single sample OC investigation is irrelevant and best left for a dedicated vendor comparison. -... really? Who's the real tard here?
No, that's what you do all the time. But thanks for the compliment, since you know I always read the articles completely, yet you think I didn't this time, WRONG. I've made a lot of money this past short week without a lot of rest, so I'll give you and dipsy doodle a point on the svengali launch date the article writer for the first time EVER declares "solid" before it even occurs, og wait, he always does that when it's AMD, but if it's nVidia he says we'll have to wait and see as they are probably lying... ROFL Who cares, the card sucks, amd is dying, the drivers blow beta chunks, and amd is way late to the party.
Just a question: And how much will your favored brand of GPUs cost, if AMD really dies? 10 times? 100 times? An arm, a leg, and both kidneys? Grow up, and understand how an ecosystem works for us all.
BTW. I don't have GPU preferences, just grab what gives bets bang for bucks. If it has EasternElbonianVideoPigs GPU on it - be it...
You must've missed the part about them simply not having as much time to test the 7790 as they'd have liked because they were at GTC. Other sites apologised for their lack of time as well.
There's a whole load of other reviews out there; only a few have overclocking results (Guru3D notably), and as far as I can see only AT, of the major sites, has both the 7790 and a factory overclocked 7790 in the same test. Guru3D is alone in providing a CrossFire test and though two 7790s perform about the same as a sole 670, there's no power readings. There's a good number of different titles being benchmarked so it's not strictly a list of AMD-says-test-these-titles, plus Tomb Raider, a Gaming Evolved title, performs better on NVIDIA hardware. There's a few bugs with the beta drivers used for the 7790 in these reviews most notably with latencies (a bug that has already been fixed with the next Catalyst release so yes, we will see new drivers soon), however the latency values are so far ahead on average of what we used to see from AMD that this can hardly be classed as an issue. Testing has generally centred on 1920x1080 because that's really the limit where cards like this are supposed to be performing - there's little point in 1024x768 and an equal measure of futility trying for 5760x1080 or whatever; the former is ridiculously low res and the latter is ridiculously ambitious even for a 7970 or 670/680.
Sapphire's blurb about multi-monitor usage via the TweakTown website:
"Working or gaming with multiple monitors is becoming increasingly popular, and the SAPPHIRE HD 7700 series supports this with AMD Eyefinity, now in its second generation. The SAPPHIRE HD 7790 OC Edition has two DVI ports (DVI-I and DVI-D), HDMI and a single DisplayPort output, supporting up to four monitors.
The SAPPHIRE HD 7790 OC Edition model supports the FleX feature, pioneered by SAPPHIRE, that allows three digital displays to be connected to the DVI and HDMI outputs and used in AMD Eyefinity mode without the need for an external active adapter. All four outputs can be used in AMD Eyefinity mode, but the fourth display must be a DisplayPort monitor or connected with an active adapter."
I've heard AMD's launch date for this is today; Guru3D has the following to say:
"But I need to add this little note alright; AMD's Never Settle Reloaded promotion continues. At participating retailers beginning 02 April, 2013, gamers will be able to receive a free copy of BioShock Infinite with a purchase of their new AMD Radeon HD 7790 graphics card. See, now that's great value. The Radeon HD 7790 series cards will be available in stores starting April 2, 2013"
I've run 2 flex edition cards, you idiot. Have you ? Run any MDT nVidia galaxy cards dummy ? How about all dvi outs so you daon't have to have $100's of dollars of soon to die amd dangly cables ? Heck a friend just got ANOTHER 6870 he usually runs 4 monitors, but that could only run 2 OOB and he has loads of cables, so he had to buy another cable just to run a 3rd monitor - it took 2 weeks to arrive... ROFL AMD SUCKS with eyefinity / multiple monitors and nVidia DOES NOT - nVidia keeps the end user in mind and makes it INEXPENSIVE to set up 3 or 4 monitors !
Amd makes it cost prohibitive. AMD SUCKS, they lost again, totally.
What planet do you come from? This card will run 4 monitors, eyefinity has done this very well, forever. With discrete audio per monitor. Nvidia is really getting handed it's ass by AMD in this category. This card will spank it's nvidia competition in civ5, since civ uses opencl, and nvidia sucks at opencl (and their current cards even suck at cuda). Crossfire: there's a crossfire port at the top, genius. It will obviously crossfire. Too bad Nvidia's 2D quality and video quality is such utter shit. I might have actually used that gtx660 I bought instead of sending it back for a 7870.
You obviously dont have a clear understanding of gpu tech so just stop blabbing your stupidity, even most nvidia biased people can admit Check linus tech tips , check your games they all work much better on amd with multimonitor.
and no overclock talk maybe because AMD doesnt approve of people tampering with gpus............ and because they want it to seem so good that you dont need an overclock...... and the specific hardware partners can make different port configs so why would you say that? and maybe comparing ASUS 650tis to this GPU is invalid becuase you didnt specify who made it so the port config advantage is completely irrelevant.
Amd is not bankrupt because of their GPU business. and their CPU business isnt bad i dont think that getting into the gpu and cpu of the top 3 consoles (PS4 Xbox WiiU) is so bad either. and why would game biases not be true if amds drivers and games play better on the amd based systems eg: Crysis 3. and saying that the civ 5 benches crashed is completely stupid because a good website like anandtech doesnt normally disregard such things. and AMD didnt pay them off if they are bankrupt right? yes it can crossfire because theres crossfire connectors on the top so maybe they assumed things would be implied for the general crowd.
I think it's also worth mentioning that the 7850 is a quite excellent overclocker. At stock I think it's definitely not worth the extra $30, but once overclocking is taken into account, if you can afford the $30 you are crazy not to spend it (assuming you are comfortable with overclocking of course).
Yeah, I'm curious what the pricing will look like on these a few weeks after introduction. I picked up a 7850 (2GB MSI Twin Frozr) for $170 AR a few weeks ago to put in a HTPC, and I've seen it at that price again already. It will be interesting to see if the regular sales on 7850s decrease once the 7790 is out. Kudos to AMD for offering BioShock Infinite with this.
Given we're talking about gaming cards here, I think it's worthwhile to add that only the 7800s and 7900s come with AMD's Never Settle game promotion. So, if you're interested Tomb Raider and/or Bioshock Infinite, the 7850 may have significantly more value to you. If you're not interested in them, people have been selling the coupons on eBay for about $50-60 each.
There's a small paragraph in the article explaining that this card _is_ part of the Never Settle Reloaded program. It's only getting BioShock Infinite since it sits at a lower price point, but still a nice addition. The bundles are a big part of the reason I'm curious to see how the pricing shakes out. I sold a TR/BS bundle and kept about ~$50 in my pocket after fees, so I basically got a very nice 2GB 7850 for $120. You could obviously sell the BioShock code you'd get with a 7790, but if the prices for that card stay at MSRP for too long they'll have some stiff competition from 7850s on sale. Unless of course the 7850 sales dry up since it doesn't have to cover such a large swath of AMD's lineup now price wise.
Your chart shows Radeon HD 6870 FP64 performance as N/A. I think it's 1/20 of FP32 performance, but I'm not sure of that. It definitely can do FP64, as otherwise, it wouldn't be able to claim OpenGL 4 compliance.
It's basically the same as what the 7770, 7790, and 7850 do, but they're not listed as N/A. The relevant question isn't whether you can do it more slowly, but how much more slowly.
Let's be clear here. 85W is not the TDP. The TDP is higher (likely on the order of 110W or so). However AMD chooses not to publish the TDP for these lower end cards, and instead the TBP.
Yeah, I figure ~85 TBP/105w TDP because that would be smack between 7770/7850 as well as having 20% headroom (which also allows another product to have their TBP between there and 7850's max TDP with it's max tdp above it within 150w....ie ~120-125/150w). IIRC, 80w is the powertune max (TDP) of 7770, 130w for 7850. 85w is the stock operation (TBP) of 7790.
I really, really dislike how convoluted this power game has become...can you tell?!
First it was max power. Then it was nvidia stating typical power (so products were within pci-e spec) with AMD still quoting max, which made them look bad. Then we get this 'awesome' product segmentation with 7000 having TBP and max powertune TDPs to separate them, while nvidia quotes TBP and hides the fact the TDP limits for their products exist unless you deduce them from the percentage you can up the boost power.
AAAAaaaarrrrrghhhhh. I miss when the product you had could do what you wanted it to, ie before software voltage control and multiple states, as for products like this it gives the user less control and the companies a ton to create segmentation. Low-end stock products may have been less-than-stellar back in the day, but with determination you could get something out of it without some marketing stating it should fit x niche so give it y max tdp so it doesn't interfere with the market of z product.
Maybe so you couldn't blow the crap out of it then return it for another one, then another one, as "you saved money" and caused everyone else to pay 25% more since you overclock freaks would blow them up, then LIE and get the freebie replacement, over and over again.
Maybe they got sick of dealing with scam artist liars... maybe they aren't evil but the end user IS.
Why would the design power be higher than the total board power? :/ You're correct that the figure they're quoting isn't TDP but then you just went and made up a number.
Has it occurred to anyone else that this is in all probability an OEM release of the "semi-custom" silicon that will find its way into Sony's Playstation 4 in the fall?
Word has it that Sony has some form of GPU switching tech integrated into the PS4.
Initially I presumed this to be some "Optimus"-esque dynamic context switching power saving routine. However, the patent explicitly states, "This architecture lets a user run one or more GPUs in parallel, but only for the purpose of increasing performance, not to reduce power consumption." Which struck me as some kind of expansion on the nebulous "hybrid crossfire" tech that AMD has been playing w/since they birthed the 3000 series 780G igpu
Based off of AMD's previous endeavors in this area on the PC side I would be skeptical of the benefits/merit of pairing the comparatively anemic iGPU's of Kabini w/a presumably Bonaire derived GPU. As an aside; since SLI/CFX work by issuing frames to the next GPU available, if one GPU is substantially faster than the other(s), frames get finished out-of-order and the IGP/slower-GPU's tardy frames simply get dropped which may make the final rendered video stuttery/choppy.
Pairing an IGP with a disproportionately powerful discrete GPU simply does not work for realtime rendering.
It is certainly possible that with the static nature of the console and perhaps especially the unified nature of the GDDR5 memory pool/bank that performance gains could be had
However, my digression on the merits of the tech thus far is 128 + 128 = 256 + 896 = Anand's own deduction of 1152sp's)
oh well...my last point of arithmetic was simply that 1 fully enabled 4 core Kabini's I'm suspecting would have a 128 shader count igpu. Factor in the much ballyhooed 8-core Cpu in the PS4 we would have two Kabini's (128+128=256) + a Bonaire derived 896sp GPU all on some kind of custom MCM style packaging "semi-custom APU" (rumor had it that the majority of Sony's R&D contributions were in the stacking/packaging dept.)
...which jives w/Anand's own piece that ran on the console's unveiling, "Sony claims the GPU features 18 compute units, which if this is GCN based we'd be looking at 1152 SPs and 72 texture units"
Sony's previous two consoles (PS2 and PS3)have traditionally favored high frequency/bandwidth proprietary Interconnects between components (see Cell's EIB) so this is likely where the "secret sauce" Sony R&D came in, thus facilitating the 176GB/S. AMD was quoted (can't find link) that said Sony engineering would be excluded if/when they release a PC variant of said APU.
Very, very interesting indeed. It tallies well with the numbers. There was me thinking they had bolted Pitcairn onto the side of their CPUs but this combo might make more sense (and yet also less sense).
NO it SLOWS THE CARD DOWN with it's crappy amd core... Haven't you been paying attention for like the YEARS you've been here ? My apologies if you're an epileptic.
I don't understand why you haven't been banned yet. You add nothing to the discussion with your posts other than vitriol. Please either be civil and logical, or go away.
AMD always releases 1GB models and 2GB models so the amd fanboys can quote the 1GB model cheapo powercolor low end price, claim it wins price perf, then go on raging about how the 2GB model covers the high end ...
ROFL - That's what they do - they even do it when comparing to a 2GB nVidia, suddenly forgetting amd makes crapster 1GB they swore off years ago, even though that's the screamer amd fanboy price "they pay" because "it's such a deal! Man! "
Regarding noise measurement: weighing scale used for absolute measurements may be the "A" scale, but a card is not certain number of dB(A) louder than another card, it is certain number of dB louder (3 dB more would be twice as loud as far as sound pressure level is concerned, measured under same conditions and with same weighing), since your weighing scale used to take absolute measurements is the same.
This refers to all your statements along the lines of "... but over 3 dB(A) louder ..." etc.
Middle of the road filler products are so boring. They are usually a mishmash of memory from here, GPU features from there, all so confusing and boring. Just release a full line of new core and memory features, age down your older products according to how they perform compared to the new and be done with it.
Is there any actual evidence to support the conclusion that 1GB is not enough for 1080p? Given the choice between more compute units or more RAM, I would take more compute units.
Right now I believe there would not be. What people are anticipating is an inflation in the size of game assets spurred on by the next generation of consoles. Some people want to keep these things for a few years so it's a legitimate concern for a change!
LOLOLOLOL Wow how fanboys change from just prior releases with 2GB or 3GB amd crapcards - then it was an ABSOLUTE WIN according to you - necessary and "future proof!" especially for "skyrim mods !" LOL You're a good laugh.
Congratulations Cerise, between this article and the HTC One article you have succeeded in injecting so much pointless crap that it is no longer worth the effort to sift your crap out to read the actual comments on the article.
It's good to see clock & voltage states become more fine-grained and their choice smarter. Ultimately how I'd like a GPU to work: set targets and limits for power use, temperature and noise.. and then crank it up as far as it goes. Vary chips by different amounts of execution units, not frequencies.
This includes simple user settings for lowering power consumption (call it the "green mode" of whatever), if people want to, which would automatically choose lower voltages to increase efficiency.
And of course something similar to nVidias frame rate target: if performance if fine now, save power. And save some thermal headroom in case it's needed soon. Make smart use of the power budget. It's nice to see AMD making some progress!
Quote - "The Radeon HD 7790 runs at 1GHz, but is not going to be called a "GHz Edition" anymore. AMD feels that they have made the point about having 1GHz edition GPUs in the market in 2012, and did not feel a need to label this new one a GHz Edition. Therefore, it will just be known as Radeon HD 7790." - Brent @ HardOCP, Asus DCUII 7790 review
Don't see a point. As a PC gamer I want it all, not some laughably compromised card - just over 30FPS (if that) at 1080p with the settings turned up? What's the point, just buy a console. I'll stick with my 680.
While I agree with your sentiment, this card was not designed with us in mind, there is a large portion of people on budgets, who can't go ahead and blow ~$450 on a graphics card. There is also a large portion of people who see no need to play games at Ultra with high AA, etc.
This card is a great line-up filler. I can see a use for this in a variety of budget gaming systems.
Well with a name like Hardcore69, of course you would want the best of the best of the best (with honors). But I'm a casual PC gamer, so this card looks pretty great to me. My 4870 is getting a little (ok, very) long in the tooth. Why would I buy a console and pay full price for games, only use a controller, and have to pay a monthly fee (x360) just to play casually when I could pick up a $150 card and drop it into my computer?
The day will come, in the not so distant future, that 700W PS requirements or higher for high-end gaming machines will come to an end (thankfully). And the whole system will not consume more than 100W and fits inside a mATX case or smaller (and no need for Godzilla size cooling fans anymore either).
"pulling 7W more than the 7770, a hair more than the 5W difference in AMD’s TBP" That 5W is not at the wall though. Factoring in rounding PSU efficiencies, it's very possible that the cards are only drawing 5W more. :) "The Sapphire card, despite being overclocked, draws 6W less than our reference 7790." Seeing how the Sapphire runs cooler in Furmark, that might explain a Watt or two in reduced power draw, coupled with the efficiency of the PSU, it might explain three or four even. :)
"NVIDIA has for a long time set the bar on efficiency, but with the 7790 it looks like AMD will finally edge out NVIDIA."
What is your definition of a long time? As far as efficiency standards, I consider AMD to be better for the end result when looking at the full definition and application of the word. See the spreadsheet I created here about 16 months ago to understand what I mean: http://forums.anandtech.com/showthread.php?t=21507...
You just called Ryan a "dummy", did you, without even checking the statement further down which reads:
"For anyone looking to pick up a 7790 today, this is being launched ahead of actual product availability (likely to coincide with GDC 2013 next week). Cards will start showing up in the market on April 2nd, which is about a week and a half from now."
If YOU had read the article, blah blah dumb idiot blah blah. As you've not replied to anybody in particular, your mistargeted rants could be construed as being directed toward the staff themselves, so keep it up and you won't HAVE to worry about what AT is reviewing in future.
Bottom line - it's faster than the 650 Ti, it's looking to be more efficient than the 650 Ti, and oh look, both have 1GB of GDDR5 on a 128-bit memory interface, which you seem to have forgotten when you leapt down AMD's throat about the 7790, and when you went on your childish tirade about the 5770's 128-bit memory interface earlier.
As far as I recall, Ryan didn't mention anything about when Titan was available to buy, only that it had launched. Pretty much blows your theory of Ryan hating NVIDIA out of the water, doesn't it?
I'm not sure if I've said this before, and apologies to everybody else if I have, but I'm done with you, full stop. I can only hope everybody else here decides that not feeding the ignorance you perpetuate on every single AMD article would save them time they could be devoting to something far less boring instead.
To the staff - is there anything you can do to introduce an Ignore List? Thanks in advance for your response.
You got eveything wrong again, and you failed to read the article not I, and you failed to read my reply addressing half your idiotic non points, so you're the non reader, fool. Now I have to correct you multiple times. And you're a waste. 650TI overclocks and it's only faster in a few amd favor games which are here, of course. Strike one for tardboy. 650Ti runs fine OC'd too, which it does well: " We pushed Gigabyte's GeForce GTX 650 as far as it'd go and achieved a maximum core overclock of 1125 MHz, with the GDDR5 memory operating at 1600. All it took was a 1.15 V GPU voltage. " http://www.tomshardware.com/reviews/geforce-gtx-65... The 128 bit bus - REPAYMENT for you FOOLS SQUEALING prior, what's so hard to understand ? Did you forget all your WHINING ? Did you forget your backing up the FAILED theorists with the nVidia dual speed memory ? ROFL You're up to strike 4 already. " Ryan didn't mention anything about when Titan was available to buy, only that it had launched. Pretty much blows your theory of Ryan hating NVIDIA out of the water, doesn't it?" NO, so why would it be mentioned if he didn't want anyone to buy it ? Why mention it, that would key in to save for release date, right ? Instead we get this gem first off in BOLD to start the article: " Who’s Titan For, Anyhow? "
Guess that just crushed your idiot backwards bullhockey forever. For all you know Ryan mentioned release date anyway.
You're not "done with me", you get everything WRONG, so you'll be opening your big fat piehole forever, that's how people like you do it. Idiot amd fanboys, all the same.
Also a beggar child for extra "control", since you "can't be an adult and control yourself" - please give me an ignore button ! I'm a crybaby who can't handle it ! ROFL
One question does your 650Ti pays for itself? this amd will pay for itself via bitcoin. even with the asics. especially if you heat your home with electrical heat.
To Ryan and staff As a long-time admirer of AnandTech, I always enjoy reading pretty much every article you post, and have immense respect for all your writers. However, I am now utterly fed up with the direction the comment discussions have taken. The general pattern is they start out as debates and end up as pretty nasty personal attacks that have nothing to do with the articles. You may say 'don't read the comments', to which I reply that they used to be an extension of the articles themselves, and were always a source of valuable information. It pains me to say this, but if you don't start removing the trolls I will no longer come to this site at all, and I would guess I am not alone in having this opinion.
I agree the trolls are out of control and need some pruning back. They have massively lessened my enjoyment of the site the last couple of times I've visited.
Well said, and thanks. I no longer visit Dailytech for the same reasons. I enjoy reading comments, since they can offer other perspectives from like-minded people, but unmoderated is worse than nothing at all. This used to be my favorite tech site, but the comments section here has slowly been pushing me to avoid it most of the time.
The 7790 reminds me of the 4770. Sure, that was on a new process node, but it's a late addition to the line designed to take advantage of tweaks, process improvements, etc.
There may be a lot of transistors in a GCN design but I couldn't help feel that there were power savings to be had. For this reason, I'd hope that their next flagship doesn't exceed the 7970GE's power draw whilst providing a decent performance boost.
For those who want the most power in the smallest package and power drain, look no further then the Radeon 7790. The only disappointment was the heat factor, but more or less the same performance as 7850 at half the power; that's great. Also, I don't mind that AMD went the 6 ghz vram route, because now there is even more reason to get 2 GB which is especially needed if you apply a dozens or hundreds of mods to your games. Also its the 128 bit interface that kept the power low, so despite everyone's cussing AMD made the right choices. I have a GTX 460 which easily uses at least 200 watts. This 7790 is almost twice as fast and uses 2.5 times less power. The pricing is acceptable, if you were to include 2GB by default, then why bother with the 7850; they still want ppl to buy that one.
Hey Anand, can you guys please do a video quality test? I mean I haven't seen any such test on any website for over 3 years. So please, can you do a video quality test in movies and games and please also use low quality video as well, not just top of the line 1080p type videos that would look amazing even on a GeForce 3.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
107 Comments
Back to Article
vailr - Friday, March 22, 2013 - link
Didn't AMD originally supply an info-graphic (about a year ago, just prior to the release of their 7000 series cards) that the 7790 card was to have been their lowest-priced 7000 series card that still provided a 256-bit memory interface? Any explanation for the downgrade now, to a 128-bit interface?campcreekdude - Friday, March 22, 2013 - link
I thought the 7890 is supposed to be 256-bit.Oxford Guy - Tuesday, March 26, 2013 - link
The big problem is that it doesn't have enough VRAM. 1 GB is too little for 2013 cards. Pretty ridiculous for one priced at $150.vailr - Friday, March 22, 2013 - link
Info-graphic from a year ago that shows the 7790 with a 256-bit memory bus:http://wccftech.com/amd-radeon-hd7000-28nm-souther...
zshift - Friday, March 22, 2013 - link
Nice catch!CeriseCogburn - Saturday, March 23, 2013 - link
Amd lies all the time then fails to deliver, so the fanboys give them a thousand breaks, another chance, some more time - " in the future what AMD I bought today will be great, it's "futureproof" " (even though it doesn't work correctly right now)...LOL - goodbye amd
So the beta driver, uhh perma beta driver must have crashed out in civ5 a lot.
[email protected] - Sunday, March 24, 2013 - link
im sorry but if you must insist on invalid staments you shouldnt be on this website because smarter people will correct you on YOUR fanboyness. First of all amd beats nvidia at every price point(check linus tech tips on his youtube video card chanell) and they have perfect drivers. working for 6 months on the same amd driver and no crashes. infact i tried a new crysis 3 driver and it didnt crash then either and that was a beta driver. and my radeon 7770 destroyed the nvidia gtx 650, which is at the same price point.medi01 - Sunday, March 24, 2013 - link
I hope you work for nVidia.Because writing such crap for free is hard to imagine.
Gigaplex - Sunday, March 24, 2013 - link
How do you keep a straight face when accusing others of being fanboys? You're the biggest fanboy I've ever seen.Deo Domuique - Sunday, March 24, 2013 - link
AMD Peasants, that's what we are... First time in my over 15-years "career" on PC gaming I went with AMD and a 7950, and you can't even imagine how much I regret it.Sabresiberian - Monday, March 25, 2013 - link
A roadmap is nothing but a projection of what is PLANNED for the future, not some kind of "promise" or "guarantee". Calling AMD people liars because the released product didn't match the projection is childish at best.And before you slap the "fanboy" label on me, I prefer Nvidia generally speaking (but I'm not going to cut off my proverbial nose to spite my face in order to be brand loyal; if AMD has the current best solution for my purposes, I'm going to buy AMD).
CeriseCogburn - Saturday, March 23, 2013 - link
128 bit bus is great, the HD5770 proved that.BWHAHAHHAAA
dishayu - Friday, March 22, 2013 - link
Good eye. But then they metion HD7790 as Pitcairn LE in that infographic. What they have launched as HD7790 now is Bonaire.ShieTar - Friday, March 22, 2013 - link
Maybe they surprised themselves by getting GDDR5 to run at 6GHz, and realized that they can stick with 128bit at that speed?Lonyo - Friday, March 22, 2013 - link
They were going to use a cutdown Pitcairn, being 7870/7850 GPU, and cut down the GPU core to use excess cores that couldn't make the cut as 7870/7850s.They might have gone with 256-bit to simplify the product for AIB partners who could just re-use their HD7850 designs, rather than needing a new design for a smaller run product.
The 7790 now is a new GPU designed to be cheaper to produce (as it's smaller) than Pitcairn, and the fact the memory can run at 6GHz is probably due in part to the fact it's a new GPU rather than a cut down Pitcairn.
CeriseCogburn - Friday, March 22, 2013 - link
I don't see a launch date in the whole article, it's NOT available. I guess that's another mystery freebie for AMD's products here.Didn't see port config either, so what cabling do we have to buy to run 3 monitors when Asus 650ti runs 4 out of the box, 3 with dvi and vga only ?
Not impressed with the huge AMD biased game line up either, so expect your mileage to be less than shown.
No overclock talk really either - so it must blow at that.
Other sites are reporting amd's beta driver, so maybe they won't even have a release driver for this card when they release it, as AMD is often known to do, for like a year sometimes or forever in terms of any sort of quality-LOL.
Civ5 has only 1 bench rez, it must have crashed in others.
Crossfire ? Article didn't say.
Multi-monitor - no talk of that anymore since nVidia SPANKS amd to death on that now.
Hopefully you've fooled the internet tards again, because amd is bankrupt, for good reason.
Spoelie - Friday, March 22, 2013 - link
Let's feed the troll.Did you even read the article?
-Launch date is mentioned on page 1, in one and a half week
-Ports are clearly visible and standard, 2 DVI + HDMI + DisplayPort
-Lineup is consistent with every other review on Anandtech.
-There's an entire page on the new PowerTune and how it impacts overclocking, single sample OC investigation is irrelevant and best left for a dedicated vendor comparison.
-... really?
Who's the real tard here?
Spunjji - Friday, March 22, 2013 - link
Oh for a down-vote button. We expect no less than mindless bollocks from Cerise, but failing to read the article entirely is a new low.CeriseCogburn - Saturday, March 23, 2013 - link
No, that's what you do all the time. But thanks for the compliment, since you know I always read the articles completely, yet you think I didn't this time, WRONG.I've made a lot of money this past short week without a lot of rest, so I'll give you and dipsy doodle a point on the svengali launch date the article writer for the first time EVER declares "solid" before it even occurs, og wait, he always does that when it's AMD, but if it's nVidia he says we'll have to wait and see as they are probably lying...
ROFL
Who cares, the card sucks, amd is dying, the drivers blow beta chunks, and amd is way late to the party.
ppeterka - Thursday, July 18, 2013 - link
Just a question: And how much will your favored brand of GPUs cost, if AMD really dies? 10 times? 100 times? An arm, a leg, and both kidneys? Grow up, and understand how an ecosystem works for us all.BTW. I don't have GPU preferences, just grab what gives bets bang for bucks. If it has EasternElbonianVideoPigs GPU on it - be it...
silverblue - Friday, March 22, 2013 - link
You must've missed the part about them simply not having as much time to test the 7790 as they'd have liked because they were at GTC. Other sites apologised for their lack of time as well.There's a whole load of other reviews out there; only a few have overclocking results (Guru3D notably), and as far as I can see only AT, of the major sites, has both the 7790 and a factory overclocked 7790 in the same test. Guru3D is alone in providing a CrossFire test and though two 7790s perform about the same as a sole 670, there's no power readings. There's a good number of different titles being benchmarked so it's not strictly a list of AMD-says-test-these-titles, plus Tomb Raider, a Gaming Evolved title, performs better on NVIDIA hardware. There's a few bugs with the beta drivers used for the 7790 in these reviews most notably with latencies (a bug that has already been fixed with the next Catalyst release so yes, we will see new drivers soon), however the latency values are so far ahead on average of what we used to see from AMD that this can hardly be classed as an issue. Testing has generally centred on 1920x1080 because that's really the limit where cards like this are supposed to be performing - there's little point in 1024x768 and an equal measure of futility trying for 5760x1080 or whatever; the former is ridiculously low res and the latter is ridiculously ambitious even for a 7970 or 670/680.
Sapphire's blurb about multi-monitor usage via the TweakTown website:
"Working or gaming with multiple monitors is becoming increasingly popular, and the SAPPHIRE HD 7700 series supports this with AMD Eyefinity, now in its second generation. The SAPPHIRE HD 7790 OC Edition has two DVI ports (DVI-I and DVI-D), HDMI and a single DisplayPort output, supporting up to four monitors.
The SAPPHIRE HD 7790 OC Edition model supports the FleX feature, pioneered by SAPPHIRE, that allows three digital displays to be connected to the DVI and HDMI outputs and used in AMD Eyefinity mode without the need for an external active adapter. All four outputs can be used in AMD Eyefinity mode, but the fourth display must be a DisplayPort monitor or connected with an active adapter."
I've heard AMD's launch date for this is today; Guru3D has the following to say:
"But I need to add this little note alright; AMD's Never Settle Reloaded promotion continues. At participating retailers beginning 02 April, 2013, gamers will be able to receive a free copy of BioShock Infinite with a purchase of their new AMD Radeon HD 7790 graphics card. See, now that's great value. The Radeon HD 7790 series cards will be available in stores starting April 2, 2013"
Trusting this is of some use to you...
CeriseCogburn - Sunday, March 24, 2013 - link
I've run 2 flex edition cards, you idiot.Have you ?
Run any MDT nVidia galaxy cards dummy ?
How about all dvi outs so you daon't have to have $100's of dollars of soon to die amd dangly cables ?
Heck a friend just got ANOTHER 6870 he usually runs 4 monitors, but that could only run 2 OOB and he has loads of cables, so he had to buy another cable just to run a 3rd monitor - it took 2 weeks to arrive...
ROFL
AMD SUCKS with eyefinity / multiple monitors and nVidia DOES NOT - nVidia keeps the end user in mind and makes it INEXPENSIVE to set up 3 or 4 monitors !
Amd makes it cost prohibitive.
AMD SUCKS, they lost again, totally.
geniusloci - Saturday, March 23, 2013 - link
You are a pathetically simple little mind, aren't you?geniusloci - Saturday, March 23, 2013 - link
What planet do you come from?This card will run 4 monitors, eyefinity has done this very well, forever. With discrete audio per monitor. Nvidia is really getting handed it's ass by AMD in this category.
This card will spank it's nvidia competition in civ5, since civ uses opencl, and nvidia sucks at opencl (and their current cards even suck at cuda).
Crossfire: there's a crossfire port at the top, genius. It will obviously crossfire.
Too bad Nvidia's 2D quality and video quality is such utter shit. I might have actually used that gtx660 I bought instead of sending it back for a 7870.
CeriseCogburn - Sunday, March 24, 2013 - link
You have your display port monitor or $100 active display cable dummy ? LOL4 monitors MY BUTT.
Another clueless amd fanboy.
[email protected] - Sunday, March 24, 2013 - link
You obviously dont have a clear understanding of gpu tech so just stop blabbing your stupidity, even most nvidia biased people can admit Check linus tech tips , check your games they all work much better on amd with multimonitor.and no overclock talk maybe because AMD doesnt approve of people tampering with gpus............ and because they want it to seem so good that you dont need an overclock...... and the specific hardware partners can make different port configs so why would you say that?
and maybe comparing ASUS 650tis to this GPU is invalid becuase you didnt specify who made it so the port config advantage is completely irrelevant.
Amd is not bankrupt because of their GPU business. and their CPU business isnt bad i dont think that getting into the gpu and cpu of the top 3 consoles (PS4 Xbox WiiU) is so bad either. and why would game biases not be true if amds drivers and games play better on the amd based systems
eg: Crysis 3. and saying that the civ 5 benches crashed is completely stupid because a good website like anandtech doesnt normally disregard such things. and AMD didnt pay them off if they are bankrupt right? yes it can crossfire because theres crossfire connectors on the top so maybe they assumed things would be implied for the general crowd.
althaz - Friday, March 22, 2013 - link
I think it's also worth mentioning that the 7850 is a quite excellent overclocker. At stock I think it's definitely not worth the extra $30, but once overclocking is taken into account, if you can afford the $30 you are crazy not to spend it (assuming you are comfortable with overclocking of course).Bob Todd - Friday, March 22, 2013 - link
Yeah, I'm curious what the pricing will look like on these a few weeks after introduction. I picked up a 7850 (2GB MSI Twin Frozr) for $170 AR a few weeks ago to put in a HTPC, and I've seen it at that price again already. It will be interesting to see if the regular sales on 7850s decrease once the 7790 is out. Kudos to AMD for offering BioShock Infinite with this.Aikouka - Friday, March 22, 2013 - link
Given we're talking about gaming cards here, I think it's worthwhile to add that only the 7800s and 7900s come with AMD's Never Settle game promotion. So, if you're interested Tomb Raider and/or Bioshock Infinite, the 7850 may have significantly more value to you. If you're not interested in them, people have been selling the coupons on eBay for about $50-60 each.Bob Todd - Friday, March 22, 2013 - link
There's a small paragraph in the article explaining that this card _is_ part of the Never Settle Reloaded program. It's only getting BioShock Infinite since it sits at a lower price point, but still a nice addition. The bundles are a big part of the reason I'm curious to see how the pricing shakes out. I sold a TR/BS bundle and kept about ~$50 in my pocket after fees, so I basically got a very nice 2GB 7850 for $120. You could obviously sell the BioShock code you'd get with a 7790, but if the prices for that card stay at MSRP for too long they'll have some stiff competition from 7850s on sale. Unless of course the 7850 sales dry up since it doesn't have to cover such a large swath of AMD's lineup now price wise.GivMe1 - Friday, March 22, 2013 - link
128bit interface is going to hurt high res textures...CeriseCogburn - Sunday, March 24, 2013 - link
Oh no it won't ! this is amd man! nothing hurts when it's amd ! amd yes it can !Quizzical - Friday, March 22, 2013 - link
Your chart shows Radeon HD 6870 FP64 performance as N/A. I think it's 1/20 of FP32 performance, but I'm not sure of that. It definitely can do FP64, as otherwise, it wouldn't be able to claim OpenGL 4 compliance.MrSpadge - Friday, March 22, 2013 - link
No, it doesn't have any HARDWARE FP64 capabilities. It's always possible to emulate this at slow performance via software, though.Quizzical - Friday, March 22, 2013 - link
It's basically the same as what the 7770, 7790, and 7850 do, but they're not listed as N/A. The relevant question isn't whether you can do it more slowly, but how much more slowly.MrSpadge - Tuesday, March 26, 2013 - link
No, it's not the same, the GCN cards have hardware FP64 capabilities.Ryan Smith - Friday, March 22, 2013 - link
Let's be clear here. 85W is not the TDP. The TDP is higher (likely on the order of 110W or so). However AMD chooses not to publish the TDP for these lower end cards, and instead the TBP.alwayssts - Friday, March 22, 2013 - link
Yeah, I figure ~85 TBP/105w TDP because that would be smack between 7770/7850 as well as having 20% headroom (which also allows another product to have their TBP between there and 7850's max TDP with it's max tdp above it within 150w....ie ~120-125/150w). IIRC, 80w is the powertune max (TDP) of 7770, 130w for 7850. 85w is the stock operation (TBP) of 7790.I really, really dislike how convoluted this power game has become...can you tell?!
First it was max power. Then it was nvidia stating typical power (so products were within pci-e spec) with AMD still quoting max, which made them look bad. Then we get this 'awesome' product segmentation with 7000 having TBP and max powertune TDPs to separate them, while nvidia quotes TBP and hides the fact the TDP limits for their products exist unless you deduce them from the percentage you can up the boost power.
AAAAaaaarrrrrghhhhh. I miss when the product you had could do what you wanted it to, ie before software voltage control and multiple states, as for products like this it gives the user less control and the companies a ton to create segmentation. Low-end stock products may have been less-than-stellar back in the day, but with determination you could get something out of it without some marketing stating it should fit x niche so give it y max tdp so it doesn't interfere with the market of z product.
CeriseCogburn - Friday, March 22, 2013 - link
Maybe so you couldn't blow the crap out of it then return it for another one, then another one, as "you saved money" and caused everyone else to pay 25% more since you overclock freaks would blow them up, then LIE and get the freebie replacement, over and over again.Maybe they got sick of dealing with scam artist liars... maybe they aren't evil but the end user IS.
Spunjji - Friday, March 22, 2013 - link
Why would the design power be higher than the total board power? :/ You're correct that the figure they're quoting isn't TDP but then you just went and made up a number.Here's some actual power consumption measurements of a 7770:
http://www.techpowerup.com/reviews/HIS/HD_7770_iCo...
So using Ananad's figures to extrapolate you can expect this thing to be ~90W max, usually lower than that at peak, right about where AMD put it.
Spunjji - Friday, March 22, 2013 - link
...forgive my stupidity. Actual figures of the 7790 here:http://www.techpowerup.com/reviews/Sapphire/HD_779...
Depends on whether we focus on Peak / Max figures to decide whether you or I am closer to the truth. :)
Ryan Smith - Friday, March 22, 2013 - link
Typical Board Power, not Total. TBP is an average rather than a peak like TDP, which is why it's a lower number than TDP.dbcoopernz - Friday, March 22, 2013 - link
Any details on UVD module? Any changes?The Asus Direct Cu-II might make an interesting high power but quiet HTPC card. Any chance of a review?
Ryan Smith - Friday, March 22, 2013 - link
There are no changes that we have been made aware of.haplo602 - Friday, March 22, 2013 - link
somebody please make this a single slot card and I am sold ... otherwise I'll wait for the 8k radeons ...Shut up and drink - Friday, March 22, 2013 - link
Has it occurred to anyone else that this is in all probability an OEM release of the "semi-custom" silicon that will find its way into Sony's Playstation 4 in the fall?Word has it that Sony has some form of GPU switching tech integrated into the PS4.
- apologies for the link to something other than Anand but I don't think they ran anything on the story http://www.tomshardware.com/news/sony-ps4-patent-p...
Initially I presumed this to be some "Optimus"-esque dynamic context switching power saving routine. However, the patent explicitly states, "This architecture lets a user run one or more GPUs in parallel, but only for the purpose of increasing performance, not to reduce power consumption."
Which struck me as some kind of expansion on the nebulous "hybrid crossfire" tech that AMD has been playing w/since they birthed the 3000 series 780G igpu
Based off of AMD's previous endeavors in this area on the PC side I would be skeptical of the benefits/merit of pairing the comparatively anemic iGPU's of Kabini w/a presumably Bonaire derived GPU.
As an aside; since SLI/CFX work by issuing frames to the next GPU available, if one GPU is substantially faster than the other(s), frames get finished out-of-order and the IGP/slower-GPU's tardy frames simply get dropped which may make the final rendered video stuttery/choppy.
Pairing an IGP with a disproportionately powerful discrete GPU simply does not work for realtime rendering.
It is certainly possible that with the static nature of the console and perhaps especially the unified nature of the GDDR5 memory pool/bank that performance gains could be had
However, my digression on the merits of the tech thus far is
128 + 128 = 256 + 896 = Anand's own deduction of 1152sp's)
Shut up and drink - Friday, March 22, 2013 - link
I pushed submit by mistake...damn...oh well...my last point of arithmetic was simply that 1 fully enabled 4 core Kabini's I'm suspecting would have a 128 shader count igpu. Factor in the much ballyhooed 8-core Cpu in the PS4 we would have two Kabini's (128+128=256) + a Bonaire derived 896sp GPU all on some kind of custom MCM style packaging "semi-custom APU" (rumor had it that the majority of Sony's R&D contributions were in the stacking/packaging dept.)
Anyone concur?
Shut up and drink - Friday, March 22, 2013 - link
...which jives w/Anand's own piece that ran on the console's unveiling, "Sony claims the GPU features 18 compute units, which if this is GCN based we'd be looking at 1152 SPs and 72 texture units"http://www.anandtech.com/show/6770/sony-announces-...
A5 - Friday, March 22, 2013 - link
Yeah, once this came in at 14 CUs with minor architecture changes, it seemed like a likely scenario to me.Obviously it isn't going to give you PS4 performance on ports with only 1GB of memory, though.
crimson117 - Friday, March 22, 2013 - link
Good thought, but I sure hope Sony doesn't hamstring its PS4 with a 128-bit memory bus!silverblue - Friday, March 22, 2013 - link
Not at 176GB/s, unless they're clocking that GDDR5 VERY high. The 7790 is good for 96GB/s.Shut up and drink - Friday, March 22, 2013 - link
Sony's previous two consoles (PS2 and PS3)have traditionally favored high frequency/bandwidth proprietary Interconnects between components (see Cell's EIB) so this is likely where the "secret sauce" Sony R&D came in, thus facilitating the 176GB/S.AMD was quoted (can't find link) that said Sony engineering would be excluded if/when they release a PC variant of said APU.
Spunjji - Friday, March 22, 2013 - link
Very, very interesting indeed. It tallies well with the numbers. There was me thinking they had bolted Pitcairn onto the side of their CPUs but this combo might make more sense (and yet also less sense).lopri - Friday, March 22, 2013 - link
Totally agree with memory size. At this performance and price level, 2 GB should be default.lopri - Friday, March 22, 2013 - link
Then again, it would be strange if AMD doesn't release "larger" cards based on this updated GCN core.silverblue - Friday, March 22, 2013 - link
Perhaps the reason for the lack of a 2GB version would be that it would be too close to the 7850...?CeriseCogburn - Sunday, March 24, 2013 - link
NO it SLOWS THE CARD DOWN with it's crappy amd core...Haven't you been paying attention for like the YEARS you've been here ?
My apologies if you're an epileptic.
Tams80 - Monday, April 1, 2013 - link
I don't understand why you haven't been banned yet. You add nothing to the discussion with your posts other than vitriol. Please either be civil and logical, or go away.CeriseCogburn - Sunday, March 24, 2013 - link
AMD always releases 1GB models and 2GB models so the amd fanboys can quote the 1GB model cheapo powercolor low end price, claim it wins price perf, then go on raging about how the 2GB model covers the high end ...ROFL - That's what they do - they even do it when comparing to a 2GB nVidia, suddenly forgetting amd makes crapster 1GB they swore off years ago, even though that's the screamer amd fanboy price "they pay" because "it's such a deal! Man! "
Brainfart Bart they should be called.
R3MF - Monday, March 25, 2013 - link
Actually, they didn't with the 7770.Your constant whining is about as welcome as a bout of herpes, scram.
Oxford Guy - Tuesday, March 26, 2013 - link
It's absolutely ridiculous to release a 1 GB card today when even games like Skyrim need 2 GB at 1080p.Arnulf - Friday, March 22, 2013 - link
Regarding noise measurement: weighing scale used for absolute measurements may be the "A" scale, but a card is not certain number of dB(A) louder than another card, it is certain number of dB louder (3 dB more would be twice as loud as far as sound pressure level is concerned, measured under same conditions and with same weighing), since your weighing scale used to take absolute measurements is the same.This refers to all your statements along the lines of "... but over 3 dB(A) louder ..." etc.
warezme - Friday, March 22, 2013 - link
Middle of the road filler products are so boring. They are usually a mishmash of memory from here, GPU features from there, all so confusing and boring. Just release a full line of new core and memory features, age down your older products according to how they perform compared to the new and be done with it.piroroadkill - Friday, March 22, 2013 - link
I disagree. This is a compelling midrange card.Spunjji - Friday, March 22, 2013 - link
Good for you. Many of us disagree and this card has an obvious place in their line-up.CeriseCogburn - Sunday, March 24, 2013 - link
Oh did you take a survey, or are you speaking for all amd fanboys just because ?CeriseCogburn - Sunday, March 24, 2013 - link
Hey a least it isn't a THREE GNERATION IN A ROW CLONE !HAHAHAHA
Like the 5770.
Shadowmaster625 - Friday, March 22, 2013 - link
Is there any actual evidence to support the conclusion that 1GB is not enough for 1080p? Given the choice between more compute units or more RAM, I would take more compute units.Spunjji - Friday, March 22, 2013 - link
Right now I believe there would not be. What people are anticipating is an inflation in the size of game assets spurred on by the next generation of consoles. Some people want to keep these things for a few years so it's a legitimate concern for a change!CeriseCogburn - Sunday, March 24, 2013 - link
LOLOLOLOLWow how fanboys change from just prior releases with 2GB or 3GB amd crapcards - then it was an ABSOLUTE WIN according to you - necessary and "future proof!" especially for "skyrim mods !"
LOL
You're a good laugh.
ThomasS31 - Friday, March 22, 2013 - link
I dont think this is the Sea Island generation yet...Ryan Smith - Friday, March 22, 2013 - link
This is Sea Islands. Oland and Bonaire are both part of the Sea Island family.CeriseCogburn - Sunday, March 24, 2013 - link
Bonaire sounds (like Bel Air) all stuffy and prim and proper - too bad amd fanboys aren't classy.At least it's not pitcairn, the in the pit card.
KnightRAF - Monday, March 25, 2013 - link
Congratulations Cerise, between this article and the HTC One article you have succeeded in injecting so much pointless crap that it is no longer worth the effort to sift your crap out to read the actual comments on the article.extide - Friday, March 22, 2013 - link
PLEASE!!! Add folding@home benchmarks to your tests, please please!!Thank You!
JarredWalton - Friday, March 22, 2013 - link
Might want to read a little better before posting:http://www.anandtech.com/show/6837/14
extide - Tuesday, March 26, 2013 - link
Sorry, doh, I feel dumb. I quickly scanned the compute page and didn't see it. Thank you VERY MUCH for including these!!MrSpadge - Friday, March 22, 2013 - link
It's good to see clock & voltage states become more fine-grained and their choice smarter. Ultimately how I'd like a GPU to work: set targets and limits for power use, temperature and noise.. and then crank it up as far as it goes. Vary chips by different amounts of execution units, not frequencies.This includes simple user settings for lowering power consumption (call it the "green mode" of whatever), if people want to, which would automatically choose lower voltages to increase efficiency.
And of course something similar to nVidias frame rate target: if performance if fine now, save power. And save some thermal headroom in case it's needed soon. Make smart use of the power budget. It's nice to see AMD making some progress!
Termie - Friday, March 22, 2013 - link
Ryan, just FYI, you're using Catalyst 13.2 beta 7 with the other cards, not Catalyst 13.7 betas as indicated in the text on "The Test" page.Ryan Smith - Friday, March 22, 2013 - link
Whoops. Thanks.bebimbap - Friday, March 22, 2013 - link
would it be wrong of me to wait for the 7790 ghz ed?SithSolo1 - Friday, March 22, 2013 - link
There will be no GE of this card.Quote - "The Radeon HD 7790 runs at 1GHz, but is not going to be called a "GHz Edition" anymore. AMD feels that they have made the point about having 1GHz edition GPUs in the market in 2012, and did not feel a need to label this new one a GHz Edition. Therefore, it will just be known as Radeon HD 7790." - Brent @ HardOCP, Asus DCUII 7790 review
SithSolo1 - Friday, March 22, 2013 - link
Before someone gets confused, I'm not Brent. I just happened to read the review a bit ago and remembered that part about the Ghz Edition.Hardcore69 - Friday, March 22, 2013 - link
Don't see a point. As a PC gamer I want it all, not some laughably compromised card - just over 30FPS (if that) at 1080p with the settings turned up? What's the point, just buy a console. I'll stick with my 680.cyan1d3 - Friday, March 22, 2013 - link
While I agree with your sentiment, this card was not designed with us in mind, there is a large portion of people on budgets, who can't go ahead and blow ~$450 on a graphics card. There is also a large portion of people who see no need to play games at Ultra with high AA, etc.This card is a great line-up filler. I can see a use for this in a variety of budget gaming systems.
R3MF - Monday, March 25, 2013 - link
agreed.i spent £400 on an MSI 7970 Lightning, but not everyone is that stupid! :D
evonitzer - Friday, March 22, 2013 - link
Well with a name like Hardcore69, of course you would want the best of the best of the best (with honors). But I'm a casual PC gamer, so this card looks pretty great to me. My 4870 is getting a little (ok, very) long in the tooth. Why would I buy a console and pay full price for games, only use a controller, and have to pay a monthly fee (x360) just to play casually when I could pick up a $150 card and drop it into my computer?CeriseCogburn - Sunday, March 24, 2013 - link
His point wasn't your pathetic budget, his point was the graphics suck like a console.Just keep the 4870, it sucks too.
I am as mad as hell - Friday, March 22, 2013 - link
The day will come, in the not so distant future, that 700W PS requirements or higher for high-end gaming machines will come to an end (thankfully). And the whole system will not consume more than 100W and fits inside a mATX case or smaller (and no need for Godzilla size cooling fans anymore either).CeriseCogburn - Sunday, March 24, 2013 - link
It's called Haswell.Death666Angel - Friday, March 22, 2013 - link
"pulling 7W more than the 7770, a hair more than the 5W difference in AMD’s TBP"That 5W is not at the wall though. Factoring in rounding PSU efficiencies, it's very possible that the cards are only drawing 5W more. :)
"The Sapphire card, despite being overclocked, draws 6W less than our reference 7790."
Seeing how the Sapphire runs cooler in Furmark, that might explain a Watt or two in reduced power draw, coupled with the efficiency of the PSU, it might explain three or four even. :)
pandemonium - Saturday, March 23, 2013 - link
"NVIDIA has for a long time set the bar on efficiency, but with the 7790 it looks like AMD will finally edge out NVIDIA."What is your definition of a long time? As far as efficiency standards, I consider AMD to be better for the end result when looking at the full definition and application of the word. See the spreadsheet I created here about 16 months ago to understand what I mean: http://forums.anandtech.com/showthread.php?t=21507...
silverblue - Saturday, March 23, 2013 - link
You just called Ryan a "dummy", did you, without even checking the statement further down which reads:"For anyone looking to pick up a 7790 today, this is being launched ahead of actual product availability (likely to coincide with GDC 2013 next week). Cards will start showing up in the market on April 2nd, which is about a week and a half from now."
If YOU had read the article, blah blah dumb idiot blah blah. As you've not replied to anybody in particular, your mistargeted rants could be construed as being directed toward the staff themselves, so keep it up and you won't HAVE to worry about what AT is reviewing in future.
Bottom line - it's faster than the 650 Ti, it's looking to be more efficient than the 650 Ti, and oh look, both have 1GB of GDDR5 on a 128-bit memory interface, which you seem to have forgotten when you leapt down AMD's throat about the 7790, and when you went on your childish tirade about the 5770's 128-bit memory interface earlier.
As far as I recall, Ryan didn't mention anything about when Titan was available to buy, only that it had launched. Pretty much blows your theory of Ryan hating NVIDIA out of the water, doesn't it?
I'm not sure if I've said this before, and apologies to everybody else if I have, but I'm done with you, full stop. I can only hope everybody else here decides that not feeding the ignorance you perpetuate on every single AMD article would save them time they could be devoting to something far less boring instead.
To the staff - is there anything you can do to introduce an Ignore List? Thanks in advance for your response.
silverblue - Saturday, March 23, 2013 - link
A note about threading - doesn't look to be stepping in consistently, so sometimes it's a little difficult to see whom replied to whom.CeriseCogburn - Sunday, March 24, 2013 - link
You got eveything wrong again, and you failed to read the article not I, and you failed to read my reply addressing half your idiotic non points, so you're the non reader, fool.Now I have to correct you multiple times. And you're a waste.
650TI overclocks and it's only faster in a few amd favor games which are here, of course.
Strike one for tardboy.
650Ti runs fine OC'd too, which it does well: " We pushed Gigabyte's GeForce GTX 650 as far as it'd go and achieved a maximum core overclock of 1125 MHz, with the GDDR5 memory operating at 1600. All it took was a 1.15 V GPU voltage. "
http://www.tomshardware.com/reviews/geforce-gtx-65...
The 128 bit bus - REPAYMENT for you FOOLS SQUEALING prior, what's so hard to understand ?
Did you forget all your WHINING ?
Did you forget your backing up the FAILED theorists with the nVidia dual speed memory ?
ROFL
You're up to strike 4 already.
" Ryan didn't mention anything about when Titan was available to buy, only that it had launched. Pretty much blows your theory of Ryan hating NVIDIA out of the water, doesn't it?"
NO, so why would it be mentioned if he didn't want anyone to buy it ? Why mention it, that would key in to save for release date, right ?
Instead we get this gem first off in BOLD to start the article: " Who’s Titan For, Anyhow? "
Guess that just crushed your idiot backwards bullhockey forever.
For all you know Ryan mentioned release date anyway.
You're not "done with me", you get everything WRONG, so you'll be opening your big fat piehole forever, that's how people like you do it. Idiot amd fanboys, all the same.
Also a beggar child for extra "control", since you "can't be an adult and control yourself" - please give me an ignore button ! I'm a crybaby who can't handle it !
ROFL
philipma1957 - Sunday, March 24, 2013 - link
One question does your 650Ti pays for itself? this amd will pay for itself via bitcoin. even with the asics. especially if you heat your home with electrical heat.nuff said
Rajan7667 - Sunday, March 24, 2013 - link
@form @LinusTech This is new New app for intel lovers. http://www.intel.com/content/www/us/en/gamers/vip-... …colonelclaw - Sunday, March 24, 2013 - link
To Ryan and staffAs a long-time admirer of AnandTech, I always enjoy reading pretty much every article you post, and have immense respect for all your writers.
However, I am now utterly fed up with the direction the comment discussions have taken. The general pattern is they start out as debates and end up as pretty nasty personal attacks that have nothing to do with the articles. You may say 'don't read the comments', to which I reply that they used to be an extension of the articles themselves, and were always a source of valuable information.
It pains me to say this, but if you don't start removing the trolls I will no longer come to this site at all, and I would guess I am not alone in having this opinion.
haze4peace - Sunday, March 24, 2013 - link
I agree 100% and actually sent off a few emails to the staff earlier in the day. I urge others to do so as well so we can put this problem behind us.KnightRAF - Monday, March 25, 2013 - link
I agree the trolls are out of control and need some pruning back. They have massively lessened my enjoyment of the site the last couple of times I've visited.Parhel - Monday, March 25, 2013 - link
Well said, and thanks. I no longer visit Dailytech for the same reasons. I enjoy reading comments, since they can offer other perspectives from like-minded people, but unmoderated is worse than nothing at all. This used to be my favorite tech site, but the comments section here has slowly been pushing me to avoid it most of the time.medi01 - Monday, March 25, 2013 - link
Suddenly Fermi is forgotten and it's only now that AMD will edge out nVidia on power efficiency.silverblue - Monday, March 25, 2013 - link
The 7790 reminds me of the 4770. Sure, that was on a new process node, but it's a late addition to the line designed to take advantage of tweaks, process improvements, etc.There may be a lot of transistors in a GCN design but I couldn't help feel that there were power savings to be had. For this reason, I'd hope that their next flagship doesn't exceed the 7970GE's power draw whilst providing a decent performance boost.
Lucian2244 - Tuesday, March 26, 2013 - link
Good article, very detailed.I think NVidia is replying to this with the new 650 Ti Boost.
Oxford Guy - Tuesday, March 26, 2013 - link
1 GB VRAM is ridiculous, especially for a $150 product.ericore - Thursday, March 28, 2013 - link
For those who want the most power in the smallest package and power drain, look no further then the Radeon 7790. The only disappointment was the heat factor, but more or less the same performance as 7850 at half the power; that's great. Also, I don't mind that AMD went the 6 ghz vram route, because now there is even more reason to get 2 GB which is especially needed if you apply a dozens or hundreds of mods to your games. Also its the 128 bit interface that kept the power low, so despite everyone's cussing AMD made the right choices. I have a GTX 460 which easily uses at least 200 watts. This 7790 is almost twice as fast and uses 2.5 times less power. The pricing is acceptable, if you were to include 2GB by default, then why bother with the 7850; they still want ppl to buy that one.slickr - Tuesday, April 9, 2013 - link
Hey Anand, can you guys please do a video quality test? I mean I haven't seen any such test on any website for over 3 years. So please, can you do a video quality test in movies and games and please also use low quality video as well, not just top of the line 1080p type videos that would look amazing even on a GeForce 3.