Comments Locked

97 Comments

Back to Article

  • YB1064 - Thursday, December 12, 2019 - link

    Poor man's NVidia indeed.
  • flashbacck - Thursday, December 12, 2019 - link

    What do you mean by that?
  • Valantar - Thursday, December 12, 2019 - link

    Apparently the same performance at the same price with a better encode/decode block and better idle power usage but slightly worse gaming power usage and buggy OC controls equals "poor man's Nvidia". Who knew?
  • assyn - Thursday, December 12, 2019 - link

    Actually the Nvenc is better encode/decode because has wider format support and can decode 8K videos, while the AMD only capable of 4K.
  • Valantar - Thursday, December 12, 2019 - link

    The 1650 and 1650S have a cut-down NVENC block lacking pretty much every single improvement made with the Turing generation. Not all NVENC is equal.
  • Ryan Smith - Thursday, December 12, 2019 - link

    1650S has a full NVENC block. It's based on TU116, not TU117.
  • jgraham11 - Friday, December 13, 2019 - link

    Seems like this will be similar to the GCN cards in the past (RX5XX Series), at the beginning its ok, then over the years AMD's fine wine kicks in. Making incremental improvements that improve overall performance with every driver release. Do realize they are still using the GCN instruction set but RDNA is a new architecture, an architecture that will be in both the Xbox and PlayStation systems.
  • WaltC - Saturday, December 21, 2019 - link

    I feel as though AMD has hobbled this card. They put the GDDR6 VRAM onboard, effectively doubling more or less the bandwidth of GDDR5, but then they took it all back with the 128-bit bus...;) I think they missed an opportunity here--but I'm not privy to the manufacturing particulars for the product--so maybe not. Anyway, it seems like they might have made the 5500XT a GDDR6 8GBs, 256-bit bus GPU for $279; then made the 5500 with 4GB, and a 128-bit bus for ~$129 or so. The SPs by themselves are sufficiently cut down to the degree that even a 256-bit bus GPU like I've dreamed up here is going to be sludge next to a 5700--or, that should certainly be the case, I would think. The card almost seems "overly hobbled" to my way of thinking. I think that @ 256-bit 5500XT, 8GBs of GDDR6 might've been a smash seller for AMD @ a ~$279 MSRP, leaving the 5700 still clobbering it simply based on the much higher number of SPs. I just see a lot of potential here that seems overlooked--unless they have plans for a 5600XT, maybe, early 1 Quarter 2020....;)
  • WaltC - Sunday, December 22, 2019 - link

    Will someone please set me straight on whether this is an 8-lane PCIe card? I'm reading this stuff you-know-where but it doesn't make any sense at all to me. It's claimed that accordingly the 5500XT runs much faster on PCIe4 than PCIe3. Eh?

    https://www.reddit.com/r/Amd/comments/edo4u8/rx_55...

    RX-480/580/590 are all PCIex16 GPUs with 256-bit buses running GDDR5. Is the article linked here factual? thanks--to whomever can answer....;) Doesn't sound probable to me--but if it's true then AMD has hog-tied this card six ways to Sunday--trussed up tighter than a Christmas piglet...;)
  • WaltC - Sunday, December 22, 2019 - link

    OK, I saw the one line in the review I missed first time about "I suspect 8 PCIe lanes"--just very strange--I don't understand this product at all! Perhaps the full incarnation of the GPU is going into the PS5/XBox 2, it's just weird. To have to hobble it that much, the GPU must be very robust.
  • pcgpus - Friday, February 14, 2020 - link

    Realy good comparasion of 5500XT and oponents is here:

    https://warmbit.blogspot.com/2020/01/rx5500xt-vs-g...
  • Smell This - Thursday, December 12, 2019 - link


    { snicker }

    So ...
    A cut-down GTX 1660 (TU116 - 284 mm2) with 6GB GDDR5 turns into a GTX 1650 (TU117 - 200 mm2) **Super** TU116 - 284 mm2 with 6GB GDDR6 ??

    1) You've been "Played"; and
    2) nVidia is going the wrong way.

    Good luck with that.
  • silverblue - Thursday, December 12, 2019 - link

    "Compared to AMD’s previous generation Polaris-based RX 500 series cards, the 1080p-focused RX 5500 XT delivers better performance"
    "Competitive performance, meanwhile, is a bit of a trickier subject. As the replacements to the RX 570/580/590 within AMD’s stack, the RX 5500 XT almost always beats AMD’s older cards"

    The elephant in the room here is the RX 590, at least in terms of gaming. In most titles, aside of those on page 6 as well as Total War in Ultra at 1080p, there doesn't appear to be enough of a gap between the 580 and 5500 XT to really set the latter apart. If we assume a somewhat standard RX 580 clocked at 1,340MHz/1,411MHz boost, the RX 590 could be 10% faster than this and would most likely beat the 5500 XT in a good number of cases. Additionally, Polaris looks like it has better 99 percentile performance.
  • drexnx - Thursday, December 12, 2019 - link

    the 590 does look like it'll perform better, but it comes at a cost of about 100w more power - for some people this might not be an issue, for others it might be.
  • PeachNCream - Thursday, December 12, 2019 - link

    If you're building a new PC from scratch or have a lower wattage PSU in a system that is potentially getting an upgrade and the choise is between the RX 590 and the 5500 XT, the additional money spent on power delivery may tilt the cost versus benefit situation in favor of the 5500 XT. That 100W difference is significant enough to possibly cause that sort of problem.
  • silverblue - Thursday, December 12, 2019 - link

    Agreed with both of you.
  • RSAUser - Friday, December 13, 2019 - link

    Wattage isn't really an issue, a normal 450W will support the entire system. I'd still take the lower wattage part as less heat in the case so less fan spin.
  • FreckledTrout - Friday, December 13, 2019 - link

    A typical white box 300 watt PSU though and then you are left with one choice.
  • HarryVoyager - Thursday, December 12, 2019 - link

    Honestly, right now the real elephant are the used 580 8gb cards currently flooding the market. At $100, they're about equivalent to the 5500 XT, and have the 8Gb video memory the author is worried about. At this performance range, it's very hard to recommend someone spend 70-100+ more for pretty much the same real performance.

    The 5500 XT replaces the 580 at the low end, but doesn't really offer much that seems compelling over it.

    I am hoping, however, that AMD can flesh out their competition next year with the Navi 2 cards. I'm seeing this as more of a Zen 1 launch. They're back in the game, and have a competitive foundation, and now they need to get it up to fighting trim.
  • Kangal - Friday, December 20, 2019 - link

    Agreed.
    I'm excited about RDNA, but not completely sold on the new cards. I think the product lineup is what's most disappointing. For instance, if I wanted a Low Profile card, then I would have to go with Nvidia. If I wanted top performance, again Nvidia. And if I wanted mid-high segment, well Nvidia's New-Super range is quite competitive and then there's the older high-end Pascal cards. And if I wanted a low-mid segment, well now it's competing again with Nvidia's Super cards, older midrange Pascal cards, and now even older AMD Polaris cards.

    The product line is very important, as well as marketing/sales, and these are both things that Nvidia absolutely gets right and dominates. After a decent 5 years, the new AMD is adapting, but they're not quite there yet. For instance, remember the launch of Pascal? Nvidia launched a high-end and a slight cut-down version, shortly after they released their mid-range version, then their low-power variants. After a year, they introduced their highest-end version, and make slight tweaks to improve the performance and efficiency of the entire product line. If AMD was able to imitate that, they would've released the RX 5800, and cut-down RX 5700. Then a month later it would be the RX 5600 and RX 5500. Then another month later it would be the RX 5400 (Low Profile) release. Then six months later all the cards would've gotten a slight refresh, maybe tag the "T" suffix at the end. And they would've also released the RX 5900T as their highest-end product.

    So do NOT I see RDNA like Zen 1 at all, that was market disruptive, and it forced Intel to compete. Whilst Zen 1.1 put Intel on the backfoot. Eventually Zen 2 annihilated them. If anything, the RX 480 was Zen 1, it kept/forced Nvidia to be competitive with their GTX 1060 cards. The RX 5500XT is basically Zen 1.1, but it's still trying to keep up against Nvidia. For a Zen2-like scenario, AMD's next card needs to cost less, AND, perform much faster, AND, use less power.

    I don't see that happening, as Nvidia improves their: architecture, drivers, ray-tracing, and moves unto the 7nm node. So when that happens, AMD is going to be on the backfoot again like they have been for a long time. For AMD to remain competitive they really need to release RDNA2 sooner rather than later, and have the entire product line ready to sell. Let's stay optimistic, and assume that is going to happen with more R&D thanks to the extra revenue AMD receives from: consoles, CPU, servers, and stock market : )
  • Korguz - Friday, December 20, 2019 - link

    and nvidia will price most of their cards out of reach of most people.. like a large chuck of then are now.... do they need to charge that ?? probably not.. but they do... cause they can.. just like intel before zen...
  • Kangal - Friday, December 20, 2019 - link

    Not quite.
    Before, in the early 2000-2005 period the Internet was not as accessible and pronounced as it is today. And to add to that, when it came to "perception" things like marketing were very effective. So despite AMD producing some of the best CPUs and GPUs at the time, they weren't winning. Intel and Nvidia had the mindshare, thanks to their large marketing departments.

    Now?
    There's Facebook and YouTube. Information is much more accessible, anyone can find more details about a product from more trusted third-party professionals. And on top of that, the past few years has changed the public's mindshare about AMD. They're now seen as the better CPU company, so that trickle downs to their perception of GPUs as well. It is no longer a "poor mans Nvidia".

    So what am I saying?
    The more things change, the more they stay the same. The market will always have winners and losers. However, this time AMD is under new management and heading a new direction, and they have seized a real opportunity to turn things around. That might mean AMD only releasing a few cards, and instead focusing their attention to things that make money: servers and consoles. I'm fine with that, and if things kind of sour, well it was the (sheep) public's fault for allowing the monopoly to arise by not being more critical and skeptical (think pre-1950's attitude). However, I should also add that just because there is a monopoly does not mean the market isn't healthy or competitive or that it's bad for the consumer; it is likely to be the case but not a certainty.
  • Korguz - Friday, December 20, 2019 - link

    nvidia had the mindshare.. but intel has its bribes and threats....

    either way.. unless amd can make a zen2 equivalent for radeon, nvidia will still over charge and price some of their cards out of reach of most people, like they already have done
  • Gemuk - Thursday, December 12, 2019 - link

    I'd argue that it wasn't AMD selling RX 570 and RX 580 for cheap that has distorted the market, rather it was the consumer's acceptance of the higher pricing across the board. Price/perf needle has barely moved up after three years. It is the new norm. We're never going back to those days, are we?
  • Yojimbo - Thursday, December 12, 2019 - link

    Price/performance has moved up since 3 years ago. 3 years ago a 1060 cost about what a 1660 Ti costs now, and a 1660 Ti is about 36% faster than a 1060. You are getting over 30% increase in performance per price, even if you throw out what happened to 1060 prices because of the crypto craze and spike in DRAM prices. The gain in performance per dollar from the 960 to the 1060 was about 44%. So it's lesser this generation but I don't think the difference qualifies as "barely moved up" and you can't take one example and call it a trend. As for the RTX cards, of course with NVIDIA taking up die space on new features that don't go into price/performance calculations there is going to be a drag on the numbers that get spit out.

    AMD were selling the 570 and 580 for cheap because they had a bunch of them in inventory from the crypto hangover and they needed to get as much revenue for them as they could. If it took too long they would lose more value and they'd have to delay the 5500.
  • Alistair - Thursday, December 12, 2019 - link

    you mean 1660 super is the same price as the 1060, the 1660 ti is still more expensive than the old 1060 from years ago
  • jabbadap - Thursday, December 12, 2019 - link

    gtx1060 FE 6GB msrp was $299 same as gtx1660ti. While gtx 1060 6GB has "msrp" of $250, finding AIB sku priced as such was near to non-existent. Norm was $299 and more like gtx1660tis.

    1660 Super is $229, which is lower than any official price point nvidia gave to gtx1060 6GB cards.
  • HardwareDufus - Thursday, December 12, 2019 - link

    Had one of my 24" 1920x1200 monitors fail the other day, and the other has a few lines. Therefore, I purchased 2 32" 4K monitors. Should arrive in days.

    Thinking about purchasing a discreet video card as my current I7-3770K with its IGP won't drive these new moniros to 4K resolutions.

    I don't game, so I would imagine I'd received satisfactory performance with an inexpensive 5500XT card to drive both these monitors at 4K for really good size/readability of text?

    Comments?
  • haukionkannel - Thursday, December 12, 2019 - link

    For that usage... sure. But no Gaming unles you use 50% scaling. With 50% scaling and amd sharpening, even Gaming is good.
  • Zoomer - Saturday, December 21, 2019 - link

    Don't need sharpening with 50% scaling. Pixels will just be 4 times as big and map perfectly.
  • skizlock - Thursday, December 12, 2019 - link

    Should be fine.
  • TheSkullCaveIsADarkPlace - Thursday, December 12, 2019 - link

    The 5500XT should be alright. But, you didn't say what you are doing, you just said what you are not doing (no gaming). So, i assume you care only about 2D. In which case you could probably select an even more inexpensive GPU that does 4K/60Hz. But then again, it's you who is calling the shots about the things you want to purchase. I am not daring to further step into the line of fire there... ;-)
  • PixyMisa - Friday, December 13, 2019 - link

    I have a 580 driving two 4K monitors for work, and it's great.
  • Ranguvar - Thursday, December 12, 2019 - link

    Hey Ryan, thank you for the excellent reviews...
    July 7th, you posted on the 5700 (XT) review that you had "15 pages of notes" on deeper RDNA details that you'd be posting later.
    Is this ever coming to fruition? Holding out hope!
    Cheers.
  • TEAMSWITCHER - Thursday, December 12, 2019 - link

    I seriously question your motives in this review... Excluding all 1440p results, but then including the Radeon 5700 (a card priced well above this range) just to ensure that AMD has the top position in all the graphs. And your conclusion page make ZERO reference to the GTX 1660 Super card, which can be had for a very good price - $230. I'm seeing this trend all over .. tilting reviews to benefit AMD. I get it.. everyone loves an underdog... but these GPU's are just dogs.
  • Ryan Smith - Thursday, December 12, 2019 - link

    The Radeon 5700 is the next card up in AMD's current-generation stack. It's important to include it to show where the 5500 XT ends, and where the next card picks up.
  • Fataliity - Thursday, December 12, 2019 - link

    Ryan - Have you tried AMD's new performance boost on the 5500XT to see how it performs? The resolution scaling when moving screen etc? I imagine the Algo feature was built for this card, considering its release and drivers release were in sync. So I'm just wondering how it helps for a budget card?

    Thank you!
  • Ryan Smith - Thursday, December 12, 2019 - link

    I've tried it. I don't care for it. But I'll give AMD tons of credit for trying new ideas.

    But I'll save that for once I can write up something proper.
  • Zoomer - Saturday, December 21, 2019 - link

    Overclocking?
  • Hrel - Thursday, December 12, 2019 - link

    Nah, its pretty bs you didn't include 4k results. The ONLY reason to buy the 8gb version is 4k and now I gotta go elsewhere to find out how the gpu uses that extra vram. Very dumb not to include 4k.
  • Valantar - Thursday, December 12, 2019 - link

    What? This class of GPU is in no way whatsoever capable of gaming at 4K. Why include a bunch of tests where the results are in the 5-20fps range? That isn't useful to anyone.
  • Zoomer - Saturday, December 21, 2019 - link

    AT used to include. I just ignored it for a card of this class; probably others did as well.
  • Ravynmagi_ - Thursday, December 12, 2019 - link

    I lean more Nvidia too and I didn't get that impression from the article. I felt it was fair to AMD and Nvidia in it's comparison of the performance and facts. I wasn't bothered by where they decided to cut off their chart.
  • FreckledTrout - Friday, December 13, 2019 - link

    Same here. I don't need to see numbers elucidating how bad these low end cards are at 4k. Let's move on.
  • Dragonstongue - Thursday, December 12, 2019 - link

    I <3 how compute these days adamantly refuse to use the "old standard"
    i.e MINING

    this shows Radeon in vastly different light, as the different forms of such absolutely show difference generation on generation, more so Radeon than Ngreedia err I mean Nvidia.

    seeing as one can take the wee bit of time to have a -pre set that really needs very little change (per brand and per specific GPU being used)

    instead of using "canned" style bechmarks, that often are very much *bias* towards those who hold more market share and/or have the heavier fist to make sure they are shown as "best" even when the full story simply is NOT being fully told...yep am looking direct at INTC/NVDA ... business is business, they certainly walk that BS line constantly, to very damaging consequence for EVERYONE

    ............

    I personally think in this regard, AMD likely would have been "best off" to up the power budget a wee touch, so the "clear choice" between going with older stuff they probably and likely not want to be producing as much anymore (likely costlier) that is RX 4/5xx generation such as the 570-580 more importantly 590, this "little card" would be that much better off, instead, they seem to "adamant" want to target the same limiting factor of limited memory bus size (even though fast VRAM) still wanting to be @ the "claimed golden number" of "sub" $200 price point --- means USA or this price often moves from "acceptable" to, why bother when get older far more potent stuff for either not much more or as of late, about the same (rarely less, though it does happen)

    1080p, I can see this, myself still using a Radeon 7870 on a 144Hz monitor "~3/4" jacked up settings (granted it is not running at full rate as the GPU does not support run this at full speed, but my Ryzen 3600 helps huge.

    still, a wee bit more power budget or something would effectively "bury" or make moot 580 - 590, then wanting to sell for that "golden" $200 price point, would make much more sense, seeing as they launched the 480 - 580 "at same pricing" (for USA) in my mind, and all I have read, with the terrific yields TSMC has managed to get as well as the "reasonable low cost to produce due to very very few "errors" THIS should have targeted 175 200 max.

    They are a business, no doubt, though they in all honesty should have looked at the "logical side" that is, "we know we cannot take down the 1660 super / Ti the way we would like to, while sticking with the shader count / memory bus, so why not say fudge it, add that extra 10w (effectively matching 7870 from many many generations back in the real world usage) so we at least give potential buyers a real hard time to decide between an old GPU (570-580-590) or a brand spanking new one that is very cool running AND not at all same power use, I am sure it will sell like hotcakes, provided we do what we can to make sure buyers everywhere can get this "for the most part" at a guaranteed $200 or less price point, will that not tick our competition right off?"

    ..........
  • thesavvymage - Thursday, December 12, 2019 - link

    What are you even trying to say here.....
  • Valantar - Thursday, December 12, 2019 - link

    I was lost after the first sentence. If it can be called a sentence. I truly have no idea what this rant is about.
  • Fataliity - Thursday, December 12, 2019 - link

    I think the game bundle is what they chose as their selling point. I'm sure they get a good deal with game pass being the supplier of CPU/GPU on xbox. So their bundle is most likely almost free for them. Which pushes the value up. Without bundle I imagine 5500 4gb being 130 and 8gb being 180.
  • TheinsanegamerN - Sunday, December 15, 2019 - link

    That's a LOTTA words just to say "AMD just made another 580 for $20 less, please clap."
  • kpb321 - Thursday, December 12, 2019 - link

    The ~$100ish 570's still look like a great deal as long as they are still available. For raw numbers they have basically the same memory bandwidth and compute as a 5500 but the newer card ends up being slightly faster and uses a bit less power. It is overall more efficient but IMO no where near enough to justify the price premium over the older cards. I'm not as sure that the 570/580 or 5500 will have enough compute power for the 4 vs 8gb of memory to really make a difference but my 570 happens to be an 8gb card anyway.
  • dr.denton - Friday, December 13, 2019 - link

    Bought an RX570 8GB this summer and feeling pretty good about that decision right now. Especially seeing how well Polaris cards seem to hold up in modern titles like Metro and RDR2.
  • n0x1ous - Thursday, December 12, 2019 - link

    Only matching power numbers with 7nm vs Nvidia's 12nm. Sad. Nvidia on 7nm is going to dominate again.
  • Fataliity - Thursday, December 12, 2019 - link

    Nvidia's 7nm should be going against RDNA2. Which I"m sure will help alot. RDNA was rebuilt because of issues, and didn't get everything it was supposed to (Similar to Zen vs Zen2. They could only do so much with its budget).

    This is their first gen of a new arch.
    Personally though, the compression synthetics speak the biggest to me. They need to match Nvidia's compression on color changes for bandwidth. That's why AMD's cards need more TFLOPS to reach same performance. If they can get their compression to Nvidia's level, then I think it will do wonders for their numbers.
  • eva02langley - Thursday, December 12, 2019 - link

    However during that time, AMD image quality is better.
  • peevee - Friday, December 20, 2019 - link

    Their cards have so much memory bandwidth compared to flops they would not benefit much from compression I 'm afraid.
    Something else is terribly wrong. Just not enough ALUs on 5500 series, have to keep frequency up = bad efficiency. They needed to use TSMCs high-density/low-power libraries, not high-performance libraries for GPUs... They would be immensely better with twice as many ALUs at 1GHz and low-power DDR5...
  • eva02langley - Thursday, December 12, 2019 - link

    It is the first iteration of a new uarch, things will only get better and better.
  • jabber - Thursday, December 12, 2019 - link

    Oh well no need to swap out my 480 for another year at least.
  • eva02langley - Thursday, December 12, 2019 - link

    Hell no, you don't need to. This generation of cards is the first time I can remember seeing identical performances for the same price tag.
  • lightningz71 - Thursday, December 12, 2019 - link

    So, for what I'm seeing, if you're shopping for a sub-$200 video card, you have the following scenario:

    1) If you have no real concerns about power usage (i.e. have a 750+ watt PSU and don't have high electricity costs) then the RX590-8GB Cards offer you the best bang for the buck.

    2)If power is a MAJOR issue for you, the GeForce 1650 Super is your best option, unless you absolutely need to have more VRAM, then it's the 5500XT 8GB.

    3) If you need solid drivers and advanced video encoding/decoding codec support, the 1660 is what you need as it has the full, current NVenc solution that appears to outstrip AMD for the moment.

    For me, with an 850 Watt PSU, and only an occasional gamer and desktop user, I'm going to be looking at good deals on the RX590 8GB for now, unless the 1660 super comes down drastically in price.
  • eva02langley - Thursday, December 12, 2019 - link

    Total disinformation, you don't need a 750W Power Supply for a 590. 500W is more than enough. Basically, If you don't have a 9900k or an OC R9 3000, I don't see the need for more than 500-600 Watts.
  • eva02langley - Thursday, December 12, 2019 - link

    Also, more wattage for a system that use less than 50% of the load will results in poor efficiency which you don't want to do.
  • silverblue - Friday, December 13, 2019 - link

    As it is, the 590 is slightly more efficient than the 580 per clock, but a minor undervolt will yield further savings. I have a TX650M which is severe overkill for a Ryzen 5 1600, 16GB DDR4 3000 and Sapphire Nitro+ RX590, which is undervolted by 8% and doesn't go above 1.1V.
  • PeachNCream - Friday, December 13, 2019 - link

    For a single CPU socket and single GPU, buying a 500W PSU is more than reasonable. You are falling into the more is better trap by probably spending more money an losing efficiency by getting something in the +750W range. That sounds like something a person that has not done any research would suggest.
  • 29a - Friday, December 13, 2019 - link

    I wish they would start testing the codec ASIC on GPUs.
  • SethNW - Thursday, December 12, 2019 - link

    RX5500XT has one issue, it is the ok card. Like it is not terrible, but older RX580/590 are just too close to irlt and with prices dropping, they make more sense. And 8GB end, add a little and you get better performing 1660 or what is in a lot of cases tiny bit more and you got 1660 Super. I don't think extra 2GB of VRAM will make that huge difference at 1080p, 4GB definitely is budget category, but 6GB is doable for next 3 years. But only time will tell whether bandwidth or size will matter more.
  • Ravynmagi_ - Thursday, December 12, 2019 - link

    I was very torn up about what to buy. I wanted to stay in the $160-$180 range. I just play a couple games that are not too demanding (Civ 6 and Albion Online).

    I tried an RX 580, but mine has really bad coil noise. So I went with a GTX 1650. It's working perfectly for my games on a 1440p monitor.

    I was uncertain if I should have waited for the RX 5500 XT. I see now it seems to provide equal performance more or less and I do prefer Nvidia drivers over AMD. So this seems to work out for me.

    A part of my still nags that I should have tried another RX 580 or 590 with 8GB of RAM. But this is an entry level gaming card. If I get into more demanding games I would most likely replace whatever I bought in this price range eventually anyway before I need the 8GB of RAM I suspect.
  • Ravynmagi_ - Thursday, December 12, 2019 - link

    Looks like I can't edit my comment. But meant to say GTX 1650 Super.
  • eva02langley - Thursday, December 12, 2019 - link

    Yeah, makes mroe sense since the 1650 GTX is a total disaster.
  • Hrel - Thursday, December 12, 2019 - link

    Looks like I'm stuck waiting for a significant price drop. I'd like to update my 280x, but it runs everything I play so the only real reason is power efficiency and system thermals. Things literally a space heater when gaming.

    I'm really just looking for 280x performance + 50% or so, which 1650 super does, 5500xt does, but to pay nearly $200 just to knock 75 watts off my system.... Does not appeal. Especially when the 4gb vram issue is real.

    280x has 3GB ram, my system monitors have shown it tapping system ram. Not a lot, and the newest game I play is dark souls 3, 500MB. But I'm pretty sure DS3 is from 2013.

    Assuming a new game that isn't esports like rocket league comes out that I want, I'd like this card to run it well.

    Maybe that means I'm waiting another 2 to 4 years to replace my card. Maybe it just means waiting till next gen comes out and getting deals on these.

    Either way, this generation isn't enough. From either camp.
  • cmdrmonkey - Thursday, December 12, 2019 - link

    You already have so many other cards that are around this level of performance that I'm not sure why we needed another one. GTX 970, 1060, 1650 Super, RX 470, 480, 570, 580, 590 are all in the same ballpark as this thing depending on the game. Why did we need this? Seems like this performance segment is totally tapped out.
  • StrangerGuy - Friday, December 13, 2019 - link

    Yeah, In Singapore a RX570 8GB is $100 and 1660S is $200. Not even sure why anyone in that price segment should bother with anything between those two.
  • catavalon21 - Saturday, December 14, 2019 - link

    I guess AMD wanted something in this performance band which cannot run OpenCL (at least not worth a hoot)

    No, I'm not over it.
  • CHADBOGA - Thursday, December 12, 2019 - link

    Such a disappointing product. :(

    Now I am left with choosing between a 1660 Super and a 5700XT.

    Nothing else makes any sense to me.
  • PeachNCream - Friday, December 13, 2019 - link

    I recommend buying something that does make sense to you and makes you feel accomplished/empowered/etc. for a couple of months until your mind adjusts to the new normal and is no longer impressed by your own purchase and you begin to feel compelled to repeat. Go for it! None of us are going to turn up at your home to make comments about your computer's components.
  • GreenReaper - Friday, December 13, 2019 - link

    Speak for yourself . . .
  • lenghui - Friday, December 13, 2019 - link

    This still can't beat the value of RX570 4GB for $110-$120 (current prices on the egg). I am quote a bit disappointed with the lack of progress in value/dollar lately.
  • AntonErtl - Friday, December 13, 2019 - link

    I had hoped for a card that would form the basis for a replacement for my passively cooled Radeon HD 6770, but I guess I'll have to keep the 6770 for some more years (no, a semi-passive card that's quiet in the beginning and turns into a siren after a while does not cut it; BTDT).
  • a5cent - Friday, December 13, 2019 - link

    Until AMD supports MxGPU on these cards or on their APUs, I can't care about AMD's graphics division anymore. Intel has supported GVT-g in their iGPUs for years already.

    I'd instantly go with AMD if they had anything comparable.
  • philosofool - Friday, December 13, 2019 - link

    Could we see some 1440p results in the future? For me, a $200 dollar card is always the sweetspot. My computer is not primarily a gaming device, but I really like my 27" 1440p monitor. I need to know whether a card meets my needs, not whether it would be great if I only had a different monitor.

    Besides, not everyone insists on 60fps or Ultra settings.
  • philosofool - Friday, December 13, 2019 - link

    I found a review that included 1440p results. In most games, including some benched here, the 99th percentile is north of 30fps, which I consider totally playable. However, a GTX 1660 Super appears to increase its lead there: 99th percentile @ 56fps in a 12 game average.
  • Freeb!rd - Friday, December 13, 2019 - link

    For those of us with older cards, I would've appreciated the GTX 1060 6GB & GTX 1070 included in the performance graphs.
  • Freeb!rd - Friday, December 13, 2019 - link

    Especially since I have both those and some 580s lying around.
  • sheh - Friday, December 13, 2019 - link

    https://www.anandtech.com/bench/product/2577?vs=25...
  • alfatekpt - Friday, December 13, 2019 - link

    Depending on price it seems it can actually fight with 1660 Super because the differences are a few FPS with a lower noise/power consumption.
  • ballsystemlord - Saturday, December 14, 2019 - link

    @ryan If you can't run opencl on windows why not run your tests on Linux? I'd imagine that some of the programs have a version that runs on Linux.
  • isthisavailable - Saturday, December 14, 2019 - link

    The 8 gb version should not exist and the 4gb version should be $10 lower to match 1650super in price while beating it in perf. Why would you buy 8gb version when NVIDIA has 1660 for $10 more.
    Also, rx5500 gaming laptops when?
  • Maxxie - Saturday, December 14, 2019 - link

    This looks like a gaming card, again. For the professional segment, AMD needs to focus more on compute and reducing operating power/temperature. I was forced to upgrade a working AMD RX 580 4gb to Nvidia's RTX 2060 Super 8gb to get GPU acceleration working in MATLAB. Major benefit the doubled graphics memory, and the new card runs noticeable cooler. Still, I spent $300 on the RX 580 earlier this year wasting a bit of budget. That's going to keep me from trying any AMD cards until they give focus to more than the entertainment segment.
  • dr.denton - Sunday, December 15, 2019 - link

    I really don't know much about professional use, but has this not been AMD's biggest problem in the past years? Having GPUs with great compute abilities, but somewhat lacking for gaming?
    Personally, I'm glad they finally have the ressources to develop two lines of architectures, one focussed on gaming and one for professional use.
  • ballsystemlord - Saturday, December 14, 2019 - link

    One grammar mistake:

    "...one that TSMC's customers are jockeying to secure wafer starts due to very high demand."
    Missing "for":
    "...one that TSMC's customers are jockeying to secure wafer starts for due to very high demand."
  • Father Time - Saturday, December 14, 2019 - link

    Closing Thoughts -> 4th last paragraph:

    "6% more perforamnce".

    Great article though.

    Does AMD still suffer from Day 1 performance not reflecting the huge gains they get over NVidia as drivers mature? It was a large issue in generations passed (ever since the 2900XT), and considering how close this card is to the NV equivalents, do we expect it to win out over time, or is that a thing of the past now that architecture has changed significantly, along with the technology and the people involved?
  • qwertymac93 - Sunday, December 15, 2019 - link

    As this card is based on the same RDNA1 architecture as the 5700 series AMD has had months to optimize performance. It isn't likely the overall rankings will change much unless the new consoles bring a major shift in game developer optimization priorities.
  • peevee - Monday, December 16, 2019 - link

    "the company is also bundling the forthcoming “Master Edition” of Monster Hunter: Iceborne. This is the Iceborne expansion bundled with the base game"

    They'd better reduce price by $10-20 to be price-competitive with 1650 Super.
  • marees - Wednesday, December 18, 2019 - link

    Given the average performance value of this card, it seems to me gamers who want a budget card for 1080p are better off, waiting for the xbox series S !?
  • Korguz - Wednesday, December 18, 2019 - link

    and what if the games the person plays.. are not on a console ? then what ?
  • peevee - Friday, December 20, 2019 - link

    So why would you prefer this over 1660?
  • kayfabe - Tuesday, December 24, 2019 - link

    Because the 4gb version is ~$30 cheaper and some gamers like monster hunter or quiet computing. The bottom line is that these products are far enough down the pricing totem pole that a good rebate or bundle can sway people pretty easily--at this range you're hunting minimally enjoyable functionality, not future proofing. Personally, I'm holding out until at least ampere arrives before I start throwing money around again.
  • toke - Saturday, August 8, 2020 - link

    Anybody seen any comparisons of IDLE power use of real cards?
    I'd like to choose the one with least among 570, 580, 590 or 5500.

    Are all the reviews like this one, idle power use against manuf. ref. cards?

Log in

Don't have an account? Sign up now