Comments Locked

49 Comments

Back to Article

  • PeachNCream - Friday, September 28, 2018 - link

    Interesting analysis, though it's a bit of a foregone conclusion these days to expect a GPU overclock to improve performance in games more than a CPU overclock since the central processor, after a point, has very little role in increasing framerates.

    This one struck me as odd though - "...Ryzen APUs are marketed for 720p gaming, and while resolutions such as 2160p and 1440p are out of reach purely for performance reasons, we have opted to use moderate settings at 1080p for our testing."

    Were the tests executed at 1080p so they would align better in the Bench? It seems more reasonable to test at 720p given the various limits associated with iGPUs in general and the use of 1080p just comes across as lazy in the same way Anandtech tests CPU performance in games at resolutions so high that GPU performance masks the differences in various processors. Tom's Hardware, back when the good doctor actually ran it, yanked resolution down as low as possible to eliminate the GPU as a variable in CPU tests and it was a good thing.
  • stuffwhy - Friday, September 28, 2018 - link

    Just purely speculating, is it possible that 720p results are just great (60+ fps) and need no testing? One could hope.
  • gavbon - Friday, September 28, 2018 - link

    My reasoning for selecting 1080p gaming tests over 720p was mainly because the other scaling pieces were running at the same resolution. Not just the iGPU tests, but the dGPU testing with the GTX 1060 too. It wasn't a case of being 'lazy' but the majority of gamers who currently use steam use 1080p and as it's the most popular resolution for gamers, I figured that's where I would lay it down.
  • neblogai - Friday, September 28, 2018 - link

    Even if monitor is 1080p, a lot of 2200G users may want to run games on 1080p with resolution scaling, for better fps. In effect, at 720p or 900p. Most games support it these days. So, popularity of 1080p monitors does not really make 720p tests less useful for this level of GPU performance.
  • V900 - Friday, September 28, 2018 - link

    Would be great if you had tested just one game at 720p.

    I know this is what I would be interested in knowing/reading if I was a possible customer.
  • usernametaken76 - Sunday, September 30, 2018 - link

    I honestly think this "majority of gamers who currently use steam use 1080p" argument is affected by a) laptop users (see the high number of 1366x768 users) and therefore game at whatever resolution their laptop panel is set to...
    Which leads one to ask what the point of testing desktop parts is when you use that as a basis for what and how to test.
  • TheJian - Friday, October 5, 2018 - link

    agree 100%. They do a lot of dumb testing here. Ryan has been claiming 1440p was the "enthusiast resolution" since Geforce 660ti. I don't even think you can say that TODAY as I'm staring at my two monitors (have a 3rd also 1080p), both of which are 1200p/1080p.

    For me, I need everything turned on, and you need to show where it hits 30fps at that level. Why? Because the designers of the games didn't want you to play their game at "MODERATE SETTINGS"...ROFL. Just ask them. They design EXACTLY WHAT THEY WANT YOU TO SEE. Then for some reason, reviewers ignore this, and benchmark the crap out of situations I'd avoid at all costs. I don't play a game until I can max it on one of my two monitors with my current card. If I want to play something that badly early, I'll buy a new card to do it. All tested resolutions should be MAXED OUT settings wise. Why would I even care how something runs being degraded? Show me the best, or drop dead. This is why I come to anandtech ONLY if I haven't had my fill from everywhere else.

    One more point, they also turn cards etc, down. Run as sold, PERIOD. If it's an OC card, show it running with a simple checkbox that OC's it the max the card allows as their defaults. IE most have game mode, etc. Choose the fastest settings their software allows out of their usually 3 or 4 default settings. I'm not talking messing with OCing yourself, I mean their choses 3-4 they give in the software for defaults. Meaning a monkey could do this, so why pretend it isn't shipped to be used like this? What user comes home with an OC card and reverts to NV/AMD default ref card speeds? ROFL. Again, why I dropped this site for the most part. Would you test cars with 3 tires? Nope. They come with 4...LOL. I could go on, but you should get the point. Irrelevant tests are just that.
  • flyingpants265 - Tuesday, March 5, 2019 - link

    Hate to tell you this, but 4k is the enthusiast resolution now.
  • 808Hilo - Saturday, October 13, 2018 - link

    Is it just me or are these test just for borderline ill people?

    Playing 4k with a 1080/1800/32/S970. Works reasonably well. I also do everything else in 4k. Would I go back to lower res? No way. Artifical benchmarking is one, real world is 4k. Test this rez and we get a mixed GPU, APU, CPU bench. Build meaningful systems instead of artificially push single building blocks. Push for advancements. T
  • Targon - Friday, September 28, 2018 - link

    The big problem with these APUs is that they limit the number of PCI Express channels, so if you DO decide to add a video card, the APU in this case will reduce performance, compared to a normal CPU without the graphics.
  • gavbon - Friday, September 28, 2018 - link

    Yeah I do agree with you there, but the main purpose of an APU is to utilize the onboard graphics. Ok sure you lose bandwidth due to the limitation of PCIe lanes on them, but the specs have to be cut down somewhere and rather that than CPU or iGPU power.
  • seamonkey79 - Friday, September 28, 2018 - link

    By less than 1% in the vast majority of games. For PCIe 3.0/3.1, 8 lanes is still plenty for a card to the point where benchmarks are within the margin of error. That is until you start looking at SLI, on the level of (at least) GTX 1080 Ti or Titan V cards. At that point, you would be building a new system anyway because you'd need a board capable of handling 2 x16 slots, which you wouldn't have bought because a board with video outputs for the APU doesn't come with dual x16 slots. You're also buying a new CPU because you're not sticking $6k worth of hardware on a sub $200 CPU/chipset.

    https://www.gamersnexus.net/guides/2488-pci-e-3-x8...

    https://www.gamersnexus.net/guides/3176-dual-titan...

    In their testing, even on the Titan V, with a single card, there was no difference between x8 and x16.

    So, no, you're not really in a situation where the APU will 'reduce performance', unless you're buying a sub-$200 CPU to stick in a system with around $6000 worth of dGPU. Which you can't do because you only have a single x8 slot anyway...
  • nathanddrews - Tuesday, October 2, 2018 - link

    Correct, the only limitation is the 2200G/2400G itself (even at 4.1GHz), not the lanes. I've got a 1080Ti in my 3570K setup and I know full well when my CPU is the bottleneck.
  • eva02langley - Saturday, September 29, 2018 - link

    8x is performing basically the same for game performances. If you were to compute, that would be another story.

    It is actually a non-issue.
  • msroadkill612 - Wednesday, October 17, 2018 - link

    Which is why its saddens me that the ~one true single ccx zen+ cpu - 2500x - is oem only.
  • t.s - Friday, September 28, 2018 - link

    Please fix: Civ. 6 graph are AVG FPS and AVG FPS, not AVG FPS and 99th %.
  • Ryan Smith - Friday, September 28, 2018 - link

    Whoops. Thanks for the heads up. Fixed!
  • Valantar - Friday, September 28, 2018 - link

    There are quite a few errors like this throughout the article. F1 for the 2200G are both 99th percentile. TW:W2 (same page) has the correct titles but same data in both images. There was more too, on earlier pages, but since I'm on my phone I can't look through it while writing this. Hope you can take a look.
  • gavbon - Friday, September 28, 2018 - link

    Thanks Valantar, fixed them now! Appreciate the heads up!
  • ET - Friday, September 28, 2018 - link

    My conclusion from this is that it's worth overclocking the 2200G to 1200 because the default clock performs badly in some cases.
  • Lolimaster - Saturday, September 29, 2018 - link

    There was an issue on the APU's were at certain frequencies the clock rate would jump around giving you nasty minimun (shown on many early reviews) and then after a certain threshold, the clock will not jump around like crazy, consistently beating the GT1030 at almost any scenario.
  • ipkh - Friday, September 28, 2018 - link

    So how about a Memory plus GPU overclock since those 2 combined would make the most difference.
  • lightningz71 - Saturday, September 29, 2018 - link

    Buildazoid showed that significant overclock on both the iGPU and the RAM was quite difficult. The iGPU reacted negatively to lower SOC voltage whereas the memory controller disliked higher SOC voltages. The happy medium seemed to be an iGPU at 1600mhz and running the ram at 3200-3400 mhz with the tightest possible timings. Leave the CPU cores at stock to maximize package power and thermal budget for the iGPU.
  • The_Assimilator - Friday, September 28, 2018 - link

    Anyone who knows discrete Vega knows it runs hungry and hot at stock frequencies and even worse when overclocked, but is far better behaved when underclocked and/or undervolted. Hence why these Vega iGPUs have so much OC headroom: they're deliberately being run slow in order to hit an acceptable power/heat target.

    Given that, the omission of power usage and temperature data from this review is glaring, to say the least.
  • jensend - Saturday, September 29, 2018 - link

    I agree that 'overclocking scaling' reviews that don't show how power and temperature scale are failures. I wouldn't overclock the 2400G for tiny gains and large power/temperature/etc drawbacks.

    The one interesting conclusion that can be drawn from this piece is that the 2400G's shader etc performance at stock is high enough that at 1080p the bottlenecks are generally elsewhere (esp memory), while that's not as true of the 2200G. (We already kind of knew that from the memory scaling article.)
  • Nagorak - Saturday, September 29, 2018 - link

    I agree, just posted the same myself.
  • Lolimaster - Saturday, September 29, 2018 - link

    Still the best reviewer for APU's, specailly OCing scaling and different resolution is techepiphany on youtube, shame on anandtech, they can't even do a proper review for an APU, just a lazy thing.
  • Nagorak - Saturday, September 29, 2018 - link

    I kind of feel like this is incomplete without some comparison of power use and heat generation. Maybe not at every frequency but it would be nice to see stock vs max OC at least. Based on the results with Vega GPUs it seems likely that efficiency craters as you go higher. It would be nice to know at least.
  • neblogai - Saturday, September 29, 2018 - link

    ~50W higher power use is usually not an issue if total system power use is ~150W. But it would be useful to look at it from the point of motherboard SOC VRM ability. Raven Ridge chips are a budget option, so are usually used with cheaper B350 motherboards; however, those often do not have good SOC VRMs(and radiator on them) for extra power overclocked Vega iGPU consumes.
  • notashill - Saturday, September 29, 2018 - link

    The other big practical concern for budget builds is the need to spend more money on cooling. The cooler used in this review costs more than the 2400G so it would be totally nonsensical to actually use. Do they actually need crazy cooling to hit these overclocks, or is something like a $30 212 EVO enough?
  • neblogai - Saturday, September 29, 2018 - link

    Or a €10 Deepcool Gammaxx 200T, €15 Gammaxx 300, etc.
  • lightningz71 - Saturday, September 29, 2018 - link

    If you keep the overclock reasonable. Keep the CPU stock and push the iGPU to 1600mhz. If you push higher, it starts demanding a lot of SOC voltage, which hurts the memory controller performance. There are a few B450 boards with decent 2 phase SOC sections that would be fine there. The problem is having the option in the BIOS to actually overclock the iGPU.
  • dromoxen - Sunday, September 30, 2018 - link

    I dont think ppl are are able to push higher, 1600mhz seems to be pretty much the outer limit. It appears that the greatest increase is going from 1100mhz to 1150mhz after that the gains are much less. Power draw and heat would be nice to know . I would like to have one of these in a nuc style system as a tellybox maybe the ge's will suit better. They can only get more popular if Intel cant supply Budget chips for next 9 months
  • eastcoast_pete - Saturday, September 29, 2018 - link

    Gavin, thanks for this last chapter of the Zen APU reviews! As mentioned by others here, it would be helpful if you could post the power draw (even just overall) figures; they usually give a good idea how much of a sweat the silicon works up. The other point is that if I would OC either the CPU or GPU on a 2200G or 2400G, I would at least try to OC the memory a bit, too. With graphics likely limited by memory speed, that seems to be a logical thing to try. If you did, any information on that combo (OC'd graphics plus OC'd memory) would be appreciated! It doesn't have to be a full list of all OC permutations.
  • ballsystemlord - Saturday, September 29, 2018 - link

    You wrote the wrong APU name down in the paragraph reading "Performance in Shadow of Mordor wasn't the best we have seen from the game testing, with average framerates gradually increasing on the 2400G."
    You were testing the 2200G in that section.
    I know it's easy to make that mistake, I do it too.
  • Haawser - Saturday, September 29, 2018 - link

    "For any compute related workloads, the integrated graphics frequency is ineffective whereas pure MHz on the Ryzen cores and memory frequency can play a major part in improving performance throughput. "

    Implying that memory overclocking isn't useful for graphics too ? Why didn't you try overclocking the graphics *and* memory, seeing as both are easy to do ? And 90% of APU users interested in gaming will anyway ?
  • eva02langley - Saturday, September 29, 2018 - link

    I am having a similar motherboard, the MSI B450i Gaming AC plus, and the same behavior between 1350-1400MHz happen.

    I just put the iGPU at 1500MHz with 1.2V and it is running flawlessly in games.

    Also, I am experiencing strange blinking black screen once in a while. It is like a blinking of an eye. It is kind of annoying.

    Do you experience the same thing?
  • wilsonkf - Sunday, September 30, 2018 - link

    Why don't you test at least one more faster memory speed (e.g. 3466Mhz)? This would be interesting.
  • hashish2020 - Tuesday, October 2, 2018 - link

    I hope that they end up doing that, because that will likely be the sweet spot for casual gamers like me who are thinking about getting back into building desktops.
  • lightningz71 - Tuesday, October 2, 2018 - link

    The memory controller on the 2400G is not improved over Zepplin in any measurable way. It gets unstable quickly past 3200/3333 (though, with very loose timings and good samples, there are OC examples in the 3400 range, though, stability is questionable.)

    Buildzoid on reddit has a video on testing. He showed that B-die can net you tight timings at 3200/3333/ but gets VERY hot above that. E-Die was showing stability and lower temperatures at 3400, but, the timings were looser to get there. No matter what, his performance testing showed that 3200/3333 with tight timings were just as fast as 3400 with loose timings, and more achievable. His absolute fastest times were with UNSTABLE runs of B-Die at 3400+ mhz with passable timings, but it required considerable cooling to occasionally complete a test.

    The advantage to keeping the memory at 3200/3333 with tight timings is that you can run a bit more SOC voltage and maybe get a bit more out of the iGPU. 1600 is a reasonable overclock, but, with a bit more voltage, people are getting it past 1650 and sometimes 1700.

    I haven't seen delidding results on iGPU overclocking with memory overclocking at the same time.
  • msroadkill612 - Sunday, September 30, 2018 - link

    Sorry to be a dick Gavin, but...

    oignant
    ˈpɔɪnjənt/Submit
    adjective
    evoking a keen sense of sadness or regret.
    "a poignant reminder of the passing of time"
    synonyms: touching, moving, sad, saddening, affecting, pitiful, piteous, pitiable, pathetic, sorrowful, mournful, tearful, wretched, miserable, bitter, painful, distressing, disturbing, heart-rending, heartbreaking, tear-jerking, plaintive, upsetting, tragic
    "the father of the murder victim bade a poignant farewell to his son"

    Its a sore point. I misused it similarly in a speech.
  • msroadkill612 - Sunday, September 30, 2018 - link

    Good effort, but a shame temps not included.

    There is a suspicious pattern here.

    On the 2400g at voltage bumped, 1600 gpu clock, it gets good results ... mostly. Its min frames are erratic and even lowered at times.

    This could well be heat.

    My take home, is that the Apu is so cool, its worth decent cooling and a low latency nvme system disk with a direct pci link to the cpu.

    The apu seems a hit in the linux world, and that the apu is becoming a kind of default zen/vega platform - a target for some intimate apu driver tweaking.

    Its fine wine indeed.
  • hashish2020 - Tuesday, October 2, 2018 - link

    I am having trouble finding it, but was the memory overclocked as well or just the iGPU?
  • piteq - Tuesday, October 2, 2018 - link

    I was reluctant to buy any new graphics card because of the sick prices, so I'm keeping my (really) old Radeon HD 6790(-ish GPU, can't remember exactly - GPU-Z states it has Barts core and 1 GB DDR5, clocked at 900/1100 MHz, respectively). And it's paired with Ryzen 5 1600, on Asus Prime B350-Plus board. Do you think switching CPU to Ryzen 2400G would be some upgrade in the GPU area? I'd like to return to WoW and also play some newer, but not demanding titles. (Slightly slower CPU won't be much of an issue for me).
  • neblogai - Wednesday, October 3, 2018 - link

    It would be better for you to start looking for a new GPU- prices are getting good. RX570 for $150 would solve your problem better than switching to a 2400G.
  • V900 - Thursday, October 4, 2018 - link

    Just buy a used card, or a GT1030/1050.

    They're available for roughly what the 2400G costs, but will give you a much bigger GPU boost.
  • mikato - Thursday, October 4, 2018 - link

    The video ads are starting to bother me now, darn.
  • msroadkill612 - Wednesday, October 17, 2018 - link

    OP. u misuse poignant in ur opening. It has a "sad" connotation.
  • msroadkill612 - Wednesday, October 17, 2018 - link

    Heresy, given the curiously zealous loyalty folks have to sata ssd over clearly very superior nvme, but it would not surprise me if the HDD era SATA interface & even greater lag of the chipset the SATA ports run off, added some significant random delays by gaming standards.

    There is no comparison in; iops, processing overhead and access times.

    I am sure I have seen reviews where they don't even state the test rig's SSD details.

Log in

Don't have an account? Sign up now