BTW Octane 2.0 is retired for Google (just check their github), and even they endorse using Mozillas Speedometer 2.0 (darn can't find the relevant blog post).
I know; in the same way we have legacy benchmarks up, some people like to look at the data.
Not directed to you in general, but don't worry if 100% of the benchmarks aren't important to you: If there's 40 you care about, and we have 80 that include those 40, don't worry that the other 40 aren't relevant for what you want. I find it surprising how many people want 100% of the tests to be relevant to them, even if it means fewer tests. Optane was easy to script up and a minor addition, just like CB11.5 is. As time marches on, we add more.
I'm happy with 1366x768 and I'm seriously considering the 2400G because it looks like it can handle max detail settings at that resolution. I'm not interested in playing at high resolutions, but I do like having all the other non-AA eye candy turned on.
Some more realistic gaming settings might be nice. Noone is going to play on settings that result in ~20 fps, and the GPU/CPU scaling can tilt quite a bit if you reduce the settings.
I can see why you might not like it, because it takes the focus away from the GPU a bit and makes comparisons against a dGPU harder (unless you run it on the exact same hardware, which might mean you have to re-run it every time), but this is a combined product, so testing both against other iGPU products would be useful info.
20 FPS is playable. I have a 2 in 1 with a Skylake i3-6100u, and 20 FPS is what it gets in Skyrim. Any notion of things being "unplayable" under 30/60 FPS is like an audiohile saying songs are unlistenable on speakers less than $10,000.
I rather reduce settings a bit to go up in FPS then look at 20 fps average. There often is many things one can turn off without a huge visual impact to achieve much better performance.
Yes sorry, I didn't mean to nitpick. Just being a web developer myself dealing mosrly with frontend code, I just wanted to mention that Speedometer is actually considered to be fairly representative by both Mozilla and Google (and true enough the frameworks they use are actual frontend JS frameworks rendering TodoMVC) If you are already aware of that then that's excellent.
Yeah this review should have used medium or low settings, something that is actually playable on the CPUs tested. 25 fps might work for Civ6 but not a shooter.
Not too shabby, 2-3x the igpu perf of intel and comparable cpu perf in the same price range. And it will likely pull ahead even further in the upcoming weeks as faster memory becomes supported.
I have been holding on to my Intel 2500K desktop for what seems like forever. It has been a trusty companion but with a TDP of 95W and a dedicated GPU pulling 100W+ I'm looking for something a little less power hungry. AMD seems to have what I've been looking for and the price is right :-)
Amazon has the 2400G in stock, http://amzn.to/2BVzSSn and I think I'm going to bite the bullet!
PS does anybody have a mobo recommendation for pairing with the 2400G? (stability is my main concern, probably won't OC since the 2400G should be a nice step up from my 2500K)
if already have a decent GPU it is better to get the Ryzen 1600 instead. it is only 10 or 20$ more but you will get two extra and 8 extra PCIe lanes. These APU only make sense as a placeholder to get something better, for example building a working PC, then add a dedicated GPU later.
EXACTLY!!! This is the market for Ryzen with Vega. Business PC's and Laptops and also economy gaming for the markets that can not afford discrete GPU AIB.
Cool, thanks for the tip! How is discrete non-gaming (desktop, Photoshop) GPU power usage these days? I live off the grid and so energy efficiency is a big plus. I do not game much so (SC2 and some lower end Steam games).
Also, any suggestions for motherboards for 1600 or 2400G? Again, stability is top criteria for me.
Last question, what's the max number of video outputs for the 2400G? Thx!
Seriously AMD need to release something akin to NUC using the Raven Ridge. They can rake quite a lot of market with that. I will change my office's PCs with those, better GPU and comparable CPU.
I'm using the ASRock AB350M Pro4 with a Ryzen 3 1300X, 16GB Crucial Ballistix 2400MHz DDR4 memory, and a GTX 1060 SC. It's been a rock solid board so far, and it has two PCI-E storage slots (one is NVMe, the other is SATA) so you can use it comfortably in a case with limited storage options.
I was nervous about it after I read some reviews on Newegg talking about stability issues, but it turned out pretty much all of those people were trying to overclock it far beyond its rated capabilities. It's perfectly stable if you don't try to burn it up on purpose.
Seriously. It's now obvious why Intel is using AMD graphics. Considering that its mostly on par (sometimes faster, sometimes slower) with a GT 1030, a $100 GPU that uses 30 watts alone, Intel made the right choice using VEGA.
Wow, that's some impressive numbers for the price point (either of them). I think the R5 2400G would cover the vast majority of users' CPU and GPU needs to the point where they wouldn't notice a difference from anything more expensive. Anyone short of a power user or hardcore gamer could buy one of these and feel like they'd bought a real high-end system, with a $169.99 CPU. That's real value. I kinda want one to play around with, I don't know how I'll justify that to myself... Maybe I'll give it to my father next Christmas.
Was hoping to see GPU OC perf and power, won't scale great unless the memory controller can take faster sticks (than Summit Ridge) but we still need to figure it all out.
Anyway, they do deliver here for folks that can't afford discrete or got other reasons to go with integrated. Even the 2400G is ok if one needs 8 threads.
Where is the i5-8400 that has the same price as the 2400G? Oh, yeah, they totally left it out from the benchmarks since it would have proved an absolute supremacy of the Intel offering. Ops.
"Where is the i5-8400 that has the same price as the 2400G? Oh, yeah, they totally left it out from the benchmarks since it would have proved an absolute supremacy of the Intel offering. Ops."
In which benchmarks do you expect to see the i5-8400 prove its "absolute supremacy" where the i5-7400 didn't? Seriously, I'd like to know.
Because what I see is either the i5-7400 beating the 2400G or going punch to punch with it, or being thoroughly decimated by it.
If the i5-7400 beats or competes with the 2400G, the i5-8400 refresh chip will do the same. If the i5-7400 gets trounced by the 2400G, the i5-8400 refresh chip isn't suddenly and magically going to beat it.
I fail to see anything in the article to indicate a pro-AMD bias on AT's part, either intentional or unintentional.
What I do see is a fanboy who's upset to see his team losing some benchmarks.
Fair point, and my apologies. I keep forgetting that they upped the i5's to 6 cores after a decade of 4c4t i5's (including the 4690K I currently use).
That being said, the i5-8400 itself is the same price as the 2400G, but getting the i5-8400 running is not the same price as getting the 2400G running. The 2400G was tested on an MSI B350I Pro AC (https://www.anandtech.com/show/12227/msi-releases-... which is new and doesn't yet have a publicly-known MSRP, but is built and featured like other $70-80 B350 motherboards. What motherboards are on the market today for $70-80 that support the i5-8400?
So we've taken into account the additional 2 cores and the subsequent boost to the CPU-focused benchmarks, which the 7400 sometimes lost and sometimes won against the 2400G, and put a couple small notches into the 8400's belt. For another 50 bucks or so on the motherboard just to use the 8400, that's not too bad I suppose. It's what I would expect pitting a 6c6t CPU against a 4c8t CPU in CPU benchmarks. It's certainly not "absolute supremacy" but it's something, right?
Were you expecting that "absolute supremacy" to show up in iGPU gaming? I'll just laugh about that and move on.
Sure, the 8400 could probably step past the 2400G in gaming and graphics if you paired it with a $120-or-so graphics card (assuming you can find one at $120 or so), but then you're comparing a dGPU to an iGPU and you're still only barely stepping past.
So the only real way to make the 8400 show "absolute supremacy" over the 2400G is to cherry-pick just the benchmarks you like, and bolster the 8400 with another $200 of additional hardware.
No it's not.In regards to vs the 8400, its a mixed bag. For programs that favor Intel CPU's there is a clear advantage. For programs that favor AMD the advantage swings the other way. For everything else that's generally proc agnostic they tie, pull ahead slightly or gets beat relatively evenly in regards to CPU performance.
Now GPU wise, it gets crushed. That's obvious that is gonna happen.
If you plan on getting a DGPU with some beef, either is good, If you looking to game on the cheap, which is the target of the AMD proc in this review, its the hands down winner. Comparable perf, but with a beefier iGPU that can hang with a 1030. Also it gives you the option of adding a DGPU later when you need more grunt. It's clearly the better buy this go around. No other site that Ive seen has argued against this.
Are these going to get a 12nm refresh , as all the other ryzen cpus? I am thinking of upgrade either i58400 or r5 1600/1700 or possibly 2400g.. decision decisions ...
That will be fixed when lower tier 300-series chipsets launch. However, it's a significant problem for those wanting to build a cheap setup until then.
I used the chips I have on hand for the tests, forgot to add already tested chips - we haven't tested the i5-8400 IGP, but the CPU results are on hand in Bench. I can add those results to the graphs when I get a chance.
Ian, I dont know if fhis is just when browing from a phone but the bench when listing CPUs while alphabetic, bc of the chips names ~lake, etc. The listing jumps all over the place. 8 series before 4 seriez then 7 series. Can yall fix this? Thanks
Well, not great, but it can still run a RAID controller off the CPU lanes and a single port of 10Gbe from the chipset, or run a dual port 10Gbe from the CPU and a lower end SATA HBA from PCIex4 from the chipset with software RAID. The 2200G could make a decent storage server with a decent B350 board. I could do more with 16 lanes, but 8 is still workable. It's far cheaper than running a Ryzen 1200 with a X370 board and a graphics card with the same amount of lanes available for IO use and a faster CPU.
What's with the gaming benchmarks... Is there a valid reason that no games were benchmarked at playable settings? I'm going to have to go to another site to find out if these can get 60ish fps on medium or low settings.... And I thought these were being pitched at esports... so some overwatch and dota numbers might have been appropriate.
"and can post 1920x1080 gaming results above 49 FPS in titles such as Battlefield One, Overwatch, Rocket League, and Skyrim, having 2x to 3x higher framerates than Intel’s integrated graphics. This is a claim we can confirm in this review."
"These games are a cross of mix of eSports and high-end titles, and to be honest, we have pushed the quality settings up higher than most people would expect for this level of integrated graphics: most benchmarks hit around 25-30 FPS average with the best IGP solutions, down to 1/3 this with the worst solutions. The best results show that integrated graphics are certainly capable with the right settings, but also shows that there is a long way between integrated graphics and a mid-range discrete graphics option."
I would love to see which settings BF1 would have 49FPS please. Is it with everything on low, medium?
Typo page 1: "Fast forward almost two years, to the start of 2018. Intel did have a second generation eDRAM product" The linked article is from 2 May 2016, not the start of 2018.
This is true - technically we were sampled a different DDR4-3200 kit to be used. Normally our policy here is to use the maximum supported DRAM frequency of the processor for these tests - in the past there is a war of words when reviews do not, from readers and companies. When we do our memory scaling piece, it'll be with a wide range of offerings.
Hey Ian, thanks for the great review. I think your Cinebench-1T scores should be higher, in the 151-160 range for the 2200G and 2400G respectively. AMD pushed a microcode update through BIOS to testers very very late last week. A lot of the changes significantly boosted single-thread performance in general, even in some games. Did you folks end up getting this?
I only started testing with the new BIOS: can you confirm the difference is on both the motherboards AMD sampled? Some got MSI, others got GIGABYTE. We had MSI.
Ah okay. I believe it should have been updated on both MSI and Gigabyte...at least, I was told it should have landed on both platforms for standardization.
I would've loved to see you compare the 2400G against the i7-5775C with regards to 1080p gaming, as I can play games like Borderlands, WoW or Diablo in 1080p with medium settings on my Broadwell Iris graphics just fine.
If the 2400G doesn't allow for higher graphics settings than the i7-5775C, than I don't really see them taking the crown for integrated graphics. intel is just too stoopid to use what they have it seems.
When that i7 cost less than 150 then it will make the chart. At the price point it's at I can buy one of these chips and a $200 graphics card and do laps around the i7 all day.
Your i7-5775C isn't even as fast as an old Kavari A10 w/ 512 GCN2 SP's (it's close, but no cigar), so vs Vega 8 & 11 it gets it's ass absolutely handed to it... like by a lot - https://youtu.be/sCWOfwcYmHI
When I look at all the available benchmarks so far, then there's nothing this chip can play, that I can't allready play with my 5775C. 1080p with medium settings is no problem for most games like Overwatch, Borderlands, WoW, Diablo, etc. So if the 2400G can't run them at high settings, like it looks like, then I see no reason to call it the King of integrated graphics really.
How on God's green Earth can you compare a $600+ CPU versus the 2400g? The whole point of iGPU is to be cheap. The 2400g out performs a CPU that costs over 3x as much in the exact area this chip was built for. Low end gaming.
Back when I bought it, the Euro and the Dollar where allmost 1:1, and to get the Dollar-price you need to subtract the 24% VAT I pay over here, so yeah, back then it was around $300. Hell, the intel list-price was $328.
So what you're saying is that you paid twice the money to have under half the graphics performance and 20% lower CPU performance of a 2400G.
Graphics-wise the 5775C was pretty bad and got beaten by ALL AMD APUs at the time. It was close but it was never very good. Time has not been kind to it.
I noticed with some sadness that there's no DOTA 2 benchmarks. Was this due to time constraints or unforeseen issues? I'm crossing my fingers that DOTA 2 hasn't been dropped for good as it's a great benchmark for silicon such as this, though the other benchmarks of course do let us ballpark where it would land.
Mistake on the Blender benchmark. The latest version is 2.79 but you've put "2.78". Being as you also have a nightly build you might even have 2.8 if you've got it from the 2.8 nightly branch. Either way you will have at least 2.79.
Great article. However, because I am a pervert, I would LOVE to see some heterogeneous GPU action going on. "Does an AMD 2400G and a nVidia 1050 make a baby that is like a 1050 TI? What about if it mated with a Vega 56 or 580?" Know what I mean? [Nudge-nudge] Know what I mean?
Heterogeneous would be an APU, not crossfire. Far as AMD's plans with HSA who knows? They're not doing much talk about it since Zen came out. Maybe they don't need it now that their single thread performance is competitive?
I think I have to make it clear. The quoted processor(Core i7-8809G) will crush the Ryzen 5 2400G, but some other cheaper models in its series will perform better, just the superiority will be, not so great in the test results, but there will be such in terms of the price ratio / productivity.
I'll be holding on to my Haswell for another year or two. Fingers crossed for a 7nm quad core (6 core maybe???) with HT and Vega 16 (or 18) APU. When that's out, I'll be upgrading promptly, both laptop and desktop machines.
It mentioned it's going to compare the A12 9800, but this APU is nowhere to be seen in benchmarks. Then out of nowhere come A10 7870K, which is fine I guess, but then there's the A10 8750, which doesn't exit, I can asume it's 7850, yet a 7850 non K APU doesn't exist, so what's happening here?
Will there be any analysis on current and potential future HTPC performance? While it won't support Netflix 4k or UHDBR (yet, thanks Playready 3.0) I for one would still like to know how it handles HDR for local media playback and Youtube, and if it will have the CPU grunt to software decode AV1.
It is meant to support up to 4K H.264/5 at 30/60/120FPS for 4K/1440p/1080p resolutions. Obviously it'd be nice to see people testing this out, and the quality of the resulting video.
Still not quite getting the point of this product. Back when it made sense to build an HTPC, I liked the idea of the Bulldozer-era APU, so that I could play games on the TV without having a noisy gaming rig in the living room. But the performance is just never quite there, and it looks like it will be some time before you can spend ~$400 and get 4K gaming in the living room. So why not just buy an Xbox One X or PS4? I also bought a Shield TV recently for $200 and that streams games from my VR/4K rig just fine onto the TV. I'm just not seeing the need for a budget product that's struggling at 1080p and costs about the same as a 4K console.
There are 7+ billion people on this planet and the vast majority of them will never be able to afford a console or to pay a single cent for software - consoles are cheap because they screw you on the software side. Vs the global average you are swimming in money. And ofc the majority of the PC market is commercial as consumer has been declining hard this decade. Most humans can barely put food on the table, if that and even a 200$ TV is a huge investment they can afford once every 15 years.
I don't get the idea of desktops except if you want ultimate gaming PC - go with High End CPU a long with High End GPU. Otherwise go mobile. You can pretty much go that route unless you desired extreme top end performance
If you primary into game get a Xbox One X or S and HDTV are cheap or PS 4,
But lower end desktop PC - I see no need them for now. Times have changed
To me laptops are annoying, and only convenient for basic tasks with their mobility. Otherwise they are slow, have a small screen, often don’t a have mouse, and no number pad on keyboard. As a result, typing is slower, pointing is slower, app speed is slower, and gaming performance is worse. With the smaller screen, juggling things, dragging files, etc is more difficult. I just can’t get stuff done as well on a laptop as a desktop.
They use the frequencies officially supported , anything above that is OC and would fall into the OC section. It's debatable how right or wrong that is but that's what AT does.
Good to see you started testing CPU's with maximum supported RAM speed instead of JEDEC frequency. These APU's would have really suffered if tested with 2133MHz DDR4 RAM.
For a low-end graphics part like this, it would be really interesting to have a section in the review exploring the "comfortable" settings in various games.
It could be really useful information for potential buyers to know what kind of settings they'd need to run in a game to reach their preferred performance level (99th percentile), whether it's 30, 45 or 60 fps, and also to know if a product simply can't reach certain performance no matter how low you turn the settings.
Why do you only report total power consumption? I'd like to see power efficiency!!! Since I don't know what the performance per CPU is, these power measurements mean almost nothing. Also, the efficiency will change with the workload, so Prime95 is a very one-dimensional test of efficiency. Look at your power measurement graphs: they tell you what we already know - single core speeds are lower for Ryzen, and lower TDP CPUs use less power. That's kinda duh...
I'm confused. The AMD vs. AMD section led me to believe there was going to be a comparison of Raven Ridge against Bristol Ridge APUs, which makes sense as it would have allow the use of the same motherboard for both APUs, even if the Bristol Ridge DDR4 memory was clocked slower. But then actual benchmarks is showing Kaveri parts?
Comparing with a competitive Intel platform with dGPU is kinda tricky right now. It's not just the dGPUs that are ridiculously priced right now. RAM is too. And to maximize performance on the R7 2400G, you WILL need to spend more than the basic $90 8GB 2400 kit. The cheapest 16GB Samsung b-die 3200 kit I found was $220. And you will want to go with a 16GB kit, because already some newer games use more than 8GB, and they use MORE when using graphics cards with less than 4GB. The iGPU takes some of that 8GB for itself...if runs out of system RAM, it has to use your system disk... enjoy the single digit frame rates...
Here is what I found on newegg: INTEL $130 - i3-8100 $90 - 8GB 2400 RAM (or 170 for 16GB) $120 - Z370 motherboard (no mainsteam chipset YET) === $340
The intel system is a full $130 cheaper (or $50 if you spring for the 16GB), and that gap will only increase with the upcoming cheaper chipsets and/or upcoming coffee lake models. Now, I haven't included the dGPU yet, but the GTX1050 2GB currently goes for $150 - making the Intel system total only $20 more than than AMD system, and running rings around it in most games (although neither would be ideal for the latest games... the 2GB-GPU/8GB-SYSRAM Intel system would run out of memory and the Vega 11 just doesn't have the horsepower).
What would put things in favor of AMD would be if they made clear that the iGPU would still be of use when using a dGPU (such as with the new "ThueAudioNext") in the future.
What I would REALLY like to see, though, is AMD use the beefier Vega iGPUs Intel is using with their own 12nm Zen+ chips and slap on some HBM memory. THAT I could go for.
1.\ HBM on budget chips pushes them into 250$ range by just adding HBM. 2.\ IGP solutions are not a GTX1060 replacement, it's not magic.
System comparison: I3 8100 is inferior in cpu tasks. it has half the memory (intel IGP still uses memory you know?, it's configurable on both systems. The benchmarked system runs at 2933, not 3200 so o.0 it has inferior gpu 3X~ even at same memory speeds it would still be 2X as slow, slower cpu.
So what is the point of the argument ? It not needed to have the extra memory frequency but if you want to replace a 80$ dedicated gpu you need to and definitely add 8gb extra memory and that's where the cost comes into place as a valid comparison if you subtract 20$ from AMD for 2933 ~ add 80-100$ for GT1030 you still end up with an intel rig with higher system power consumption, equal gaming performance, inferior cpu and you will have to buy a G-sync monitor if you want tearing free monitor while freesync is thrown at ya at any price range as an added bonus.
systems are comparable and Intel's I3 line is destroyed along amd's old R3 line too. I5,I7 and R5 stands tall still and the R7 has it's place at times too.
Did you not even read my post? Or the review for that matter? Did you think AT ALL about the real life application of anything said before posting? "I3 8100 is inferior in cpu tasks" WHAT tasks? I'll answer that for you. Rendering. If you are trying to get the cheapest CPU possible for rendering with as little RAM as possible to shave as much money off as possible... you are doing it wrong. Since we (or at least I am... not sure what you are going on about) are talking about GPUs, you can safely assume we are concerned about GAMES.
"it has half the memory" no s--t, sherlock, read the post again (or for the first time, apparently)
I could go on, but apparently you were just scouring the posts for someone to disagree with with a pre-defined little rant, so I won't bother.
Why you need 16GB? if you bought Ryzen APU, you probably only plays e-sport title anyway and some older games... E-sport titles doesn't need a huge RAM. And it already crush the intel counterparts, both in performance and price.
You guys from first world countries are always complaining. Jeez. Try to live at poorer countries like South East Asia.
An APU at $100 would still be expensive, especially when people in the developing world build machines with Pentium G chips. The speedy APU graphics would negate the need for a low-end discrete GPU though.
Well not really. While they using Pentium G with GT 730 or lower, many uses AMD A-series APU too (since they no need to use low end discrete GPU to be on par)
And Ryzen 2200G also priced the same as Pentium G with GT 730 tho. The exception is RAM prices...
If AMD uses a beefier Vega IGPU, are you willing to pay for it is the question? I feel iGPU will only make sense if the price is low, or if the power consumption is low. Where Intel is using AMD graphics, is likely for a fruity client. Outside of that, you won't see many manufacturers using it because of the cost. For the same amount of money Intel is asking for the chip only, there are many possible configuration with dedicated graphics that you can think of. Also, the supposedly beefier AMD graphics is about as fast as a GTX 1050 class. You are better off buying a GTX 1050Ti.
I feel the R3 2200G is still a better deal than the R5 2400G. The price gap is too big relative to the difference in performance. And because these chips are over clocking friendly, so despite the R3 being a cut down chip, there could be some performance catchup with some overclocking. Overall, I feel both are great chips especially for some light/ casual gaming. If gaming is the main stay, then there is no substitute for a dedicated graphic solution.
The 2200G is a sweet because it offers most of the 2400G's performance at a sub-$100 point. For most business and home desktops, it's more than enough for both CPU and GPU performance. And with discrete GPUs being so hard to get now, good-enough APU graphics will do for the majority of home users. Hopefully AMD can translate all this into actual shipping machines.
I'm going to sound like a broken record but AMD could send another boot up Intel's behind by making an Atom competitor. A dual-core Zen with SMT and cut-down Vega graphics would still be enough to blow Atom out of the water.
Looks like AMD owns the good-enough category. As I said previously, let's hope this translates into actual machines being shipped, seeing as OEMs previously made some terrible AMD-based systems at the low end.
Finally one review where I can see the driver version ... So this is the same driver used also for the Ryzen mobile APUs. Can you check if you can force/manual install the latest Adrenaline drivers ? That works on some of the Ryzen 2500u chips and actually increases the performance by some 15+% ...
I was considering GT 1030 + Intel route for H265 and HDR10 playback and was really looking forward to Zen APUs, but there doesn't seem to be any motherboards with HDMI 2.0?!
Also, I wonder if the chips can be undervolted and underclocked to bring them to a near silent noise level for the living room.
What most writers and critics of integrated graphics processors such as AMD's APU or Intel iGP all seem to forget, is not EVERYONE in the world has a disposable or discretionary income equal to that of the United States, Europe, Japan etc. Not everyone can afford bleeding edge gaming PC's or laptops. Food, housing and clothing must come first for 80% of the population of the world.
An APU can grant anyone who can afford at least a decent basic APU the enjoyment of playing most computer games. The visual quality of these games may not be up to the arrogantly high standards of most western gamers, but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen.
This simple idea is huge not only for the laptop and pc market but especially game developers who can only expect to see an expansion of their Total Addressable Market. And that is good for everybody as broader markets help reduce the cost of development.
This in fact was the whole point behind AMD's release of Mantle and Microsoft and The Kronos Group's release of DX12 and Vulkan respectively.
Today's AMD APU has all of the power of a GPU Add In Board of not more than a several years back.
Because these CPUs, while having the same price range, outperform these Raven Ridge chips. That would have been a bad press for AMD and it seems like Anandtech wants to remains extremely loyal to AMD in these days.
"the data shows both how far integrated graphics has come, and how far it still has to go to qualify for those 'immerse experiences' that Intel, AMD, and NVIDIA all claim are worth reaching for, with higher resolutions and higher fidelity. "
This assumes a static situation which is rot.
what it reveals is that in the current paradigm, coders have coded accordingly for satisfactory results. If the paradigm changes and other ways work better, then code evolves.
This unprecedented integration of new gen, sibling cpu & gpu, offers many performance upsides too for future code.
picture a mobo with a discrete gpu like an equivalent 1030, then picture a ~matchbox footprint apu - there is a huge difference in the size of the respective circuits - yet they both do the same job & have to send a lot of data to each other.
it's not hard to figure which is inherently superior in many ways.
They do. It's called Vega. Very efficient in mid- to low range and compute, and if I'm not mistaken that's where the money is. Highend gaming is just wi**ie waving for us geeks.
Keep in mine this is a mobile chips - this is new mobile chips is quite powerful - I thinking of actually getting one - only big concern is compatibility with Vega chip.
Any idea where I could buy the MSI B350I Pro AC? I have searched every retailer I've ever bought from and can not find the damn thing. I'm hoping it can run a 2400G out of the box, at least to update to the newest BIOS.
they REALLY should not have cut back the L3 cache SO MUCH...beyond that, truly are amazing for what they are...they should have also made a higher TDP version such as 125-160w so they could cram more cpu cores or at very least a more substantial graphics portion and not limit dGPU access to 8x pci-e (from what I have read)
Graphics cards and memory are anything but low cost.
2200 IMO is "fine" for what it is, the 2400 should have had at least 4mb l3 cache (or more) then there should have been "enthusiast end" with the higher TDP versions so they could more or less ensure someone trying to do it "on a budget" really would not have to worry about getting anything less than (current) RX 570-580 or 1060-1070 level.
many cpu over the years (especially when overclocked) had a 140+w TDP, they could have and should have made many steps for their Raven Ridge and not limit them so much..IMO...they could have even had a frankenstein like one that has a 6pin pci-e connector on it to feed more direct power to the chip instead of relying on the socket alone to provide all the power needed (at least more stable power)
AM4 socket has already been up to 8 core 16 thread, and TR what 16 core 32 thread says to me the "chip size" has much more room available internally to have a bigger cpu portion and/or a far larger GPU portion, now, if they go TR4 size, TR as it is already has 1/2 of it "not used" this means they could "double up" the vega cores in it to be a very much "enthusiast grade" APU, by skimping cost on the HBM memory and relying on the system memory IMO there is a vast amount of potential performance they can capture, not to mention, properly designed, the cooling does not really become an issue (has not in the past with massive TDP cpu afterall)
anyways..really is very amazing how much potency they managed to stuff into Raven Ridge, they IMO should not have "purposefully limited it" especially on the L3 cache amount, 2mb is very limiting as far as I am concerned especially when trying to feed 4 core 8 thread at 65w TDP alojng with the gpu portion.
Either they are asking a bit much for the 2400g or, they are asking enough they just need to "tweak" a bit more quickly to make sure it is not bottlenecking itself for the $ they want for it ^.^
either way, very well done....basically above Phenom II and into Core i7 level performance with 6870+ level graphics grunt using much less power...amazing job AMD...Keep it up.
Both these APUs are extremely attractive. The R5 just screams upgradable. You get a very capable 4 core / 8 thread CPU packaged with an entry level dGPU for less than the competition charges for the CPU (with abyssmal iGPU) alone. In the current market with astronomical, even comical, dGPU prices this is a clear winner for anyone wanting to build a powerful mid-tier system but doesn't have the means to fork out ridiculous cash for higher tier dGPU now.
The R3 scream HTPC or small gaming box. A good low end CPU paired with a bare bones but still decently performing iGPU. Add MB, RAM, PSU, and HDD/SSD and you're good to go. I imagine these will sell like hot cakes in markets with less overall GDP and in the brick'n'mortar retail market.
The question is now. Is Intel ever going to produce a decent iGPU for the low end market? They've had plenty of time to do so but before Ryzen, AMD APUs just wasn't that attractive. Now though, you really have to think hard for a reason to justify buying a low end Intel CPU at all.
I have been doing some digging and found that although current-generation AM4 motherboards lack formal HDMI 2.0 certification, just like many HDMI 1.4 cables will pass an HDMI 2.0 signal seamlessly without a hitch, the same appears to be the case for these boards whose HDMI traces and connectors may indeed be agnostic to the differences, if any. Could you do a quick test to see if HDMI 2.0 signals work for the Raven Ridge APUs on the AM4 motherboards you have access to? For further reference on the topic, see this forum thread “Raven Ridge HDMI 2.0 Compatibility — AM4 Motherboard Test Request Megathread” at SmallFormFactor.
These chips will cause a major segment of low-end graphics cards to be buried, or have their prices cut down. I can't wait for the benchmarks, and see how many generations will be affected.
what an amazing apu... this makes all celerons / pentiums and half of i3 completely out of market! this apu is so good, so cheap, kills all the entry level gpus and basically all those intel cpus mentioned . way to go amd! now about those drivers and bios....
this is amazing by AMD, they need to keep the hits coming, and chipping away market share from intel. Intel has not made any progress on processor graphics in years, this is a super aggressive strategy by AMD that i think will work. Nvidia basically has to cancel out any gpu line below gtx 1030 by next year. Hopefully AMD comes out with a GTX 1050 killer by next year.
Do these APUs not compete directly with the new Intel-Vega chips? Why did AMD give Intel Vega, when AMD could have had the APU market all to themselves?
Not really. The i8809G as it's called is supposedly a mobile part if you can call a 100W TDP part mobile by any stretch of the imagination. It's based on a 22 (or 24) CU Vega and will be much much more expensive. Expect it to land somewhere in the $400+ range, ie. more expensive than an 8700K
Hi Ian. Two questions: I have to do a cheap PC for video editing (Adobe Premiere, After Effcets, etc.) and I thought about a Ryzen 3 1200 and a Geforce GT 1030. You think this Ryzen 5 2400G goes for video editing by having a Vega 11? Another question: using DDR4 2400 would I have a strong performance drop or could it be okay? Thank you!
A question for Ian or anyone who knows - Does 2400G not support HW encoding of H264/265 and just limited decoding? Or that Handbrake doesn't support it yet? Looking at the encoding score it would seem 8400 is miles ahead though the GPU in Ryzen is much stronger.
Hi. Do you know the max videoram? Will it be possible to use Crossfire between the shared ram and a graphics card? If crossfire will be limited to VEGA graphics card, will I hope AMD will introduce a cheap graphics card with just 8 or 11 Vega compute units and may be HDMI 2.0
On the subject of allocated VRAM. It should be set as low as possible. The minimum is 64MB. You should never set it higher. Since VRAM is system RAM there is no speed gains to be had in GPU performance setting it higher than the minimum and just letting windows sort out the spill over but you are potentially limiting the CPU performance a lot as allocated VRAM eats available system RAM, so if you have 8GB but set VRAM to 2GB, the system only have 6GB remaining. This can seriously hurt performance in some cases. So, as little VRAM as possible is the correct setting.
Get The Dissertation Writing Service Students Look For These Days With The Prime Focus Being Creating A Well Researched And Lively Content On Any Topic. Mechanical Engineering Homework Help http://www.mechanicalassignments.com/ ............
For Ryzen 5 2400G + X370 mATX m/b. What 16GB (4x4GB) and 32GB (4x8GB) kits memory modules (brand + model) would easily overclock much higher than DDR4-2933 for the best GPU performance ?
(I'm looking at some 4000MHz kits but if that is not compatible it will be a waste.)
Anandtech reviews continue to show us what the graphics chips CAN'T do well instead of what they CAN do well. I've heard all the reasons before, and they don't help. Gaming benchmarks should be presented for PLAYABLE settings rather than can-just-barely-run-it settings, i.e. High/Ultra. No gamer will settle for jerky, stuttery gameplay, so why not show us what the items under review can do on, say, Medium settings with a few High setting benchmarks thrown in for good measure?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
177 Comments
Back to Article
Gideon - Monday, February 12, 2018 - link
BTW Octane 2.0 is retired for Google (just check their github), and even they endorse using Mozillas Speedometer 2.0 (darn can't find the relevant blog post).Ian Cutress - Monday, February 12, 2018 - link
I know; in the same way we have legacy benchmarks up, some people like to look at the data.Not directed to you in general, but don't worry if 100% of the benchmarks aren't important to you: If there's 40 you care about, and we have 80 that include those 40, don't worry that the other 40 aren't relevant for what you want. I find it surprising how many people want 100% of the tests to be relevant to them, even if it means fewer tests. Optane was easy to script up and a minor addition, just like CB11.5 is. As time marches on, we add more.
kmmatney - Monday, February 12, 2018 - link
In this case, a few 720p gaming benchmarks would have been useful, or even 1080p at medium or low settings.III-V - Tuesday, February 13, 2018 - link
Who uses 720p and is in the market for this?PeachNCream - Tuesday, February 13, 2018 - link
I'm happy with 1366x768 and I'm seriously considering the 2400G because it looks like it can handle max detail settings at that resolution. I'm not interested in playing at high resolutions, but I do like having all the other non-AA eye candy turned on.atatassault - Tuesday, February 13, 2018 - link
People who buy sub $100 monitors.WorldWithoutMadness - Tuesday, February 13, 2018 - link
Just google GDP per capita and you'll find huge market for 720p budget gaming pc.Sarah Terra - Wednesday, February 14, 2018 - link
Wow, i just came here after not visiting in ages, really sad to see how far this site has fallen.Ian Cutress was the worst thing that ever happened to Anandtech.
At one point AT was the defacto standard for tech news on the web, but now it has simply become irrelevant.
Unless things change i see AT slowly but surely dying
lmcd - Friday, March 22, 2019 - link
Wow, I just came to this article after not visiting for ages, really sad to see how the comment section has fallenmikato - Thursday, February 15, 2018 - link
Me. My TV is 720p and still kicking after many years. These CPUs would make for a perfect high end HTPC with some solid gaming ability. Awesome.nevcairiel - Tuesday, February 13, 2018 - link
Some more realistic gaming settings might be nice. Noone is going to play on settings that result in ~20 fps, and the GPU/CPU scaling can tilt quite a bit if you reduce the settings.I can see why you might not like it, because it takes the focus away from the GPU a bit and makes comparisons against a dGPU harder (unless you run it on the exact same hardware, which might mean you have to re-run it every time), but this is a combined product, so testing both against other iGPU products would be useful info.
atatassault - Tuesday, February 13, 2018 - link
20 FPS is playable. I have a 2 in 1 with a Skylake i3-6100u, and 20 FPS is what it gets in Skyrim. Any notion of things being "unplayable" under 30/60 FPS is like an audiohile saying songs are unlistenable on speakers less than $10,000.lmcd - Tuesday, February 13, 2018 - link
Any notion of things being "unplayable" under 30/60 FPS is like an audiophile saying songs are unlistenable on speakers less than $100.Fixed it for you (FIFY).
nevcairiel - Thursday, February 15, 2018 - link
I rather reduce settings a bit to go up in FPS then look at 20 fps average. There often is many things one can turn off without a huge visual impact to achieve much better performance.29a - Saturday, October 26, 2019 - link
What a useless review. I came here to see if this thing can do some low end gaming and you didn't even test on 720p.Gideon - Tuesday, February 13, 2018 - link
Yes sorry, I didn't mean to nitpick. Just being a web developer myself dealing mosrly with frontend code, I just wanted to mention that Speedometer is actually considered to be fairly representative by both Mozilla and Google (and true enough the frameworks they use are actual frontend JS frameworks rendering TodoMVC) If you are already aware of that then that's excellent.richardginn - Monday, February 12, 2018 - link
An article looking at how memory speed affects FPS on the 2400G and 2200G is s must.I say you can 1080P game with this although it looks like for a bunch of games you will be on low settings
stanleyipkiss - Monday, February 12, 2018 - link
Check out Hardware Unboxed's review on YouTube. They did just that.beginner99 - Tuesday, February 13, 2018 - link
Yeah this review should have used medium or low settings, something that is actually playable on the CPUs tested. 25 fps might work for Civ6 but not a shooter.iter - Monday, February 12, 2018 - link
Not too shabby, 2-3x the igpu perf of intel and comparable cpu perf in the same price range. And it will likely pull ahead even further in the upcoming weeks as faster memory becomes supported.coolhardware - Monday, February 12, 2018 - link
I have been holding on to my Intel 2500K desktop for what seems like forever. It has been a trusty companion but with a TDP of 95W and a dedicated GPU pulling 100W+ I'm looking for something a little less power hungry. AMD seems to have what I've been looking for and the price is right :-)Amazon has the 2400G in stock, http://amzn.to/2BVzSSn and I think I'm going to bite the bullet!
PS does anybody have a mobo recommendation for pairing with the 2400G? (stability is my main concern, probably won't OC since the 2400G should be a nice step up from my 2500K)
zaza - Monday, February 12, 2018 - link
if already have a decent GPU it is better to get the Ryzen 1600 instead. it is only 10 or 20$ more but you will get two extra and 8 extra PCIe lanes. These APU only make sense as a placeholder to get something better, for example building a working PC, then add a dedicated GPU later.haukionkannel - Monday, February 12, 2018 - link
Or this will get you very good office computer, without ever needing external GPU...forgerone - Tuesday, February 13, 2018 - link
EXACTLY!!! This is the market for Ryzen with Vega. Business PC's and Laptops and also economy gaming for the markets that can not afford discrete GPU AIB.coolhardware - Monday, February 12, 2018 - link
Cool, thanks for the tip! How is discrete non-gaming (desktop, Photoshop) GPU power usage these days? I live off the grid and so energy efficiency is a big plus. I do not game much so (SC2 and some lower end Steam games).Also, any suggestions for motherboards for 1600 or 2400G? Again, stability is top criteria for me.
Last question, what's the max number of video outputs for the 2400G? Thx!
coolhardware - Monday, February 12, 2018 - link
PS I currently have a GTX 960. It does look like a step down versus the 2400G, as ~1030 (similar benchs to 2400G) is quite a bit lower speed than a 960:http://gpu.userbenchmark.com/Compare/Nvidia-GTX-96...
Cellar Door - Monday, February 12, 2018 - link
The 960 is 2-3x the performance of this.Samus - Monday, February 12, 2018 - link
GTX960 is a $200+ GPU. It's substantially faster than any integrated graphics and probably will be for the next few years.msroadkill612 - Tuesday, February 13, 2018 - link
" I live off the grid and so energy efficiency is a big plus." - apuS are exactly what you should be using.WorldWithoutMadness - Monday, February 12, 2018 - link
then might as well wait for ryzen+ version.Seriously AMD need to release something akin to NUC using the Raven Ridge. They can rake quite a lot of market with that. I will change my office's PCs with those, better GPU and comparable CPU.
Lolimaster - Monday, February 12, 2018 - link
I would get the Asus X370 pro and the G.Skill Flare X 3200 CL14 (ram is expensive no matter how "cheap" you wanna go)coolhardware - Monday, February 12, 2018 - link
Thank you for the recommendation!!! :-)kaidenshi - Tuesday, February 13, 2018 - link
I'm using the ASRock AB350M Pro4 with a Ryzen 3 1300X, 16GB Crucial Ballistix 2400MHz DDR4 memory, and a GTX 1060 SC. It's been a rock solid board so far, and it has two PCI-E storage slots (one is NVMe, the other is SATA) so you can use it comfortably in a case with limited storage options.I was nervous about it after I read some reviews on Newegg talking about stability issues, but it turned out pretty much all of those people were trying to overclock it far beyond its rated capabilities. It's perfectly stable if you don't try to burn it up on purpose.
Samus - Monday, February 12, 2018 - link
Seriously. It's now obvious why Intel is using AMD graphics. Considering that its mostly on par (sometimes faster, sometimes slower) with a GT 1030, a $100 GPU that uses 30 watts alone, Intel made the right choice using VEGA.Flunk - Monday, February 12, 2018 - link
Wow, that's some impressive numbers for the price point (either of them). I think the R5 2400G would cover the vast majority of users' CPU and GPU needs to the point where they wouldn't notice a difference from anything more expensive. Anyone short of a power user or hardcore gamer could buy one of these and feel like they'd bought a real high-end system, with a $169.99 CPU. That's real value. I kinda want one to play around with, I don't know how I'll justify that to myself... Maybe I'll give it to my father next Christmas.jjj - Monday, February 12, 2018 - link
Was hoping to see GPU OC perf and power, won't scale great unless the memory controller can take faster sticks (than Summit Ridge) but we still need to figure it all out.iter - Monday, February 12, 2018 - link
Most other sites' reviews feature overclocking and power.Ian Cutress - Monday, February 12, 2018 - link
I started an initial run with higher speed memory, but nothing substantial enough to put in the article just yet. I'm planning some follow ups.jjj - Monday, February 12, 2018 - link
Looking forward to all of that.Anyway, they do deliver here for folks that can't afford discrete or got other reasons to go with integrated. Even the 2400G is ok if one needs 8 threads.
Kamgusta - Monday, February 12, 2018 - link
Where is the i5-8400 that has the same price as the 2400G?Oh, yeah, they totally left it out from the benchmarks since it would have proved an absolute supremacy of the Intel offering.
Ops.
speely - Monday, February 12, 2018 - link
"Where is the i5-8400 that has the same price as the 2400G?Oh, yeah, they totally left it out from the benchmarks since it would have proved an absolute supremacy of the Intel offering.
Ops."
In which benchmarks do you expect to see the i5-8400 prove its "absolute supremacy" where the i5-7400 didn't? Seriously, I'd like to know.
Because what I see is either the i5-7400 beating the 2400G or going punch to punch with it, or being thoroughly decimated by it.
If the i5-7400 beats or competes with the 2400G, the i5-8400 refresh chip will do the same. If the i5-7400 gets trounced by the 2400G, the i5-8400 refresh chip isn't suddenly and magically going to beat it.
I fail to see anything in the article to indicate a pro-AMD bias on AT's part, either intentional or unintentional.
What I do see is a fanboy who's upset to see his team losing some benchmarks.
Kamgusta - Monday, February 12, 2018 - link
Ehm sir, 7400 is 4 core and 8400 is 6 core.Other reviews shows a 30% performance dominance of i5-8400 over the 2400G.
speely - Monday, February 12, 2018 - link
Fair point, and my apologies. I keep forgetting that they upped the i5's to 6 cores after a decade of 4c4t i5's (including the 4690K I currently use).That being said, the i5-8400 itself is the same price as the 2400G, but getting the i5-8400 running is not the same price as getting the 2400G running. The 2400G was tested on an MSI B350I Pro AC (https://www.anandtech.com/show/12227/msi-releases-... which is new and doesn't yet have a publicly-known MSRP, but is built and featured like other $70-80 B350 motherboards. What motherboards are on the market today for $70-80 that support the i5-8400?
So we've taken into account the additional 2 cores and the subsequent boost to the CPU-focused benchmarks, which the 7400 sometimes lost and sometimes won against the 2400G, and put a couple small notches into the 8400's belt. For another 50 bucks or so on the motherboard just to use the 8400, that's not too bad I suppose. It's what I would expect pitting a 6c6t CPU against a 4c8t CPU in CPU benchmarks. It's certainly not "absolute supremacy" but it's something, right?
Were you expecting that "absolute supremacy" to show up in iGPU gaming? I'll just laugh about that and move on.
Sure, the 8400 could probably step past the 2400G in gaming and graphics if you paired it with a $120-or-so graphics card (assuming you can find one at $120 or so), but then you're comparing a dGPU to an iGPU and you're still only barely stepping past.
So the only real way to make the 8400 show "absolute supremacy" over the 2400G is to cherry-pick just the benchmarks you like, and bolster the 8400 with another $200 of additional hardware.
"Absolute supremacy".
Manch - Monday, February 12, 2018 - link
No it's not.In regards to vs the 8400, its a mixed bag. For programs that favor Intel CPU's there is a clear advantage. For programs that favor AMD the advantage swings the other way. For everything else that's generally proc agnostic they tie, pull ahead slightly or gets beat relatively evenly in regards to CPU performance.Now GPU wise, it gets crushed. That's obvious that is gonna happen.
If you plan on getting a DGPU with some beef, either is good, If you looking to game on the cheap, which is the target of the AMD proc in this review, its the hands down winner. Comparable perf, but with a beefier iGPU that can hang with a 1030. Also it gives you the option of adding a DGPU later when you need more grunt. It's clearly the better buy this go around. No other site that Ive seen has argued against this.
dromoxen - Tuesday, February 13, 2018 - link
Are these going to get a 12nm refresh , as all the other ryzen cpus? I am thinking of upgrade either i58400 or r5 1600/1700 or possibly 2400g.. decision decisions ...Manch - Wednesday, February 14, 2018 - link
Originally it was labeled as 12nm, now referred to 14nm+.Probably will be updated.cheshirster - Monday, February 12, 2018 - link
You need Z370 for the "supremacy" to work.Ops.
bug77 - Monday, February 12, 2018 - link
That will be fixed when lower tier 300-series chipsets launch. However, it's a significant problem for those wanting to build a cheap setup until then.Ian Cutress - Monday, February 12, 2018 - link
I used the chips I have on hand for the tests, forgot to add already tested chips - we haven't tested the i5-8400 IGP, but the CPU results are on hand in Bench. I can add those results to the graphs when I get a chance.Manch - Monday, February 12, 2018 - link
Ian, I dont know if fhis is just when browing from a phone but the bench when listing CPUs while alphabetic, bc of the chips names ~lake, etc. The listing jumps all over the place. 8 series before 4 seriez then 7 series. Can yall fix this? ThanksAndrewJacksonZA - Monday, February 12, 2018 - link
Hi Ian. I'm still on page one but I'm so excited! Can a 4xx Polaris card be Crossfired with this APU?prtskg - Tuesday, February 13, 2018 - link
No crossfire supported by these apus, according to AMD. You can check it out on AMD's product page.dgingeri - Monday, February 12, 2018 - link
How many PCIe lanes are available on them? I didn't see that info anywhere in the article.iter - Monday, February 12, 2018 - link
Only 8dgingeri - Monday, February 12, 2018 - link
Well, not great, but it can still run a RAID controller off the CPU lanes and a single port of 10Gbe from the chipset, or run a dual port 10Gbe from the CPU and a lower end SATA HBA from PCIex4 from the chipset with software RAID. The 2200G could make a decent storage server with a decent B350 board. I could do more with 16 lanes, but 8 is still workable. It's far cheaper than running a Ryzen 1200 with a X370 board and a graphics card with the same amount of lanes available for IO use and a faster CPU.Geranium - Monday, February 12, 2018 - link
8 PCIe Gen3 for gpu+4 Gen3 for SSD+4 Gen3 for Chipset.andrewaggb - Monday, February 12, 2018 - link
What's with the gaming benchmarks... Is there a valid reason that no games were benchmarked at playable settings? I'm going to have to go to another site to find out if these can get 60ish fps on medium or low settings.... And I thought these were being pitched at esports... so some overwatch and dota numbers might have been appropriate.AndrewJacksonZA - Monday, February 12, 2018 - link
"and can post 1920x1080 gaming results above 49 FPS in titles such as Battlefield One, Overwatch, Rocket League, and Skyrim, having 2x to 3x higher framerates than Intel’s integrated graphics. This is a claim we can confirm in this review.""These games are a cross of mix of eSports and high-end titles, and to be honest, we have pushed the quality settings up higher than most people would expect for this level of integrated graphics: most benchmarks hit around 25-30 FPS average with the best IGP solutions, down to 1/3 this with the worst solutions. The best results show that integrated graphics are certainly capable with the right settings, but also shows that there is a long way between integrated graphics and a mid-range discrete graphics option."
I would love to see which settings BF1 would have 49FPS please. Is it with everything on low, medium?
Ian Cutress - Monday, February 12, 2018 - link
I've added some sentences to the IGP page while I'm on the road. We used our 1080 high/ultra CPU Gaming suite for two reasons.Manch - Monday, February 12, 2018 - link
Which are?jjj - Monday, February 12, 2018 - link
Playable means 30FPS and has been that for.. ever.AndrewJacksonZA - Monday, February 12, 2018 - link
Typo page 1: "Fast forward almost two years, to the start of 2018. Intel did have a second generation eDRAM product"The linked article is from 2 May 2016, not the start of 2018.
richardginn - Monday, February 12, 2018 - link
If only the ram used in the 2400G review kits was not so god damn expensive. It is more expensive than the CPU.Ian Cutress - Monday, February 12, 2018 - link
This is true - technically we were sampled a different DDR4-3200 kit to be used. Normally our policy here is to use the maximum supported DRAM frequency of the processor for these tests - in the past there is a war of words when reviews do not, from readers and companies. When we do our memory scaling piece, it'll be with a wide range of offerings.richardginn - Monday, February 12, 2018 - link
I certainly want to see the memory scaling piece before making a purchase.RBD117 - Monday, February 12, 2018 - link
Hey Ian, thanks for the great review. I think your Cinebench-1T scores should be higher, in the 151-160 range for the 2200G and 2400G respectively. AMD pushed a microcode update through BIOS to testers very very late last week. A lot of the changes significantly boosted single-thread performance in general, even in some games. Did you folks end up getting this?Ian Cutress - Monday, February 12, 2018 - link
I only started testing with the new BIOS: can you confirm the difference is on both the motherboards AMD sampled? Some got MSI, others got GIGABYTE. We had MSI.RBD117 - Monday, February 12, 2018 - link
Ah okay. I believe it should have been updated on both MSI and Gigabyte...at least, I was told it should have landed on both platforms for standardization.jrs77 - Monday, February 12, 2018 - link
I would've loved to see you compare the 2400G against the i7-5775C with regards to 1080p gaming, as I can play games like Borderlands, WoW or Diablo in 1080p with medium settings on my Broadwell Iris graphics just fine.If the 2400G doesn't allow for higher graphics settings than the i7-5775C, than I don't really see them taking the crown for integrated graphics. intel is just too stoopid to use what they have it seems.
nierd - Monday, February 12, 2018 - link
When that i7 cost less than 150 then it will make the chart. At the price point it's at I can buy one of these chips and a $200 graphics card and do laps around the i7 all day.Cooe - Monday, February 12, 2018 - link
Here's an article with a bunch of graph's that include the i7-5775C if you'd prefer to peep this instead of that vid.https://hothardware.com/reviews/amd-raven-ridge-ry...
Cooe - Monday, February 12, 2018 - link
Your i7-5775C isn't even as fast as an old Kavari A10 w/ 512 GCN2 SP's (it's close, but no cigar), so vs Vega 8 & 11 it gets it's ass absolutely handed to it... like by a lot - https://youtu.be/sCWOfwcYmHIjrs77 - Monday, February 12, 2018 - link
When I look at all the available benchmarks so far, then there's nothing this chip can play, that I can't allready play with my 5775C. 1080p with medium settings is no problem for most games like Overwatch, Borderlands, WoW, Diablo, etc. So if the 2400G can't run them at high settings, like it looks like, then I see no reason to call it the King of integrated graphics really.Holliday75 - Monday, February 12, 2018 - link
How on God's green Earth can you compare a $600+ CPU versus the 2400g? The whole point of iGPU is to be cheap. The 2400g out performs a CPU that costs over 3x as much in the exact area this chip was built for. Low end gaming.jrs77 - Monday, February 12, 2018 - link
$600 ?!? I paid €400 for my 5775C incl 24% VAT. So that would be $300 then.And again. I can play games in 1080p with low to medium settings just fine, so I don't see a reason to upgrade.
acidtech - Monday, February 12, 2018 - link
Need to check your math. €400 = $491.jrs77 - Tuesday, February 13, 2018 - link
Back when I bought it, the Euro and the Dollar where allmost 1:1, and to get the Dollar-price you need to subtract the 24% VAT I pay over here, so yeah, back then it was around $300. Hell, the intel list-price was $328.SaturnusDK - Wednesday, February 14, 2018 - link
So what you're saying is that you paid twice the money to have under half the graphics performance and 20% lower CPU performance of a 2400G.Graphics-wise the 5775C was pretty bad and got beaten by ALL AMD APUs at the time. It was close but it was never very good. Time has not been kind to it.
SSNSeawolf - Monday, February 12, 2018 - link
I noticed with some sadness that there's no DOTA 2 benchmarks. Was this due to time constraints or unforeseen issues? I'm crossing my fingers that DOTA 2 hasn't been dropped for good as it's a great benchmark for silicon such as this, though the other benchmarks of course do let us ballpark where it would land.Ian Cutress - Monday, February 12, 2018 - link
That's in our GPU reviews; different editors with different benchmark sets. We're looking at unifying the two.SSNSeawolf - Monday, February 12, 2018 - link
Wonderful, that's understandable. Always appreciate the time you take to slog through the comments, Ian.HStewart - Monday, February 12, 2018 - link
It might be me - unless you are really serious gamer and need high end performance, I see no reason to use a desktop CPU and GPU in todays world.Holliday75 - Monday, February 12, 2018 - link
That appears to be the case. This CPU would be my go to option for any family member wanting a PC these days. The flexibility it offers is remarkable.B3an - Monday, February 12, 2018 - link
Mistake on the Blender benchmark. The latest version is 2.79 but you've put "2.78". Being as you also have a nightly build you might even have 2.8 if you've got it from the 2.8 nightly branch. Either way you will have at least 2.79.milkod2001 - Monday, February 12, 2018 - link
Looks like decent but still 720 gaming at the best. How far away are we from 40-50fps 1080p gaming from APU?richardginn - Monday, February 12, 2018 - link
Depending on the game you are going to play you will need low settings to get 40-50fps 1080p gaming from this APU,.Yaldabaoth - Monday, February 12, 2018 - link
Great article. However, because I am a pervert, I would LOVE to see some heterogeneous GPU action going on. "Does an AMD 2400G and a nVidia 1050 make a baby that is like a 1050 TI? What about if it mated with a Vega 56 or 580?" Know what I mean? [Nudge-nudge] Know what I mean?Threska - Wednesday, February 14, 2018 - link
Heterogeneous would be an APU, not crossfire. Far as AMD's plans with HSA who knows? They're not doing much talk about it since Zen came out. Maybe they don't need it now that their single thread performance is competitive?Pork@III - Monday, February 12, 2018 - link
Core i7-8809G will smash easily Ryzen 5 2400Ganactoraaron - Monday, February 12, 2018 - link
If cost is no concern, then yes.Pork@III - Monday, February 12, 2018 - link
I think I have to make it clear. The quoted processor(Core i7-8809G) will crush the Ryzen 5 2400G, but some other cheaper models in its series will perform better, just the superiority will be, not so great in the test results, but there will be such in terms of the price ratio / productivity.Manch - Monday, February 12, 2018 - link
Stfu trollHolliday75 - Monday, February 12, 2018 - link
I don't know any idiots that would buy that CPU to build a low end gaming rig that can still handle facebook and Office products. Worthless comment.lilmoe - Monday, February 12, 2018 - link
Welcome back AMD :)I'll be holding on to my Haswell for another year or two. Fingers crossed for a 7nm quad core (6 core maybe???) with HT and Vega 16 (or 18) APU. When that's out, I'll be upgrading promptly, both laptop and desktop machines.
REALLY excited.
ToTTenTranz - Monday, February 12, 2018 - link
Thanks for the review!What are the system specs for the GT 1030 results? I can't find them in the review..
thevoiceofreason - Monday, February 12, 2018 - link
They need to release a variant with halved CPU clocks and TDP for HTPC use.Manch - Monday, February 12, 2018 - link
Cant you just undervolt and downclock it?lilmoe - Monday, February 12, 2018 - link
You don't need to half CPU clocks to reach half the TDP, you can get 70-80% by halfing TDP. That would be very appealing actually for 35-40 watts.Manch - Tuesday, February 13, 2018 - link
It's funny you said that bc you're spot on in regards to the GE variants!jjj - Monday, February 12, 2018 - link
There was a leak over the weekend about GE SKUs at 35W and lower clocks.Lolimaster - Monday, February 12, 2018 - link
You don't another model, just disable high clocked pstates till you get the power consumption you want.I can lock my Athlon II X4 to 800Mhz if I desire.
Lolimaster - Monday, February 12, 2018 - link
You can simply set pstate for a lower base clock and also undervolt if you want to reduce power consupmtion even more.Or the lazy way, cTDP in bios to 45w.
Manch - Tuesday, February 13, 2018 - link
Ask and ye shall receivehttps://www.anandtech.com/show/12428/amd-readies-r...
Cryio - Monday, February 12, 2018 - link
This review kind of confused me?It mentioned it's going to compare the A12 9800, but this APU is nowhere to be seen in benchmarks.
Then out of nowhere come A10 7870K, which is fine I guess, but then there's the A10 8750, which doesn't exit, I can asume it's 7850, yet a 7850 non K APU doesn't exist, so what's happening here?
Simon_Says - Monday, February 12, 2018 - link
Will there be any analysis on current and potential future HTPC performance? While it won't support Netflix 4k or UHDBR (yet, thanks Playready 3.0) I for one would still like to know how it handles HDR for local media playback and Youtube, and if it will have the CPU grunt to software decode AV1.Drazick - Monday, February 12, 2018 - link
Does the Ryzen have any hardware based unit for Video trans coding?Could you test that as well (Speed and Quality).
It will be interesting as this CPU can be heaven for HTPC and for NAS with Multimedia capabilities.
Thank You.
GreenReaper - Wednesday, February 14, 2018 - link
It is meant to support up to 4K H.264/5 at 30/60/120FPS for 4K/1440p/1080p resolutions. Obviously it'd be nice to see people testing this out, and the quality of the resulting video.gerz1219 - Monday, February 12, 2018 - link
Still not quite getting the point of this product. Back when it made sense to build an HTPC, I liked the idea of the Bulldozer-era APU, so that I could play games on the TV without having a noisy gaming rig in the living room. But the performance is just never quite there, and it looks like it will be some time before you can spend ~$400 and get 4K gaming in the living room. So why not just buy an Xbox One X or PS4? I also bought a Shield TV recently for $200 and that streams games from my VR/4K rig just fine onto the TV. I'm just not seeing the need for a budget product that's struggling at 1080p and costs about the same as a 4K console.jjj - Monday, February 12, 2018 - link
There are 7+ billion people on this planet and the vast majority of them will never be able to afford a console or to pay a single cent for software - consoles are cheap because they screw you on the software side.Vs the global average you are swimming in money.
And ofc the majority of the PC market is commercial as consumer has been declining hard this decade.
Most humans can barely put food on the table, if that and even a 200$ TV is a huge investment they can afford once every 15 years.
Pinn - Monday, February 12, 2018 - link
But $10 per day on cigarettes is fine?HStewart - Monday, February 12, 2018 - link
I don't get the idea of desktops except if you want ultimate gaming PC - go with High End CPU a long with High End GPU. Otherwise go mobile. You can pretty much go that route unless you desired extreme top end performanceIf you primary into game get a Xbox One X or S and HDTV are cheap or PS 4,
But lower end desktop PC - I see no need them for now. Times have changed
Lolimaster - Monday, February 12, 2018 - link
If you wanna upgrade a laptop, be prepared for a bunch of cabling.Have 3-4 drives on mobile?
Dedicated capture/sound card?
Keep your thermals in check?
Upgrade your cpu/apu whenever you like?
mikato - Saturday, February 17, 2018 - link
To me laptops are annoying, and only convenient for basic tasks with their mobility. Otherwise they are slow, have a small screen, often don’t a have mouse, and no number pad on keyboard. As a result, typing is slower, pointing is slower, app speed is slower, and gaming performance is worse. With the smaller screen, juggling things, dragging files, etc is more difficult. I just can’t get stuff done as well on a laptop as a desktop.oldschool_75 - Monday, February 12, 2018 - link
Why do the Intel systems have 32 gigs of ram while the AMD systems only have 16?Also bulldozer was not 2 cores 4 threads, it was two modules with two cores sharing the modules so 4 cores.
Lolimaster - Monday, February 12, 2018 - link
Why use 2933 memory?As far as i know AMD send 3200 CL14 Flare X to pretty much everyone for the sake of testing the gpu at 3200 CL14 !!!!
jjj - Monday, February 12, 2018 - link
They use the frequencies officially supported , anything above that is OC and would fall into the OC section. It's debatable how right or wrong that is but that's what AT does.Lolimaster - Monday, February 12, 2018 - link
Guru3d got the reviewer's kit with 3200 cl14 flare-x as 100% of the techtoubers too.ScottSoapbox - Monday, February 12, 2018 - link
The number of typos in the first two sentences was enough for me to stop reading.Lolimaster - Monday, February 12, 2018 - link
The avrg l3 latency for non-APU multiple CCX Ryzen's was around 11-12ns, on the single CCX APU is aroun 9.5ns.Memory latency Ryzen
91ns DDR4 2400
77ns DDR4 3200
2400G
66ns DDR4 3200
Macpoedel - Monday, February 12, 2018 - link
Good to see you started testing CPU's with maximum supported RAM speed instead of JEDEC frequency. These APU's would have really suffered if tested with 2133MHz DDR4 RAM.Hurr Durr - Monday, February 12, 2018 - link
I don`t care about these. I want to see how AMD is holding up in notebooks, 15W range specifically.Hul8 - Monday, February 12, 2018 - link
For a low-end graphics part like this, it would be really interesting to have a section in the review exploring the "comfortable" settings in various games.It could be really useful information for potential buyers to know what kind of settings they'd need to run in a game to reach their preferred performance level (99th percentile), whether it's 30, 45 or 60 fps, and also to know if a product simply can't reach certain performance no matter how low you turn the settings.
DrizztVD - Monday, February 12, 2018 - link
Why do you only report total power consumption? I'd like to see power efficiency!!! Since I don't know what the performance per CPU is, these power measurements mean almost nothing. Also, the efficiency will change with the workload, so Prime95 is a very one-dimensional test of efficiency. Look at your power measurement graphs: they tell you what we already know - single core speeds are lower for Ryzen, and lower TDP CPUs use less power. That's kinda duh...JHBoricua - Monday, February 12, 2018 - link
I'm confused. The AMD vs. AMD section led me to believe there was going to be a comparison of Raven Ridge against Bristol Ridge APUs, which makes sense as it would have allow the use of the same motherboard for both APUs, even if the Bristol Ridge DDR4 memory was clocked slower. But then actual benchmarks is showing Kaveri parts?prtskg - Tuesday, February 13, 2018 - link
Kaveri was better at gaming/performance than Bristol. The latter had the advantage of efficiency.nwarawa - Monday, February 12, 2018 - link
Comparing with a competitive Intel platform with dGPU is kinda tricky right now. It's not just the dGPUs that are ridiculously priced right now. RAM is too. And to maximize performance on the R7 2400G, you WILL need to spend more than the basic $90 8GB 2400 kit. The cheapest 16GB Samsung b-die 3200 kit I found was $220. And you will want to go with a 16GB kit, because already some newer games use more than 8GB, and they use MORE when using graphics cards with less than 4GB. The iGPU takes some of that 8GB for itself...if runs out of system RAM, it has to use your system disk... enjoy the single digit frame rates...Here is what I found on newegg:
INTEL
$130 - i3-8100
$90 - 8GB 2400 RAM (or 170 for 16GB)
$120 - Z370 motherboard (no mainsteam chipset YET)
===
$340
AMD
$170 - R5 2400g
$220 - 16GB 3200C14 RAM
$80 - Motherboard(cheapest decently reviewed AM4)
===
$470
The intel system is a full $130 cheaper (or $50 if you spring for the 16GB), and that gap will only increase with the upcoming cheaper chipsets and/or upcoming coffee lake models. Now, I haven't included the dGPU yet, but the GTX1050 2GB currently goes for $150 - making the Intel system total only $20 more than than AMD system, and running rings around it in most games (although neither would be ideal for the latest games... the 2GB-GPU/8GB-SYSRAM Intel system would run out of memory and the Vega 11 just doesn't have the horsepower).
What would put things in favor of AMD would be if they made clear that the iGPU would still be of use when using a dGPU (such as with the new "ThueAudioNext") in the future.
What I would REALLY like to see, though, is AMD use the beefier Vega iGPUs Intel is using with their own 12nm Zen+ chips and slap on some HBM memory. THAT I could go for.
oleyska - Monday, February 12, 2018 - link
Jeez,1.\ HBM on budget chips pushes them into 250$ range by just adding HBM.
2.\ IGP solutions are not a GTX1060 replacement, it's not magic.
System comparison:
I3 8100 is inferior in cpu tasks.
it has half the memory (intel IGP still uses memory you know?, it's configurable on both systems.
The benchmarked system runs at 2933, not 3200 so o.0
it has inferior gpu 3X~
even at same memory speeds it would still be 2X as slow, slower cpu.
So what is the point of the argument ?
It not needed to have the extra memory frequency but if you want to replace a 80$ dedicated gpu you need to and definitely add 8gb extra memory and that's where the cost comes into place as a valid comparison if you subtract 20$ from AMD for 2933 ~
add 80-100$ for GT1030 you still end up with an intel rig with higher system power consumption, equal gaming performance, inferior cpu and you will have to buy a G-sync monitor if you want tearing free monitor while freesync is thrown at ya at any price range as an added bonus.
systems are comparable and Intel's I3 line is destroyed along amd's old R3 line too.
I5,I7 and R5 stands tall still and the R7 has it's place at times too.
nwarawa - Monday, February 12, 2018 - link
Did you not even read my post? Or the review for that matter? Did you think AT ALL about the real life application of anything said before posting? "I3 8100 is inferior in cpu tasks" WHAT tasks? I'll answer that for you. Rendering. If you are trying to get the cheapest CPU possible for rendering with as little RAM as possible to shave as much money off as possible... you are doing it wrong. Since we (or at least I am... not sure what you are going on about) are talking about GPUs, you can safely assume we are concerned about GAMES."it has half the memory" no s--t, sherlock, read the post again (or for the first time, apparently)
I could go on, but apparently you were just scouring the posts for someone to disagree with with a pre-defined little rant, so I won't bother.
Fritzkier - Monday, February 12, 2018 - link
Why you need 16GB? if you bought Ryzen APU, you probably only plays e-sport title anyway and some older games... E-sport titles doesn't need a huge RAM. And it already crush the intel counterparts, both in performance and price.You guys from first world countries are always complaining. Jeez. Try to live at poorer countries like South East Asia.
serendip - Tuesday, February 13, 2018 - link
An APU at $100 would still be expensive, especially when people in the developing world build machines with Pentium G chips. The speedy APU graphics would negate the need for a low-end discrete GPU though.Fritzkier - Tuesday, February 13, 2018 - link
Well not really. While they using Pentium G with GT 730 or lower, many uses AMD A-series APU too (since they no need to use low end discrete GPU to be on par)And Ryzen 2200G also priced the same as Pentium G with GT 730 tho. The exception is RAM prices...
watzupken - Tuesday, February 13, 2018 - link
If AMD uses a beefier Vega IGPU, are you willing to pay for it is the question? I feel iGPU will only make sense if the price is low, or if the power consumption is low. Where Intel is using AMD graphics, is likely for a fruity client. Outside of that, you won't see many manufacturers using it because of the cost. For the same amount of money Intel is asking for the chip only, there are many possible configuration with dedicated graphics that you can think of. Also, the supposedly beefier AMD graphics is about as fast as a GTX 1050 class. You are better off buying a GTX 1050Ti.iwod - Tuesday, February 13, 2018 - link
Well unless we could solve the GPU Crypto problem in the near future ( Which we wont ) I think having better Vega GFx combined with CPU is good deal.Gadgety - Monday, February 12, 2018 - link
Will these APUs do HDR UHD 4k Bluray playback (yes I know it's a tiny niche), or is that still Intel only?GreenReaper - Wednesday, February 14, 2018 - link
Probably best to just get an Xbox One S for it. As a bonus you could play a few games on it, too!watzupken - Tuesday, February 13, 2018 - link
I feel the R3 2200G is still a better deal than the R5 2400G. The price gap is too big relative to the difference in performance. And because these chips are over clocking friendly, so despite the R3 being a cut down chip, there could be some performance catchup with some overclocking. Overall, I feel both are great chips especially for some light/ casual gaming. If gaming is the main stay, then there is no substitute for a dedicated graphic solution.serendip - Tuesday, February 13, 2018 - link
The 2200G is a sweet because it offers most of the 2400G's performance at a sub-$100 point. For most business and home desktops, it's more than enough for both CPU and GPU performance. And with discrete GPUs being so hard to get now, good-enough APU graphics will do for the majority of home users. Hopefully AMD can translate all this into actual shipping machines.I'm going to sound like a broken record but AMD could send another boot up Intel's behind by making an Atom competitor. A dual-core Zen with SMT and cut-down Vega graphics would still be enough to blow Atom out of the water.
msroadkill612 - Tuesday, February 13, 2018 - link
Its a pity they dont get hbcc.msroadkill612 - Tuesday, February 13, 2018 - link
Simply put, amd now own the entry level up to most 1080p gaming, and its a daunting jump in cost to improve by much.Its polite and nice of this review to pretend intel has competitive products, and include them for old times sake.
serendip - Tuesday, February 13, 2018 - link
Looks like AMD owns the good-enough category. As I said previously, let's hope this translates into actual machines being shipped, seeing as OEMs previously made some terrible AMD-based systems at the low end.haplo602 - Tuesday, February 13, 2018 - link
Finally one review where I can see the driver version ... So this is the same driver used also for the Ryzen mobile APUs. Can you check if you can force/manual install the latest Adrenaline drivers ? That works on some of the Ryzen 2500u chips and actually increases the performance by some 15+% ...haplo602 - Tuesday, February 13, 2018 - link
I hope there's a memory scaling article in the future with frequency and CL scaling for the APU part ...crotach - Tuesday, February 13, 2018 - link
What about HTPC use?I was considering GT 1030 + Intel route for H265 and HDR10 playback and was really looking forward to Zen APUs, but there doesn't seem to be any motherboards with HDMI 2.0?!
Also, I wonder if the chips can be undervolted and underclocked to bring them to a near silent noise level for the living room.
Lolimaster - Tuesday, February 13, 2018 - link
You can undervolt and underclock ANY intel or amd cpu.forgerone - Tuesday, February 13, 2018 - link
What most writers and critics of integrated graphics processors such as AMD's APU or Intel iGP all seem to forget, is not EVERYONE in the world has a disposable or discretionary income equal to that of the United States, Europe, Japan etc. Not everyone can afford bleeding edge gaming PC's or laptops. Food, housing and clothing must come first for 80% of the population of the world.An APU can grant anyone who can afford at least a decent basic APU the enjoyment of playing most computer games. The visual quality of these games may not be up to the arrogantly high standards of most western gamers, but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen.
This simple idea is huge not only for the laptop and pc market but especially game developers who can only expect to see an expansion of their Total Addressable Market. And that is good for everybody as broader markets help reduce the cost of development.
This in fact was the whole point behind AMD's release of Mantle and Microsoft and The Kronos Group's release of DX12 and Vulkan respectively.
Today's AMD APU has all of the power of a GPU Add In Board of not more than a several years back.
krazyfrog - Tuesday, February 13, 2018 - link
Why did you leave out the 8400 and the 1500X in these comparisons?Kamgusta - Wednesday, February 14, 2018 - link
Because these CPUs, while having the same price range, outperform these Raven Ridge chips. That would have been a bad press for AMD and it seems like Anandtech wants to remains extremely loyal to AMD in these days.msroadkill612 - Tuesday, February 13, 2018 - link
"the data shows both how far integrated graphics has come, and how far it still has to go to qualify for those 'immerse experiences' that Intel, AMD, and NVIDIA all claim are worth reaching for, with higher resolutions and higher fidelity. "This assumes a static situation which is rot.
what it reveals is that in the current paradigm, coders have coded accordingly for satisfactory results. If the paradigm changes and other ways work better, then code evolves.
This unprecedented integration of new gen, sibling cpu & gpu, offers many performance upsides too for future code.
picture a mobo with a discrete gpu like an equivalent 1030, then picture a ~matchbox footprint apu - there is a huge difference in the size of the respective circuits - yet they both do the same job & have to send a lot of data to each other.
it's not hard to figure which is inherently superior in many ways.
I strongly disagree with your blinkered bias.
Pork@III - Tuesday, February 13, 2018 - link
There is something unfinished, something inconsolable.elites2012 - Tuesday, February 13, 2018 - link
anything this chip lost to intel at, was most likely outdated. adobe, fcat, dolphin, pov are all outdated benchmarks.sonicmerlin - Tuesday, February 13, 2018 - link
Now if only AMD had a competent GPU arch. The APU performance could be given a huge boost with Nvidia’s techdr.denton - Thursday, February 15, 2018 - link
They do. It's called Vega. Very efficient in mid- to low range and compute, and if I'm not mistaken that's where the money is. Highend gaming is just wi**ie waving for us geeks.HStewart - Tuesday, February 13, 2018 - link
Check out performance of up and coming i8809G with Vega Graphics compare to Ryzen 7http://cpu.userbenchmark.com/Compare/Intel-Core-i7...
Keep in mine this is a mobile chips - this is new mobile chips is quite powerful - I thinking of actually getting one - only big concern is compatibility with Vega chip.
haplo602 - Wednesday, February 14, 2018 - link
the i8809G is a desktop chip, 100W TDP ....hansmuff - Tuesday, February 13, 2018 - link
Any idea where I could buy the MSI B350I Pro AC? I have searched every retailer I've ever bought from and can not find the damn thing. I'm hoping it can run a 2400G out of the box, at least to update to the newest BIOS.Dragonstongue - Tuesday, February 13, 2018 - link
they REALLY should not have cut back the L3 cache SO MUCH...beyond that, truly are amazing for what they are...they should have also made a higher TDP version such as 125-160w so they could cram more cpu cores or at very least a more substantial graphics portion and not limit dGPU access to 8x pci-e (from what I have read)Graphics cards and memory are anything but low cost.
2200 IMO is "fine" for what it is, the 2400 should have had at least 4mb l3 cache (or more) then there should have been "enthusiast end" with the higher TDP versions so they could more or less ensure someone trying to do it "on a budget" really would not have to worry about getting anything less than (current) RX 570-580 or 1060-1070 level.
many cpu over the years (especially when overclocked) had a 140+w TDP, they could have and should have made many steps for their Raven Ridge and not limit them so much..IMO...they could have even had a frankenstein like one that has a 6pin pci-e connector on it to feed more direct power to the chip instead of relying on the socket alone to provide all the power needed (at least more stable power)
AM4 socket has already been up to 8 core 16 thread, and TR what 16 core 32 thread says to me the "chip size" has much more room available internally to have a bigger cpu portion and/or a far larger GPU portion, now, if they go TR4 size, TR as it is already has 1/2 of it "not used" this means they could "double up" the vega cores in it to be a very much "enthusiast grade" APU, by skimping cost on the HBM memory and relying on the system memory IMO there is a vast amount of potential performance they can capture, not to mention, properly designed, the cooling does not really become an issue (has not in the past with massive TDP cpu afterall)
anyways..really is very amazing how much potency they managed to stuff into Raven Ridge, they IMO should not have "purposefully limited it" especially on the L3 cache amount, 2mb is very limiting as far as I am concerned especially when trying to feed 4 core 8 thread at 65w TDP alojng with the gpu portion.
Either they are asking a bit much for the 2400g or, they are asking enough they just need to "tweak" a bit more quickly to make sure it is not bottlenecking itself for the $ they want for it ^.^
either way, very well done....basically above Phenom II and into Core i7 level performance with 6870+ level graphics grunt using much less power...amazing job AMD...Keep it up.
SaturnusDK - Wednesday, February 14, 2018 - link
Well done AMD. Well done.Both these APUs are extremely attractive. The R5 just screams upgradable. You get a very capable 4 core / 8 thread CPU packaged with an entry level dGPU for less than the competition charges for the CPU (with abyssmal iGPU) alone. In the current market with astronomical, even comical, dGPU prices this is a clear winner for anyone wanting to build a powerful mid-tier system but doesn't have the means to fork out ridiculous cash for higher tier dGPU now.
The R3 scream HTPC or small gaming box. A good low end CPU paired with a bare bones but still decently performing iGPU. Add MB, RAM, PSU, and HDD/SSD and you're good to go. I imagine these will sell like hot cakes in markets with less overall GDP and in the brick'n'mortar retail market.
The question is now. Is Intel ever going to produce a decent iGPU for the low end market? They've had plenty of time to do so but before Ryzen, AMD APUs just wasn't that attractive. Now though, you really have to think hard for a reason to justify buying a low end Intel CPU at all.
yhselp - Wednesday, February 14, 2018 - link
"Now with the new Ryzen APUs, AMD has risen that low-end bar again."You had to do it. I understand. And thank you.
dr.denton - Thursday, February 15, 2018 - link
<3Hifihedgehog - Wednesday, February 14, 2018 - link
I have been doing some digging and found that although current-generation AM4 motherboards lack formal HDMI 2.0 certification, just like many HDMI 1.4 cables will pass an HDMI 2.0 signal seamlessly without a hitch, the same appears to be the case for these boards whose HDMI traces and connectors may indeed be agnostic to the differences, if any. Could you do a quick test to see if HDMI 2.0 signals work for the Raven Ridge APUs on the AM4 motherboards you have access to? For further reference on the topic, see this forum thread “Raven Ridge HDMI 2.0 Compatibility — AM4 Motherboard Test Request Megathread” at SmallFormFactor.Gonemad - Wednesday, February 14, 2018 - link
These chips will cause a major segment of low-end graphics cards to be buried, or have their prices cut down. I can't wait for the benchmarks, and see how many generations will be affected.callmesissi - Wednesday, February 14, 2018 - link
what an amazing apu... this makes all celerons / pentiums and half of i3 completely out of market! this apu is so good, so cheap, kills all the entry level gpus and basically all those intel cpus mentioned .way to go amd! now about those drivers and bios....
edlee - Thursday, February 15, 2018 - link
this is amazing by AMD, they need to keep the hits coming, and chipping away market share from intel. Intel has not made any progress on processor graphics in years, this is a super aggressive strategy by AMD that i think will work. Nvidia basically has to cancel out any gpu line below gtx 1030 by next year. Hopefully AMD comes out with a GTX 1050 killer by next year.CSMR - Wednesday, February 14, 2018 - link
I think the relevant Intel comparisons would be processors with Iris Pro or Iris Plus.Hixbot - Wednesday, February 14, 2018 - link
Do these APUs not compete directly with the new Intel-Vega chips? Why did AMD give Intel Vega, when AMD could have had the APU market all to themselves?SaturnusDK - Wednesday, February 14, 2018 - link
Not really. The i8809G as it's called is supposedly a mobile part if you can call a 100W TDP part mobile by any stretch of the imagination. It's based on a 22 (or 24) CU Vega and will be much much more expensive. Expect it to land somewhere in the $400+ range, ie. more expensive than an 8700Kfranco1961 - Thursday, February 15, 2018 - link
Hi Ian. Two questions: I have to do a cheap PC for video editing (Adobe Premiere, After Effcets, etc.) and I thought about a Ryzen 3 1200 and a Geforce GT 1030. You think this Ryzen 5 2400G goes for video editing by having a Vega 11? Another question: using DDR4 2400 would I have a strong performance drop or could it be okay? Thank you!rexian96 - Thursday, February 15, 2018 - link
A question for Ian or anyone who knows - Does 2400G not support HW encoding of H264/265 and just limited decoding? Or that Handbrake doesn't support it yet? Looking at the encoding score it would seem 8400 is miles ahead though the GPU in Ryzen is much stronger.csell - Friday, February 16, 2018 - link
Hi. Do you know the max videoram?Will it be possible to use Crossfire between the shared ram and a graphics card?
If crossfire will be limited to VEGA graphics card, will I hope AMD will introduce a cheap graphics card with just 8 or 11 Vega compute units and may be HDMI 2.0
SaturnusDK - Friday, February 16, 2018 - link
There will be no crossfire with dGPUs. It's no possible at all.SaturnusDK - Saturday, February 17, 2018 - link
On the subject of allocated VRAM. It should be set as low as possible. The minimum is 64MB. You should never set it higher. Since VRAM is system RAM there is no speed gains to be had in GPU performance setting it higher than the minimum and just letting windows sort out the spill over but you are potentially limiting the CPU performance a lot as allocated VRAM eats available system RAM, so if you have 8GB but set VRAM to 2GB, the system only have 6GB remaining. This can seriously hurt performance in some cases. So, as little VRAM as possible is the correct setting.minidea - Tuesday, February 20, 2018 - link
<p><a href="http://www.minidea.co.in ">Minidea</a></p>Titoboyer24 - Saturday, February 24, 2018 - link
This Is Really A Great Stuff For Sharing. Keep It Up .Thanks For Sharing. <a href="http://www.mechanicalassignments.com/">Mec... Engineering Homework Help</a>Titoboyer24 - Saturday, February 24, 2018 - link
Get The Dissertation Writing Service Students Look For These Days With The Prime Focus Being Creating A Well Researched And Lively Content On Any Topic. Mechanical Engineering Homework Help http://www.mechanicalassignments.com/ ............Titoboyer24 - Saturday, February 24, 2018 - link
Only Professional Writers Can Make This Kind Of Material, Cheers<a href="http://www.spsshelponline.com/">SPSS Homework Help</a>
IntoGraphics - Friday, March 9, 2018 - link
For Ryzen 5 2400G + X370 mATX m/b.What 16GB (4x4GB) and 32GB (4x8GB) kits memory modules (brand + model) would easily overclock much higher than DDR4-2933 for the best GPU performance ?
(I'm looking at some 4000MHz kits but if that is not compatible it will be a waste.)
andrewbaggins - Tuesday, May 1, 2018 - link
Anandtech reviews continue to show us what the graphics chips CAN'T do well instead of what they CAN do well. I've heard all the reasons before, and they don't help. Gaming benchmarks should be presented for PLAYABLE settings rather than can-just-barely-run-it settings, i.e. High/Ultra. No gamer will settle for jerky, stuttery gameplay, so why not show us what the items under review can do on, say, Medium settings with a few High setting benchmarks thrown in for good measure?