They paid him with old new stock of a dozen 1st gen Bulldozer CPUs. Just having the AMD logo on them will make him blissfully happy. No need to waste good products on him ;-)
I genuinely want to understand why you think so? You really believe this article puts Intel's chips in a good light? Would've preferred to see a set of different/recent processors in comparison but your comment is confusing lol.
I guess one point might be that in most of the comparisons, there are no higher-end AMD CPUs included. So you see that Intel's higher-end processors are better for gaming, but not that there are AMD options as well further up the chain.
Even so, I think Intel holds the gaming FPS crown anyway for the moment, with their new 10900k (which isn't on this chart). That 5.3 boost clock should be pretty good for achieving maximal framerates.
Other than losing to Intel at max-FPS gaming though, AMD dominates all segments from a price/performance, raw performance, and power efficiency performance. Server, mobile, workstation, gaming, etc.
The narrative seems forced in lieue of current and price competitive offerings from Intel. Hard to blame AT though, must publish. Regardless, AMD is absolutely ballin'.
Unbelievable.. this chip is a quarter of the price and is a fucking steel and totally embarrasses Intel's best .. and he thinks its Intel biased, what the actual F?
I'm hard pressed to tell whether these comments are from actual dyed-in-the-wool AMD fanboys or a few Intel nuggets doing their best impersonations of how they think one would behave. 🤔
"AMD's new processor is better than Intel's recent flagship at a third of the price, and Intel's i3 chips are barely available so it will be great for budget gaming" "HOW MUCH IS INTEL PAYING YOU"
What, should he have manipulated the benchmarks?
Or should he have unprofessionally craptalked Intel because his identity revolves around an alliterative antagonistic absolutist dogmatic dichotomy between fooking TECH companies?
Don't mind Korguz - he never offers anything to the conversation... the is a hard core AMD shill who thinks he makes his point by calling other people Intel shills. He is a bit creepy - he will follow you on multiple forums...
and you can also ignore Deicidium369, he cant get is OWN personal facts straight, and also gives false information on another site, so anything he says are lies, and BS calls me a hardcore amd shill, but yet, says NOTHING positive about amd, but praises intel like a god, go figure
" He is a bit creepy - he will follow you on multiple forums... " FYI there are others that know about your other post that are full of mis information, and call you out on it to, and your response, just like here, name calling, insults, and condescending remarks, this is why you were banned from tom's, was it not ?
" Just ignore him - he has nothing. " oh i do, do " ? take a look that these 2 pics here :
in the 1st pic, he states he retired @ 28, 20 years ago( year 2000 ), then a few days later claims he started, and built up a complete woodshop business 40 years ago, which using his " retired at 28 " BS, as a reference, would of made him EIGHT years old. so either he is a lier, or he cant get his facts straight. the only reason to listen to him, is cause his posts are comedy gold
This comment is absolutely hilarious. Korguz mocks PeterCollier for accusing Ian and AnandTech of anti-Intel bias, and Deicidium369 chimes in saying it's because... *drumroll* Korguz is a "hard core AMD shill". Even better is that PeterCollier goes on to agree.
Peter and Deicidium make a great pairing - two sides to the same demented coin.
Considering it shows the $120 AMD offering comprehensively beating the old i7-7700K, and says the current Intel budget offerings will be slower, and recommends the AMD processors, I find this comment rather brain dead.
One question I had is why AT chose to use the 2600 instead of the 3600...? Makes no sense to me, as the 3600 runs at 65W and the 3600X runs at 95W--just like the 2600--only the 3600 is appreciably faster--but costs the same! 3600 is MIA. No question but that the review benchmarks clearly demonstrate the superiority of the AMD offerings, but we already knew that. I see the omission here--deliberate--of the 3600--while including $425 Intel 6c/12t offerings--as surely an apology for Intel's inability to compete. Such is not needed, really. Apologizing in subtle ways for Intel is, I think, a pretty poor way to write a review on CPUs Intel cannot at the present time compete with--the 3100/3300. Getting right down to it--there was no need to include *any* 6c/12t CPUs here, right? Should have been comparisons only with Intel/AMD 4c/8t cpus, exclusively, imo. Selection of CPUS for *this review* didn't make any objective sense that I could see--beyond the obvious, of course (at least you didn't forget and leave the 3100/3300 out...;))
I'm guessing the omission of the 3600X has something to do with, at the time I read this, they hadn't even finished all the benchmarks for the 3100. You know, the one in the headline. I don't think it's a conspiracy, just a time constraint.
They didn't rerun the 2600 for this, they used existing benchmarks.
They haven't ever benchmarked the 3600 previously, so it's not listed here. They do have the 3700X, however, which is essentially the same performance as a 3600 (except in heavily threaded workloads): https://www.anandtech.com/bench/product/2520?vs=25...
I saw other testers on Youtube use the 3600, and, the 3300X was VERY surprisingly close to it's performance...; the 3300X's clearly quite strong threads and lack of inter -CCX -RAM latency issues are reaping benefits!
The choices they use to compare are utterly bizarre. A three and a half year old Intel i7 and last generation Ryzen parts....?
Legitimately, this review is useless if you are shopping *today*, not just from a team red versus team blue, but where this processor sits in today's market, no clue after reading this. One of my friends was looking for a budget gaming build and I was looking at a 3200G/B450 setup, how does this compare? Instead let's assume people have a time machine and are cross shopping two gen old Ryzen and three green old Intel parts....?
The charts aren't bad, they are terrible. Have an old i7 in there for reference, ok, put current Ryzen 3 and i3 inn there and if you don't have enough time *only* include them.
bro, they try to make a point with the reviews. if you want this comparison you use the cpubench feature of this website and compare any chip they tested on any of the tests they have. it's an actual feature not a bug. the point of this article and tests is to show entry level amd 100 price point is as powerful as 3 year old flagship-ish intel for the mainstream. it shows against the zen and zen+ hexacores that it catches up to them in many situations despite lacking in cores. this shows you amd is not just throwing cores at intel anymore. they have ipc too! ok any more spoon-feeding? would you prefer a spork?
You just called them AMD shills. They went into this review to prove how great AMD is, that is not journalism, that is not a review, that is a marketing campaign.
Literally zero need to use the CPU bench tool they have, literally every other site I've checked has a useful, much better review, although it doesn't hit the level of marketing you are looking for.
"Openly stating the obvious conclusion that your empirical testing led you to is a marketing caompaign" is exactly the sort of anti-intellectual, brain-dead take I have come to expect from you, Ben.
He didn't call them AMD shills - you did, and all for daring to have an opinion.
That is to the letter the opposite of objectivity, that is precisely what shilling is, and I'm not the one that said it. Is English not your first language? You truly shouldn't being up intellect if you don't comprehend the words being used.
Oh, because Intel's offerings got panned. How sad. I guess that means Anandtech has a bias that tracks roughly with which parts are the best at any given time. Its almost like... objectivity 🤔
The world does not revolve around what amdownzjoo.com has as a recommended processor. Even if we were limiting ourselves to that, what about the 3200G? Every other site I found handled their reviews of this product much better.
What *about* the 3200G? It's an older, slower CPU. If you're going to add a dGPU, it's pointless. If you're not, you're still better off waiting for the 4000 series.
The desperate scraping for even a semblance of a point in your posts is positively painful.
Hmmm.. looks like most of your posts in this thread are responding to Korguz, they're primarily focused on being critical of him, and they add nothing to the discussion.
If you were projecting any harder you'd burn a hole through the screen.
Yes, I'm sure Intel was very quite pleased and eager to have both their famous 7700K and 8086K equaled or surpassed in most games by a $120 CPU....; they insisted and 'bribed' the powers that be that their former flagships be included! :)
What a worthless comment. Sure, they included high-end Intel CPUs in the gaming sections - but the review isn't *about* high-end CPUs, it's about what you get for the money with these specific AMD CPUs. Comparing AMD's budget gaming CPU to the best available to see how little you lose is a valid comparison to make.
Anandtech can no longer write a simple factual article anymore about any processor. Even this article, which is supposed to be a simple article about a new low-cost processor, uses the word "Bonanza" in the title, mysteriously It also takes multiple jabs at Intel in the body, even though it servers no purpose to the actual content. Every Anand article is now an opinion piece instead of responsible reporting.
And what's the point of these new benchmarks? I prefer PCMark and Userbench. Basically no one is using their new CPU to simulate the neurons of a sea slug, for example. Utterly irrelevant to real-life usage.
The purpose of a benchmark is to produce repeatable and reliable numbers. Just "doing real-life stuff" is not repeatable and will generate different numbers for everyone. If you have a specific use case in mind, you can observe relevant or related benchmarks.
I read articles from all sources, including the silicon-equivalent of Faux News. I find it a good practice to read from sources that you disagree with, or worse, purposely mislead you, because it's important not to create an echo chamber for one's self.
Ah, we have an "enlightened centrist" here. No take is too worthless, malformed or ignorant for him. Anything less than subjecting yourself to the dribblings of fools and disinformation artists is an "echo chamber". Such rational, many smart.
You do realise that not all of the benchmarks have to confirm to your use case?
There are two roads to take:
1) Out of ABCXYZ, Benchmarks XYZ are relevant to me. That's good. 2) Out of ABCXYZ, only Benchmarks XYZ are relevant to me. Why did you even bother testing ABC?
Not 100% of benchmarks have to be relevant to you. Plenty of other folks have requested these.
I'm sorry, WHAT? Userbench? I'll assume you're joking, because that's a god-awful "benchmark". PCMark is fine, but they gave many gaming results, so actually they've done better work than a lazy PCMark result.
They did 7zip/WinRAR, imaging editing, video encoding, and browser tests as well.
Exactly what would using PCMark and Userbench add?
Userbenchmark is so bad, it now gets ridiculed permanently and has become the Number 1 Meme in the community. PCMark measures, like you said yourself, SYSTEM-responsiveness, so its understandable if its not the top priority here. Furthermore, if you arent completely braindead you can extrapolate system-behaviour from a CPU-test/benchmark. Basically everything you spew here is demontrating your total ignorance and lack of knowledge of everything. You should be utterly ashamed of yourself.
Facts! Okay: It has a known Intel bias, it doesn't actually do anything that isn't covered by the tests shown here, and even if it did *some of us would still be happier with these real-world application benchmarks*.
Oh for crying out loud. The first two posts in the comments wasted on a feckless simpleton blubbering that this site doesn't do the exact same things every other site does.
Bit disappointed with the CPU selection used in this article.
Thought it might be interesting to see how it fares against the 1600AF or R5 3600. Different CPUs which people might be considering if they're worth the extra.
Exactly. I'd have loved to see it up against the i3 9100F, 9400F. Comparison to the 3600 and 1600AF would be very welcome given the 1600AF's price range and 3300X vs 3600 would be very interesting from a gaming standpoint.
None of those CPUs are in Bench, so Ian doesn't have any of them to test. The original 1600 is present for the older benchmarks, but not the newer ones. I wouldn't expect many CPUs to be run on the new ones until all of them are finished and ready to go in order to minimize the amount of time spent dis/reassembling test systems.
Bench is terrible, it's missing way too many options. Most common problem I run into is when I'm trying to compare an older card (7870 or gtx660 or something) to a newer card, and they are just missing from the list. Just needs to be expanded and improved.
I'd like for there to be a checkbox that says "these aren't scientific, you can compare incomplete, incorrect data at your own risk" etc, but there's a reason they aren't in the same listing; they're not the same.
The problem is that such data is often very misleading. By far the most important issue: there are many drivers touting >10% speedup in game X. You might as well guess which GPU is faster if they end up very close over many generations. Heck, simple "the new GPU has X% more CU and Y% higher frequency vs old one" is likely to lead to a better estimate than trying to correct for all that mess.
Worst part in the test is the RAM. That set of 3200C14 memory is very expensive - $160. That making entire build pointless. For that money you can get 32 GB kit with 6-core Core i3 10400.
It was mentioned that Intel didn't even send these CPUs out for review, and that they're hard to obtain because Intel isn't making many of them.
However, a few more data points would be nice. I think Ian needs to set up a system test datacentre like Phoronix so the rebuilding is kept to a minimum!
AMD must have sent the 7700k or specified it's use. I've noticed every review using that specific CPU. AMD aiming for the used market upgraders it seems.
I believe that's the last Intel chip that was 4C/8T as well, right? Seems a fair comparison, I guess if AMD really think that's the market.
Anyway, TechPowerUp went ahead and lined up the 3300X against a bunch of other relevant chips (https://www.techpowerup.com/review/amd-ryzen-3-330... It's 1% slower than the 3600 at 720P gaming, 16.5% slower than the 9900K at 720P gaming.
CPU tests show the 4C/8T 3300X holding up well to the 6C/6T 8600K and 9400F. It pretty well trounces the 9100F.
Hence why most review sites use 1080p. 720p benchmarking on modern hardware is akin to Quake 3 benchmarking at 640x480 resolution back in 2000. All you end up seeing are crazy high numbers that don't mean anything. We see it all the time that CPU A is faster at 720p but then slower at 1080p?
@schujj07 Interesting. Your claim sounds totally alien to me, so can you show us some examples where a CPU is significantly slower in 1080p than in 720p when the GPU isnt the bottleneck pls?
@superdawgwtfd - If the resolution is too low then you artificially amplify the differences between CPUs. Meanwhile at 1080p you're testing a resolution people will acttually use for high-frame-rate displays, and a decent GPU is still not going to be the primary limit at that resolution.
Also a 7700K should be similar to the new 10th gens with same amount of cores. It's same arch / node. Just frequency changes (and I think the low end new ones are saame or slightly lower.
The 9100F is 4c/4t with a 3.6/4.2 clock. The 7700k is 4c/8t with a 4.2/4.5 clock. Since both the 7th & 9th gen are both Sky Lake, they will have identical IPC. Based on that we know that the 9100F will perform worse than the 7700k and makes that inclusion pretty pointless. Not to mention that Ian said he never got review samples of the 9th gen i3's. In a lot of the benchmarks we see the R5 1600 & 2600 and the 1600AF will be right between those 2 CPUs in performance. The inclusion of the 4790k and 8086k are nice as they show comparisons from the top 2014 CPU and 2018 CPU. When it comes to single threaded applications, a stock 8086k will be as fast than as a stock 9900k due to having the same boost and IPC. Therefore we are able to extrapolate a lot of data from this whole thing.
You made a succession of excellent points here. Alas, I feel some people would rather use their brain for trolling than for processing the information they claim to want in the course of said trolling.
You don't come to anywhere. You are not going anywhere. Your life sucks, and will forever suck. It still has a purpose, though: To serve as a warning to others.
He is a stalker - if you are posting on Toms or Wcc - he is stalking you... he has nothing to offer, just calling people Intel shills. all the while being an AMD Shill.
This forum needs an ignore function. He needs to just go clean the basement.
ahh, so reading multiple websites, where some one is stupid enough to make the same name on 2 of those sites, is considered stalking ? whats the matter Deicidium369, you have nothing else ? youi cant prove any of your BS so you have to, once again, resort to BS replies where you just insult, call people names, and be condescending ? BTW, you get your personal facts straight ?
Except that these chips (3100, 3300) are available, so one can buy them now; the Comet Lake i3s aren't. If Intel wanted the Comet Lake i3 to be included, they could have shipped a review sample to Ian. I don't believe he would have refused it.
Did you really feel the need to add 40 comments a week later the article was published? And on a Monday? Please tell us what company, Intel or AMD, that you work for.
Yes, reading article comments five days "late" definitely indicates that I work for a tech company, and not that I only visit this place once or twice a week during my lunch break. But hey, basing trolls is what I do for amusement. What's your excuse for trolling in the first place?
I suspect that the 2600 was put into the charts as a stand-in for the 1600AF. The 1600AF should perform very close to the 2600 in most benchmarks - just a touch slower due to reduced clock speeds.
1600af is represented by the 2600. the 2600 is just slightly better than it. this is not a problem. also amazon is out of them. the only benchmarks these processors win are high core count. daily life the 100 3100 will be faster so why hobble yourself with the smaller cache and 12nm?
Something I've been wondering about for a while, is that when the first Ryzen announcements were made I read articles saying that the inclusion of some SATA/USB on the CPU itself would allow for cheaper entry level mobos/systems that used the CPU as an SoC without a chipset at all. However I don't recall ever seeing anything built that way. I'm inclined to doubt that it is possible but that no one has ever done it because for a low end system without a discrete GPU the CPU appears to have enough IO to cover all the bases. Were the initial reports wrong? Is it something that's only shown up in cheap OEM systems but never DIY boards?
Deskmini A300 is made this way- althouth it has no dGPU slot and requires an APU. This also allows the PC to idle at just 8-10W, compared to ~20W(?) for PCs that use motherboards with chipsets.
just one system by a single vendor. Kinda disappointing IMO since the CPU has enough connectivity for a basic no frills system, and would've been a reasonable option for a budget mITX board.
Agreed. I'd love to build a system in my existing USFF ITX case using something like an ITX version of the Deskmini A300 board, but for some reason nobody's doing it. I'm genuinely cluesless as to what "some reason" might be, too, as the A300 proves the concept just fine.
A lot of focus in this text is about how 3300X ($120) compares to 7700K ($329, 3 years ago). Wouldn't it be nice to see how 3300X compares to high-end original Ryzen? ;)
The choice of CPUs is really weird. 8086K? 4790K? Really?
1600X, 1700, 1700X, 1800X - all probably beaten in most games and synthetic single-thread. In a lot of software as well (since 7700K used to beat 1800X occasionally).
And that's obviously great. But with that approach you could just write "3100 and 3300X added to Bench", right? :)
I have nothing against the factual layer of this article. Results are as expected and they look consistent. But it's essentially a story how an entry-level $120 CPU from company A beats a not-so-ancient flagship from company B.
So I'm merely wondering why you decided to write it like this, instead of comparing to wider choice of expensive CPUs from 2017. Because in many of your results 3300X beats 1st gen Ryzens that were even more expensive than the 7700K. Or you could include older 4C/8T Ryzens (1500X) - showing how much faster Zen2 is.
Instead you've included the older 6-core Ryzens, which are neither similar in core count nor in MSRP.
This is a highly impressive little CPU for the money.
I particularly liked the 3300X‘s good showing. If this is at least in part due to it using only one CCX, this should bode well for Ryzen 3 which should have an eight core CCX.
Look at some tests were Ryzen did not do so well wrt their Intel counterpart like Kraken and Octane - the 3300x now does very well. It even scores slightly better than the 3700x
Look at the spec graphics. Note the only difference to the old B450 is pretty much that it provides PCIe 3.0 lanes instead of PCIe 2.0.
Now, when was the last time you saw a PCIe 3.0-based chipset hub needing active cooling?
As an aside, while i am kinda glad the B550 is finally coming, i am also a bit disappointed in seeing AMD (and their design/manufacturing partners) needing a better part of a year just for managing a bump from PCIe 2.0 to PCIe 3.0. PCIe 3.0 has been in the market for around eight years now; there is no excuse for AMD taking this long to figure out this s*it.
AMD's B550 slide tricked me for a moment, as it makes it appear as if the CPU only has 20 PCIe lanes total. Which is of course bollocks, Ryzen has 24 PCIe lanes total (20 usable + 4 chipset link).
Does it mean AMD artifically only allows 16 of the 20 CPU PCIe lanes to be used on B550 motherboards? Really? I am confused whether that is a mistake in the slide, or if that will be the actual reality. I hope, and for AMDs sake, it is the former...
If you're talking about the "The New AMD B550 chipset" slides, the problem is they're poorly designed and you've misread them. On the left side of the 1st one you've got a box with 20 PCIe lanes 16 for the graphics and 4 for the chipset, then below that you've got a box with what is either 4 lanes for a single 4x 4.0 SSD, 2 sets of 2 lanes for a pair of 2x 4.0 SSDs, or a 2x PCIe link and 2 sata ports. Below that in the list of text it has 16 lanes and 8 lanes as the first two items.
It's less about noise than durability. I've had two MB died on me prematurely in 30+ years and both are due to the little cooling fans dying. Unless you are buying top of the line $1000 MB, those fans are garbage comparing to what's used on GPU and CPU.
Oi, are you still using Zimbabwe dollaroos? ;-) But yeah, other than the creative pricing i am totally with you in regard to those little teeny fans...
If I’m interested in CPUs in this price range, I’m also considering the following units: 2700 2600x 2600 1600AF 3600
While I realize that the intel 10 series isn’t available yet, a low end current 9 series i5 and a higher end 9 series i3 would have also been relevant.
I realize that this was under a short deadline, but at least a couple of comparisons in that range for maybe a few tests would have helped.
For my money, the base 2700 is very hard to beat in this price range. It would only ever loose in things that are strictly single core or strictly AVX2, which are very case specific, and would wipe the floor with the 3300x in anything multi core sensitive, judging by the 2600 tests alone. It can usually be had for within $10 of the msrp of the 3300x.
...because they are playing it loose with the spelling of lose. Also, Double O's posess a certain elgance, sophistication and general badassery. They are also deadly. Ooh, and keep your girl away from them, especially one particular Double O.
When some people say lose, they put all the emphasis on the o, so it sounds longer, so they think one is not enough. Ask them to spell loose straight after, and you get to see some good old gears start clunking into place.
Because I was typing it on mobile, didn’t proofread before I hit submit, and the spell checker didn’t flag it as being wrong because it doesn’t know context.
It’s my fault, my mistake, and I normally strive to do a better job with my spelling in general. Thank you for pointing out my mistake so that I can be more cognizant of my future errors.
In all honesty just poking fun and genuinely curious because I see this mistake made daily all over the place. Facebook, comments, even articles by professional journalists and a work email or two. I find it curious when I know the people who speak American English natively and still make this mistake.
Well, Autocorrect is one answer - and the other is the paradoxical relationship between the long "oo" sound in in lose and the shorter "oo" sound in loose. It's hard to argue that the spelling shouldn't be the other way around, although I have no doubt would still trip over it even then.
Idle power draw is atrocious. How can it be this high?
It's not even that I'm worried about the unnecessary electricity use or noise (which could make an analogous APU a lot less interesting for HTPC and NAS).
I'm just really shocked by interconnect using 16W when the cores it joins are hardly doing anything. Does anyone know what is I/O die doing? Is there a 10W ASIC mining bitcoin or what?
Hush! You're spilling the beans here (: Actually, if AMD had a highly efficient ASIC mining chip with good hash rates, I'd consider buying some. Same goes for Intel.
Actually Intel is a major FPGA maker, so you can get one of those. It's not that hard to find an open-source coin miner (even on GitHub).
The comment stand though. I googled a bit and there's no clear explanation for the high idle uncore. And 8-core mobile Zen2 chips use maybe 3W in idle. It's not like their IF is a lot slower or has less to do.
This makes me wonder if we're even going to see desktop/server 35W chips? Not to mention it would be nice if they offered TDP down of 25W...
Suddenly, I'm a lot less interested in an AMD-powered home server or NAS (and BTW: EPYC embedded lineup is still Zen-only).
If they do make desktop 35W chips, they'll probably be based on the integrated APU die. I suspect the increased idle power is due either to off-die IF link to the IO chiplet needing more power than IF within a die, or perhaps the (14nm) IO chiplet itself having higher power usage.
I'm OK with this kind of uncore under load (it's how Zen works). And I don't really mind high idle in workstation CPUs. It's an acceptable compromise.
I just assumed that they'll adjust this for low-core CPUs, since these often go into home PCs used casually - spending a lot of time at idle / low. And under a cheap, slim cooler there will be a difference between 5 and 16W.
AMD will have to fix this in the APUs if they want to take on low-power segments (NAS, HTPC, tiny office desktops).
AFAIK Zen2 APUs will use the chiplet layout, not monolithic approach from the mobile offer. Hence, OEMs will probably use mobile chips anyway. DIY customers may have a problem.
We've seen updates addressing issues with previous Zen CPU's. Possible it could be a miss on their part of just didn't have the time to tweak it before release.
Thanks for detailing the two new AMD CPUs. Any news on the new desktop APUs though? I'm hearing rumors of up to 8 cores but the GPUs on them will be worse than the previous generation.
They'll be based off on Renoir. So 8 cores, 16 threads, with 8MB L3.
In mobile, Renoir's GPU has outperformed the predecessor, despite having fewer CUs, because of improved clocks. I'd say it's likely desktop Renoir will outperform the predecessor in graphics at the same price point, but not dramatically.
Yes, but, it was trivially easy to run the 3400g gpu at 1600mhz and run the ram at 3400/3600 speeds. Assuming that the gpu of the “4400” apu gets 8CUs at about 2000 MHz, it will have less total processing power. Assuming that it can’t typically run the ram much faster than 4000 speeds, it won’t have much extra bandwidth. My best guesstimate is that it performs marginally better than the 3400 in gpu limited tasks purely for having better ram support and less processor memory contention due to the larger L3. However, games are rarely entirely gpu limited and having the much improved zen 2 cores will make things markedly better.
I base a lot of that on the benchmarks of the 3500u vs the 4500u, which are very roughly comparable in resources. The 4500u is consistently faster, though not by much.
I'm expecting much the same as what you outlined here. A significant improvement over the 3400G in CPU performance and gaming for stock configurations, but with limited gains over overclocked systems.
Hmmm ... the 3300X is doing better than I thought it would. Would appreciate some benchmarks with games that benefit from more cores/threads. Great article. I find the part about the difference between 3100 and 3300X particularly interesting (I had wondered about the difference between the two CPUs to warrant the price difference).
The 3300 is a fully functional 4 cores on one die, the 3100 is 2 cores on two (otherwise defective) die. Thus, the 3100 needs to use the interconnect a lot, which slows it down a bit.
About the test setup: No PCIe 4.0 graphics cards. No PCIe 4.0 NVME SSD. You are handicapping these CPUs by not letting them take full advantage of their features. If an older or lesser CPU cannot support these features, well then it deserves to score lower for it. You did use DDR4-3200 RAM, thanks for that.
Users with a $99 CPU are going to use a PCIe 4.0 SSD? really? How do I keep the storage element consistent between tests then, to make sure I'm actually testing the CPU? How do I keep that storage constant for CPUs 10 years ago?
can't wait for a water block equipped X570 for $800 and the R3-3100 to get the best OC's possible with muh PCI-e 4.0 storage......!!!! :) (Who cares if PCI-e 4.0 drives sometimes fare 1-3% worse than the 970 EVO in some real world comparisons!)
Maybe it's because after buying a PCIe 4 capable MB and a PCIe 4 SSD, I wouldn't have any money left to buy a CPU for more than $ 100? Kidding, of course, this challenge makes no sense. That aside, it would be interesting to see what kind of CPU can actually make good use of PCIe 4 capable MBs and fast storage.
Thanks Ian! If possible, please add some performance numbers for the current i3 and i5 in. Right now, AMD owns the below $200 space for desktop CPUs. Also, data from other websites that had some i5-9100 on hand show that the 3100 A.K.A AMD's leftover dies, are outperforming Intel's offerings here. Really hope Intel steps up, and soon. I'm hoping to buy something later this year, so whoever gives me the most bang for my buck gets my money.
Unfortunately, we never got any of those. I'm recently stretched six ways from Sunday. Pulled an all-nighter just to even get to this point in the review process. As much as people would love me just to bench CPUs all day every day, even in lockdown I've got these CPUs, EPYC, Motherboards, Xeons, laptops to test, as well as news coverage and all the behind the scenes stuff no-one ever sees. Writing isn't a quick process, either.
Appreciate the reply! I think the fact that you never got those current-gen i3s and i5s is not on you, but on Intel. If they want their stuff reviewed, they know they need to send some samples. Unless, of course, they're afraid of the test results. Which they might just be.
I just noticed in the "AMD 500 SERIES CHIPSET PROCESSOR SUPPORT" chart; 4000 series/Zen2 based desktop APUs are not represented. An oversight? or is AMD trying to say something?
There's got to be something strange going on with the 7700k system. In several benchmarks the 6700k outperforms the 7700k, even though the only difference between them is the 7700k is clocked higher. Under no circumstances should the 6700k outperform the 7700k.
Were the Skylake and Kaby lake systems tested with different motherboards or with different BIOS revisions? Its possible some security patch was active on one system but not on another.
You should have benched it against a Xeon E-2174G. At least that's the most modern 4C8T CPU Intel sells right now. But I look forward to seeing how it does against the i3-i3-10320 to see if Intel still has the IPC-clockspeed crown or not.
The performance of the i3-10320 will be very similar to the 7700k. The i3 has clock speeds of 3.8/4.6 and the 7700k is 4.2/4.5. That means that on single threaded the 10320 will be slightly faster but in heavily threaded work loads the 7700k will probably be faster due to the higher base clock. This is know because both CPUs are on the Sky Lake architecture and will have the same IPC. Therefore we can infer what the 10320 will do based on what we see the 7700k doing in this review.
I think it'd be nice to see generational to generational improvements on intel vs AMD's side of things, you guys use to do that everytime a new generation came out. It'd be nice to see how far my 4th gen Intel chip has gone vs a new gen now.
well maxipadking, if this site pisses you off so much, and all you do is whine about how bad the reviews are, and how they test low end garbage cpus, why do you even bother coming here ? it is just to be a biased intel shill ?? go back the site that praises your god intel.
Look it's the resident, no life AMD shill in his natural habitat - offering absolutely nothing to conversation and just sniping.. he will also follow you to other forums - he's a creepy little guy
I am enjoying how "I got caught shitposting in multiple forums" went through Deicidium's flamebot troll filter and came out as "watch out, this guy will stalk you".
It’s crazy how Skylake is still the fastest for gaming. Beats having to spend and spend on endless minor upgrades with AMD... and still be slower in gaming ROFL.
And maybe that matters to man children who never leave the basement. For the rest of the adults, price/performance matters more than a few extra fps. And even more so now with alot of people being layed off due to the pandemic and trying to save money.
I just recently rebuilt a i5 760 for a friend, and you're absolutely right. It is still a pretty quick cpu for most users for basic tasks. However its super slow compared to even Zen 1. Part of that is the low clocks, and lacking boost. My Ryzen 1600 non-AF runs circles around it even single threaded.
It is impressive that a low end AMD chip soundly beat the ex flagship, 7700K. The 3300X is certainly the better option to go for between the 2 entry level Ryzen.
Over the last couple of years it's been requested a few times. I finally got around to sorting out a benchmark for it and adding it into our automated script. Seems to work almost flawlessly on any system I'm testing it on. That big gen test can take 1hr+ though.
just curious if I follow the "bench" link it shows Intel at the top of the stack in the opening page, yet when I choose to look at the actual results with the drop down then the results change, yet the opening page is from a benchmark in which the Thread ripper has not even been tested on, the whole industry recognises that its probably the single fastest chip out there for the HEDT platform yet your opening page shows Intel at the top and no 2020 results, once again this looks like careful manipulation of the results and the casual viewer dropping on the page just sees the top 4 out of 6 positions taken by Intel with TR2 mixed in and no mention of TR3 not a very fair page and it gives a poor impression and a possibly misleading impression to folks who know no better and instantly get the impression Intel sells the highest performing CPUs which we know is not the case anymore
and thats what I did clicked the link in the article and ended up with a page showing intel as having all the top spots which we all know is no longer the case...that was my point the opening summary page should reflect the results not the results of 2 years ago, and its now month5 of 2020 not 2019 "latest" results should be 2020
This 3300x is something beats the six core 2600. In some reviews, it is equal to 3600 in games while a slightly behind in rendering tasks. I have already decided that a six-core is minimum for me since the 1600 but this...
It's pretty good for what it is but for a cheap PC, intel has graphics. For a cheap gaming PC it's a bargain now but probably won't age well with 8 core consoles coming out this year. If you can afford the 3700x (or better), that should last 8+ years for gaming.
Ian, sorry to say this, but you must find another organisation for you. Anandtech is just the ghost of what it was. You need at least what every youtuber has to conduct a decent set of benchmarks. You need to buy cpus, videocards, etc. for decent testbeds when they not sampled to you. I'm sick of seeing obscure outfits with every cpu and gpu possible, while a real expert is using a 1080 etc.
I misread the paragraph below it, but in general it's weird for AMD to put out a diagram quite that misleading. The ASRock AB350 was ~$120 when I bought it and is ASRock supported for the 3900X -- surely a decent percentage of boards can support most Zen 2 processors barring power constraints for the 16 core if a cheap budget build can?
Not true the AM$ socket will support all Ryzen chips however not all features are available on all boards such as gen 4 as this is a specific development that was not available when the 1 series launched, also the limitation is on the power system of the board not in AMDs specs
"CHIPSET FEATURES: Note that not all processors are supported on every chipset, and support may require a BIOS upgrade. See your motherboard manufacturer’s website for compatibility"
I have a 3 series running in my A320 media pc in my lounge updated the bios and it works fine however i suspect if i tried a 3900 it would not have the power circuit to support it, the other issue is the bios chips in some of the older boards cannot store enough information to allow all the chips to be used, so strictly speaking the issue is with the board supplier.
I'm still stuck on the i5-6600K which I built back in 2016. Thought it would serve me well for many years to come given the state of Intel and AMD at that point in time, and that my previous i5-2400 lasted me a good number of years while still being competitive. Now barely four years later it's obsoleted by a 100 dollar CPU lol.
It's far from obsolete, even if it's regularly beaten. I'm still using my Sandy-E processor when I'm unopposed to simultaneously running a space heater -- it's just a question of whether you need the latest and greatest.
Actually looking that the performance of these 4 cores chip, I can't wait to see an APU with it. Even the 4 core APU will be great for every day usage, without a graphic card. I just hope they give the 4 core version a decent graphic option, rather than a Vega 6.
Please stop running tests that appeal to less than 5% of your audience (and I think I'm being generous here). Crysis on cpu? Who cares? What does it prove I can do today? Dwarf fortress?? WTF? Quit wasting your time and ours. AI ETH tests? What for (farms do this now)? How many tests do you need that show NOTHING to any of us?
People should get the point. You are irrelevant at some point if you keep posting crap nobody cares to read. Ask toms hardware :) Oh, wait, you guys are toms. ;)
How about testing 20 games at 1080p where everyone plays. :) Is it too difficult to ask a few pros to make a script for photoshop/premier/AE to test under AMD/NV (cuda vs. OpenCL or whatever is faster on AMD)? It is almost like you guys seek benchmarks that nobody could possibly find useful IRL.
"provide a good idea on how instruction streams are interpreted by different microarchitectures." Your PHD project tells me how these cpus will run in WHICH PRO APP? Why not just test a PRO APP IRL? Bah...fake news. Not sure why, AMD wins everything right now. Why hunt for fake tests that mean nothing? How many people use Agisoft instead of PhotoshopCC for $10 a month?
Still ripping at crap modes nobody would actually use. Again tells us nothing about what we REALLY do usually. Only a retard uses FAST settings in handbrake for anything but a 15fps training vid.
"We are currently in the middle of revisiting our CPU gaming benchmarks" and upgrading to 2080ti. Can't happen soon enough, please make sure you test games that sell over 1mil ON PC or don't bother. If the sell poorly or are poorly rated, there is no point in testing them. Test what people PLAY, at settings people really use. 720p low shows what to a person who will NEVER play below 1080p? Oh wait, I just described 99% of your audience, as I'm quite sure they haven't played 720p in ages. So much wasted testing. Stop testing 4k and test more 1080p/1440p (1440p still almost useless, wake me at 10%).
"Some of these new benchmarks provide obvious talking points, others are just a bit of fun. Most of them are so new we’ve only run them on a few processors so far. It will be interesting to hear your feedback!"
Please quit wasting your time. It feels like all your benchmarks are "for fun" as I'm not much smarter after coming here. Off to a site that tests a dozen games and some real world stuff some of us actually use (techpowerup for example...games galore, 10 REAL games tested). THIS is how you give a well rounded idea of a cpu/gpu perf. YOU TEST REAL STUFF, instead of your PHD crap or agisoft junk. People use adobe, and play games that SELL. This isn't complicated people.
Might as well jump off the roof with your cpu and tell us how fast you hit the ground. Just a useless as your benchmarks. Are they benchmarks if nobody uses them? Or is it just more "fun" crap tests that tell us nothing useful? If you are NOT helping me make a more informed decision (useful info) about buying the reviewed product, you have failed. A good review is chock full of useful info related to how we actually use the product, not a bunch of crap nobody cares about or use IRL.
https://store.steampowered.com/app/975370/Dwarf_Fo... The devs make 3K a month from it. This is not exactly played by the world if it pulls down $35K a year. Why even bother testing this crap? Are we all going to go back to pixel crap graphics tomorrow? Heck now. Wake up. Those games (and the shite monitors we had then) are why I needed lasik...ROFL.
"Only a retard uses" And that's about where I realised you weren't really making a comment so much as farting into a piece of voice recognition software.
Anandtech, can we please have an upvote/downvote system in the comments section? Seems to work very well at Arstechnica in drowning out the trolls. Thanks.
I imagine this has been mentioned elsewhere before, but why does Zen fare so badly, specifically, in 3DPM v1? Additionally, Geekbench 4's MT test is shown twice, once in place of 3DPM v1's MT test.
Good to see my old 4790k featured. I can see what I would get for my money if I upgraded. And judging by these results it isn’t really worth the money to upgrade it. Thank god really because I’ve been burned by AMD so many times that if I can I will avoid them. But right now you really can’t avoid them! Good thing I don’t need a CPU!
A nice 30xx series Nvidia card will be a good replacement for my RX580 when they release. Hey, might even be able to use my 4K monitor in 4K!
I can get i7 4770/4790 systems for $120. AMD's going to have to do better than +10% performance to get me to pay triple the price over Intel's old stock.
Chancellor Portal is a portal to get admission in Jharkhand State. For more information, visit https://www.jharkhandjob.in/chancellor-portal-jhar... all the informations are given here. Students can also learn more about computer and technology after getting admission.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
249 Comments
Back to Article
PeterCollier - Thursday, May 7, 2020 - link
Ugh, how much did Intel pay you for this article?kulareddy - Thursday, May 7, 2020 - link
How much did AMD pay you for this comment?callmebob - Thursday, May 7, 2020 - link
They paid him with old new stock of a dozen 1st gen Bulldozer CPUs.Just having the AMD logo on them will make him blissfully happy. No need to waste good products on him ;-)
kulareddy - Thursday, May 7, 2020 - link
👏👏👏PeterCollier - Saturday, May 9, 2020 - link
This makes no sense.Teckk - Thursday, May 7, 2020 - link
I genuinely want to understand why you think so?You really believe this article puts Intel's chips in a good light?
Would've preferred to see a set of different/recent processors in comparison but your comment is confusing lol.
Teckk - Thursday, May 7, 2020 - link
Arghhh .. meant as a reply to @PeterCollierkulareddy - Thursday, May 7, 2020 - link
👍PeterCollier - Friday, May 8, 2020 - link
Yestwtech - Friday, May 8, 2020 - link
I guess one point might be that in most of the comparisons, there are no higher-end AMD CPUs included. So you see that Intel's higher-end processors are better for gaming, but not that there are AMD options as well further up the chain.Even so, I think Intel holds the gaming FPS crown anyway for the moment, with their new 10900k (which isn't on this chart). That 5.3 boost clock should be pretty good for achieving maximal framerates.
Other than losing to Intel at max-FPS gaming though, AMD dominates all segments from a price/performance, raw performance, and power efficiency performance. Server, mobile, workstation, gaming, etc.
PeterCollier - Saturday, May 9, 2020 - link
Where are the AMD APUs?PeterCollier - Friday, May 8, 2020 - link
0The_Assimilator - Thursday, May 7, 2020 - link
Please shut up.b0rnslippy - Thursday, May 7, 2020 - link
Why? not all of them are blind Ian supporters. Just saying it like we seeing it.Teckk - Thursday, May 7, 2020 - link
You think this article is biased towards Intel?mrvco - Friday, May 8, 2020 - link
The narrative seems forced in lieue of current and price competitive offerings from Intel. Hard to blame AT though, must publish. Regardless, AMD is absolutely ballin'.brunis.dk - Saturday, May 9, 2020 - link
Unbelievable.. this chip is a quarter of the price and is a fucking steel and totally embarrasses Intel's best .. and he thinks its Intel biased, what the actual F?Spunjji - Monday, May 11, 2020 - link
Yeah, this "logic" isn't working out at all.I'm hard pressed to tell whether these comments are from actual dyed-in-the-wool AMD fanboys or a few Intel nuggets doing their best impersonations of how they think one would behave. 🤔
PeterCollier - Monday, May 11, 2020 - link
Socrates said it's the mark of an informed mind to entertain a thought without rejecting it.It's the mark of an uninformed mind to be unable to entertain an opposing viewpoint and instead dismiss it as trolling.
FreckledTrout - Monday, May 11, 2020 - link
Those trolls in 400BC were a pain writting stuff in stone and all.Spunjji - Tuesday, May 12, 2020 - link
It's the mark of an uninformed troll to misquote Socrates in support of obvious trolling.Unashamed_unoriginal_username_x86 - Thursday, May 7, 2020 - link
"AMD's new processor is better than Intel's recent flagship at a third of the price, and Intel's i3 chips are barely available so it will be great for budget gaming""HOW MUCH IS INTEL PAYING YOU"
What, should he have manipulated the benchmarks?
Or should he have unprofessionally craptalked Intel because his identity revolves around an alliterative antagonistic absolutist dogmatic dichotomy between fooking TECH companies?
Scootiep7 - Thursday, May 7, 2020 - link
DUDE! Not so many big ass words this early in the morning! My 4th cup of coffee hasn't kicked in yet.Mil0 - Friday, May 8, 2020 - link
Verily this vichyssoise of verbiage veers most verbose, so let me simply add that it's my very good honor to meet you, and you may call me M.boozed - Thursday, May 7, 2020 - link
"Blind Ian supporters" like this is some kind of celebrity feud? Grow up.deathBOB - Thursday, May 7, 2020 - link
Don't be rude. I for one think it's nice that people with serious intellectual disabilities can participate here.Richlet - Friday, May 8, 2020 - link
As long as we don't go back to the days of dailytech, lol. I think all those guys currently work for the current POTUS.sarafino - Friday, May 8, 2020 - link
Come on dude, don't turn the comment section into a Reddit-style off topic political flame war.PeterCollier - Saturday, May 9, 2020 - link
I just hope Gary Johnson runs. Can't stand Trump or Biden. Not voting for women molestors.PeterCollier - Saturday, May 9, 2020 - link
Intellectual disabilities? This is the level of discussion here?PeterCollier - Friday, May 8, 2020 - link
I'm expressing my informed opinion, what's your issue?Korguz - Friday, May 8, 2020 - link
informed opinion ?? yea rightDeicidium369 - Friday, May 8, 2020 - link
Don't mind Korguz - he never offers anything to the conversation... the is a hard core AMD shill who thinks he makes his point by calling other people Intel shills. He is a bit creepy - he will follow you on multiple forums...Just ignore him - he has nothing.
Korguz - Friday, May 8, 2020 - link
and you can also ignore Deicidium369, he cant get is OWN personal facts straight, and also gives false information on another site, so anything he says are lies, and BS calls me a hardcore amd shill, but yet, says NOTHING positive about amd, but praises intel like a god, go figure" He is a bit creepy - he will follow you on multiple forums... " FYI there are others that know about your other post that are full of mis information, and call you out on it to, and your response, just like here, name calling, insults, and condescending remarks, this is why you were banned from tom's, was it not ?
Korguz - Friday, May 8, 2020 - link
" Just ignore him - he has nothing. " oh i do, do " ?take a look that these 2 pics here :
in the 1st pic, he states he retired @ 28, 20 years ago( year 2000 ), then a few days later claims he started, and built up a complete woodshop business 40 years ago, which using his " retired at 28 " BS, as a reference, would of made him EIGHT years old. so either he is a lier, or he cant get his facts straight. the only reason to listen to him, is cause his posts are comedy gold
i wonder what BS he will type for this now
Korguz - Friday, May 8, 2020 - link
https://imgur.com/a/s9Ift1pPeterCollier - Saturday, May 9, 2020 - link
Thanks, I'll make a mental note to ignore him.Spunjji - Monday, May 11, 2020 - link
This comment is absolutely hilarious. Korguz mocks PeterCollier for accusing Ian and AnandTech of anti-Intel bias, and Deicidium369 chimes in saying it's because... *drumroll* Korguz is a "hard core AMD shill". Even better is that PeterCollier goes on to agree.Peter and Deicidium make a great pairing - two sides to the same demented coin.
PeterCollier - Monday, May 11, 2020 - link
Your obsession with certain posters is concerning.Spunjji - Tuesday, May 12, 2020 - link
🥱Spunjji - Monday, May 11, 2020 - link
The issue is that it's not informed. It's codswallop.PeterCollier - Monday, May 11, 2020 - link
You're talking about the article.Spunjji - Tuesday, May 12, 2020 - link
🤡psychobriggsy - Thursday, May 7, 2020 - link
Considering it shows the $120 AMD offering comprehensively beating the old i7-7700K, and says the current Intel budget offerings will be slower, and recommends the AMD processors, I find this comment rather brain dead.WaltC - Thursday, May 7, 2020 - link
One question I had is why AT chose to use the 2600 instead of the 3600...? Makes no sense to me, as the 3600 runs at 65W and the 3600X runs at 95W--just like the 2600--only the 3600 is appreciably faster--but costs the same! 3600 is MIA. No question but that the review benchmarks clearly demonstrate the superiority of the AMD offerings, but we already knew that. I see the omission here--deliberate--of the 3600--while including $425 Intel 6c/12t offerings--as surely an apology for Intel's inability to compete. Such is not needed, really. Apologizing in subtle ways for Intel is, I think, a pretty poor way to write a review on CPUs Intel cannot at the present time compete with--the 3100/3300. Getting right down to it--there was no need to include *any* 6c/12t CPUs here, right? Should have been comparisons only with Intel/AMD 4c/8t cpus, exclusively, imo. Selection of CPUS for *this review* didn't make any objective sense that I could see--beyond the obvious, of course (at least you didn't forget and leave the 3100/3300 out...;))evilspoons - Thursday, May 7, 2020 - link
I'm guessing the omission of the 3600X has something to do with, at the time I read this, they hadn't even finished all the benchmarks for the 3100. You know, the one in the headline. I don't think it's a conspiracy, just a time constraint.crimson117 - Thursday, May 7, 2020 - link
They didn't rerun the 2600 for this, they used existing benchmarks.They haven't ever benchmarked the 3600 previously, so it's not listed here. They do have the 3700X, however, which is essentially the same performance as a 3600 (except in heavily threaded workloads): https://www.anandtech.com/bench/product/2520?vs=25...
MDD1963 - Thursday, May 7, 2020 - link
I saw other testers on Youtube use the 3600, and, the 3300X was VERY surprisingly close to it's performance...; the 3300X's clearly quite strong threads and lack of inter -CCX -RAM latency issues are reaping benefits!BenSkywalker - Thursday, May 7, 2020 - link
The choices they use to compare are utterly bizarre. A three and a half year old Intel i7 and last generation Ryzen parts....?Legitimately, this review is useless if you are shopping *today*, not just from a team red versus team blue, but where this processor sits in today's market, no clue after reading this. One of my friends was looking for a budget gaming build and I was looking at a 3200G/B450 setup, how does this compare? Instead let's assume people have a time machine and are cross shopping two gen old Ryzen and three green old Intel parts....?
The charts aren't bad, they are terrible. Have an old i7 in there for reference, ok, put current Ryzen 3 and i3 inn there and if you don't have enough time *only* include them.
rabidpeach - Friday, May 8, 2020 - link
bro, they try to make a point with the reviews. if you want this comparison you use the cpubench feature of this website and compare any chip they tested on any of the tests they have. it's an actual feature not a bug. the point of this article and tests is to show entry level amd 100 price point is as powerful as 3 year old flagship-ish intel for the mainstream. it shows against the zen and zen+ hexacores that it catches up to them in many situations despite lacking in cores. this shows you amd is not just throwing cores at intel anymore. they have ipc too! ok any more spoon-feeding? would you prefer a spork?BenSkywalker - Friday, May 8, 2020 - link
You just called them AMD shills. They went into this review to prove how great AMD is, that is not journalism, that is not a review, that is a marketing campaign.Literally zero need to use the CPU bench tool they have, literally every other site I've checked has a useful, much better review, although it doesn't hit the level of marketing you are looking for.
Spunjji - Monday, May 11, 2020 - link
"Openly stating the obvious conclusion that your empirical testing led you to is a marketing caompaign" is exactly the sort of anti-intellectual, brain-dead take I have come to expect from you, Ben.He didn't call them AMD shills - you did, and all for daring to have an opinion.
BenSkywalker - Friday, May 15, 2020 - link
"They try to make a point with their review"That is to the letter the opposite of objectivity, that is precisely what shilling is, and I'm not the one that said it. Is English not your first language? You truly shouldn't being up intellect if you don't comprehend the words being used.
Deicidium369 - Friday, May 8, 2020 - link
is it any different that the AMD Fluff like "Which is the best CPU" "Which is the best Workstation CPU" etc ...Spunjji - Monday, May 11, 2020 - link
How was that article "AMD fluff"?Oh, because Intel's offerings got panned. How sad. I guess that means Anandtech has a bias that tracks roughly with which parts are the best at any given time. Its almost like... objectivity 🤔
PeterCollier - Monday, May 11, 2020 - link
I agree. Ian should take some classes from Andrei on chart creation and proper benchmarking.Fataliity - Friday, May 8, 2020 - link
The first page talks about,Which should you buy? the 3100/3300 versiohns, or a 2600, or a 1600AF?
And then the benchmarks compare them.
He's comparing what you can buy in the price range, I thought it was easy to understand.
BenSkywalker - Friday, May 8, 2020 - link
The world does not revolve around what amdownzjoo.com has as a recommended processor. Even if we were limiting ourselves to that, what about the 3200G? Every other site I found handled their reviews of this product much better.Spunjji - Wednesday, May 13, 2020 - link
What *about* the 3200G? It's an older, slower CPU. If you're going to add a dGPU, it's pointless. If you're not, you're still better off waiting for the 4000 series.The desperate scraping for even a semblance of a point in your posts is positively painful.
BenSkywalker - Friday, May 15, 2020 - link
Wanting to see how a $99 CPU compares to a $99 and a $129 CPU is pointless..... You have a special way of viewing things.PeterCollier - Saturday, May 9, 2020 - link
I'm curious what happened to HSA.PeterCollier - Friday, May 8, 2020 - link
It beats a 3 year old processor, congratulations. Let me cut the cake.Korguz - Friday, May 8, 2020 - link
looks like some one didnt read the articleDeicidium369 - Friday, May 8, 2020 - link
Try adding something to the conversation rather than sniping and stalkingKorguz - Friday, May 8, 2020 - link
you 1st there Deicidium369Spunjji - Monday, May 11, 2020 - link
Hmmm.. looks like most of your posts in this thread are responding to Korguz, they're primarily focused on being critical of him, and they add nothing to the discussion.If you were projecting any harder you'd burn a hole through the screen.
[email protected] - Thursday, May 7, 2020 - link
Not sure if you're just trolling but which part of this review is biased to you? It seems factual and well written to me.boozed - Thursday, May 7, 2020 - link
The problem with the internet is that it's difficult for people to tell when you're jokingThreska - Friday, May 8, 2020 - link
Netscape should have introduced the <joke></joke> tag.MDD1963 - Saturday, May 9, 2020 - link
Yes, I'm sure Intel was very quite pleased and eager to have both their famous 7700K and 8086K equaled or surpassed in most games by a $120 CPU....; they insisted and 'bribed' the powers that be that their former flagships be included! :)Spunjji - Monday, May 11, 2020 - link
What a worthless comment. Sure, they included high-end Intel CPUs in the gaming sections - but the review isn't *about* high-end CPUs, it's about what you get for the money with these specific AMD CPUs. Comparing AMD's budget gaming CPU to the best available to see how little you lose is a valid comparison to make.jjjag - Tuesday, May 12, 2020 - link
Anandtech can no longer write a simple factual article anymore about any processor. Even this article, which is supposed to be a simple article about a new low-cost processor, uses the word "Bonanza" in the title, mysteriously It also takes multiple jabs at Intel in the body, even though it servers no purpose to the actual content. Every Anand article is now an opinion piece instead of responsible reporting.Spunjji - Tuesday, May 12, 2020 - link
"I hate content with flavour. I want lists of graphs with no words."Good for you. Off you go to userbenchmark, for worthless, context-free information that's appropriately biased towards your preferred team.
rdgoodri - Friday, May 15, 2020 - link
Its pretty positive for AMD, don't catch your angle here.Meteor2 - Tuesday, August 4, 2020 - link
This article absolutely rips into Intel, and rightly so.Your comment is bizarre.
PeterCollier - Thursday, May 7, 2020 - link
And what's the point of these new benchmarks? I prefer PCMark and Userbench. Basically no one is using their new CPU to simulate the neurons of a sea slug, for example. Utterly irrelevant to real-life usage.Mansoor - Thursday, May 7, 2020 - link
The purpose of a benchmark is to produce repeatable and reliable numbers. Just "doing real-life stuff" is not repeatable and will generate different numbers for everyone. If you have a specific use case in mind, you can observe relevant or related benchmarks.PeterCollier - Friday, May 8, 2020 - link
None of my use cases mesh with any of the lousily selected benchmarks in this review.Korguz - Friday, May 8, 2020 - link
then why are you here reading this article ?PeterCollier - Saturday, May 9, 2020 - link
I read articles from all sources, including the silicon-equivalent of Faux News. I find it a good practice to read from sources that you disagree with, or worse, purposely mislead you, because it's important not to create an echo chamber for one's self.Spunjji - Tuesday, May 12, 2020 - link
Ah, we have an "enlightened centrist" here. No take is too worthless, malformed or ignorant for him. Anything less than subjecting yourself to the dribblings of fools and disinformation artists is an "echo chamber". Such rational, many smart.Ian Cutress - Friday, May 8, 2020 - link
You do realise that not all of the benchmarks have to confirm to your use case?There are two roads to take:
1) Out of ABCXYZ, Benchmarks XYZ are relevant to me. That's good.
2) Out of ABCXYZ, only Benchmarks XYZ are relevant to me. Why did you even bother testing ABC?
Not 100% of benchmarks have to be relevant to you. Plenty of other folks have requested these.
Spunjji - Monday, May 11, 2020 - link
Famrnuke - Thursday, May 7, 2020 - link
I'm sorry, WHAT? Userbench? I'll assume you're joking, because that's a god-awful "benchmark". PCMark is fine, but they gave many gaming results, so actually they've done better work than a lazy PCMark result.They did 7zip/WinRAR, imaging editing, video encoding, and browser tests as well.
Exactly what would using PCMark and Userbench add?
PeterCollier - Friday, May 8, 2020 - link
PCMark writing is a good test of system responsiveness.paulemannsen - Saturday, May 9, 2020 - link
Userbenchmark is so bad, it now gets ridiculed permanently and has become the Number 1 Meme in the community. PCMark measures, like you said yourself, SYSTEM-responsiveness, so its understandable if its not the top priority here. Furthermore, if you arent completely braindead you can extrapolate system-behaviour from a CPU-test/benchmark. Basically everything you spew here is demontrating your total ignorance and lack of knowledge of everything. You should be utterly ashamed of yourself.PeterCollier - Saturday, May 9, 2020 - link
Mind mentioning some facts when disparaging Userbenchmark?Spunjji - Monday, May 11, 2020 - link
Facts! Okay:It has a known Intel bias, it doesn't actually do anything that isn't covered by the tests shown here, and even if it did *some of us would still be happier with these real-world application benchmarks*.
PeterCollier - Monday, May 11, 2020 - link
Got some links?Spunjji - Tuesday, May 12, 2020 - link
Sure, here's one that's relevant to you:http://wondermark.com/1k62/
Spunjji - Monday, May 11, 2020 - link
You missed out on what Userbenchmark would add. Luckily most of us know the answer is "nothing".I'm now pretty convinced that you're a Deicidium sockpuppet.
Spunjji - Monday, May 11, 2020 - link
Oh for crying out loud. The first two posts in the comments wasted on a feckless simpleton blubbering that this site doesn't do the exact same things every other site does.Guess what laddoo - that's the point.
twizzlebizzle22 - Thursday, May 7, 2020 - link
Bit disappointed with the CPU selection used in this article.Thought it might be interesting to see how it fares against the 1600AF or R5 3600. Different CPUs which people might be considering if they're worth the extra.
amrnuke - Thursday, May 7, 2020 - link
Exactly. I'd have loved to see it up against the i3 9100F, 9400F. Comparison to the 3600 and 1600AF would be very welcome given the 1600AF's price range and 3300X vs 3600 would be very interesting from a gaming standpoint.DanNeely - Thursday, May 7, 2020 - link
None of those CPUs are in Bench, so Ian doesn't have any of them to test. The original 1600 is present for the older benchmarks, but not the newer ones. I wouldn't expect many CPUs to be run on the new ones until all of them are finished and ready to go in order to minimize the amount of time spent dis/reassembling test systems.flyingpants265 - Thursday, May 7, 2020 - link
Bench is terrible, it's missing way too many options. Most common problem I run into is when I'm trying to compare an older card (7870 or gtx660 or something) to a newer card, and they are just missing from the list. Just needs to be expanded and improved.0ldman79 - Thursday, May 7, 2020 - link
That's because the data sets don't overlap.They were run with different CPU, OS, etc...
I'd like for there to be a checkbox that says "these aren't scientific, you can compare incomplete, incorrect data at your own risk" etc, but there's a reason they aren't in the same listing; they're not the same.
Zizy - Friday, May 8, 2020 - link
The problem is that such data is often very misleading. By far the most important issue: there are many drivers touting >10% speedup in game X. You might as well guess which GPU is faster if they end up very close over many generations. Heck, simple "the new GPU has X% more CU and Y% higher frequency vs old one" is likely to lead to a better estimate than trying to correct for all that mess.regsEx - Thursday, May 7, 2020 - link
Worst part in the test is the RAM. That set of 3200C14 memory is very expensive - $160. That making entire build pointless. For that money you can get 32 GB kit with 6-core Core i3 10400.b0rnslippy - Thursday, May 7, 2020 - link
Oh Ian doesn't have them in Bench.. How convenient for Ian. Ehem, "Dr" Ian, sorry.destorofall - Thursday, May 7, 2020 - link
you sound butthurt0ldman79 - Thursday, May 7, 2020 - link
Heaven forbid his data set of God knows how many CPU doesn't include the one you want to see...Damn, you really should demand a refund.
LMonty - Thursday, May 7, 2020 - link
You should really file a complaint, buddy. Gotta fight for your rights. ;Pjimbo2779 - Sunday, May 10, 2020 - link
What has happened to the comments section here. Can we go back to just ignoring the ignoramus'. It often means they just go away.psychobriggsy - Thursday, May 7, 2020 - link
It was mentioned that Intel didn't even send these CPUs out for review, and that they're hard to obtain because Intel isn't making many of them.However, a few more data points would be nice. I think Ian needs to set up a system test datacentre like Phoronix so the rebuilding is kept to a minimum!
twizzlebizzle22 - Thursday, May 7, 2020 - link
AMD must have sent the 7700k or specified it's use. I've noticed every review using that specific CPU. AMD aiming for the used market upgraders it seems.amrnuke - Thursday, May 7, 2020 - link
I believe that's the last Intel chip that was 4C/8T as well, right? Seems a fair comparison, I guess if AMD really think that's the market.Anyway, TechPowerUp went ahead and lined up the 3300X against a bunch of other relevant chips (https://www.techpowerup.com/review/amd-ryzen-3-330... It's 1% slower than the 3600 at 720P gaming, 16.5% slower than the 9900K at 720P gaming.
CPU tests show the 4C/8T 3300X holding up well to the 6C/6T 8600K and 9400F. It pretty well trounces the 9100F.
The 3100 beats the 9100F by 14% in CPU tests.
schujj07 - Thursday, May 7, 2020 - link
720p gaming isn't even relevant. If these were iGPU tests then sure, but even a GTX 1050 can do better than 720p gaming.supdawgwtfd - Thursday, May 7, 2020 - link
Are you stupid?To test CPU performance you run lower resolution to ensure the CPU is the bottleneck
Your comment is not relevant.
schujj07 - Saturday, May 9, 2020 - link
Hence why most review sites use 1080p. 720p benchmarking on modern hardware is akin to Quake 3 benchmarking at 640x480 resolution back in 2000. All you end up seeing are crazy high numbers that don't mean anything. We see it all the time that CPU A is faster at 720p but then slower at 1080p?paulemannsen - Saturday, May 9, 2020 - link
@schujj07 Interesting. Your claim sounds totally alien to me, so can you show us some examples where a CPU is significantly slower in 1080p than in 720p when the GPU isnt the bottleneck pls?schujj07 - Sunday, May 10, 2020 - link
Just look at this review and there are a couple examples of this a 720p and 1080p ultra.Spunjji - Monday, May 11, 2020 - link
@superdawgwtfd - If the resolution is too low then you artificially amplify the differences between CPUs. Meanwhile at 1080p you're testing a resolution people will acttually use for high-frame-rate displays, and a decent GPU is still not going to be the primary limit at that resolution.Fataliity - Friday, May 8, 2020 - link
Also a 7700K should be similar to the new 10th gens with same amount of cores. It's same arch / node. Just frequency changes (and I think the low end new ones are saame or slightly lower.Ian Cutress - Friday, May 8, 2020 - link
7700K was tested last year on the same driver sets. It's been in Bench for a whileschujj07 - Thursday, May 7, 2020 - link
The 9100F is 4c/4t with a 3.6/4.2 clock. The 7700k is 4c/8t with a 4.2/4.5 clock. Since both the 7th & 9th gen are both Sky Lake, they will have identical IPC. Based on that we know that the 9100F will perform worse than the 7700k and makes that inclusion pretty pointless. Not to mention that Ian said he never got review samples of the 9th gen i3's. In a lot of the benchmarks we see the R5 1600 & 2600 and the 1600AF will be right between those 2 CPUs in performance. The inclusion of the 4790k and 8086k are nice as they show comparisons from the top 2014 CPU and 2018 CPU. When it comes to single threaded applications, a stock 8086k will be as fast than as a stock 9900k due to having the same boost and IPC. Therefore we are able to extrapolate a lot of data from this whole thing.Spunjji - Monday, May 11, 2020 - link
You made a succession of excellent points here. Alas, I feel some people would rather use their brain for trolling than for processing the information they claim to want in the course of said trolling.crimson117 - Thursday, May 7, 2020 - link
1600AF performance is identical to the 2600, so just use that.3600 is an unfortunate omission.
schujj07 - Thursday, May 7, 2020 - link
Due to the clock differences between the 2 CPUs that is false. The 1600AF will fall between the 1600 & 2600 in performance.crimson117 - Thursday, May 7, 2020 - link
You're right, not identical, but like 95% the performance at worst and often exactly the same in practice (especially gaming above 1080p): https://www.techspot.com/review/1977-amd-ryzen-160...paulemannsen - Saturday, May 9, 2020 - link
Try Hardware Unboxed, they have exactly what you want. Their verdict though is the same as Anandtechs.flyingpants265 - Thursday, May 7, 2020 - link
Came here to post this. This site has been a joke for a long while now, but this is crazy. I'm reading GamersNexus for real benchmarking charts.Korguz - Thursday, May 7, 2020 - link
then why do you keep coming here ?callmebob - Thursday, May 7, 2020 - link
You don't come to anywhere. You are not going anywhere.Your life sucks, and will forever suck. It still has a purpose, though: To serve as a warning to others.
Deicidium369 - Friday, May 8, 2020 - link
He is a stalker - if you are posting on Toms or Wcc - he is stalking you... he has nothing to offer, just calling people Intel shills. all the while being an AMD Shill.This forum needs an ignore function. He needs to just go clean the basement.
Lord of the Bored - Saturday, May 9, 2020 - link
But thank god we have a stalker Intel shill to balance him out. Your service to the community is appreciated.Korguz - Saturday, May 9, 2020 - link
ahh, so reading multiple websites, where some one is stupid enough to make the same name on 2 of those sites, is considered stalking ? whats the matter Deicidium369, you have nothing else ? youi cant prove any of your BS so you have to, once again, resort to BS replies where you just insult, call people names, and be condescending ? BTW, you get your personal facts straight ?drothgery - Thursday, May 7, 2020 - link
the right Intel comparables are the Comet Lake i3s, but they're not available yet, so they've got to hash something together ...eastcoast_pete - Thursday, May 7, 2020 - link
Except that these chips (3100, 3300) are available, so one can buy them now; the Comet Lake i3s aren't. If Intel wanted the Comet Lake i3 to be included, they could have shipped a review sample to Ian. I don't believe he would have refused it.Spunjji - Monday, May 11, 2020 - link
This x1000. I will never understand people who fault a reviewer for releasing a review that doesn't contain chips that simply are not available.PeterCollier - Monday, May 11, 2020 - link
Did you really feel the need to add 40 comments a week later the article was published? And on a Monday? Please tell us what company, Intel or AMD, that you work for.Spunjji - Tuesday, May 12, 2020 - link
Yes, reading article comments five days "late" definitely indicates that I work for a tech company, and not that I only visit this place once or twice a week during my lunch break. But hey, basing trolls is what I do for amusement. What's your excuse for trolling in the first place?kepstin - Friday, May 8, 2020 - link
I suspect that the 2600 was put into the charts as a stand-in for the 1600AF. The 1600AF should perform very close to the 2600 in most benchmarks - just a touch slower due to reduced clock speeds.rabidpeach - Friday, May 8, 2020 - link
1600af is represented by the 2600. the 2600 is just slightly better than it. this is not a problem. also amazon is out of them. the only benchmarks these processors win are high core count. daily life the 100 3100 will be faster so why hobble yourself with the smaller cache and 12nm?DanNeely - Thursday, May 7, 2020 - link
Something I've been wondering about for a while, is that when the first Ryzen announcements were made I read articles saying that the inclusion of some SATA/USB on the CPU itself would allow for cheaper entry level mobos/systems that used the CPU as an SoC without a chipset at all. However I don't recall ever seeing anything built that way. I'm inclined to doubt that it is possible but that no one has ever done it because for a low end system without a discrete GPU the CPU appears to have enough IO to cover all the bases. Were the initial reports wrong? Is it something that's only shown up in cheap OEM systems but never DIY boards?neblogai - Thursday, May 7, 2020 - link
Deskmini A300 is made this way- althouth it has no dGPU slot and requires an APU. This also allows the PC to idle at just 8-10W, compared to ~20W(?) for PCs that use motherboards with chipsets.DanNeely - Friday, May 8, 2020 - link
just one system by a single vendor. Kinda disappointing IMO since the CPU has enough connectivity for a basic no frills system, and would've been a reasonable option for a budget mITX board.Spunjji - Monday, May 11, 2020 - link
Agreed. I'd love to build a system in my existing USFF ITX case using something like an ITX version of the Deskmini A300 board, but for some reason nobody's doing it. I'm genuinely cluesless as to what "some reason" might be, too, as the A300 proves the concept just fine.Slash3 - Thursday, May 7, 2020 - link
ASRock's DeskMini A300 series systems are the closest configuration to this, lacking a secondary chipset.notb - Thursday, May 7, 2020 - link
A lot of focus in this text is about how 3300X ($120) compares to 7700K ($329, 3 years ago).Wouldn't it be nice to see how 3300X compares to high-end original Ryzen? ;)
The choice of CPUs is really weird. 8086K? 4790K? Really?
1600X, 1700, 1700X, 1800X - all probably beaten in most games and synthetic single-thread. In a lot of software as well (since 7700K used to beat 1800X occasionally).
1800X: $499
Ian Cutress - Thursday, May 7, 2020 - link
4790K 6700K 7700KAll Intel quad core showing generational differences as to where the 3300X and 3100 fit in.
1600X, 1700, 1700X, 1800X are all in our benchmark database, Bench.
It's practically listed on almost every page.
notb - Thursday, May 7, 2020 - link
And that's obviously great. But with that approach you could just write "3100 and 3300X added to Bench", right? :)I have nothing against the factual layer of this article. Results are as expected and they look consistent.
But it's essentially a story how an entry-level $120 CPU from company A beats a not-so-ancient flagship from company B.
So I'm merely wondering why you decided to write it like this, instead of comparing to wider choice of expensive CPUs from 2017. Because in many of your results 3300X beats 1st gen Ryzens that were even more expensive than the 7700K.
Or you could include older 4C/8T Ryzens (1500X) - showing how much faster Zen2 is.
Instead you've included the older 6-core Ryzens, which are neither similar in core count nor in MSRP.
Ian Cutress - Friday, May 8, 2020 - link
2600/1600 AF is ~$85 at retail (where you can find it), and judging by the comments, VERY popular. That's why this was included.Deicidium369 - Friday, May 8, 2020 - link
Just say AMD GOOD! INTEL BAD! that's all they are looking foreastcoast_pete - Thursday, May 7, 2020 - link
Some other sites have, and yes, the 3300 gives most of the 1st generation Ryzens a run for their money.Irata - Thursday, May 7, 2020 - link
This is a highly impressive little CPU for the money.I particularly liked the 3300X‘s good showing. If this is at least in part due to it using only one CCX, this should bode well for Ryzen 3 which should have an eight core CCX.
Look at some tests were Ryzen did not do so well wrt their Intel counterpart like Kraken and Octane - the 3300x now does very well. It even scores slightly better than the 3700x
wr3zzz - Thursday, May 7, 2020 - link
Does the B550 MB need active cooling? I can't tell from the pic.callmebob - Thursday, May 7, 2020 - link
Look at the spec graphics. Note the only difference to the old B450 is pretty much that it provides PCIe 3.0 lanes instead of PCIe 2.0.Now, when was the last time you saw a PCIe 3.0-based chipset hub needing active cooling?
As an aside, while i am kinda glad the B550 is finally coming, i am also a bit disappointed in seeing AMD (and their design/manufacturing partners) needing a better part of a year just for managing a bump from PCIe 2.0 to PCIe 3.0. PCIe 3.0 has been in the market for around eight years now; there is no excuse for AMD taking this long to figure out this s*it.
Fritzkier - Thursday, May 7, 2020 - link
Because their PCIe 3 and 4 was provided by the CPU tho. Or maybe there's an advantage of PCIe lanes provided by the chipset?callmebob - Thursday, May 7, 2020 - link
Haha, do you even know _how many_ PCIe lanes the CPU provides? Wager a guess whether it is for more than a single x16 slot?callmebob - Thursday, May 7, 2020 - link
Haha to myself.AMD's B550 slide tricked me for a moment, as it makes it appear as if the CPU only has 20 PCIe lanes total. Which is of course bollocks, Ryzen has 24 PCIe lanes total (20 usable + 4 chipset link).
Does it mean AMD artifically only allows 16 of the 20 CPU PCIe lanes to be used on B550 motherboards? Really? I am confused whether that is a mistake in the slide, or if that will be the actual reality. I hope, and for AMDs sake, it is the former...
DanNeely - Thursday, May 7, 2020 - link
If you're talking about the "The New AMD B550 chipset" slides, the problem is they're poorly designed and you've misread them. On the left side of the 1st one you've got a box with 20 PCIe lanes 16 for the graphics and 4 for the chipset, then below that you've got a box with what is either 4 lanes for a single 4x 4.0 SSD, 2 sets of 2 lanes for a pair of 2x 4.0 SSDs, or a 2x PCIe link and 2 sata ports. Below that in the list of text it has 16 lanes and 8 lanes as the first two items.Makaveli - Thursday, May 7, 2020 - link
The cooling fans on the X570 are silent as I've never heard mine once in the 6 month's I've been using it. I wouldn't worry about it.wr3zzz - Thursday, May 7, 2020 - link
It's less about noise than durability. I've had two MB died on me prematurely in 30+ years and both are due to the little cooling fans dying. Unless you are buying top of the line $1000 MB, those fans are garbage comparing to what's used on GPU and CPU.callmebob - Friday, May 8, 2020 - link
> Unless you are buying top of the line $1000 MBOi, are you still using Zimbabwe dollaroos? ;-)
But yeah, other than the creative pricing i am totally with you in regard to those little teeny fans...
lightningz71 - Thursday, May 7, 2020 - link
If I’m interested in CPUs in this price range, I’m also considering the following units:2700
2600x
2600
1600AF
3600
While I realize that the intel 10 series isn’t available yet, a low end current 9 series i5 and a higher end 9 series i3 would have also been relevant.
I realize that this was under a short deadline, but at least a couple of comparisons in that range for maybe a few tests would have helped.
For my money, the base 2700 is very hard to beat in this price range. It would only ever loose in things that are strictly single core or strictly AVX2, which are very case specific, and would wipe the floor with the 3300x in anything multi core sensitive, judging by the 2600 tests alone. It can usually be had for within $10 of the msrp of the 3300x.
The 3300x is interesting at $99. The 3100 at $80
Holliday75 - Thursday, May 7, 2020 - link
Why does everyone spell lose with two "O's"?Namisecond - Thursday, May 7, 2020 - link
Two not mutually exclusive possibilities:1. English is not their native language
2. They failed at English.
callmebob - Thursday, May 7, 2020 - link
...because they are playing it loose with the spelling of lose.Also, Double O's posess a certain elgance, sophistication and general badassery. They are also deadly. Ooh, and keep your girl away from them, especially one particular Double O.
Ian Cutress - Thursday, May 7, 2020 - link
When some people say lose, they put all the emphasis on the o, so it sounds longer, so they think one is not enough. Ask them to spell loose straight after, and you get to see some good old gears start clunking into place.lightningz71 - Friday, May 8, 2020 - link
Because I was typing it on mobile, didn’t proofread before I hit submit, and the spell checker didn’t flag it as being wrong because it doesn’t know context.It’s my fault, my mistake, and I normally strive to do a better job with my spelling in general. Thank you for pointing out my mistake so that I can be more cognizant of my future errors.
Holliday75 - Saturday, May 9, 2020 - link
Now I feel like a d*ck for pointing it out.In all honesty just poking fun and genuinely curious because I see this mistake made daily all over the place. Facebook, comments, even articles by professional journalists and a work email or two. I find it curious when I know the people who speak American English natively and still make this mistake.
Spunjji - Monday, May 11, 2020 - link
Well, Autocorrect is one answer - and the other is the paradoxical relationship between the long "oo" sound in in lose and the shorter "oo" sound in loose. It's hard to argue that the spelling shouldn't be the other way around, although I have no doubt would still trip over it even then.notb - Thursday, May 7, 2020 - link
Idle power draw is atrocious. How can it be this high?It's not even that I'm worried about the unnecessary electricity use or noise (which could make an analogous APU a lot less interesting for HTPC and NAS).
I'm just really shocked by interconnect using 16W when the cores it joins are hardly doing anything.
Does anyone know what is I/O die doing? Is there a 10W ASIC mining bitcoin or what?
eastcoast_pete - Thursday, May 7, 2020 - link
Hush! You're spilling the beans here (:Actually, if AMD had a highly efficient ASIC mining chip with good hash rates, I'd consider buying some. Same goes for Intel.
notb - Friday, May 8, 2020 - link
Actually Intel is a major FPGA maker, so you can get one of those. It's not that hard to find an open-source coin miner (even on GitHub).The comment stand though. I googled a bit and there's no clear explanation for the high idle uncore.
And 8-core mobile Zen2 chips use maybe 3W in idle. It's not like their IF is a lot slower or has less to do.
This makes me wonder if we're even going to see desktop/server 35W chips? Not to mention it would be nice if they offered TDP down of 25W...
Suddenly, I'm a lot less interested in an AMD-powered home server or NAS (and BTW: EPYC embedded lineup is still Zen-only).
kepstin - Friday, May 8, 2020 - link
If they do make desktop 35W chips, they'll probably be based on the integrated APU die. I suspect the increased idle power is due either to off-die IF link to the IO chiplet needing more power than IF within a die, or perhaps the (14nm) IO chiplet itself having higher power usage.notb - Friday, May 8, 2020 - link
I'm OK with this kind of uncore under load (it's how Zen works).And I don't really mind high idle in workstation CPUs. It's an acceptable compromise.
I just assumed that they'll adjust this for low-core CPUs, since these often go into home PCs used casually - spending a lot of time at idle / low. And under a cheap, slim cooler there will be a difference between 5 and 16W.
AMD will have to fix this in the APUs if they want to take on low-power segments (NAS, HTPC, tiny office desktops).
AFAIK Zen2 APUs will use the chiplet layout, not monolithic approach from the mobile offer. Hence, OEMs will probably use mobile chips anyway. DIY customers may have a problem.
Holliday75 - Saturday, May 9, 2020 - link
We've seen updates addressing issues with previous Zen CPU's. Possible it could be a miss on their part of just didn't have the time to tweak it before release.Namisecond - Thursday, May 7, 2020 - link
Thanks for detailing the two new AMD CPUs. Any news on the new desktop APUs though? I'm hearing rumors of up to 8 cores but the GPUs on them will be worse than the previous generation.psychobriggsy - Thursday, May 7, 2020 - link
They'll be based off on Renoir. So 8 cores, 16 threads, with 8MB L3.In mobile, Renoir's GPU has outperformed the predecessor, despite having fewer CUs, because of improved clocks. I'd say it's likely desktop Renoir will outperform the predecessor in graphics at the same price point, but not dramatically.
Namisecond - Thursday, May 7, 2020 - link
So, if we go by mobile Renoir, a max of 8 CU graphics?lightningz71 - Friday, May 8, 2020 - link
Yes, but, it was trivially easy to run the 3400g gpu at 1600mhz and run the ram at 3400/3600 speeds. Assuming that the gpu of the “4400” apu gets 8CUs at about 2000 MHz, it will have less total processing power. Assuming that it can’t typically run the ram much faster than 4000 speeds, it won’t have much extra bandwidth. My best guesstimate is that it performs marginally better than the 3400 in gpu limited tasks purely for having better ram support and less processor memory contention due to the larger L3. However, games are rarely entirely gpu limited and having the much improved zen 2 cores will make things markedly better.I base a lot of that on the benchmarks of the 3500u vs the 4500u, which are very roughly comparable in resources. The 4500u is consistently faster, though not by much.
Spunjji - Monday, May 11, 2020 - link
I'm expecting much the same as what you outlined here. A significant improvement over the 3400G in CPU performance and gaming for stock configurations, but with limited gains over overclocked systems.Koenig168 - Thursday, May 7, 2020 - link
Hmmm ... the 3300X is doing better than I thought it would. Would appreciate some benchmarks with games that benefit from more cores/threads. Great article. I find the part about the difference between 3100 and 3300X particularly interesting (I had wondered about the difference between the two CPUs to warrant the price difference).EdgeOfDetroit - Thursday, May 7, 2020 - link
Do those games actually exist?eastcoast_pete - Friday, May 8, 2020 - link
The 3300 is a fully functional 4 cores on one die, the 3100 is 2 cores on two (otherwise defective) die. Thus, the 3100 needs to use the interconnect a lot, which slows it down a bit.extide - Saturday, May 9, 2020 - link
No, it's still a single die, just spread across both CCX in that die instead of just one.extide - Saturday, May 9, 2020 - link
(But otherwise yes you're correct because even within a single CCD the cores must travel out to the IO die and back to get to the other CCX)Deicidium369 - Friday, May 8, 2020 - link
Suggest 3 or 4ksec - Thursday, May 7, 2020 - link
It was only yday I asked on forum what is happening to Intel 7nm CPU. We know Tiger Lake is coming, then there is Alderlake. And that is it.Again, despite all these, AMD needs to "sell" better. The results from their quarterly report are no way near good enough.
outsideloop - Thursday, May 7, 2020 - link
Hardware Unboxed includes the 9th Gen i3 and i5 parts against these new Ryzens, in their testing.CrystalCowboy - Thursday, May 7, 2020 - link
About the test setup: No PCIe 4.0 graphics cards. No PCIe 4.0 NVME SSD. You are handicapping these CPUs by not letting them take full advantage of their features. If an older or lesser CPU cannot support these features, well then it deserves to score lower for it. You did use DDR4-3200 RAM, thanks for that.Ian Cutress - Thursday, May 7, 2020 - link
Users with a $99 CPU are going to use a PCIe 4.0 SSD? really?How do I keep the storage element consistent between tests then, to make sure I'm actually testing the CPU? How do I keep that storage constant for CPUs 10 years ago?
Makaveli - Thursday, May 7, 2020 - link
Yup Ian,That complaint is ridiculous, almost no one is going PCIe 4 storage in a budget build.
MDD1963 - Thursday, May 7, 2020 - link
can't wait for a water block equipped X570 for $800 and the R3-3100 to get the best OC's possible with muh PCI-e 4.0 storage......!!!! :) (Who cares if PCI-e 4.0 drives sometimes fare 1-3% worse than the 970 EVO in some real world comparisons!)eastcoast_pete - Friday, May 8, 2020 - link
Maybe it's because after buying a PCIe 4 capable MB and a PCIe 4 SSD, I wouldn't have any money left to buy a CPU for more than $ 100? Kidding, of course, this challenge makes no sense.That aside, it would be interesting to see what kind of CPU can actually make good use of PCIe 4 capable MBs and fast storage.
Deicidium369 - Friday, May 8, 2020 - link
$500 Car w/ $10,000 rimsMDD1963 - Thursday, May 7, 2020 - link
Yes, PCI-e 4.0 SSDs would have help *so much* on ... gaming frame rates.... <exaggerated overtly obvious eye roll> :)eastcoast_pete - Thursday, May 7, 2020 - link
Thanks Ian! If possible, please add some performance numbers for the current i3 and i5 in. Right now, AMD owns the below $200 space for desktop CPUs. Also, data from other websites that had some i5-9100 on hand show that the 3100 A.K.A AMD's leftover dies, are outperforming Intel's offerings here.Really hope Intel steps up, and soon. I'm hoping to buy something later this year, so whoever gives me the most bang for my buck gets my money.
Ian Cutress - Thursday, May 7, 2020 - link
Unfortunately, we never got any of those. I'm recently stretched six ways from Sunday. Pulled an all-nighter just to even get to this point in the review process. As much as people would love me just to bench CPUs all day every day, even in lockdown I've got these CPUs, EPYC, Motherboards, Xeons, laptops to test, as well as news coverage and all the behind the scenes stuff no-one ever sees. Writing isn't a quick process, either.destorofall - Thursday, May 7, 2020 - link
Surely Ian you can just use the AI wirter now :)Lord of the Bored - Thursday, May 7, 2020 - link
The AI Writer read the comments and now just fanboy-flames.eastcoast_pete - Thursday, May 7, 2020 - link
Appreciate the reply! I think the fact that you never got those current-gen i3s and i5s is not on you, but on Intel. If they want their stuff reviewed, they know they need to send some samples. Unless, of course, they're afraid of the test results. Which they might just be.Namisecond - Thursday, May 7, 2020 - link
I just noticed in the "AMD 500 SERIES CHIPSET PROCESSOR SUPPORT" chart; 4000 series/Zen2 based desktop APUs are not represented. An oversight? or is AMD trying to say something?qwertymac93 - Thursday, May 7, 2020 - link
There's got to be something strange going on with the 7700k system. In several benchmarks the 6700k outperforms the 7700k, even though the only difference between them is the 7700k is clocked higher. Under no circumstances should the 6700k outperform the 7700k.Were the Skylake and Kaby lake systems tested with different motherboards or with different BIOS revisions? Its possible some security patch was active on one system but not on another.
EdgeOfDetroit - Thursday, May 7, 2020 - link
You should have benched it against a Xeon E-2174G. At least that's the most modern 4C8T CPU Intel sells right now. But I look forward to seeing how it does against the i3-i3-10320 to see if Intel still has the IPC-clockspeed crown or not.schujj07 - Friday, May 8, 2020 - link
The performance of the i3-10320 will be very similar to the 7700k. The i3 has clock speeds of 3.8/4.6 and the 7700k is 4.2/4.5. That means that on single threaded the 10320 will be slightly faster but in heavily threaded work loads the 7700k will probably be faster due to the higher base clock. This is know because both CPUs are on the Sky Lake architecture and will have the same IPC. Therefore we can infer what the 10320 will do based on what we see the 7700k doing in this review.Sushisamurai - Thursday, May 7, 2020 - link
I think it'd be nice to see generational to generational improvements on intel vs AMD's side of things, you guys use to do that everytime a new generation came out. It'd be nice to see how far my 4th gen Intel chip has gone vs a new gen now.Maxiking - Thursday, May 7, 2020 - link
Again your garbage reviews. Apparently all the cpus on the world are bottlenecking 1080 gtx @ 1080p except 3300x.Do you even think when checking the results. This thing happens constantly, especially when you test low end garbage amd cpus.
Korguz - Thursday, May 7, 2020 - link
well maxipadking, if this site pisses you off so much, and all you do is whine about how bad the reviews are, and how they test low end garbage cpus, why do you even bother coming here ? it is just to be a biased intel shill ?? go back the site that praises your god intel.Deicidium369 - Friday, May 8, 2020 - link
Look it's the resident, no life AMD shill in his natural habitat - offering absolutely nothing to conversation and just sniping.. he will also follow you to other forums - he's a creepy little guyKorguz - Friday, May 8, 2020 - link
your funny man, go look in the mirror, and work on getting your OWN personal facts straight.Spunjji - Monday, May 11, 2020 - link
I am enjoying how "I got caught shitposting in multiple forums" went through Deicidium's flamebot troll filter and came out as "watch out, this guy will stalk you".dwade123 - Thursday, May 7, 2020 - link
It’s crazy how Skylake is still the fastest for gaming. Beats having to spend and spend on endless minor upgrades with AMD... and still be slower in gaming ROFL.Makaveli - Thursday, May 7, 2020 - link
And maybe that matters to man children who never leave the basement. For the rest of the adults, price/performance matters more than a few extra fps. And even more so now with alot of people being layed off due to the pandemic and trying to save money.stardude82 - Thursday, May 7, 2020 - link
Heck, for most people a 10 year old Lynn field is good enough.rUmX - Saturday, May 9, 2020 - link
I just recently rebuilt a i5 760 for a friend, and you're absolutely right. It is still a pretty quick cpu for most users for basic tasks. However its super slow compared to even Zen 1. Part of that is the low clocks, and lacking boost. My Ryzen 1600 non-AF runs circles around it even single threaded.Spunjji - Monday, May 11, 2020 - link
When the best reach you can manage is "My favourite company's abject failure to improve performance is a plus, actually."ExarKun333 - Thursday, May 7, 2020 - link
This is AMD 90nm all over again.Spunjji - Monday, May 11, 2020 - link
Those were the days!stardude82 - Thursday, May 7, 2020 - link
What I get from this is my 9 year old Sandy Bridge i7 is probably good enough for modern games when not playing with a $1000 video card.Spunjji - Monday, May 11, 2020 - link
Quite correct, if it's overclocked. There are a few exceptions, though, and there will be more over the next few years. Sandy was a beast.stardude82 - Thursday, May 7, 2020 - link
Hey, Ian, for my edification, what happens if you turned HT/SMT for one of the quad-cores? Do modern games tank?boozed - Thursday, May 7, 2020 - link
It's been a while since I've seen game benchmarks at 800x600!watzupken - Thursday, May 7, 2020 - link
It is impressive that a low end AMD chip soundly beat the ex flagship, 7700K. The 3300X is certainly the better option to go for between the 2 entry level Ryzen.Alexvrb - Thursday, May 7, 2020 - link
I have to chuckle a bit when I see Dwarf Fortress as a "new" benchmark. I doubt it uses any modern processor to its fullest.Ian Cutress - Friday, May 8, 2020 - link
Over the last couple of years it's been requested a few times. I finally got around to sorting out a benchmark for it and adding it into our automated script. Seems to work almost flawlessly on any system I'm testing it on. That big gen test can take 1hr+ though.Deicidium369 - Friday, May 8, 2020 - link
Dwarf Fortress is the gaming equivalent to dentists - pure sadismSpunjji - Monday, May 11, 2020 - link
The results seem to indicate that it prefers clock speed over either threads or IPC.alufan - Friday, May 8, 2020 - link
just curious if I follow the "bench" link it shows Intel at the top of the stack in the opening page, yet when I choose to look at the actual results with the drop down then the results change, yet the opening page is from a benchmark in which the Thread ripper has not even been tested on, the whole industry recognises that its probably the single fastest chip out there for the HEDT platform yet your opening page shows Intel at the top and no 2020 results, once again this looks like careful manipulation of the results and the casual viewer dropping on the page just sees the top 4 out of 6 positions taken by Intel with TR2 mixed in and no mention of TR3 not a very fair page and it gives a poor impression and a possibly misleading impression to folks who know no better and instantly get the impression Intel sells the highest performing CPUs which we know is not the case anymoreqwertymac93 - Friday, May 8, 2020 - link
There are two versions of "bench" for CPUs. "CPU" & "CPU 2019". You need the 2019 one to see the more recent results.alufan - Saturday, May 9, 2020 - link
and thats what I did clicked the link in the article and ended up with a page showing intel as having all the top spots which we all know is no longer the case...that was my point the opening summary page should reflect the results not the results of 2 years ago, and its now month5 of 2020 not 2019 "latest" results should be 2020zodiacfml - Friday, May 8, 2020 - link
This 3300x is something beats the six core 2600. In some reviews, it is equal to 3600 in games while a slightly behind in rendering tasks. I have already decided that a six-core is minimum for me since the 1600 but this...andrewaggb - Friday, May 8, 2020 - link
It's pretty good for what it is but for a cheap PC, intel has graphics. For a cheap gaming PC it's a bargain now but probably won't age well with 8 core consoles coming out this year. If you can afford the 3700x (or better), that should last 8+ years for gaming.Tchamber - Friday, May 8, 2020 - link
The Ryzen 3 3200G had preside graphics, too.ahenriquedsj - Saturday, May 9, 2020 - link
Good job.Mugur - Saturday, May 9, 2020 - link
Ian, sorry to say this, but you must find another organisation for you. Anandtech is just the ghost of what it was. You need at least what every youtuber has to conduct a decent set of benchmarks. You need to buy cpus, videocards, etc. for decent testbeds when they not sampled to you. I'm sick of seeing obscure outfits with every cpu and gpu possible, while a real expert is using a 1080 etc.Rudde - Saturday, May 9, 2020 - link
Will you update the conclusion to include 3100?lmcd - Saturday, May 9, 2020 - link
That board compatibility diagram must be flawed because my B350 board from ASRock has validated support for everything in the 3XXX series.Death666Angel - Sunday, May 10, 2020 - link
No. Official AMD support and motherboard manufacturer support are two different things. As stated in the article.lmcd - Sunday, May 10, 2020 - link
I misread the paragraph below it, but in general it's weird for AMD to put out a diagram quite that misleading. The ASRock AB350 was ~$120 when I bought it and is ASRock supported for the 3900X -- surely a decent percentage of boards can support most Zen 2 processors barring power constraints for the 16 core if a cheap budget build can?alufan - Monday, May 11, 2020 - link
Not true the AM$ socket will support all Ryzen chips however not all features are available on all boards such as gen 4 as this is a specific development that was not available when the 1 series launched, also the limitation is on the power system of the board not in AMDs specs"CHIPSET FEATURES: Note that not all processors are supported on every chipset, and support may require a BIOS upgrade. See your motherboard manufacturer’s website for compatibility"
I have a 3 series running in my A320 media pc in my lounge updated the bios and it works fine however i suspect if i tried a 3900 it would not have the power circuit to support it, the other issue is the bios chips in some of the older boards cannot store enough information to allow all the chips to be used, so strictly speaking the issue is with the board supplier.
trenzterra - Sunday, May 10, 2020 - link
I'm still stuck on the i5-6600K which I built back in 2016. Thought it would serve me well for many years to come given the state of Intel and AMD at that point in time, and that my previous i5-2400 lasted me a good number of years while still being competitive. Now barely four years later it's obsoleted by a 100 dollar CPU lol.lmcd - Sunday, May 10, 2020 - link
It's far from obsolete, even if it's regularly beaten. I'm still using my Sandy-E processor when I'm unopposed to simultaneously running a space heater -- it's just a question of whether you need the latest and greatest.watzupken - Sunday, May 10, 2020 - link
Actually looking that the performance of these 4 cores chip, I can't wait to see an APU with it. Even the 4 core APU will be great for every day usage, without a graphic card. I just hope they give the 4 core version a decent graphic option, rather than a Vega 6.TexasBard79 - Monday, May 11, 2020 - link
A very good review, quite in line with the others. Ryzen 3 3300X is a nasty game-changer.TheJian - Tuesday, May 12, 2020 - link
Please stop running tests that appeal to less than 5% of your audience (and I think I'm being generous here). Crysis on cpu? Who cares? What does it prove I can do today? Dwarf fortress?? WTF? Quit wasting your time and ours. AI ETH tests? What for (farms do this now)? How many tests do you need that show NOTHING to any of us?People should get the point. You are irrelevant at some point if you keep posting crap nobody cares to read. Ask toms hardware :) Oh, wait, you guys are toms. ;)
How about testing 20 games at 1080p where everyone plays. :) Is it too difficult to ask a few pros to make a script for photoshop/premier/AE to test under AMD/NV (cuda vs. OpenCL or whatever is faster on AMD)? It is almost like you guys seek benchmarks that nobody could possibly find useful IRL.
"provide a good idea on how instruction streams are interpreted by different microarchitectures."
Your PHD project tells me how these cpus will run in WHICH PRO APP? Why not just test a PRO APP IRL? Bah...fake news. Not sure why, AMD wins everything right now. Why hunt for fake tests that mean nothing? How many people use Agisoft instead of PhotoshopCC for $10 a month?
Still ripping at crap modes nobody would actually use. Again tells us nothing about what we REALLY do usually. Only a retard uses FAST settings in handbrake for anything but a 15fps training vid.
"We are currently in the middle of revisiting our CPU gaming benchmarks" and upgrading to 2080ti. Can't happen soon enough, please make sure you test games that sell over 1mil ON PC or don't bother. If the sell poorly or are poorly rated, there is no point in testing them. Test what people PLAY, at settings people really use. 720p low shows what to a person who will NEVER play below 1080p? Oh wait, I just described 99% of your audience, as I'm quite sure they haven't played 720p in ages. So much wasted testing. Stop testing 4k and test more 1080p/1440p (1440p still almost useless, wake me at 10%).
"Some of these new benchmarks provide obvious talking points, others are just a bit of fun. Most of them are so new we’ve only run them on a few processors so far. It will be interesting to hear your feedback!"
Please quit wasting your time. It feels like all your benchmarks are "for fun" as I'm not much smarter after coming here. Off to a site that tests a dozen games and some real world stuff some of us actually use (techpowerup for example...games galore, 10 REAL games tested). THIS is how you give a well rounded idea of a cpu/gpu perf. YOU TEST REAL STUFF, instead of your PHD crap or agisoft junk. People use adobe, and play games that SELL. This isn't complicated people.
Might as well jump off the roof with your cpu and tell us how fast you hit the ground. Just a useless as your benchmarks. Are they benchmarks if nobody uses them? Or is it just more "fun" crap tests that tell us nothing useful? If you are NOT helping me make a more informed decision (useful info) about buying the reviewed product, you have failed. A good review is chock full of useful info related to how we actually use the product, not a bunch of crap nobody cares about or use IRL.
https://store.steampowered.com/app/975370/Dwarf_Fo...
The devs make 3K a month from it. This is not exactly played by the world if it pulls down $35K a year. Why even bother testing this crap? Are we all going to go back to pixel crap graphics tomorrow? Heck now. Wake up. Those games (and the shite monitors we had then) are why I needed lasik...ROFL.
Spunjji - Tuesday, May 12, 2020 - link
"Only a retard uses"And that's about where I realised you weren't really making a comment so much as farting into a piece of voice recognition software.
Meteor2 - Tuesday, August 4, 2020 - link
I wonder if even one single person ever read that commentLMonty - Tuesday, May 12, 2020 - link
Anandtech, can we please have an upvote/downvote system in the comments section? Seems to work very well at Arstechnica in drowning out the trolls. Thanks.Spunjji - Wednesday, May 13, 2020 - link
Seconded.silverblue - Tuesday, May 12, 2020 - link
I imagine this has been mentioned elsewhere before, but why does Zen fare so badly, specifically, in 3DPM v1? Additionally, Geekbench 4's MT test is shown twice, once in place of 3DPM v1's MT test.Scubasausage - Thursday, May 14, 2020 - link
Good to see my old 4790k featured. I can see what I would get for my money if I upgraded. And judging by these results it isn’t really worth the money to upgrade it. Thank god really because I’ve been burned by AMD so many times that if I can I will avoid them. But right now you really can’t avoid them! Good thing I don’t need a CPU!A nice 30xx series Nvidia card will be a good replacement for my RX580 when they release. Hey, might even be able to use my 4K monitor in 4K!
Duto - Friday, May 15, 2020 - link
I love the free market!ReallyBigMistake - Sunday, May 17, 2020 - link
How much did Cyrix/Centaur/VIA pay you for this article? (I forgot to mention Rise lol)DominionSeraph - Sunday, May 17, 2020 - link
I can get i7 4770/4790 systems for $120. AMD's going to have to do better than +10% performance to get me to pay triple the price over Intel's old stock.indiandablu - Wednesday, June 10, 2020 - link
Chancellor Portal is a portal to get admission in Jharkhand State. For more information, visit https://www.jharkhandjob.in/chancellor-portal-jhar... all the informations are given here. Students can also learn more about computer and technology after getting admission.pcgpus - Friday, July 10, 2020 - link
Nice review. 3300X looks very nice, but 3100 is not that cool.If you want to compare this article with other services You have to go on this link:
https://warmbit.blogspot.com/2020/07/amd-ryzen-3-3...
There are results from 15 services from 38 games!
After page load please pick up your language from google translate (right side of page).