Ummm don't trust much in AMD marketing slides. AMD draw less only because the all core setup is unable to run at high clock, they barely can go a little over 4Ghz all core. Intel all core setup can go near 5Ghz for a short period (or longer if cooling setup allow this).. So at he end there is not this high peformance watt advantage they are saying, expecially because they are stuck to CB and do not show others benches to support their numbers. Bet on other workloads Intel is better than AMD in efficence. The long story of benches......
If you can handle the heat Intel CPU can run very high my i9 7900x is running at 4.8ghz all coees albeit quite hot under load but still high frequency for all core with some offset for AVX 512. I believe that with direct die cooling results might be even better.
Right now the 3700X has near identical performance to a 9900K, they are within 5% of each other typically, and the 3700X draws 1/2 - 1/3 the power of the 9900K. This is when they are both running stock performance. That means that the Ryzen has far better efficiency than the Intel.
Intel is trust worthy? 9900K has 95W TDP, but out of box without any overclocking it runs 4.7GHz all core turbo drawing over anywhere from 150-190W depending on motherboard.
How many times you need to be told, Intel CPU runs turbo out of the box by default without any mess with BIOS? What's the point of having a TDP at a speed that your processor never runs at?
airdrifting looks like you still need to be told, intels TDP is at BASE clock, WITH OUT turbo, and WILL use more power even with " runs turbo out of the box by default "
What's the point of TDP at a "base clock" that the processor never runs at? Please allow me to put this in a language even you can understand: You brag you can last 30 minutes when in fact you only last 20 seconds, when girls call out you bs you claim "oh my 30 minute record was done when I was given superman power." Now here comes the question: Do you last 20 seconds or 30 minutes in reality?
" What's the point of TDP at a "base clock" that the processor never runs at " cause thats where intel gets its TDP spec from, which IS from the BASE clock, Anandtech even did an article on this here : https://www.anandtech.com/show/13544/why-intel-pro... i would suggest you read this. then maybe you would actually understand.
2nd GROW UP. the fact you now resort to insults, further shows you know you are wrong, and you have to resort to a VERY childish analogy.
and i quote from the above link to the Anandtech article : " For any given processor, Intel will guarantee both a rated frequency to run at (known as the base frequency) for a given power, which is the rated TDP. This means that a processor like the 65W Core i7-8700, which has a base frequency of 3.2 GHz and a turbo of 4.7 GHz, is only guaranteed to be at or below 65W when the processor is running at 3.2 GHz. Intel does not guarantee any level of performance above this 3.2 GHz / 65W value. "
more like you are not capable of reading... and proves you are wrong.. your pride to high to admit it ?? the quote from the article, alone, proves you are wrong.. nuff said..
The point is simply to be able to have "95W" shown next to the CPU in a benchmark slide while the performance result is based on "far beyond 95W" clock speeds.
This was not the case with Intel's pre Ryzen CPU like the 7700k btw. Go figure why this has changed.
And? They never claimed the processors could do turbo within the TDP. It's based on the base clock and obviously any turbo will use more power. That's how they've always rated TDP. It's been known for years. Turbo have always been a "bonus". If the board can supply enough power and the temps are low enough, then it'd clock higher. Simple as that.
So based on this, AMD could claim their processors are all 1 Watt - based on course on a 200mhz super-low C-State - but still true that they consume 1 Watt at this speed.
The fact it will never run at this power or speed is irrelevant, they could use the Intel system to the extreme to claim the performance crown with a 1 Watt processor - everything above 200mhz is just a bonus!
Woah! hold your horses. Do you trust Intel marketing more? Intel TDP is at non-boosted clock. Boost all cores and exceed TDP and you're voiding warranty essentially.
Why there are always idiots like you spreading false information without even owning the CPU or know the basics of current gen hardware? The boost is done automatically without you doing anything in BIOS, all Intel CPU essentially run boost speed right out of the box, so you are saying everyone voided their warranty for doing nothing at all?
Why do you keep insulting others? Turbo does not void any warranty. Boards are not supposed to boost beyond the intel specced clocks.
However, there is a non-spec OEM implemented feature on some boards, usually called MCE (multi-core enhancement). This option would cause the board to boost all-core clocks to the single-core boost clock which does break the spec. Realistically, it shouldn't be enabled out-of-the-box but some board makers apparently don't care. Your beef should be with them, not intel.
eddman, why does he keep insulting others ?? cause he is a child, and when proven wrong, or cant prove what he says, this is his only recourse.. maybe it makes himself feel better...
Intel's TDP is a useless figure. Given the nature of competition, AMD is going to increasingly follow suit with their own less-than-useful TDP. With that being said: As of today, Intel's high end chips eat far more power at stock settings, "base" TDP be damned.
Well let's just get this straight, Intel processors have to clock high just to match a lower clocked AMD processor. Ryzen 3000 series vs Intel 9000 series is proof of that.
Second, you don't seem to realize that overclocking does not improve performance per watt. The gains yielded by OCing a 9900K for example to 5.1 GHz are around 3% while power consumption increase by 30%. Simple math tells me performance per watt decreases.
Let's be frank here, Intel is just barely hanging onto the best gaming CPU crown right now but in every other category they loose. It is not even remotely surprising if they loose in a battle of core count, which is what HEDT is, as AMD's architecture is designed to scale.
Their performance / W comparison is based on results vs. power use, so perfectly valid. Does not really matter at what clock speed each CPU ran if it was at their best (stock).
As for there only being CB shown - yes, that is indeed only one case. Wish they had included more.
I remember seeing some reviews mentioning power hog chipset on AMD side pretty much nullifying any advantage they gain on pure CPU. Ie. while cpu was more efficient platform itself in the end was less... Is that still an issue?
Because you can run the third generation Ryzen processors on first or second generation motherboards, the issue of high motherboard power draw can be avoided, and you only lose PCIe 4.0 speeds. I am running my Ryzen 3900X on an Asus ROG Crosshair VI Hero motherboard, no problems at all, and no need for an active cooler for the chipset.
The difference between these CPUs is over a 100w, not 15. Simple math tells me the chipset is not going to make up the difference even if it was running at it's max of 15w.
In addition, no one said you had to use the X570 chipset with this processor. If you don't need PCIe 4.0, go with a cheaper motherboard. If you do AMD is the only choice right now .
omg still folks dont get it the AMD chip does as well as or in many cases better at a lower clock and lower voltage that means it uses its power more efficiently to do virtually or in some cases more of the same work whats so difficult about that to understand?
Given that a core count race seems to be the new Mhz race, what type of casual everyday programs can we expect to take advantage of 16/32 or more cores? Are there any game engines that can meaningfully use 32 cores? I can seet browsers taking advantage of high core counts trivially by being able to remain performant with many tabs of JavaScript heavy pages open. What else could potentially use 32 or more cores?
You mean other than Chrome and Electron apps on the system ?!? On a serious note, can't think of anything other than Virtual Machines, Image/Video Editing and Debugging in IDEs but I don't know if that falls under 'casual everyday programs'. But again they could be for the person buying a 16-core processor for casual everyday use :)
Audio also prefers that those cores are all on the same die, otherwise you can't get full utilization of the CPU at low buffer settings. Scan Pro Audio has found that the 3900X starts having dropouts at 70% utilization. I actually returned a 3900X because they're not that great for low latency audio production.
Is that right? I'd always understood that single core perf is extremely important in audio, not only for latency but because if you have a complex I/O chain (instrument running through multiple plugins) it has to stay on the same core. In Live if I have an instrument input that runs through a gate and compressor, then into another track that is running plugins (effects/loopers), that all has to be processed by a single core according to Ableton documentation. So although other instruments can be running on other cores, the single core perf is still a potential bottleneck (and frequently is!)
(The reason I use the example of routing audio through different tracks is for monitoring and looping at multiple points of the chain.)
I can't speak to all DAWs, it depends on which one you are talking about specifically. But I've seen good results with FL Studio with more cores and threads versus improving single threaded perf.
While not really needing that many cores by a long stretch I am amazed at how some Antivirus software insist on analizing all the drives on my PC serially. I get that when there is only one drive it would be I/O bound but when I have multiple drives I don't see a reason they could not scan them in paral·lel. specially when I do a Scan on demand and don't expect to be doing anything else with the PC.
Who gives a shit about AV in 2019? If you're dumb enough to download and run pornhubvideo.jpg.mkv.exe then you deserve to get buttcoined or randomware'd
Congratulation! You're part of a 1% club who knows how to detect BS stuff! Good on you!!
For the rest of the population, those not hooked to forums like this, Anti-Virus programs are still needed. They help with many other things than downloading Trojan viruses. The easiest way to get most is actually phishing links. Even though at home I taught my mother how to look for BS links in emails, and I have a Pi-Hole VM to stop most ads, including mal-advertisement, there hasn't been a virus in our house, but I still bought a year of Bit-Defender for my mother's PC (And I may install it onto my own PC just to help clean it)
That's why I said "considerable part of buyers". I am sure that for professionals it is more than worth it.
For someone like me, however, it would definitely be a "want" CPU - as far as my needs / use cases go, a 3700x or even 3600 would be perfectly fine, however I want a 3950x because it is what it is.
The cool thing is that I can get a lower end Ryzen 3000 now (with a good main board) and upgrade to the 3950x later when it is offered at EOL prices.
One thing I like about the 3950X is that it's blurring the lines of what HEDT is, in a good way. Availability of this many cores on a mainstream desktop platform is a great incentive for developers to look for ways to use that power.
As Irata points out - your common-or-garden end user can purchase a 4, 6 or 8-core system and then eventually upgrade to 3950X at a (potentially much) later date when more software benefits. There's already been an uptick in software like games using meaningfully more cores since Zen first released; I'd anticipate that trend continuing, albeit acknowledging the difficulty of multi-core scaling for many tasks means the trend will likely slow down.
yep anything above 6 cores is gonna sit around doing nothing most of the time , for ordinary users. And ven for other users multi-cores are very under utilized... they have run out of ideas to make us upgrade, datacentres need it and HPC need it but home users? no way .. E-peen ?b.rites
TR is mainly for content creation and other things where more cores is beneficial. If games and browsers are your thing, just get a Zen 3 or Intel equiv. No point in spending on this.
A game like Kingdom Come: Deliverance shows that performance can drag with a lower number of CPU cores, but I don't know how well it scales up. I would expect that similar games with a lot of AI controlled NPCs would see a big benefit from additional CPU cores if the game is designed to use them.
TR 1950X user here. In general use, it's great at multitasking. You can be doing a lot of things at once and the system doesn't choke because there are always more cores. And the rare case (certain kinds of content creation) where a single program can utilize all cores, it's ludicrously faster than the 6-core Ivy Bridge "Extreme" I had prior. TR + fast storage + 32GB RAM is a dream machine for PC desktop, IMO.
As far as gaming goes, benchmarks tell the story. Game engines that can multi-thread draw calls running content that is mostly limited by draw calls see the most benefit, but even then it won't beat a fast Intel; TR will keep pace at best. Off-the-shelf game engines like Unity & Unreal run all of their game logic & world/physics simulation on one or two threads, so simulation games using those engines like Cities Skylines and Kerbal Space Program are ultimately CPU frequency-limited because the bottleneck lies in their gamesim/physics thread.
In 2017, 64GB seemed like overkill for a 16-core system. Even now, I never top out even as I'm running Unreal Engine (editor), 3DS Max, Ableton Live even simultaneously... Of course, YMMV. 16-core soon enough will merely be a top-end Ryzen, and 24-core & up is a decidedly different class of computer, and I suppose 64GB is appropriate for that.
I have a machine with 1024 cores and 2TB or RAM (no joke). I think the point of the comment above was that you don't *need* 64GB just because you need a TR.
That is why these new CPUs are a slippery slope for people. At the end of the day even the hardcore gamer really has no incentive to upgrade past a CPU like the 2600K from Intel that was so popular. AMD is trying to put the "business" and "consumer" cpu in different category, but every year it seems kind of comical in most regards. Not saying that they don't have a advantage, but like you said the performance is getting to the point that you will not actually care about Mhz or core count, and more of feature sets of the CPU.
Case in point: "3.5 GHz base frequency and a 4.7 GHz single core boost frequency; the overall all-core turbo frequency will be dependent on the motherboard used, the quality of the silicon, and the turbo in play."
The last CPU i had boost was a 486dx intel cpu that went from 33mhz to 66mhz Boosted..even at that time i remember the controversy of it all. What i'm saving is when you get to a point that you have so many cores and are itching for a measly 100mhz or so its time to focus on integrating new tech into the cpu.
Maybe your memory is foggy but the turbo on the 486 was a misnomer. It didn't make the cpu faster. 66mhz was the base clock. It halved the multiplier to run the cpu at 33mhz for games that didn't use the rtc. Many DOS games didn't support polling system time from the rtc because a lot of 386 and earlier systems didn't have them. So they kept time relative to the cpu cycles as a work around. I remember Liero did this and anything faster than a 16mhz 386 made the game actually run faster, to the point of unplayability. Unclocking the 486 via the turbo button helped, although on some games not enough.
True. They all just misnamed the button turbo :) Which even back in the the 286 days I had a turbo button it of course worked as you described albeit lower frequencies.
Platform features like PCIe version, SATA version, NVMe, some things that weren't even a sparkle in the eyes of engineers back in the Sandybridge days. Sure, if you only game, there's not too much of a difference, but there is a good one to say no to Sandybridge era CPUs in modern times. Hell, I wanna upgrade my 4790k, which is 3 years younger than Sandy Bridge.
The hardcore gamer have a lot of reasons to upgrade from Sandy, CPUs are faster, have more features, and newer stuff works better for newer OSs and such.
You have a point for the average person, CPU performance is pretty much a commodity. It took awhile but we have hit a point where people can reasonably afford more compute power than they need. However people who do real work like say video editing and don't have server farms these advances have been huge as they can never have to much compute.
This hasn't been true for around 2-3 years now, depending how you measure. Sandy had a long life (mainly thanks to its extremely conservative clock rates at stock) but it's way outside what I'd recommend to anyone looking to game even moderately seriously now.
Almost nothing. These things are for work. A reasonably fast quad core (anything Ivy Bridge or newer, really) is fine for almost any casual programs. Going up to eight cores makes sense for gaming since the new consoles will be eight core.
The consoles have been 8 core since 2013 dude. PS4 and xbone are 8 core. 8 slow cores, which should have prompted swift acceleration of multi threaded game engines.
Half of this console generation Intel was in the lead and they kept milking the core count. Until 3,4 years ago they were selling dual core cpu as i7 on mobile. It wasn't until AMD came up and basically showered everyone with 4 and 6 core cpus for half the price is when Intel dropped the BS and started offering real 6 core cpus in the lower tier consumer market and 4 core real 4 core cpus in mobile etc. I blame only Intel.
And we saw slow progress toward multi-threaded games throughout the generation. There are way more games today that can take advantage of 4+ cores than there were in 2013. It takes time to adapt game engines and not every kind of game will even benefit from more cores. All I'm suggesting is that if you play games you have some reason to go beyond four cores.
Jaguar uses complete cores, albeit "small" ones in terms of area - in design and performance terms they're somewhere between the old K8 Athlon 64 and K10 Athlon II processors.
I think the confusion comes in because the console implementation of Jaguar has 8 cores split across 2 "modules" which is the same terminology used for 'dozer, but referring to a different thing: Bulldozer module = 2 cores with shared FP resources Jaguar module = 4 independent cores, like a CCX in Zen
8 cores at 1.6GHz (PS4 as the slowest) is at best the same as 4 cores at 3.2 GHz, assuming everything else equal and perfect MT. Plus those cat cores were essentially half as capable as the current stuff, normalized by clock. Therefore, consoles have about the same as 2 proper desktop cores - the lowest end CPUs you can buy. Anyway, there are many games that use more than 4 cores these days. Especially stuff coming out now when also Intel started offering more cores and AMD having competitive if not superior chips.
Last I checked $2000 CPUs generally weren't for "casual everyday programs". Not really $750 ones either.
Performance hungry productivity applications can on the other hand make use of 16 quite commonly, though 32 is still a stretch. Then again there's some value to a workstation that's fully usable even when running a compile, render, or other multi-hour heavy workload.
People were saying the same thing 3 years ago about the 8 core Zen 1 CPUs and yet here we are, a majority of new games coming out utilizing 8 cores. Give it another 3 years, I wouldn't be surprised to see if that doubles again.
Agreed. What's more likely is that we'll see games / engines that depend on 4+ cores becoming commonplace, with maybe an outlier or two that can squeeze marginal gains from 8+.
With the Ryzen 7 release in 2017, and then the release of the i9-9900k, no one questions that we are now in the era where games and programs should at least be able to scale with 8 core/16 threads. Now, once you actually have a properly multi-threaded design, it becomes simple to just use a design to use more and more threads, and if you have fewer cores, no problem because the scheduler will just assign the threads to CPU cores.
You don't really target a given number of cores, you either go for a multi-threaded design, or you don't. Allow those who have a higher end processor get the advantage of more cores/threads, it doesn't HURT those with lower tier chips.
Absolute dominance, I love it. Crazy that I can use my 3700x now (which is already incredibly fast) and buy a used 3950x in a couple of years for an upgrade with double the cores (or just get a zen 3 chip).
Well, AMD is trying the Intel seat and it likes it. We can see that the 32 core part is now more expensive than the previous gen 32 core part, 2000$ vs 1800$.
While this is more than many had expected, the only Intel CPU / platform that comes remotely close to the TR3 platform is the 28C Xeon W-3175X, which costs $ 3,000 and requires a separate very expensive mainboard.
The 32 core TR2 had a very awkward memory architecture where not all the CPUs had direct access to memory and on many workloads it performed no better than a 16 core. If you wanted a "normal" 32 core CPU you'd have to buy EPYC server chips which cost a lot more for much lower speeds. So while you can't read it out of that spec sheet the TR3 is actually a much more capable product.
So the 3000G is basically a CS:GO cruncher on a budget 👍 Also it's good they're taking their time with the 3950X cause imho distant but realistic deadlines > watery "soon™" > short but unfulfilled deadline. Sadly AMD seems to have been thru all 3 stages at this point...
Not so sure about that, I would've probably upgraded to 3950X if it was there on the initial launch day, but now it feels it makes more sense to wait for Ryzen 4000/Zen 3 - it's only another 6 months. I upgraded from 2700X to a 3700X to tie me over in the meantime
It'll probably be more like 10 months as Ryzen release dates have been slipping back a bit with each new generation. I would expect Ryzen 4000 to be available mid to late Q3 2020.
I’m sort of wishing they announced a full line, even if they are only launching two in November. I have no idea if I want to wait and see what else is coming or buy now.
$55 --> $49 with slight performance boost? now can I please get any microitx board to make this most powerful smart tv? I wonder why we cannot get super small am4 boards for so long?
There are ITX boards out there for Ryzen, but considering that the processors themselves have tended to be more powerful, putting them into a small system can be problematic. The new 7nm processors solve some of those problems, though I wish AMD would have released 7nm APUs by now.
What are you talking about? A short trip to "geizhals.de" shows 16 mini ITX AM4 motherboards available, 3 X570, 2 X470, 3 X370, 4 B450, 3 B350 and 1 A320. There is even an mini STX motherboard, if you want to buy a whole barebones PC (DeskMini A300). And a smart TV has the smart stuff integrated. You are talking about an HTPC.
I really wish all the PCIe lanes were available at lower core counts. Its not hard to have a lot of I/O needs without needing a stupid amount of cores. It really sucks when you go to install an expansion card or NVME only to find out that its going to drop your GPU lanes from 16x to 8x.
My understanding is that Ryzen actually has 32 PCIe lanes available on the processor, at least in the 1xxx and 2xxx series. They only exposed 24 of them (16 GPU, 4 NVMe, 4 to the chipset) for some reason. Limitation of the socket?
It is obviously limited by the AM4 socket, though not arbitrarily - it was a conscious design choice. Some of them are used internally bt the SoC for SATA, USB, etc. Actually, If you take a look at Raven Ridge and Picasso all 32 lanes are indeed used: 8 to the discrete GPU slot, 4 for NVMe, 4 to the chipset and 16 to the integrated GPU, SATA, USB, etc.
Kind of irked that there is a new socket for TR. The Epyc side didn't need a new socket for PCIe 4.0 and that invoked a few changes too (like an additional PCIe lane for OoBM). At the very least, it would have been nice to have had the 3rd gen TR parts work in earlier motherboards for an upgrade path even if that meant that the IO was PCIe 3.0 only and the chipset link was effectively 4x PCIe 3.0. The centralized IO chip is likely a huge reason for the generational performance leap on top of the Zen2 improvements. That's something original socket TR4 owners probably want.
What are the benefits you got from your motherboard after upgrading from a Z170 to a Z390 motherboard? Let's look at the Asus Z170-P and the Asus Z390-P: 4 DDR4 DIMM slots x16 and x4 GPU slots Z170 has 2 PCIe x1 & 2 PCI slots vs Z390 4 PCIe x1 slots Z390 has 2 NVME slots (x2/SATA + x4) and wifi M.2 vs Z170 has 1 NVME x4 slot Z170 has USB C on the IO vs Z390 not but Z390 has more USB ports in general (other Z170 have more USB) and Z390 has USB 3.1 (10 Gbps) Z390 has RGB and a water pump header
That is not a lot of features you gain for a product that launched in 2015 vs 2018. Not many people need many USB 3.0 ports. At one point in time every motherboard upgrade would bring with it better USB and SATA speeds that actually mattered, better fan controls and better DDR speeds. But it's slowed down a lot. I could mod my Haswell board BIOS to allow NVME booting from a 15€ add in card, the board had USB 3.0, SATA III, good fan controls. When it comes to pure features, my X570 doesn't really offer anything more. And I'm sure a lot of people would have liked to upgrade just their 7700K to a 8700K, getting an automatic 50% increase in a lot of tasks and kept their own board. If AMD does indeed keep their AM4 platform for 4 generations with Zen 3 next year that will be a great accomplishment for them, the consumer and the environment.
We've really been conditioned by Intel not allowing it and doing little generation to generation to think that. I've upgraded in the AM3 days from a 2 core to a 4 core and then a 6 core. Some friends bought a first gen Ryzen CPU and now upgraded to Gen 3 and they are happy because PCIe is basically the only thing they miss out on. I also upgraded in the Athlon 64 days, I think I upgraded a socket 939 motherboard from a single core to a dual core, but I'm not too sure anymore. With Intel, there was no way to upgrade from a 2700k SB to a 4770k Haswell without a motherboard change. Realistically, there was no great motherboard feature upgrade in that time. Z68 boards already had USB 3.0 and SATA III, DDR3. The big jump came with the *Lake architecture that enabled NVME slots and DDR4. And then Intel allowed another generation of 4 core CPUs for that generation before breaking compatiblity when the core counts increased. It is always interesting when people argue against good features for seemingly no reason other than to defend a multi billion dollar corporation.
The answer to your question can be found on this site. The short answer is: yes. The long answer is: some tasks benefit from more memory bandwidth, but only if they're bound by that limitation in the first place. They're the sort of tasks you'd only do on a HEDT platform in the first place.
Nice write up! It's interesting to watch Intel drop further and further behind. 2020 should be a blowout year for AMD. Surprising that anyone is still buying Intel's old, massively security-hole-punched architectures, made on the old 14nm process, to boot. Processing performance, not MHz, has been the name-of-the-game since 1999. Hard to believe that in 2019 there are people who still don't understand what that means...;) I always thought it was rather elementary.
Umm, considering holes was fixed, and 14nm process means nothing to end users they are doing good. Intel is still a great CPU, it makes sense for lots of people. Upgrade path to AMD is not exactly cheap if you already have intel cpu..you got to get new mobo/ram as added cost to going amd.
Mhz is still king for flat game performance as well.
Sounds like you are just a uninformed on how things actually work in the real world.
" Mhz is still king for flat game performance as well. " not really.. ipc is also just as important.and right now, clock for clock, amd wins there. for the most part, ryzen is pretty close to intel even with the clock speed disadvantage... imagine if ryzen were at the same clocks as intel...
Yup because we should all cpu buying decisions based on what cpu offers an extra frame or two when most games on almost all mod tier cpus can do 75 to 100fps.
14nm is ancient and Intel is milking it thanks to fangirls who were happily buying dual core i7s on mobile until AMD came in and kicked them in the rear and finally made 6 core cpus the new low tier norm.
Don't be harsh... according to them, human eyes can only see 4 cores & PC is for gaming only. All Those multi-billions $ games apparently developed exclusively on i7 procs
"Umm, considering holes was fixed," -- are you sure? To me it looks like Intel still sell more CPUs with holes not fixed (in silicon) than those fixed. Honestly Intel message about this is all big mess...
Upgrading to a new Intel CPU also requires buying a new motherboard, and you'll only need to change RAM if you're going from DDR3 to DDR4 or your DDR4 is old and slow.
Basically, AMD is a better value option and AM4 still has a potential upgrade path to Zen 3. Sounds like your "uninformed" comment is projection.
I'm really curious how the 3960X and 3970X might game. Sounds like architecturally with the single I/O die, they shouldn't suffer from the NUMA issues of the 2970WX or 2990WX. They're (essentially or precisely?) the same chiplets as the 3900X, but have a greater IHS surface area. Their advertised boost is competitive with 3900X, it'll be a matter of what they can hold as a steady state. They also have a larger cache.
It'd be great if we could get all the benefits of the extra cores, PCIe lanes, and memory channels, without having to significantly sacrifice gaming performance.
Adjustable cTDP is a godsent, but only if it is run-time adjustable (not just a fixed setting in the BIOS as with some older APUs) and preferably with command line and API: That allows you to really tune the hardware between energy efficient batch processing and fast interactive response times, especially when you can take cores offline similarly at run-time to make sure you hit high-frequency points consistently with the remaining ones (and re-enable them later).
That would then turn most of the hundreds of fixed allocation SKUs Intel is selling into "CPU as code".
Of course these cTDP limits would have to be very closely observed (or be adustable), so you can control power vs. temperature limits, depending on if you want to allocate PSU overcapacity to increase boost speeds while there is thermal capacity or if you want a PSU that is energy optimzed, but has few reserves.
Why are the TRX40 motherboards all the same? Why aren't there different roles being played out with all of those PCIe lanes the motherboard manufacturer's are being given? Please make a motherboard for those of us who need lots of PCIe lanes and aren't just for gamers.
With the TRX40 boards I see 4x PCIe slot on almost every single board--total. Some have a 5th slot, but is a x1.
I have been waiting for something like the X299 WS SAGE/10G to come out on the AMD side for more than a year, but apparently that day will never come.
The slide with the Threadripper prices, reads "New TRX40 Platform With 88 PCIe® 4.0" 88 Lanes? Dose the Chipset use a MUX och do the CPU have all those lanes?
What's not to understand? It talks about the TRX40 platform, which is the chipset and the CPU as a whole and the article shows quite nicely what can be configured how. The CPU has 48 + 8 + 4 + 4 (=64) and the chipset has 8+8+4+4 (=24) which is the 88 number they come up with. 16 of those area already in use for the chipset communication, so 72 can be made available to the use however the motherboard manufacturer choose (as seen on the "AMD TRX40 Platform" slide). This isn't rocket science.
Very disappointed by two things: (1) lack of retrocompatibility with x399 (2) no clear upgrade route for a setup with 16c/32t; from 1950x (999$) one could go to 2950x (899$); but now there's a 50% premium to go to the first TR40 offering (3960x 1399 $). Going to 3950x+ x570 is not a solution when you need the pcie lanes provided by TR4. @Ian: did you hear anything about a future 16c TR 3000, which would overlap with the 3950x obviously ? (Intel has done so in the past, mixing up HEDT and mainstream)
Do you mean Intel has had overlapping HEDT and mainstream CPUs as in they had the same core counts? Sure, AMD had that as well, 8C TR is a thing after all. Or do you mean Intel had the same name for HEDT and mainstream CPUs before? Because a 16 core TR3 would fit in the 3950X naming scheme that is now taken up by the AM4 equivalent.And 3955X would look a bit messy to me. :D
I've no doubt there'll be a 16 core TR3, they'd get paid very well for a 4+4+4+4 dud chip combo but they probably want to clear the first wave of people that won't wait first. I'm thinking 2-3 months out like February or so, that's just me looking into the crystal ball though.
there's an interview of an amd senior technical marketing manager in pcworld https://www.pcworld.com/article/3305945/watch-the-... ; the absence of 16c was one of the first questions. He answered the absence of 16c is deliberate, in order to have a clear boundary between mainstream and HEDT. So the prospects of having in a few months 16c on TR40x are nil atm.
1030 GT is a point of failure, heat, cost, PCI slot. Intel produce perfectly good graphics for my need (and 500 users at my place of work) - for "free" AMD NEED to produce processors with 6 to 8 cores, decent computing power and a very very VERY simple GPU.
I'm happy with iGPU levels, as it stands, AMD do not have a product for me, it's sad.
B450-based motherboard, Ryzen 5 2600 or 3600 CPU, and an Nvidia 730 GPU makes for a great, silent office computer. And gives you triple-monitor support to boot. Or an AMD Radeon Pro Wx 3100 or 3200, which gives you even better multi-monitor support. Both are fanless. Add NVMe and 16 GB of RAM and you have a great, little, silent workstation.
While it would be nice to have more than 4 cores in an APU, it will be another year or so before that's available from AMD. Really hoping Zen2 chiplet design leads to 4-, 6-, and 8-core APUs.
First that TDP bullshit needs to go out of window from marketing when you are pushing for AIO OOTB.
~175W vs 9900K 210W both can be handled by a same PSU plus a same chassis and a goddamn DH15. What is this whining and shiny bs of TDP in a Desktop, cTDP yuck. Having reduced performance is bullshit esp on such high core CPU machine this is not a BGA macbook pro soldered garbage we are talking about nor a BIOS chastitized Machine.
Now price, $600 over 3700X ? For a superbinned processor at $800ish ? Why such high price. Also their 3800X doesn't make any sense no one recommends it, GN also ingores it. Yeah AMD just wants to squeeze out all margins, one can understand that due to 7nm costs plus AMD state of CapEx vs Intel. Still their mediocrr perf Improvements over their multi SKU confusion is bad, not to forget the insane expensive X570 chipset on 12nm.
I was in market for this year and I wanted to wait for TR plus Z390 future. Its a shame Z390/9900Kx platform is dead for LGA 12xx now a big kick in nuts that even CPU is outdated. With AMD I was ready for $700 Xtreme purchase too, going to OCN and seeing GB boards having BIOS issues and all of AM4 playing hit and miss with their beta testing bullshit BIOS patches rolling is a hell, to make it worse the DRAM is hit with Speed stability plus overall vCore, Even Trident Z at fclk, mclk, uclk plus DRAM is a huge pain of trial and error just because to get proper perf forget OC and that PBO, XFR2 marketing BS. And Ryzen Master requirement.
Intel Z390 Dark was my choice for other now with dead platform on one side and unstable on one with useless Gen 4..For now, yeah maybe future its good but chipset fan again, all OCN shows high RPM for that.
TR4 again same chipset fan bs with insane price at $1500+ base on top AIO, Mega expensive mobos at over +600.. With zero backwards compatibility, AND once 2021 DDR5 hits it's dead. The $2000 CPU is gone outdated.
HEDT Intel won this time 7xxx to 10xxx and stable platform. Gen 3 x16 2080Ti wont max it out..and Mainstream AMD is better but still expensive and huge fragmentation lineup and unstable.
Quantumz0d, wait.. you are calling AMD expensive ?? have you not seen, or remember the prices intel was charging for its cpus before Zen ?? how the fact that going from 9xxx series to 10xxx saw what some might call massive price drops ?? the top chip for 10xxx is 1k less then the 9xxx chips.
Its still 2019. DDR5 is easily almost 1.5-2 years away for average consumer. TR3 looks like a very solid platform for many HEDT users at a very respectable price. I think you're overreacting. I have a 1800x. I was going to buy the 3950x but if the perf isnt that much better I may just get the 3700x and keep everything else the same in the computer. Waiting for ryzen 4k may be a good idea for you but man that's a long wait.
This reads suspiciously like someone trying - at length - to post-hoc rationalize their pre-existing decision not to buy any AMD products. You don't need to justify that irrational desire to anyone else; just go with it! The attempt to do so ends up making you look more biased because you had to pick feeble, irrelevant and/or hypocritical reasons.
A correction may be needed- 3000G seems to be a Raven2 (released for embedded market this spring, and now also going into laptops as 3200U), not Picasso die. AMD.com lists 3000G as 14nm (RR, Raven2), not 12nm (Picasso). And also, Robert Hallock confirms it is 14nm here: https://youtu.be/NRSUYBjlBXw?t=4072
I am planning to buy Ryzen 3950X post launch, but i am planning to get X570 motherboard early, i just want to know will the x570 board work readily when i install 3950X ? since AMD has told a BIOS update is required, will i be getting a display atleast to update the BIOS? (OR) will manufacturers update the BIOS and re-launch the same motherboard, for this i may have to wait......not sure how this will go ?
3950X support has been baked in to all X570 boards since July (earlier, actually, if you include eval systems). The new updates will simply flash the most current microcode and AGESA optimizations. No worries if you plan to buy a board before receiving a 3950X CPU.
Well, now we know why the 3950X was delayed in availability. It is strongly recommended to use a large, nearly top-end, AIO water cooler with it, and it also needs the most recent BIOS updates to work. Still, it's an impressive chip even if it did take them a little extra time to get it right.
If they released the 3950x I already had bought it, but with zen 3 coming in q2 I think I will buy a ryzen 4000 laptop first and then I will replace my desktop, hoping that someone will make a premium content creator itx board for 16 cores CPU. I mean 2x nvme, 10gbe and thunderbolt
CPU package, potentially containing at least 2 chips (CPU or GPU), having DDR5 SODIMM slots on the 4 sides of the package to minimize physical wire length and thus latency and power, each slot on its own channel. 256 bit configuration makes decent built-in GPU possible, and that is what I specifically want - 8-core CPU chip (with large shared L3 cache) and GPU chip capable of saturating the 256-bit DDR5 memory (but other configurations are possible).
This package would only need power, PCIe and Thunderbolt 3/USB4 lines coming out of it (including carrying video and audio). Relatively small and simple connector with large contact separation as most contacts are for memory in today's CPUs. Simple MB with just a few layers. Cheap!
No chipset on the motherboard. No network, no Wifi, no separate sound (USB sound adapters are fine), no SATA controllers... Give me slots to install just what I want, and replace when I want (of course other MB models can contain stuff embedded).
1 PCIe4 (5 in the future) x16, two x8 (if any are populated, it is OK making x16 slot working in x8 mode. Min two more x4 PCIe slots and 2 x4 NVMe slots.
Give me graphics cards containing the same chip as in the CPU above, with 4 SODIMM DDR5 slots each. Able to supplement the GPU in the CPU module, not replace it! Up to 3 external GPU cards for 4x performance of the embedded one.
True modular approach a desktop system supposed to be.
Meet Girls Dating is the best website to find local girls in your near me area. lots of people using this site for one night stand, get laid tonight. No 1 Choice for married
women who looking for extramarital affairs. Best way to find local sex dating partner for fuck tonight. join for fun here you get amazing dating profile for fun tonight.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
171 Comments
Back to Article
Teckk - Thursday, November 7, 2019 - link
What is the peak power consumption for Core i9-10940X and 3950X given their TDPs 165 and 105W?deil - Thursday, November 7, 2019 - link
360W and 130W respectively if we should look at how they treated TDP in the pastdeil - Thursday, November 7, 2019 - link
I was not far from the truth:https://cdn.mos.cms.futurecdn.net/Y2BXYaZKaWsuFSWW...
300 intel
and
180 AMD
Teckk - Thursday, November 7, 2019 - link
Wow ! :|Gondalf - Thursday, November 7, 2019 - link
Ummm don't trust much in AMD marketing slides.AMD draw less only because the all core setup is unable to run at high clock, they barely can go a little over 4Ghz all core. Intel all core setup can go near 5Ghz for a short period (or longer if cooling setup allow this)..
So at he end there is not this high peformance watt advantage they are saying, expecially because they are stuck to CB and do not show others benches to support their numbers.
Bet on other workloads Intel is better than AMD in efficence.
The long story of benches......
Eliadbu - Thursday, November 7, 2019 - link
If you can handle the heat Intel CPU can run very high my i9 7900x is running at 4.8ghz all coees albeit quite hot under load but still high frequency for all core with some offset for AVX 512. I believe that with direct die cooling results might be even better.schujj07 - Thursday, November 7, 2019 - link
Right now the 3700X has near identical performance to a 9900K, they are within 5% of each other typically, and the 3700X draws 1/2 - 1/3 the power of the 9900K. This is when they are both running stock performance. That means that the Ryzen has far better efficiency than the Intel.airdrifting - Thursday, November 7, 2019 - link
Intel is trust worthy? 9900K has 95W TDP, but out of box without any overclocking it runs 4.7GHz all core turbo drawing over anywhere from 150-190W depending on motherboard.eddman - Thursday, November 7, 2019 - link
How many times it needs to be pointed out; intel's TDP does not cover turbo, certainly not a sustained one.airdrifting - Thursday, November 7, 2019 - link
How many times you need to be told, Intel CPU runs turbo out of the box by default without any mess with BIOS? What's the point of having a TDP at a speed that your processor never runs at?Korguz - Thursday, November 7, 2019 - link
airdrifting looks like you still need to be told, intels TDP is at BASE clock, WITH OUT turbo, and WILL use more power even with " runs turbo out of the box by default "airdrifting - Friday, November 8, 2019 - link
What's the point of TDP at a "base clock" that the processor never runs at? Please allow me to put this in a language even you can understand: You brag you can last 30 minutes when in fact you only last 20 seconds, when girls call out you bs you claim "oh my 30 minute record was done when I was given superman power." Now here comes the question: Do you last 20 seconds or 30 minutes in reality?Korguz - Friday, November 8, 2019 - link
" What's the point of TDP at a "base clock" that the processor never runs at " cause thats where intel gets its TDP spec from, which IS from the BASE clock, Anandtech even did an article on this here : https://www.anandtech.com/show/13544/why-intel-pro... i would suggest you read this. then maybe you would actually understand.2nd GROW UP. the fact you now resort to insults, further shows you know you are wrong, and you have to resort to a VERY childish analogy.
Korguz - Friday, November 8, 2019 - link
and i quote from the above link to the Anandtech article :" For any given processor, Intel will guarantee both a rated frequency to run at (known as the base frequency) for a given power, which is the rated TDP. This means that a processor like the 65W Core i7-8700, which has a base frequency of 3.2 GHz and a turbo of 4.7 GHz, is only guaranteed to be at or below 65W when the processor is running at 3.2 GHz. Intel does not guarantee any level of performance above this 3.2 GHz / 65W value. "
airdrifting - Friday, November 8, 2019 - link
Yeah, I see you are not capable of reading. Done.Korguz - Friday, November 8, 2019 - link
more like you are not capable of reading... and proves you are wrong.. your pride to high to admit it ?? the quote from the article, alone, proves you are wrong.. nuff said..Irata - Friday, November 8, 2019 - link
The point is simply to be able to have "95W" shown next to the CPU in a benchmark slide while the performance result is based on "far beyond 95W" clock speeds.This was not the case with Intel's pre Ryzen CPU like the 7700k btw. Go figure why this has changed.
eddman - Friday, November 8, 2019 - link
And? They never claimed the processors could do turbo within the TDP. It's based on the base clock and obviously any turbo will use more power. That's how they've always rated TDP. It's been known for years. Turbo have always been a "bonus". If the board can supply enough power and the temps are low enough, then it'd clock higher. Simple as that.eddman - Friday, November 8, 2019 - link
@Ian Will the website ever upgrade to a better commenting system?I was replying to airdrifting.
Father Time - Sunday, November 24, 2019 - link
So based on this, AMD could claim their processors are all 1 Watt - based on course on a 200mhz super-low C-State - but still true that they consume 1 Watt at this speed.The fact it will never run at this power or speed is irrelevant, they could use the Intel system to the extreme to claim the performance crown with a 1 Watt processor - everything above 200mhz is just a bonus!
cubesdating - Thursday, December 5, 2019 - link
https://hungrysingles.comhttps://cubesdating.com
https://imhookup.com
https://rompmilf.com
https://dateonlinegirls.com
https://rompgirls.com
https://coitusgirls.com
https://coitusdating.com
https://nichelgbt.com
https://hungrymingles.com
M O B - Thursday, November 7, 2019 - link
Have you read any of the Zen3 reviews? AMD is much more power efficient than Intel. Period.zangheiv - Thursday, November 7, 2019 - link
Woah! hold your horses. Do you trust Intel marketing more? Intel TDP is at non-boosted clock. Boost all cores and exceed TDP and you're voiding warranty essentially.airdrifting - Thursday, November 7, 2019 - link
Why there are always idiots like you spreading false information without even owning the CPU or know the basics of current gen hardware? The boost is done automatically without you doing anything in BIOS, all Intel CPU essentially run boost speed right out of the box, so you are saying everyone voided their warranty for doing nothing at all?eddman - Friday, November 8, 2019 - link
Why do you keep insulting others? Turbo does not void any warranty. Boards are not supposed to boost beyond the intel specced clocks.However, there is a non-spec OEM implemented feature on some boards, usually called MCE (multi-core enhancement). This option would cause the board to boost all-core clocks to the single-core boost clock which does break the spec. Realistically, it shouldn't be enabled out-of-the-box but some board makers apparently don't care. Your beef should be with them, not intel.
Korguz - Friday, November 8, 2019 - link
eddman, why does he keep insulting others ?? cause he is a child, and when proven wrong, or cant prove what he says, this is his only recourse.. maybe it makes himself feel better...Alexvrb - Friday, November 8, 2019 - link
Intel's TDP is a useless figure. Given the nature of competition, AMD is going to increasingly follow suit with their own less-than-useful TDP. With that being said: As of today, Intel's high end chips eat far more power at stock settings, "base" TDP be damned.Lcs006 - Wednesday, November 13, 2019 - link
airdrifting - You are right, if it makes easier to deal with "alternatively gifted".evernessince - Thursday, November 7, 2019 - link
Well let's just get this straight, Intel processors have to clock high just to match a lower clocked AMD processor. Ryzen 3000 series vs Intel 9000 series is proof of that.Second, you don't seem to realize that overclocking does not improve performance per watt. The gains yielded by OCing a 9900K for example to 5.1 GHz are around 3% while power consumption increase by 30%. Simple math tells me performance per watt decreases.
Let's be frank here, Intel is just barely hanging onto the best gaming CPU crown right now but in every other category they loose. It is not even remotely surprising if they loose in a battle of core count, which is what HEDT is, as AMD's architecture is designed to scale.
Korguz - Thursday, November 7, 2019 - link
evernessince imagine how much worse it would be for intel if ryzen 3000 matched intel's current clocks.....tamalero - Saturday, November 9, 2019 - link
Its Pentium 4 vs Athlon X2 all over again.urmom - Thursday, November 7, 2019 - link
IDK, I runn all core on my 3900x @4.4Ghz.Irata - Friday, November 8, 2019 - link
Their performance / W comparison is based on results vs. power use, so perfectly valid. Does not really matter at what clock speed each CPU ran if it was at their best (stock).As for there only being CB shown - yes, that is indeed only one case. Wish they had included more.
HollyDOL - Friday, November 8, 2019 - link
I remember seeing some reviews mentioning power hog chipset on AMD side pretty much nullifying any advantage they gain on pure CPU. Ie. while cpu was more efficient platform itself in the end was less... Is that still an issue?Targon - Friday, November 8, 2019 - link
Because you can run the third generation Ryzen processors on first or second generation motherboards, the issue of high motherboard power draw can be avoided, and you only lose PCIe 4.0 speeds. I am running my Ryzen 3900X on an Asus ROG Crosshair VI Hero motherboard, no problems at all, and no need for an active cooler for the chipset.evernessince - Sunday, November 10, 2019 - link
@HollyDOLThe difference between these CPUs is over a 100w, not 15. Simple math tells me the chipset is not going to make up the difference even if it was running at it's max of 15w.
In addition, no one said you had to use the X570 chipset with this processor. If you don't need PCIe 4.0, go with a cheaper motherboard. If you do AMD is the only choice right now .
yannigr2 - Friday, November 8, 2019 - link
People started understanding that frequency is not everything when Pentium 4 was still selling. More than 15 years have passed since then.Oliseo - Sunday, November 10, 2019 - link
AMD has better IPC than Intel. Intel needs those high clock speeds to keep up.Performanc per watt, Intel are getting obliterated.
(i9 9900k owner, I have no dog in any fight, but i like to stick to facts)
alufan - Monday, November 11, 2019 - link
omg still folks dont get it the AMD chip does as well as or in many cases better at a lower clock and lower voltage that means it uses its power more efficiently to do virtually or in some cases more of the same work whats so difficult about that to understand?unixguru88 - Thursday, November 7, 2019 - link
Given that a core count race seems to be the new Mhz race, what type of casual everyday programs can we expect to take advantage of 16/32 or more cores? Are there any game engines that can meaningfully use 32 cores? I can seet browsers taking advantage of high core counts trivially by being able to remain performant with many tabs of JavaScript heavy pages open. What else could potentially use 32 or more cores?Teckk - Thursday, November 7, 2019 - link
You mean other than Chrome and Electron apps on the system ?!?On a serious note, can't think of anything other than Virtual Machines, Image/Video Editing and Debugging in IDEs but I don't know if that falls under 'casual everyday programs'. But again they could be for the person buying a 16-core processor for casual everyday use :)
DPUser - Thursday, November 7, 2019 - link
Audio loves multiple cores... lots of parallel processes happening in a big multi-track, multi-plugin, multi-virtual instrument mix.shreduhsoreus - Thursday, November 7, 2019 - link
Audio also prefers that those cores are all on the same die, otherwise you can't get full utilization of the CPU at low buffer settings. Scan Pro Audio has found that the 3900X starts having dropouts at 70% utilization. I actually returned a 3900X because they're not that great for low latency audio production.MattMe - Thursday, November 7, 2019 - link
Is that right? I'd always understood that single core perf is extremely important in audio, not only for latency but because if you have a complex I/O chain (instrument running through multiple plugins) it has to stay on the same core.In Live if I have an instrument input that runs through a gate and compressor, then into another track that is running plugins (effects/loopers), that all has to be processed by a single core according to Ableton documentation. So although other instruments can be running on other cores, the single core perf is still a potential bottleneck (and frequently is!)
(The reason I use the example of routing audio through different tracks is for monitoring and looping at multiple points of the chain.)
zmatt - Thursday, November 7, 2019 - link
I can't speak to all DAWs, it depends on which one you are talking about specifically. But I've seen good results with FL Studio with more cores and threads versus improving single threaded perf.valinor89 - Thursday, November 7, 2019 - link
While not really needing that many cores by a long stretch I am amazed at how some Antivirus software insist on analizing all the drives on my PC serially. I get that when there is only one drive it would be I/O bound but when I have multiple drives I don't see a reason they could not scan them in paral·lel. specially when I do a Scan on demand and don't expect to be doing anything else with the PC.timecop1818 - Friday, November 8, 2019 - link
Who gives a shit about AV in 2019? If you're dumb enough to download and run pornhubvideo.jpg.mkv.exe then you deserve to get buttcoined or randomware'dOliseo - Friday, November 8, 2019 - link
lol. It's cute you thinking that's how you get viruses in 2019 and not by compromised websites instead.Xyler94 - Friday, November 8, 2019 - link
Congratulation! You're part of a 1% club who knows how to detect BS stuff! Good on you!!For the rest of the population, those not hooked to forums like this, Anti-Virus programs are still needed. They help with many other things than downloading Trojan viruses. The easiest way to get most is actually phishing links. Even though at home I taught my mother how to look for BS links in emails, and I have a Pi-Hole VM to stop most ads, including mal-advertisement, there hasn't been a virus in our house, but I still bought a year of Bit-Defender for my mother's PC (And I may install it onto my own PC just to help clean it)
Irata - Thursday, November 7, 2019 - link
3950X is probably more of a "want" than a "need" CPU for a considerable part of buyers.I'd say the plus part of having many cores is that you can have several things run in parallel without having to worry much about performance.
evernessince - Thursday, November 7, 2019 - link
It's a HEDT processor, it's designed for professionals. If it speeds up your work then that for many is worth a lot of money.Irata - Friday, November 8, 2019 - link
That's why I said "considerable part of buyers". I am sure that for professionals it is more than worth it.For someone like me, however, it would definitely be a "want" CPU - as far as my needs / use cases go, a 3700x or even 3600 would be perfectly fine, however I want a 3950x because it is what it is.
The cool thing is that I can get a lower end Ryzen 3000 now (with a good main board) and upgrade to the 3950x later when it is offered at EOL prices.
Spunjji - Friday, November 8, 2019 - link
One thing I like about the 3950X is that it's blurring the lines of what HEDT is, in a good way. Availability of this many cores on a mainstream desktop platform is a great incentive for developers to look for ways to use that power.As Irata points out - your common-or-garden end user can purchase a 4, 6 or 8-core system and then eventually upgrade to 3950X at a (potentially much) later date when more software benefits. There's already been an uptick in software like games using meaningfully more cores since Zen first released; I'd anticipate that trend continuing, albeit acknowledging the difficulty of multi-core scaling for many tasks means the trend will likely slow down.
MASSAMKULABOX - Monday, November 11, 2019 - link
yep anything above 6 cores is gonna sit around doing nothing most of the time , for ordinary users. And ven for other users multi-cores are very under utilized... they have run out of ideas to make us upgrade, datacentres need it and HPC need it but home users? no way .. E-peen ?b.ritesManch - Thursday, November 7, 2019 - link
TR is mainly for content creation and other things where more cores is beneficial. If games and browsers are your thing, just get a Zen 3 or Intel equiv. No point in spending on this.haukionkannel - Friday, November 8, 2019 - link
Yep... 3600 or 3700 Are for Gaming. Anything above more to content creation!Targon - Thursday, November 7, 2019 - link
A game like Kingdom Come: Deliverance shows that performance can drag with a lower number of CPU cores, but I don't know how well it scales up. I would expect that similar games with a lot of AI controlled NPCs would see a big benefit from additional CPU cores if the game is designed to use them.SeannyB - Thursday, November 7, 2019 - link
TR 1950X user here. In general use, it's great at multitasking. You can be doing a lot of things at once and the system doesn't choke because there are always more cores. And the rare case (certain kinds of content creation) where a single program can utilize all cores, it's ludicrously faster than the 6-core Ivy Bridge "Extreme" I had prior. TR + fast storage + 32GB RAM is a dream machine for PC desktop, IMO.As far as gaming goes, benchmarks tell the story. Game engines that can multi-thread draw calls running content that is mostly limited by draw calls see the most benefit, but even then it won't beat a fast Intel; TR will keep pace at best. Off-the-shelf game engines like Unity & Unreal run all of their game logic & world/physics simulation on one or two threads, so simulation games using those engines like Cities Skylines and Kerbal Space Program are ultimately CPU frequency-limited because the bottleneck lies in their gamesim/physics thread.
surt - Thursday, November 7, 2019 - link
If you have reason to buy TR, you should really be buying no less than 64G to pair it with.SeannyB - Thursday, November 7, 2019 - link
In 2017, 64GB seemed like overkill for a 16-core system. Even now, I never top out even as I'm running Unreal Engine (editor), 3DS Max, Ableton Live even simultaneously... Of course, YMMV. 16-core soon enough will merely be a top-end Ryzen, and 24-core & up is a decidedly different class of computer, and I suppose 64GB is appropriate for that.extide - Thursday, November 7, 2019 - link
Dependson your workloads. Both my desktop and work laptop have 64GB and I want to move my desktop at home to 128GB soon.close - Thursday, November 7, 2019 - link
I have a machine with 1024 cores and 2TB or RAM (no joke). I think the point of the comment above was that you don't *need* 64GB just because you need a TR.voodoobunny - Thursday, November 7, 2019 - link
... what do you *do*, and how can we do that too?imaheadcase - Thursday, November 7, 2019 - link
That is why these new CPUs are a slippery slope for people. At the end of the day even the hardcore gamer really has no incentive to upgrade past a CPU like the 2600K from Intel that was so popular. AMD is trying to put the "business" and "consumer" cpu in different category, but every year it seems kind of comical in most regards. Not saying that they don't have a advantage, but like you said the performance is getting to the point that you will not actually care about Mhz or core count, and more of feature sets of the CPU.Case in point:
"3.5 GHz base frequency and a 4.7 GHz single core boost frequency; the overall all-core turbo frequency will be dependent on the motherboard used, the quality of the silicon, and the turbo in play."
The last CPU i had boost was a 486dx intel cpu that went from 33mhz to 66mhz Boosted..even at that time i remember the controversy of it all. What i'm saving is when you get to a point that you have so many cores and are itching for a measly 100mhz or so its time to focus on integrating new tech into the cpu.
zmatt - Thursday, November 7, 2019 - link
Maybe your memory is foggy but the turbo on the 486 was a misnomer. It didn't make the cpu faster. 66mhz was the base clock. It halved the multiplier to run the cpu at 33mhz for games that didn't use the rtc. Many DOS games didn't support polling system time from the rtc because a lot of 386 and earlier systems didn't have them. So they kept time relative to the cpu cycles as a work around. I remember Liero did this and anything faster than a 16mhz 386 made the game actually run faster, to the point of unplayability. Unclocking the 486 via the turbo button helped, although on some games not enough.FreckledTrout - Thursday, November 7, 2019 - link
True. They all just misnamed the button turbo :) Which even back in the the 286 days I had a turbo button it of course worked as you described albeit lower frequencies.Xyler94 - Thursday, November 7, 2019 - link
There's more to a PC than raw CPU power.Platform features like PCIe version, SATA version, NVMe, some things that weren't even a sparkle in the eyes of engineers back in the Sandybridge days. Sure, if you only game, there's not too much of a difference, but there is a good one to say no to Sandybridge era CPUs in modern times. Hell, I wanna upgrade my 4790k, which is 3 years younger than Sandy Bridge.
The hardcore gamer have a lot of reasons to upgrade from Sandy, CPUs are faster, have more features, and newer stuff works better for newer OSs and such.
Threska - Thursday, November 7, 2019 - link
Instructions the earlier CPU didn't.**Reason I had to upgrade. The DRM needed it.
FreckledTrout - Thursday, November 7, 2019 - link
You have a point for the average person, CPU performance is pretty much a commodity. It took awhile but we have hit a point where people can reasonably afford more compute power than they need. However people who do real work like say video editing and don't have server farms these advances have been huge as they can never have to much compute.Spunjji - Friday, November 8, 2019 - link
This hasn't been true for around 2-3 years now, depending how you measure. Sandy had a long life (mainly thanks to its extremely conservative clock rates at stock) but it's way outside what I'd recommend to anyone looking to game even moderately seriously now.cfenton - Thursday, November 7, 2019 - link
Almost nothing. These things are for work. A reasonably fast quad core (anything Ivy Bridge or newer, really) is fine for almost any casual programs. Going up to eight cores makes sense for gaming since the new consoles will be eight core.TheinsanegamerN - Thursday, November 7, 2019 - link
The consoles have been 8 core since 2013 dude. PS4 and xbone are 8 core. 8 slow cores, which should have prompted swift acceleration of multi threaded game engines.Yet here we are.
milkywayer - Thursday, November 7, 2019 - link
Half of this console generation Intel was in the lead and they kept milking the core count. Until 3,4 years ago they were selling dual core cpu as i7 on mobile. It wasn't until AMD came up and basically showered everyone with 4 and 6 core cpus for half the price is when Intel dropped the BS and started offering real 6 core cpus in the lower tier consumer market and 4 core real 4 core cpus in mobile etc. I blame only Intel.cfenton - Thursday, November 7, 2019 - link
And we saw slow progress toward multi-threaded games throughout the generation. There are way more games today that can take advantage of 4+ cores than there were in 2013. It takes time to adapt game engines and not every kind of game will even benefit from more cores. All I'm suggesting is that if you play games you have some reason to go beyond four cores.evernessince - Thursday, November 7, 2019 - link
Consoles were based off jaguar, which really had 8 half cores that shared execution units. So really, 4 cores.scineram - Friday, November 8, 2019 - link
No.Spunjji - Friday, November 8, 2019 - link
You're confusing Jaguar for Bulldozer.Jaguar uses complete cores, albeit "small" ones in terms of area - in design and performance terms they're somewhere between the old K8 Athlon 64 and K10 Athlon II processors.
I think the confusion comes in because the console implementation of Jaguar has 8 cores split across 2 "modules" which is the same terminology used for 'dozer, but referring to a different thing:
Bulldozer module = 2 cores with shared FP resources
Jaguar module = 4 independent cores, like a CCX in Zen
Zizy - Friday, November 8, 2019 - link
8 cores at 1.6GHz (PS4 as the slowest) is at best the same as 4 cores at 3.2 GHz, assuming everything else equal and perfect MT. Plus those cat cores were essentially half as capable as the current stuff, normalized by clock. Therefore, consoles have about the same as 2 proper desktop cores - the lowest end CPUs you can buy.Anyway, there are many games that use more than 4 cores these days. Especially stuff coming out now when also Intel started offering more cores and AMD having competitive if not superior chips.
nevcairiel - Thursday, November 7, 2019 - link
Casual everyday users absolutely do not need such CPUs.DigitalFreak - Thursday, November 7, 2019 - link
Exactly right. Core count is the new Mhz race for the uninformed.Valantar - Thursday, November 7, 2019 - link
Last I checked $2000 CPUs generally weren't for "casual everyday programs". Not really $750 ones either.Performance hungry productivity applications can on the other hand make use of 16 quite commonly, though 32 is still a stretch. Then again there's some value to a workstation that's fully usable even when running a compile, render, or other multi-hour heavy workload.
evernessince - Thursday, November 7, 2019 - link
People were saying the same thing 3 years ago about the 8 core Zen 1 CPUs and yet here we are, a majority of new games coming out utilizing 8 cores. Give it another 3 years, I wouldn't be surprised to see if that doubles again.Oliseo - Friday, November 8, 2019 - link
Enough with the hyperbole already. Games will NOT be using 16 cores in another 3 years.I know it suits your argument an all, but get real.
Spunjji - Friday, November 8, 2019 - link
Agreed. What's more likely is that we'll see games / engines that depend on 4+ cores becoming commonplace, with maybe an outlier or two that can squeeze marginal gains from 8+.Targon - Friday, November 8, 2019 - link
With the Ryzen 7 release in 2017, and then the release of the i9-9900k, no one questions that we are now in the era where games and programs should at least be able to scale with 8 core/16 threads. Now, once you actually have a properly multi-threaded design, it becomes simple to just use a design to use more and more threads, and if you have fewer cores, no problem because the scheduler will just assign the threads to CPU cores.You don't really target a given number of cores, you either go for a multi-threaded design, or you don't. Allow those who have a higher end processor get the advantage of more cores/threads, it doesn't HURT those with lower tier chips.
jaju123 - Thursday, November 7, 2019 - link
Absolute dominance, I love it. Crazy that I can use my 3700x now (which is already incredibly fast) and buy a used 3950x in a couple of years for an upgrade with double the cores (or just get a zen 3 chip).yeeeeman - Thursday, November 7, 2019 - link
Well, AMD is trying the Intel seat and it likes it. We can see that the 32 core part is now more expensive than the previous gen 32 core part, 2000$ vs 1800$.Irata - Thursday, November 7, 2019 - link
While this is more than many had expected, the only Intel CPU / platform that comes remotely close to the TR3 platform is the 28C Xeon W-3175X, which costs $ 3,000 and requires a separate very expensive mainboard.TheinsanegamerN - Thursday, November 7, 2019 - link
It also brings gen 4 PCIe, which aint cheap.Kjella - Thursday, November 7, 2019 - link
The 32 core TR2 had a very awkward memory architecture where not all the CPUs had direct access to memory and on many workloads it performed no better than a 16 core. If you wanted a "normal" 32 core CPU you'd have to buy EPYC server chips which cost a lot more for much lower speeds. So while you can't read it out of that spec sheet the TR3 is actually a much more capable product.evernessince - Thursday, November 7, 2019 - link
A slight increase in price doesn't make it Intel. You also have to consider the included features like PCI 4.0 and a monstrous amount of lanes.Compare that to Intel, which increased prices while providing the same core count, feature set, and extremely small IPC increases.
RavenRampkin - Thursday, November 7, 2019 - link
So the 3000G is basically a CS:GO cruncher on a budget 👍Also it's good they're taking their time with the 3950X cause imho distant but realistic deadlines > watery "soon™" > short but unfulfilled deadline. Sadly AMD seems to have been thru all 3 stages at this point...
Spoelie - Thursday, November 7, 2019 - link
Not so sure about that, I would've probably upgraded to 3950X if it was there on the initial launch day, but now it feels it makes more sense to wait for Ryzen 4000/Zen 3 - it's only another 6 months. I upgraded from 2700X to a 3700X to tie me over in the meantimeSquarePeg - Thursday, November 7, 2019 - link
It'll probably be more like 10 months as Ryzen release dates have been slipping back a bit with each new generation. I would expect Ryzen 4000 to be available mid to late Q3 2020.wishgranter - Thursday, November 7, 2019 - link
Gigabyte MoBoshttps://www.gigabyte.com/Motherboard/Socket-sTRX4
Marlin1975 - Thursday, November 7, 2019 - link
So no news on the B550 chipset? :(haukionkannel - Friday, November 8, 2019 - link
Next year... as it was always expected from the x570 release.sor - Thursday, November 7, 2019 - link
I’m sort of wishing they announced a full line, even if they are only launching two in November. I have no idea if I want to wait and see what else is coming or buy now.deil - Thursday, November 7, 2019 - link
$55 --> $49 with slight performance boost?now can I please get any microitx board to make this most powerful smart tv?
I wonder why we cannot get super small am4 boards for so long?
Targon - Thursday, November 7, 2019 - link
There are ITX boards out there for Ryzen, but considering that the processors themselves have tended to be more powerful, putting them into a small system can be problematic. The new 7nm processors solve some of those problems, though I wish AMD would have released 7nm APUs by now.Death666Angel - Thursday, November 7, 2019 - link
What are you talking about? A short trip to "geizhals.de" shows 16 mini ITX AM4 motherboards available, 3 X570, 2 X470, 3 X370, 4 B450, 3 B350 and 1 A320. There is even an mini STX motherboard, if you want to buy a whole barebones PC (DeskMini A300).And a smart TV has the smart stuff integrated. You are talking about an HTPC.
Midwayman - Thursday, November 7, 2019 - link
I really wish all the PCIe lanes were available at lower core counts. Its not hard to have a lot of I/O needs without needing a stupid amount of cores. It really sucks when you go to install an expansion card or NVME only to find out that its going to drop your GPU lanes from 16x to 8x.DigitalFreak - Thursday, November 7, 2019 - link
My understanding is that Ryzen actually has 32 PCIe lanes available on the processor, at least in the 1xxx and 2xxx series. They only exposed 24 of them (16 GPU, 4 NVMe, 4 to the chipset) for some reason. Limitation of the socket?John_M - Friday, November 8, 2019 - link
It is obviously limited by the AM4 socket, though not arbitrarily - it was a conscious design choice. Some of them are used internally bt the SoC for SATA, USB, etc. Actually, If you take a look at Raven Ridge and Picasso all 32 lanes are indeed used: 8 to the discrete GPU slot, 4 for NVMe, 4 to the chipset and 16 to the integrated GPU, SATA, USB, etc.Kevin G - Thursday, November 7, 2019 - link
Kind of irked that there is a new socket for TR. The Epyc side didn't need a new socket for PCIe 4.0 and that invoked a few changes too (like an additional PCIe lane for OoBM). At the very least, it would have been nice to have had the 3rd gen TR parts work in earlier motherboards for an upgrade path even if that meant that the IO was PCIe 3.0 only and the chipset link was effectively 4x PCIe 3.0. The centralized IO chip is likely a huge reason for the generational performance leap on top of the Zen2 improvements. That's something original socket TR4 owners probably want.DigitalFreak - Thursday, November 7, 2019 - link
Realistically, how many people actually upgrade a CPU without upgrading the motherboard? That argument is getting old.TheinsanegamerN - Thursday, November 7, 2019 - link
so old. I have never upgraded a CPU only, because motherboard features have always been just as important.Death666Angel - Thursday, November 7, 2019 - link
What are the benefits you got from your motherboard after upgrading from a Z170 to a Z390 motherboard?Let's look at the Asus Z170-P and the Asus Z390-P:
4 DDR4 DIMM slots
x16 and x4 GPU slots
Z170 has 2 PCIe x1 & 2 PCI slots vs Z390 4 PCIe x1 slots
Z390 has 2 NVME slots (x2/SATA + x4) and wifi M.2 vs Z170 has 1 NVME x4 slot
Z170 has USB C on the IO vs Z390 not but Z390 has more USB ports in general (other Z170 have more USB) and Z390 has USB 3.1 (10 Gbps)
Z390 has RGB and a water pump header
That is not a lot of features you gain for a product that launched in 2015 vs 2018. Not many people need many USB 3.0 ports. At one point in time every motherboard upgrade would bring with it better USB and SATA speeds that actually mattered, better fan controls and better DDR speeds. But it's slowed down a lot. I could mod my Haswell board BIOS to allow NVME booting from a 15€ add in card, the board had USB 3.0, SATA III, good fan controls. When it comes to pure features, my X570 doesn't really offer anything more. And I'm sure a lot of people would have liked to upgrade just their 7700K to a 8700K, getting an automatic 50% increase in a lot of tasks and kept their own board. If AMD does indeed keep their AM4 platform for 4 generations with Zen 3 next year that will be a great accomplishment for them, the consumer and the environment.
Threska - Thursday, November 7, 2019 - link
I did, but then not everyone throws away the car every time they buy new tires.Death666Angel - Thursday, November 7, 2019 - link
We've really been conditioned by Intel not allowing it and doing little generation to generation to think that. I've upgraded in the AM3 days from a 2 core to a 4 core and then a 6 core. Some friends bought a first gen Ryzen CPU and now upgraded to Gen 3 and they are happy because PCIe is basically the only thing they miss out on. I also upgraded in the Athlon 64 days, I think I upgraded a socket 939 motherboard from a single core to a dual core, but I'm not too sure anymore.With Intel, there was no way to upgrade from a 2700k SB to a 4770k Haswell without a motherboard change. Realistically, there was no great motherboard feature upgrade in that time. Z68 boards already had USB 3.0 and SATA III, DDR3. The big jump came with the *Lake architecture that enabled NVME slots and DDR4. And then Intel allowed another generation of 4 core CPUs for that generation before breaking compatiblity when the core counts increased.
It is always interesting when people argue against good features for seemingly no reason other than to defend a multi billion dollar corporation.
Total Meltdowner - Thursday, November 7, 2019 - link
I've done it like twice. That's pretty good, though. May do it again here with the 3700x if the price drops a bit. Will replace my 1800x.proflogic - Thursday, November 7, 2019 - link
That moving line for HEDT is kind of lame. Now it costs even more to get more memory channels and/or PCIe lanes.Are core counts increasing to nonsense levels on AM4? Can they even be served well with the amount of I/O and memory throughput available to them?
Spunjji - Friday, November 8, 2019 - link
Not really. A little more, but not a lot more.The answer to your question can be found on this site. The short answer is: yes.
The long answer is: some tasks benefit from more memory bandwidth, but only if they're bound by that limitation in the first place. They're the sort of tasks you'd only do on a HEDT platform in the first place.
WaltC - Thursday, November 7, 2019 - link
Nice write up! It's interesting to watch Intel drop further and further behind. 2020 should be a blowout year for AMD. Surprising that anyone is still buying Intel's old, massively security-hole-punched architectures, made on the old 14nm process, to boot. Processing performance, not MHz, has been the name-of-the-game since 1999. Hard to believe that in 2019 there are people who still don't understand what that means...;) I always thought it was rather elementary.imaheadcase - Thursday, November 7, 2019 - link
Umm, considering holes was fixed, and 14nm process means nothing to end users they are doing good. Intel is still a great CPU, it makes sense for lots of people. Upgrade path to AMD is not exactly cheap if you already have intel cpu..you got to get new mobo/ram as added cost to going amd.Mhz is still king for flat game performance as well.
Sounds like you are just a uninformed on how things actually work in the real world.
Korguz - Thursday, November 7, 2019 - link
" Mhz is still king for flat game performance as well. " not really.. ipc is also just as important.and right now, clock for clock, amd wins there. for the most part, ryzen is pretty close to intel even with the clock speed disadvantage... imagine if ryzen were at the same clocks as intel...milkywayer - Thursday, November 7, 2019 - link
Yup because we should all cpu buying decisions based on what cpu offers an extra frame or two when most games on almost all mod tier cpus can do 75 to 100fps.14nm is ancient and Intel is milking it thanks to fangirls who were happily buying dual core i7s on mobile until AMD came in and kicked them in the rear and finally made 6 core cpus the new low tier norm.
Count Rushmore - Thursday, November 7, 2019 - link
Don't be harsh... according to them, human eyes can only see 4 cores & PC is for gaming only. All Those multi-billions $ games apparently developed exclusively on i7 procskgardas - Thursday, November 7, 2019 - link
"Umm, considering holes was fixed," -- are you sure? To me it looks like Intel still sell more CPUs with holes not fixed (in silicon) than those fixed. Honestly Intel message about this is all big mess...haukionkannel - Friday, November 8, 2019 - link
It is all about pricing! Intel has huge r&d and their cpus Are really good, if the price is ok!Spunjji - Friday, November 8, 2019 - link
Upgrading to a new Intel CPU also requires buying a new motherboard, and you'll only need to change RAM if you're going from DDR3 to DDR4 or your DDR4 is old and slow.Basically, AMD is a better value option and AM4 still has a potential upgrade path to Zen 3. Sounds like your "uninformed" comment is projection.
martinbrice - Thursday, November 7, 2019 - link
I'm really curious how the 3960X and 3970X might game. Sounds like architecturally with the single I/O die, they shouldn't suffer from the NUMA issues of the 2970WX or 2990WX. They're (essentially or precisely?) the same chiplets as the 3900X, but have a greater IHS surface area. Their advertised boost is competitive with 3900X, it'll be a matter of what they can hold as a steady state. They also have a larger cache.It'd be great if we could get all the benefits of the extra cores, PCIe lanes, and memory channels, without having to significantly sacrifice gaming performance.
Total Meltdowner - Thursday, November 7, 2019 - link
I was wondering the same thing. I game on an 1800x just fine at 1440p... This tr3 boosts 500mhz higher with about the same base clock...jospoortvliet - Sunday, November 10, 2019 - link
... and much better IPC. I would expect the TR's to beat your 1800x easily.abufrejoval - Thursday, November 7, 2019 - link
Adjustable cTDP is a godsent, but only if it is run-time adjustable (not just a fixed setting in the BIOS as with some older APUs) and preferably with command line and API: That allows you to really tune the hardware between energy efficient batch processing and fast interactive response times, especially when you can take cores offline similarly at run-time to make sure you hit high-frequency points consistently with the remaining ones (and re-enable them later).That would then turn most of the hundreds of fixed allocation SKUs Intel is selling into "CPU as code".
Of course these cTDP limits would have to be very closely observed (or be adustable), so you can control power vs. temperature limits, depending on if you want to allocate PSU overcapacity to increase boost speeds while there is thermal capacity or if you want a PSU that is energy optimzed, but has few reserves.
M O B - Thursday, November 7, 2019 - link
Why are the TRX40 motherboards all the same? Why aren't there different roles being played out with all of those PCIe lanes the motherboard manufacturer's are being given? Please make a motherboard for those of us who need lots of PCIe lanes and aren't just for gamers.With the TRX40 boards I see 4x PCIe slot on almost every single board--total. Some have a 5th slot, but is a x1.
I have been waiting for something like the X299 WS SAGE/10G to come out on the AMD side for more than a year, but apparently that day will never come.
Hammer_Man - Thursday, November 7, 2019 - link
The slide with the Threadripper prices, reads "New TRX40 Platform With 88 PCIe® 4.0"88 Lanes?
Dose the Chipset use a MUX och do the CPU have all those lanes?
DigitalFreak - Thursday, November 7, 2019 - link
They're probably playing the Intel game of adding the PCIe lanes on the CPU and the chipset together.Death666Angel - Thursday, November 7, 2019 - link
What's not to understand? It talks about the TRX40 platform, which is the chipset and the CPU as a whole and the article shows quite nicely what can be configured how. The CPU has 48 + 8 + 4 + 4 (=64) and the chipset has 8+8+4+4 (=24) which is the 88 number they come up with. 16 of those area already in use for the chipset communication, so 72 can be made available to the use however the motherboard manufacturer choose (as seen on the "AMD TRX40 Platform" slide). This isn't rocket science.pkv - Thursday, November 7, 2019 - link
Very disappointed by two things:(1) lack of retrocompatibility with x399
(2) no clear upgrade route for a setup with 16c/32t; from 1950x (999$) one could go to 2950x (899$); but now there's a 50% premium to go to the first TR40 offering (3960x 1399 $).
Going to 3950x+ x570 is not a solution when you need the pcie lanes provided by TR4.
@Ian: did you hear anything about a future 16c TR 3000, which would overlap with the 3950x obviously ? (Intel has done so in the past, mixing up HEDT and mainstream)
Death666Angel - Thursday, November 7, 2019 - link
Do you mean Intel has had overlapping HEDT and mainstream CPUs as in they had the same core counts? Sure, AMD had that as well, 8C TR is a thing after all. Or do you mean Intel had the same name for HEDT and mainstream CPUs before? Because a 16 core TR3 would fit in the 3950X naming scheme that is now taken up by the AM4 equivalent.And 3955X would look a bit messy to me. :Dpkv - Friday, November 8, 2019 - link
meant the former; similarly powerful cpus, one for mainstream, the other for HEDT.Kjella - Thursday, November 7, 2019 - link
I've no doubt there'll be a 16 core TR3, they'd get paid very well for a 4+4+4+4 dud chip combo but they probably want to clear the first wave of people that won't wait first. I'm thinking 2-3 months out like February or so, that's just me looking into the crystal ball though.pkv - Friday, November 8, 2019 - link
there's an interview of an amd senior technical marketing manager in pcworld https://www.pcworld.com/article/3305945/watch-the-... ; the absence of 16c was one of the first questions. He answered the absence of 16c is deliberate, in order to have a clear boundary between mainstream and HEDT. So the prospects of having in a few months 16c on TR40x are nil atm.AbRASiON - Thursday, November 7, 2019 - link
AMD *STILL* continues to ignore business desktops and home performance enthusiasts who don't game.Where is the higher performance processors with extreemely basic graphics/ Where's the 3000G with 6 cores?
Some people just want a 6 core Ryzen but a very very simple GPU for basic Windows tasks / video.
Intel can do it with the 8400.
Death666Angel - Thursday, November 7, 2019 - link
When Zen2 enters APUs you will likely get your wish.But what is wrong with just getting a 1030 GT?
AbRASiON - Thursday, November 7, 2019 - link
1030 GT is a point of failure, heat, cost, PCI slot.Intel produce perfectly good graphics for my need (and 500 users at my place of work) - for "free"
AMD NEED to produce processors with 6 to 8 cores, decent computing power and a very very VERY simple GPU.
I'm happy with iGPU levels, as it stands, AMD do not have a product for me, it's sad.
Korguz - Thursday, November 7, 2019 - link
as Death666Angel said.. wait till amd migrates the zen 2 core over to their APU's. and you will be able to get what you are looking forphoenix_rizzen - Friday, November 8, 2019 - link
B450-based motherboard, Ryzen 5 2600 or 3600 CPU, and an Nvidia 730 GPU makes for a great, silent office computer. And gives you triple-monitor support to boot. Or an AMD Radeon Pro Wx 3100 or 3200, which gives you even better multi-monitor support. Both are fanless. Add NVMe and 16 GB of RAM and you have a great, little, silent workstation.While it would be nice to have more than 4 cores in an APU, it will be another year or so before that's available from AMD. Really hoping Zen2 chiplet design leads to 4-, 6-, and 8-core APUs.
scineram - Friday, November 8, 2019 - link
No.Quantumz0d - Thursday, November 7, 2019 - link
First that TDP bullshit needs to go out of window from marketing when you are pushing for AIO OOTB.~175W vs 9900K 210W both can be handled by a same PSU plus a same chassis and a goddamn DH15. What is this whining and shiny bs of TDP in a Desktop, cTDP yuck. Having reduced performance is bullshit esp on such high core CPU machine this is not a BGA macbook pro soldered garbage we are talking about nor a BIOS chastitized Machine.
Now price, $600 over 3700X ? For a superbinned processor at $800ish ? Why such high price. Also their 3800X doesn't make any sense no one recommends it, GN also ingores it. Yeah AMD just wants to squeeze out all margins, one can understand that due to 7nm costs plus AMD state of CapEx vs Intel. Still their mediocrr perf Improvements over their multi SKU confusion is bad, not to forget the insane expensive X570 chipset on 12nm.
I was in market for this year and I wanted to wait for TR plus Z390 future. Its a shame Z390/9900Kx platform is dead for LGA 12xx now a big kick in nuts that even CPU is outdated. With AMD I was ready for $700 Xtreme purchase too, going to OCN and seeing GB boards having BIOS issues and all of AM4 playing hit and miss with their beta testing bullshit BIOS patches rolling is a hell, to make it worse the DRAM is hit with Speed stability plus overall vCore, Even Trident Z at fclk, mclk, uclk plus DRAM is a huge pain of trial and error just because to get proper perf forget OC and that PBO, XFR2 marketing BS. And Ryzen Master requirement.
Intel Z390 Dark was my choice for other now with dead platform on one side and unstable on one with useless Gen 4..For now, yeah maybe future its good but chipset fan again, all OCN shows high RPM for that.
TR4 again same chipset fan bs with insane price at $1500+ base on top AIO, Mega expensive mobos at over +600.. With zero backwards compatibility, AND once 2021 DDR5 hits it's dead. The $2000 CPU is gone outdated.
HEDT Intel won this time 7xxx to 10xxx and stable platform. Gen 3 x16 2080Ti wont max it out..and Mainstream AMD is better but still expensive and huge fragmentation lineup and unstable.
Will wait for Comet Lake and Zen 4000.
Quantumz0d - Thursday, November 7, 2019 - link
Higher Vcore & Higher DRAM voltage*Korguz - Thursday, November 7, 2019 - link
Quantumz0d, wait.. you are calling AMD expensive ?? have you not seen, or remember the prices intel was charging for its cpus before Zen ?? how the fact that going from 9xxx series to 10xxx saw what some might call massive price drops ?? the top chip for 10xxx is 1k less then the 9xxx chips.Death666Angel - Thursday, November 7, 2019 - link
Anandtech really need an ignore feature in the comments.Spunjji - Friday, November 8, 2019 - link
I'd not be seeing fully 50% of them by now. It would be nice.Slash3 - Saturday, November 9, 2019 - link
Very much so.Total Meltdowner - Thursday, November 7, 2019 - link
Its still 2019.DDR5 is easily almost 1.5-2 years away for average consumer.
TR3 looks like a very solid platform for many HEDT users at a very respectable price.
I think you're overreacting.
I have a 1800x. I was going to buy the 3950x but if the perf isnt that much better I may just get the 3700x and keep everything else the same in the computer.
Waiting for ryzen 4k may be a good idea for you but man that's a long wait.
Spunjji - Friday, November 8, 2019 - link
This reads suspiciously like someone trying - at length - to post-hoc rationalize their pre-existing decision not to buy any AMD products. You don't need to justify that irrational desire to anyone else; just go with it! The attempt to do so ends up making you look more biased because you had to pick feeble, irrelevant and/or hypocritical reasons.Sychonut - Thursday, November 7, 2019 - link
Eagerly looking forward to Intel's 14+++++++.PixyMisa - Thursday, November 7, 2019 - link
So the TRX40 is basically a fully-enabled X570? (That is, a Ryzen 3000 I/O die.)Total Meltdowner - Thursday, November 7, 2019 - link
My next system very may be a Threadripper one... wow.wow&wow - Friday, November 8, 2019 - link
How to turn off the annoying video at bottom-right?John_M - Friday, November 8, 2019 - link
I'm not seeing it so Pihole must be working.neblogai - Friday, November 8, 2019 - link
A correction may be needed- 3000G seems to be a Raven2 (released for embedded market this spring, and now also going into laptops as 3200U), not Picasso die. AMD.com lists 3000G as 14nm (RR, Raven2), not 12nm (Picasso). And also, Robert Hallock confirms it is 14nm here: https://youtu.be/NRSUYBjlBXw?t=4072DoomsDayCJ - Friday, November 8, 2019 - link
I don't think the 3000G is Zen+. AMD lists it as a 14nm part and thats Zen not Zen+ (12nm) right?https://www.amd.com/en/products/apu/amd-athlon-300...
rya - Friday, November 8, 2019 - link
do we know of 3000G is actually 12nm Zen+? AMD"s website says its 14nm: < https://www.amd.com/en/products/apu/amd-athlon-300... >for reference, here's the 3400G, which you will see says 12nmm FinFet: < https://www.amd.com/en/products/apu/amd-ryzen-5-34... >
I realize for $49 you can't have it all, but i'm still hoping its 12nm so people can squeeze a little extra out of it and enjoy even lower power consumption.
john.doe.maniac77 - Saturday, November 9, 2019 - link
I am planning to buy Ryzen 3950X post launch, but i am planning to get X570 motherboard early, i just want to know will the x570 board work readily when i install 3950X ? since AMD has told a BIOS update is required, will i be getting a display atleast to update the BIOS? (OR) will manufacturers update the BIOS and re-launch the same motherboard, for this i may have to wait......not sure how this will go ?Slash3 - Saturday, November 9, 2019 - link
3950X support has been baked in to all X570 boards since July (earlier, actually, if you include eval systems). The new updates will simply flash the most current microcode and AGESA optimizations. No worries if you plan to buy a board before receiving a 3950X CPU.dwade123 - Saturday, November 9, 2019 - link
Overpriced.Korguz - Monday, November 11, 2019 - link
go look at intels cpus if you want overpriced :-)quadibloc - Monday, November 11, 2019 - link
Well, now we know why the 3950X was delayed in availability. It is strongly recommended to use a large, nearly top-end, AIO water cooler with it, and it also needs the most recent BIOS updates to work. Still, it's an impressive chip even if it did take them a little extra time to get it right.umano - Monday, November 11, 2019 - link
If they released the 3950x I already had bought it, but with zen 3 coming in q2 I think I will buy a ryzen 4000 laptop first and then I will replace my desktop, hoping that someone will make a premium content creator itx board for 16 cores CPU. I mean 2x nvme, 10gbe and thunderboltumano - Monday, November 11, 2019 - link
" If they had released the 3950x on september"peevee - Friday, November 15, 2019 - link
AMD (or Intel), here is what I want:CPU package, potentially containing at least 2 chips (CPU or GPU), having DDR5 SODIMM slots on the 4 sides of the package to minimize physical wire length and thus latency and power, each slot on its own channel. 256 bit configuration makes decent built-in GPU possible, and that is what I specifically want - 8-core CPU chip (with large shared L3 cache) and GPU chip capable of saturating the 256-bit DDR5 memory (but other configurations are possible).
This package would only need power, PCIe and Thunderbolt 3/USB4 lines coming out of it (including carrying video and audio). Relatively small and simple connector with large contact separation as most contacts are for memory in today's CPUs. Simple MB with just a few layers. Cheap!
No chipset on the motherboard. No network, no Wifi, no separate sound (USB sound adapters are fine), no SATA controllers... Give me slots to install just what I want, and replace when I want (of course other MB models can contain stuff embedded).
1 PCIe4 (5 in the future) x16, two x8 (if any are populated, it is OK making x16 slot working in x8 mode.
Min two more x4 PCIe slots and 2 x4 NVMe slots.
Give me graphics cards containing the same chip as in the CPU above, with 4 SODIMM DDR5 slots each. Able to supplement the GPU in the CPU module, not replace it! Up to 3 external GPU cards for 4x performance of the embedded one.
True modular approach a desktop system supposed to be.
jamesjiya - Saturday, August 1, 2020 - link
Meet Girls Dating is the best website to find local girls in your near me area. lots of people using this site for one night stand, get laid tonight. No 1 Choice for marriedwomen who looking for extramarital affairs. Best way to find local sex dating partner for fuck tonight. join for fun here you get amazing dating profile for fun tonight.
https://groups.google.com/d/topic/soc.history.war....
https://groups.google.com/d/topic/soc.history.war....
https://groups.google.com/d/topic/soc.culture.burm...
https://groups.google.com/d/topic/soc.history.medi...
https://groups.google.com/d/topic/soc.history.medi...
https://groups.google.com/d/topic/soc.history.medi...
https://groups.google.com/d/topic/soc.culture.gree...
https://groups.google.com/d/topic/soc.culture.balt...
https://groups.google.com/d/topic/soc.culture.balt...
https://groups.google.com/d/topic/soc.culture.balt...
https://groups.google.com/d/topic/alt.zen/eArYX1-k...
https://groups.google.com/d/topic/alt.romance/0cjp...
https://groups.google.com/d/topic/soc.culture.germ...
https://groups.google.com/d/topic/soc.women/__hkct...
https://groups.google.com/d/topic/soc.culture.libe...
https://groups.google.com/d/topic/alt.usage.englis...
https://groups.google.com/d/topic/soc.culture.germ...
https://groups.google.com/d/topic/soc.history.war....
https://groups.google.com/d/topic/soc.culture.usa/...
http://www.meetgirlsdating.com/
http://www.meetgirlsdating.com/how-to-find-one-nig...
http://www.meetgirlsdating.com/local-girls-fuck-to...
okcupid2com - Friday, August 7, 2020 - link
https://u4my.com/sex-buddy-is-it-right-for-you/https://u4my.com/how-to-get-a-new-girlfriend/
https://u4my.com/a-sex-guide-should-you-open-up-yo...
https://u4my.com/so-you-want-to-be-a-sugar-daddy-s...
https://u4my.com/3-sex-tips-for-the-best-threesome...
https://www.friendclub.online/how-to-flirt-flirtin...
https://www.friendclub.online/married-men-before-t...
https://www.friendclub.online/how-to-talk-to-a-gir...
https://www.friendclub.online/5-sex-tips-for-women...
https://www.friendclub.online/26-sex-tips-from-rea...
https://www.datenchat.com/7-reasons-to-have-sex-ev...
https://www.datenchat.com/fuck-buddy-in-5-steps/
https://www.datenchat.com/5-signs-hes-not-that-int...
https://www.datenchat.com/cams-com-review-sex-cams...
https://www.datenchat.com/flirting-sex-chat-friend...
https://www.buddiesfuck.com/about-a-boy-toy-benefi...
https://www.buddiesfuck.com/real-sexcontacts/
https://www.buddiesfuck.com/no-strings-fun/
https://www.womendateonline.com/how-to-get-a-guy-l...
https://www.womendateonline.com/the-friends-with-b...
https://www.womendateonline.com/near-me-singles/
https://nearmegirls.com/nine-signs-you-are-likely-...
https://nearmegirls.com/5-things-keeping-you-from-...
https://nearmegirls.com/bdsm-the-beginners-guide/
https://nearmegirls.com/dildos-why-your-ass-will-l...
https://nearmegirls.com/adult-friend-finder-review...
https://craigslistpersonalsalternative.com/avoidin...
https://craigslistpersonalsalternative.com/no-stri...
https://craigslistpersonalsalternative.com/5-big-r...
https://craigslistpersonalsalternative.com/4-tips-...
https://craigslistpersonalsalternative.com/top-10-...
okcupid2com - Friday, August 7, 2020 - link
https://sexandchatonline.com/live-sex-cams-adult-w...https://sexandchatonline.com/eharmony-vs-adultfrie...
https://sexandchatonline.com/one-night-stand/
https://www.okcupid2.com/alt-com-review-erotic-bds...
https://www.okcupid2.com/casual-dating-sites-on-wh...
https://www.findgirlsdating.com/find-sugar-daddies...
https://www.findgirlsdating.com/answered-where-can...
https://www.findgirlsdating.com/no-strings-attache...
jessica8989 - Tuesday, August 11, 2020 - link
https://groups.google.com/d/topic/soc.history.war....
https://groups.google.com/d/topic/soc.genealogy.be...
https://groups.google.com/d/topic/soc.women/zzceuv...
https://groups.google.com/d/topic/alt.usage.englis...
https://groups.google.com/d/topic/soc.history/x-D4...
https://groups.google.com/d/topic/soc.culture.burm...
https://groups.google.com/d/topic/soc.history.anci...
https://groups.google.com/d/topic/alt.culture.us.1...
https://groups.google.com/d/topic/alt.home.repair/...
https://groups.google.com/d/topic/soc.history.medi...
https://groups.google.com/d/topic/soc.culture.nord...
https://groups.google.com/d/topic/soc.culture.usa/...
https://groups.google.com/d/topic/soc.culture.aust...
https://groups.google.com/d/topic/rec.collecting.s...
https://groups.google.com/d/topic/alt.religion.chr...
https://groups.google.com/d/topic/alt.society.libe...
http://www.meetgirlsdating.com/
http://www.meetgirlsdating.com/how-to-find-one-nig...
http://www.meetgirlsdating.com/local-girls-fuck-to...
https://girlsfordatingnearme.blogspot.com/
https://girlsfordatingnearme.blogspot.com/2020/06/...
https://girlsfordatingnearme.blogspot.com/2020/06/...
https://girlsfordatingnearme.blogspot.com/2020/06/...
https://girlsfordatingnearme.blogspot.com/2020/07/...
https://girlsfordatingnearme.blogspot.com/2020/07/...
https://girlsfordatingnearme.blogspot.com/2020/07/...
sexdude - Wednesday, October 28, 2020 - link
https://thesexdude.com/flirt-sites-for-free-flirti...https://thesexdude.com/free-fuck-sites-list-with-m...
https://thesexdude.com/friends-with-benefits-fwb-s...
https://thesexdude.com/fuck-buddies-apps-free-fuck...
https://thesexdude.com/hookup-near-me-to-fuck-toni...
https://thesexdude.com/local-singles-dating-sites-...
https://thesexdude.com/meet-women-wanting-sex-wome...
https://thesexdude.com/sex-sites-for-discreet-hook...
https://thesexdude.com/sexting-sites-for-dirty-tal...
https://thesexdude.com/one-night-stand-apps-to-fin...
https://thesexdude.com/top-online-affair-website-m...
https://thesexdude.com/best-bdsm-dating-apps-fetis...
https://thesexdude.com/best-cam-sites-top-adult-we...
https://thesexdude.com/best-swinger-sites-apps-fin...
https://thesexdude.com/top-backpage-alternatives-t...
sexdude - Wednesday, October 28, 2020 - link
https://thesexdude.comcatchclassifieds - Monday, November 9, 2020 - link
https://groups.google.com/d/topic/soc.genealogy.be...https://groups.google.com/d/topic/soc.culture.espe...
https://groups.google.com/d/topic/soc.genealogy.au...
https://groups.google.com/d/topic/soc.history.earl...
https://groups.google.com/d/topic/uk.politics.misc...
https://groups.google.com/d/topic/soc.culture.brit...
https://groups.google.com/d/topic/soc.culture.aust...
https://groups.google.com/d/topic/soc.culture.fren...
https://groups.google.com/d/topic/soc.culture.new-...
https://groups.google.com/d/topic/alt.home.repair/...
https://groups.google.com/d/topic/news.admin.net-a...
https://groups.google.com/d/topic/soc.women/DMdEGE...
https://groups.google.com/d/topic/news.admin.net-a...