"that's not true, the 6900 XT is $1000, AMD's top card was $400 before"
I think you should clarify which "top card" you are referring to here. The problem was AMD was never competitive in the high end graphics for quite a number of years. With them finally back in the high end market where the RX 6900 XT beats or matches a RTX 3090 in some cases, I don't think it is feasible to price it at 400 bucks. Even with slightly cut down RX 6800 XT is snapped up with a too good to be true MSRP of 649 USD.
There is less incentive to not let Nvidia set the prices for the PC gaming platform and push Polaris-forever (plus overpriced half-baked 'high-end' stuff for the 'red team faithful' — and miners more than anything) strategy.
Nvidia gets to be the 'bad guy' by setting higher and higher prices, while AMD just meekly follows. Meanwhile, gamers never add 2 + 2 to see all of the facets of how they're being squeezed because they refuse to demand adequate market competition.
Believe me, if AMD could easily take the GPU crown, they definitely would. I'm sure they'd love to be raking in the kind of revenue on GPUs that Nvidia is making.
And AMD has been burned several times by mining bubbles. It would be complete folly for them to count on mining to keep prices high. And the only times substandard AMD-powered graphics cards have sold for significant prices is when demand for *all* GPUs has outstripped supply.
There's a 3rd reason it would be plain dumb for AMD to count on a supply shortage, and it starts with an "I". There's no telling how much production Intel has in the pipeline. With the launch of their gaming cards and a slight deflation of the crypto bubble, graphics card prices could virtually crash, overnight.
1. Scalpers didn't craft the GPUs to be attractive for mining.
2. Scalpers didn't refuse to directly sell GPUs to gamers in quantity.
3. Scalpers didn't refuse to implement what is long overdue: motherboards that have sockets for GPUs and VRAM (or just GPUs with HBM2), not just CPUs — rather than continuing to push the antique ATX form factor with its absurdly inefficient GPU cooling solutions.
> Scalpers didn't craft the GPUs to be attractive for mining.
GPUs are good at mining, period. It's actually pretty hard to make a GPU that's good at gaming and NOT good at mining.
> Scalpers didn't refuse to directly sell GPUs to gamers in quantity.
I don't know what you're talking about, but anyone buying in quantity is probably a miner -- not a gamer.
> Scalpers didn't refuse to implement what is long overdue: motherboards that have sockets for GPUs
LOL. They do. It's called a PCIe slot.
Seriously, lay off the crack pipe. You're not even making sense, here.
GDDR has to be soldered onto the same PCB as the GPU, and right next to it. That's one of the reasons it can be so much faster than normal DDR memory. And once you have a PCB with a GPU and some GDDR, it makes sense to put the display connectors on there, so that you can add ports and upgrade to newer display standards when you upgrade your GPU. Also, put the VRM on it, so the cheapest motherboard doesn't need 320 W VRMs for a GPU few users would ever pair with it. And voilà! We have a recipe for the modern graphics card!
Even better: putting GPUs in PCIe slots makes it easy to scale to multi-GPU setups.
> the antique ATX form factor with its absurdly inefficient GPU cooling solutions.
How would putting the GPU on the motherboard make them any easier to cool? Keep in mind that GPU cooling solutions also need to cool their GDDR and VRMs.
There are different sides to this. The Radeon 5700XT was a 40CU(Compute Unit) card, which is the same size as the 6700XT. The 6800 is a 60CU card, 6800XT being 72CU, and 6900XT is 80CU. From that point of view, the proper comparison for price would be MSRP of 5700XT to 6700XT. As I recall, the 5700XT did launch with a MSRP of $400.
Now, the tariffs are a part of the MSRP, since any products made in China and imported into the USA are subject to that higher price. Even globally where the tariffs don't apply, AMD would have to incorporate it into the MSRP or lose a good amount of profitability from selling the reference models.
It is difficult to pin down exactly how many units of GPUs that AMD is selling, since AMD could theoretically send out more GPUs than the video card makers are able to turn into video cards due to other shortages, and it would still mean more GPU revenue for AMD.
Very good to see AMD reach to good position. I'm however not impressed on how their AM4 socket is with X570 and Vermeer. Many issues still happen to people all over and lack of documentation on their technologies for public disclosure is another thing, where the end user has to scour over the reddit and other forums to get those numbers and voltage readings with tons of software to tune even to DRAM. Their EPYC got all the R&D and tons of polish I bet. Since it moves more along with their custom silicon which is Console market. AMD should put more onto consumer side. I was going to get AM4 with Zen 3 but damn when I see the issues of USB and AGESA I'm very much hesitant.
Since Xilinx is not fully integrated, I think we have to wait on that front, that would be really top class breakthrough esp when the FPGA and x86 compute combines.
I haven't had a single issue with my x570 board---and its not a high-end one either, just a cheapo ASUS Tuf model.
The only issue I had was that I simply couldn't reach 1800/3600, but that ended up being a limitation of the IF prior to Zen 3. As soon as I replaced the 3900X with a 5950X, it was fine at 1800/3600.
You're making a mountain out of a molehill... -_- ... The USB issues were quickly identified & fixed, and that kind of stuff ALWAYS happens with new CPU releases (it only affected Ryzen 5000), both AMD AND Intel. By and large, X570 works great and is a WAY more powerful and flexible platform than Intel's.
No not fixed, stop saying it's not an issue, here's an AMD official r/AMD thread I give you and see latest posts for people on 1.2.0.2, you can simply search usb as well, there's OCN threads of mobos and etc where some people still report issues.
Just because you do not have this BS problem doesn't mean it's not an issue, "it's quickly identified and fixed" what the fuck ? No they took a long time and people at r/AMD were bashing the folks who reported the issues, you want to talk more ? WHEA BSOD problems and RMAing CPUs also happened with Ryzen 5000 series platform. This RMA bs didn't happen with Ryzen 3000 series.
Then the issue root cause was not even mentioned they simply said it's going to fix and no it didn't and does not. It's random and no pattern makes it even worse.
"AND Intel" massive bs claim you are making it, Intel Z490 / Z390 and / Z590 or even Z270 didn't have USB crap out issues hell I don't even see them on Ivy and Haswell or the 6th to 8th gen. Bonus is didn't see people doing RMA for CPUs because of WHEA BSOD.
I know Intel Rocketlake is a bust since backport only helped Intel not consumers due to 2C4T loss and insanely High power consumption. The only advantage it has over CML is PCIe 4 which is not useful as of today, but in the future it might. It's not wayyy more powerful as you make it BUT it's wayyy more efficient. 10C20C CML processor is still very fast in gaming and emulation performance, and best part is it's cheap esp 10850K, since 5900X is only viable processor that can beat 10900K but it's not in stock and expensive. 5950X doesn't scale so great in gaming and it's even more expensive and no stock either. AND the RMA for these dual CCD processors are higher than Single CCD 5800X and 5600X (which is overpriced and not in stock). Now that is probably fixed by new batch of CPUs in circulation.
I know that Die size on RKL is huge, it doesn't do anything except for heat dissipation. Then there's that useless iGPU on the die wasting that die space. They could have added eDRAM if they really wanted to push for gaming crown, they didn't do any of that. Probably because it's super short lived and last 14nm silicon since all Xeon is now on 10nm from IceLake.
They dropped 2 cores not just because of die size but the heat output and power consumption as well, if they added those 2C4T the power guzzling would reach to 400W+ This whole RKL doesn't help anyone except Intel now they have a working product that has a backport done from 10nm design to 14nm and it works, which adds it to their experience.
the thing s that Igpu isnt useless its not that horrible fore troubleshooting and it can sorta play some games, its better than not having a gpu please sit down before idk something
> there's that useless iGPU on the die wasting that die space.
It's not a waste, especially when dGPUs are virtually unobtainable. However, a lot of market segments only use the iGPU. I have a web/office PC at home with no dGPU, and my corporate PC for my job also has no dGPU.
> all Xeon is now on 10nm from IceLake.
Only Xeon Scalable Series (i.e. the big server CPUs). E-series Xeon are rumored to be moving to Rocket Lake. W-series Xeons remain on Cascade Lake (at least the HEDT-cousins with 4-channel memory).
Ice Lake and Rocket Lake are both stop-gap solutions, to be replaced after well under a year on the market. However, they do give us PCIe 4.0 support, which is at least advancing something. Rocket Lake's iGPU is also a meaningful step up from their previous Gen9 units, for those stuck using it.
Also, RAM tuning on Zen 3 is freaking idiot proof! It'll run at stupid high speeds & low timings with next to no voltage tweaking needed AT ALL. Where the hell are you reading all this crap?? O_o
What is your DRAM speed and timings ? Also I know with Zen 3 DRAM latency doesn't matter that much due to the improved cache layout.
Claiming such with absolutely no numbers is as useful as a toilet paper. DRAM tuning on Zen needs Zentimings software and the undocumented bullshit terms and their numbers, Intel has Datasheets for LLC and other points while AMD has none, period. The only data points for AMD is go to the OCN and Reddit and find those bs numbers that people do and run the CTR and try to tune with tons of OCCT and other load / stress testing, want to get CO stable ? Run P95 core testing esp this is even more important due to the previous RMA / WHEA issues. This is all public knowledge from OCN and Reddit.
Where as having nebulous lines and typical defense on AMD is cheesy at max and worse second.
why would you tune DRAM further than default settings when the ryzen CPU already beats all the Intel stuff by default? easy plug and play memory fit just buy 3xxx ish memory. The time when massive gains from memory and cpu tuning were possible are long gone. gains are still possible if you want. https://www.youtube.com/watch?v=9IY_KlkQK1Q
what about the 10900 vs 5900 review? https://www.techspot.com/review/2132-amd-ryzen-590... its faster in any general task you can imagine by at least 20%, gaming is on par (mainly because of old less cpu demanding games in test) but it consume 25% less power. There is still a cpu upgrade on its way (warhol) and motherboards with PCI-e4 are already there for a LOOONG time with normal pricing. no brainer choice vs a dead intel platform.
but as usual on any AMD post you are smoking some fine INTEL wheat....
I don't even have new Intel PC here. I'm in the market to buy, what to buy and what to expect. Down to the LAN port Intel problems to the AMD USB issues.
Intel LGA1200 is dead so is AM4, Warhol is a rumor, how can anyone expect a rumor to be proper future for an AM4 platform ? Did you read AT review on any of the Intel CPUs ? They mention both platforms are dead end only thing is AM4 has a 16C32T option while Intel is dead at 8C on Gen 4 and 10C on Gen 3 PCIe.
I mentioned above 5900X is the processor which beats 10900K and where I would spend money on, and that is expected of 2C4T more, if it's not then Zen 3 would have been a failure.
DRAM tuning is not dead on either of the platforms, G1/G2 on RKL has a hard cap on G1 vs CML no issues and Zen 3 parts have solid DRAM tuning too, which is even that 2000Mhz FCLK tuning, for those few lucky IMCs on the CPUs. And both the platforms help with improving performance, depends on the user budget and needs.
And why would you run 3600 on that DR with poor spec 18trc kit ? X570 doesn't even have 2 DIMM mobos like Dark and Apex, a 3600C15 SR Bdie would be easy on Ryzen 5000 over that DRAM ah you don't even know and think that is a solid spec DRAM. Got it.
1. ARM will not be sold to Nvidia 2. ARM custom silicon can be a major security concern in the data center 3. Data center ecosystem are closed and compatitbility can be an issue even from one x86 uarch to another. ie Intel to AMD) 4. AMD/Xilinx could reshape the norm in term of custom enterprise silicon. FPGA, micro controllers, DSP... with CPUs on SOCs, while ARM is just ARM.
> ARM custom silicon can be a major security concern in the data center
ARM in the datacenter is already a fact. Amazon is scaling it up in AWS at a stunning pace. Microsoft reportedly has an ARM server chip in the works, as does Google.
> AMD/Xilinx could reshape the norm in term of custom enterprise silicon. FPGA, micro controllers, DSP... with CPUs on SOCs, while ARM is just ARM.
ARM's new interconnect fabric can do this, as well. ARM's entire business model is helping customers craft customized solutions.
I'm happy to see AMD in this position. Back when they were down, I never failed to believe in them and waited for the day they'd knock Intel. Well done! Keep up the fantastic work.
It is pointless to make such wish. It is like saying you should have bought bitcoin... or Nvidia... or Tesla... or Gamestop... or BB... or whatever... just move on.
The crypto bubble is gonna crash. Crypto won't completely go away, but I wouldn't be buying into bitcoin, right now.
Tesla is priced beyond perfection. Reality will eventually catch up.
Gamestop never made a whole lot of sense. For every time there's a Gamestop scenario, there are probably 100 companies that didn't come back. If you're into gambling, go ahead and bet on a Gamestop. Professional investors usually stay well clear of stuff like that.
They should really buy a machine learning ASIC company. If they're serious about being a player in that space, then they can't just keep trying to use their GPUs (which have indeed come a long way, considering the MI100's new Matrix cores, but still...).
You may have missed the whole Xilinx purchase then. FPGA combined with AMD CPU and GPU technologies is going to surprise people with the synergies that come from it. Unlike Intel, which only saw adding FPGA as a way to diversify, AMD seems to understand the idea of combining multiple products that used to be in their own area and the resulting product be better than just the sum of its parts.
Don't be surprised if not only machine learning will come of the Xilinx merger, but other things that most people couldn't understand until the fusion of the technologies has been around for a few years.
FPGAs are useful for certain embedded usecases and as datacenter "glue", but they're no substitute for the power and efficiency of a purpose-built ASIC. FPGAs can't outperform Nvidia GPUs on machine learning, much less dedicated ASICS. They can't even deliver better perf/W on modern networks, either!
The reason for this is simple. When you have something like a tensor core, it does the bulk of the computation with the efficiency of a dedicated ASIC, due to the amount of hardwired logic it contains. A FPGA with, discrete multipliers and adders that have to get linked together over longer distances, just can't compete on timing or power-efficiency. You can even stuff it with DSP cores and still watch them lag behind what Nvidia's tensor cores can do.
And where ASICS really come into their own is that they ditch the graphics hardware, coherent caches, and 64-bit floating-point hardware of GPUs, often replacing it with more on-chip storage and more dataflow-optimized architectures.
If AMD doesn't get into the machine learning ASIC game, they're going to be stuck on the sidelines. The MI100 is a good chip, but it still devotes too much silicon to HPC.
I'm not saying they should run out and buy Cerebras, but it's not far off from where their heads should be at.
These are good numbers for AMD, and I'm not trying to be too gloomy here, maybe just pointing out an aberration. But AMD's Semi-custom gross this last quarter vs 20q1 is a billion dollars more. If most of that is due to the consoles, and they've shipped most of their console chips already, will that revenue dry up and return to lower numbers?
Their client and business revenue went from 20Q1's $1.4B to 21Q1's $2.1B, what a 25% increase? Considering the chip shortage, COVID, their existing momentum, release of Zen3, etc, I guess I'm not super impressed by that percentage increase, and thought it would be a lot more, especially for a small company on the verge of becoming big.
I think you yourself cited the reason it's not greater: chip shortage. AMD can't suddenly get more wafers than it already ordered and it can't just jack up prices -- it has to honor existing supply contracts and doesn't want to create bad blood with OEMs by squeezing them upon renewal.
Well I mentioned COVID as a positive for AMD; people bought more electronic gizmos and stayed inside, etc. But yeah it definitely seems to have impacted supplies too.
What I really want to know is how good AMD would do if they weren't supply-limited. Or severely supply limited. How much sales to OEMs, big companies, end users, etc, are they leaving on the table because they can't make enough product?
It applies to everyone a little bit, sure. But AMD was always going to run into a massive bottleneck in 2020 with the new consoles, new Radeon, new mobile Ryzen (4000 parts), and new Zen3 all being produced on TSMC 7nm. This was a logjam coming well before COVID hit.
Ian recently did a piece where he predicted AMD needed to buy ~60k wafers from TSMC just to produce the PS5 CPUs alone. Add X-Box SOCs into that and that's like well over 100K wafers. If the demand for those dry up and they can divert those wafers to Ryzen/Radeon silicon, it will be very exciting to see how it affects the market vis a vis Intel and NVidia product costs.
I kind of agree; they have arguably the best product in the market, but can't produce it. Imagine if Apple touted their latest and greatest iPhone with world-class silicon, but no one could buy it. Say what you will about Apple and Intel, but I never worry about not being able to buy their products.
> they have arguably the best product in the market, but can't produce it.
Uhhh... yes they sure can! What they're struggling with is keeping up with demand. That's different, even if it feels the same to a would-be customer.
> Say what you will about Apple and Intel, but I never worry about not being able to buy their products.
Have you been under a rock? Over the past few years, Intel has had massive availability problems off-and-on, and from top-to-bottom of their product stack! At one point, Intel's Cascade Lake Xeons were so backordered that they actually told their sales channel to recommend customers buy AMD EPYC!! That's insane!
By comparison, Apple's availability hiccups are more minor and limited, but their products have gotten backordered at launch, on several occasions. I'm not an Apple fan, so it could be the case that they've gotten those issues all sorted out, but it's also the case that a new iPhone model is no longer a must-have, the way they once were.
A few of us last year dumped a lot of their 401K, RMD's and hobled together available cash laying around into AMD. Cashing in big time also thanks to the many AMD fans stretched around the many tech-sites. Happy times and virtually doubling our money in about 8-months time. Now I would hate to see AMD stock going over $100 a share. Which I believe with Alder Lake coming to this 2021 theater soon will be a hard reach given the lingering AMD product maturity, technical and badly executed production problems. Wall Street already calling it unsurmountable AMD headwinds. With this I wonder if Intel in their 2022 4th quarter earnings report will afford us another grab at quick cash? Or for me a low milage cash-grab for a Mercedes AMG GT and another Breitling Super Chronomat? Hope that my ship will come in again!
Enjoy feeling like a genius, because you won't be right every time. I'm glad the "few of you" didn't lose big chunks of your 401(k)'s.
As you can probably guess, I've lost money buying individual stocks. However, the main thing that taught me is that I don't have the patience or passion to develop a genuine aptitude for it. And while aptitude is no guarantee of success, the lack of it is a recipe for failure.
Not difficult when you're backed by Sony and Microsoft (console scam) and barely have to compete in one of your two major markets (GPUs) due to a situation of extremely inadequate market competition (only two players in the non-APU bracket).
I know this is going to sound crazy, but maybe you could simply *tell* us why you're upset. Then, we wouldn't make assumptions and might even show a little empathy.
Both are overclockable with Precision Boost 2 technology on the whole core and XFR 2 maximizes the maximum clock ceiling with well-cooled systems. These 2 APU lines are targeted at mainstream users with very affordable prices, do not need to invest in additional discrete graphics cards but can still play well with many online games at medium - high graphics. Many Ryzen Mobile laptops have been launched such as Acer Nitro 5, ASUS K series, Dell Inspiron 7000, 5000 series with AMD versions, Lenovo Ideapad 330s, ThinkPad A285... These machines are not very high, they seem to be only sold by brands in certain markets. https://bubbleshooter.io/
'It's nice to see that both AMD and Nvidia can profit so well by not selling GPUs to PC gamers.'
But mention to gamers that the only way things will get better for them is if they choose to help themselves, by putting their money into a company that actually wants to sell GPUs to PC gamers — and you get excuses seventeen miles high, including claims like it's easier to create the Google empire from scratch than to create a GPU company that's actually favorable toward PC gaming.
'AMD will have more money to buy wafers to make consoles to compete directly against the PC gaming platform.'
And, really... I should always take the time to mention Nvidia pushing the Switch — another parasitic redundant x86 walled garden. It's just not as obviously terrible as what AMD is doing, since the Switch, at least, has the weak excuse of being based on a somewhat more novel form factor (not that the PC gaming platform can't do everything it does — plus ship with non-defective joysticks).
Hruska did an article about how all three 'consoles' use the same low-grade drift-prone plastic mechanism. This is one of the joyful benefits to consumes for them paying to keep the console scam going. Who doesn't want to buy products like the Switch that have expensive joysticks that are defective because the companies selling them have redundant parasitic walled software gardens to make it impossible for competitors to produce better-quality products (like what would be available if the 'three consoles' were part of the legitimate PC gaming platform, which should be based on Linux and OpenGL + Vulkan — not Windows)?
Sony and MS are using their own non-x86 CPUs and their own non-AMD GPUs in those 'consoles' -- CPUs and GPUs that are made from 200mm wafers on old nodes. So, yes... your rebuttal has been 100% effective.
You confuse a rebuttal with a request for evidence. I don't know for a fact that AMD isn't buying the wafers. People seem to believe they are, but I wonder if that's merely an assumption or do we have supporting evidence?
As for your points, what I think you're not taking into account is that MS and Sony are engaging AMD as a design contractor or an IP house, much like ARM. As such, both MS and Sony have ownership rights to the IP for their console APUs. MS famously got burned when one of their previous suppliers (forget if it was Intel or Nvidia) decided to stop production on one of the XBox chips before MS was ready to stop selling the console.
Finally, both MS and Sony are existing customers of TSMC (and other fabs) and have the size and supply-chain management experience to deal with TSMC, themselves. It's entirely plausible that's exactly what they're doing.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
75 Comments
Back to Article
Nehemoth - Tuesday, April 27, 2021 - link
Nothing about Xilink?Ryan Smith - Tuesday, April 27, 2021 - link
Nope. The deal hasn't closed yet.mdriftmeyer - Tuesday, April 27, 2021 - link
Xilinx popped in after hours trading approaching $139/share.Same day full scale production of their flag ship hardware for 5G networks.
https://www.xilinx.com/news/press/2021/xilinx-anno...
Xilinx will have their Q4 earnings via press release on May 4th, according to their Investor Relations section off their corporate site.
blppt - Tuesday, April 27, 2021 - link
Well deserved. Zen 3 is an incredible product---hopefully they'll be ramping up production soon.6900XT is also far better a performer than I was expecting---was thinking a 2080ti equivalent, and got something that trades blows with 3090 instead.
Alistair - Tuesday, April 27, 2021 - link
Although with graphics cards costing more than 50 percent compared to a year ago, I guess that means AMD's volume is actually dropping.bernstein - Tuesday, April 27, 2021 - link
Its not amd thats charging more, its board partners, vendors & scalpersAlistair - Tuesday, April 27, 2021 - link
that's not true, the 6900 XT is $1000, AMD's top card was $400 beforemdriftmeyer - Tuesday, April 27, 2021 - link
The RX 5700XT was never $400.mode_13h - Wednesday, April 28, 2021 - link
Also, it wasn't their flagship card - Radeon VII still was. That sold for $700.watzupken - Tuesday, April 27, 2021 - link
"that's not true, the 6900 XT is $1000, AMD's top card was $400 before"I think you should clarify which "top card" you are referring to here. The problem was AMD was never competitive in the high end graphics for quite a number of years. With them finally back in the high end market where the RX 6900 XT beats or matches a RTX 3090 in some cases, I don't think it is feasible to price it at 400 bucks. Even with slightly cut down RX 6800 XT is snapped up with a too good to be true MSRP of 649 USD.
pogsnet - Thursday, May 6, 2021 - link
I bought my 6700xt days ago for $1000Qasar - Monday, May 10, 2021 - link
during the way the world is, and with demand as it is vs supply, you probably paid 200-300 too much.Linustechtips12#6900xt - Thursday, April 29, 2021 - link
keep in mind they have never had a 3090 class competitor for like A LONG FREAKING TIMEOxford Guy - Wednesday, May 5, 2021 - link
Because they're pushing the console scam.There is less incentive to not let Nvidia set the prices for the PC gaming platform and push Polaris-forever (plus overpriced half-baked 'high-end' stuff for the 'red team faithful' — and miners more than anything) strategy.
Nvidia gets to be the 'bad guy' by setting higher and higher prices, while AMD just meekly follows. Meanwhile, gamers never add 2 + 2 to see all of the facets of how they're being squeezed because they refuse to demand adequate market competition.
mode_13h - Wednesday, May 5, 2021 - link
LOL. Never figured you for a tin foil hat type.Believe me, if AMD could easily take the GPU crown, they definitely would. I'm sure they'd love to be raking in the kind of revenue on GPUs that Nvidia is making.
And AMD has been burned several times by mining bubbles. It would be complete folly for them to count on mining to keep prices high. And the only times substandard AMD-powered graphics cards have sold for significant prices is when demand for *all* GPUs has outstripped supply.
There's a 3rd reason it would be plain dumb for AMD to count on a supply shortage, and it starts with an "I". There's no telling how much production Intel has in the pipeline. With the launch of their gaming cards and a slight deflation of the crypto bubble, graphics card prices could virtually crash, overnight.
Oxford Guy - Wednesday, May 5, 2021 - link
1. Scalpers didn't craft the GPUs to be attractive for mining.2. Scalpers didn't refuse to directly sell GPUs to gamers in quantity.
3. Scalpers didn't refuse to implement what is long overdue: motherboards that have sockets for GPUs and VRAM (or just GPUs with HBM2), not just CPUs — rather than continuing to push the antique ATX form factor with its absurdly inefficient GPU cooling solutions.
mode_13h - Wednesday, May 5, 2021 - link
> Scalpers didn't craft the GPUs to be attractive for mining.GPUs are good at mining, period. It's actually pretty hard to make a GPU that's good at gaming and NOT good at mining.
> Scalpers didn't refuse to directly sell GPUs to gamers in quantity.
I don't know what you're talking about, but anyone buying in quantity is probably a miner -- not a gamer.
> Scalpers didn't refuse to implement what is long overdue: motherboards that have sockets for GPUs
LOL. They do. It's called a PCIe slot.
Seriously, lay off the crack pipe. You're not even making sense, here.
GDDR has to be soldered onto the same PCB as the GPU, and right next to it. That's one of the reasons it can be so much faster than normal DDR memory. And once you have a PCB with a GPU and some GDDR, it makes sense to put the display connectors on there, so that you can add ports and upgrade to newer display standards when you upgrade your GPU. Also, put the VRM on it, so the cheapest motherboard doesn't need 320 W VRMs for a GPU few users would ever pair with it. And voilà! We have a recipe for the modern graphics card!
Even better: putting GPUs in PCIe slots makes it easy to scale to multi-GPU setups.
> the antique ATX form factor with its absurdly inefficient GPU cooling solutions.
How would putting the GPU on the motherboard make them any easier to cool? Keep in mind that GPU cooling solutions also need to cool their GDDR and VRMs.
Targon - Wednesday, April 28, 2021 - link
There are different sides to this. The Radeon 5700XT was a 40CU(Compute Unit) card, which is the same size as the 6700XT. The 6800 is a 60CU card, 6800XT being 72CU, and 6900XT is 80CU. From that point of view, the proper comparison for price would be MSRP of 5700XT to 6700XT. As I recall, the 5700XT did launch with a MSRP of $400.Now, the tariffs are a part of the MSRP, since any products made in China and imported into the USA are subject to that higher price. Even globally where the tariffs don't apply, AMD would have to incorporate it into the MSRP or lose a good amount of profitability from selling the reference models.
It is difficult to pin down exactly how many units of GPUs that AMD is selling, since AMD could theoretically send out more GPUs than the video card makers are able to turn into video cards due to other shortages, and it would still mean more GPU revenue for AMD.
Spunjji - Wednesday, April 28, 2021 - link
Only if they're manufacturing fewer cards now than then - as they seem to be selling everything they make, which I don't think was true then?Silver5urfer - Tuesday, April 27, 2021 - link
Very good to see AMD reach to good position. I'm however not impressed on how their AM4 socket is with X570 and Vermeer. Many issues still happen to people all over and lack of documentation on their technologies for public disclosure is another thing, where the end user has to scour over the reddit and other forums to get those numbers and voltage readings with tons of software to tune even to DRAM. Their EPYC got all the R&D and tons of polish I bet. Since it moves more along with their custom silicon which is Console market. AMD should put more onto consumer side. I was going to get AM4 with Zen 3 but damn when I see the issues of USB and AGESA I'm very much hesitant.Since Xilinx is not fully integrated, I think we have to wait on that front, that would be really top class breakthrough esp when the FPGA and x86 compute combines.
blppt - Tuesday, April 27, 2021 - link
I haven't had a single issue with my x570 board---and its not a high-end one either, just a cheapo ASUS Tuf model.The only issue I had was that I simply couldn't reach 1800/3600, but that ended up being a limitation of the IF prior to Zen 3. As soon as I replaced the 3900X with a 5950X, it was fine at 1800/3600.
Cooe - Tuesday, April 27, 2021 - link
You're making a mountain out of a molehill... -_- ... The USB issues were quickly identified & fixed, and that kind of stuff ALWAYS happens with new CPU releases (it only affected Ryzen 5000), both AMD AND Intel. By and large, X570 works great and is a WAY more powerful and flexible platform than Intel's.Silver5urfer - Tuesday, April 27, 2021 - link
No not fixed, stop saying it's not an issue, here's an AMD official r/AMD thread I give you and see latest posts for people on 1.2.0.2, you can simply search usb as well, there's OCN threads of mobos and etc where some people still report issues.https://old.reddit.com/r/Amd/comments/m2wqkf/updat...
Just because you do not have this BS problem doesn't mean it's not an issue, "it's quickly identified and fixed" what the fuck ? No they took a long time and people at r/AMD were bashing the folks who reported the issues, you want to talk more ? WHEA BSOD problems and RMAing CPUs also happened with Ryzen 5000 series platform. This RMA bs didn't happen with Ryzen 3000 series.
Then the issue root cause was not even mentioned they simply said it's going to fix and no it didn't and does not. It's random and no pattern makes it even worse.
"AND Intel" massive bs claim you are making it, Intel Z490 / Z390 and / Z590 or even Z270 didn't have USB crap out issues hell I don't even see them on Ivy and Haswell or the 6th to 8th gen. Bonus is didn't see people doing RMA for CPUs because of WHEA BSOD.
I know Intel Rocketlake is a bust since backport only helped Intel not consumers due to 2C4T loss and insanely High power consumption. The only advantage it has over CML is PCIe 4 which is not useful as of today, but in the future it might. It's not wayyy more powerful as you make it BUT it's wayyy more efficient. 10C20C CML processor is still very fast in gaming and emulation performance, and best part is it's cheap esp 10850K, since 5900X is only viable processor that can beat 10900K but it's not in stock and expensive. 5950X doesn't scale so great in gaming and it's even more expensive and no stock either. AND the RMA for these dual CCD processors are higher than Single CCD 5800X and 5600X (which is overpriced and not in stock). Now that is probably fixed by new batch of CPUs in circulation.
Smell This - Wednesday, April 28, 2021 - link
Y - A - W - N
mode_13h - Wednesday, April 28, 2021 - link
> Intel Rocketlake is a bust since backport only helped Intel not consumers due to 2C4T lossDid it ever occur to you to fact-check this?
10-core Comet Lake die size: 206.1 mm2
8-core Rocket Lake die size: 276.0 mm2
Why did you *think* they dropped 2 cores?
Silver5urfer - Thursday, April 29, 2021 - link
I know that Die size on RKL is huge, it doesn't do anything except for heat dissipation. Then there's that useless iGPU on the die wasting that die space. They could have added eDRAM if they really wanted to push for gaming crown, they didn't do any of that. Probably because it's super short lived and last 14nm silicon since all Xeon is now on 10nm from IceLake.They dropped 2 cores not just because of die size but the heat output and power consumption as well, if they added those 2C4T the power guzzling would reach to 400W+ This whole RKL doesn't help anyone except Intel now they have a working product that has a backport done from 10nm design to 14nm and it works, which adds it to their experience.
Linustechtips12#6900xt - Thursday, April 29, 2021 - link
the thing s that Igpu isnt useless its not that horrible fore troubleshooting and it can sorta play some games, its better than not having a gpu please sit down before idk somethingmode_13h - Thursday, April 29, 2021 - link
> there's that useless iGPU on the die wasting that die space.It's not a waste, especially when dGPUs are virtually unobtainable. However, a lot of market segments only use the iGPU. I have a web/office PC at home with no dGPU, and my corporate PC for my job also has no dGPU.
> all Xeon is now on 10nm from IceLake.
Only Xeon Scalable Series (i.e. the big server CPUs). E-series Xeon are rumored to be moving to Rocket Lake. W-series Xeons remain on Cascade Lake (at least the HEDT-cousins with 4-channel memory).
Ice Lake and Rocket Lake are both stop-gap solutions, to be replaced after well under a year on the market. However, they do give us PCIe 4.0 support, which is at least advancing something. Rocket Lake's iGPU is also a meaningful step up from their previous Gen9 units, for those stuck using it.
Linustechtips12#6900xt - Thursday, April 29, 2021 - link
that is not apropriate language young manCooe - Tuesday, April 27, 2021 - link
Also, RAM tuning on Zen 3 is freaking idiot proof! It'll run at stupid high speeds & low timings with next to no voltage tweaking needed AT ALL. Where the hell are you reading all this crap?? O_oSilver5urfer - Wednesday, April 28, 2021 - link
What is your DRAM speed and timings ? Also I know with Zen 3 DRAM latency doesn't matter that much due to the improved cache layout.Claiming such with absolutely no numbers is as useful as a toilet paper. DRAM tuning on Zen needs Zentimings software and the undocumented bullshit terms and their numbers, Intel has Datasheets for LLC and other points while AMD has none, period. The only data points for AMD is go to the OCN and Reddit and find those bs numbers that people do and run the CTR and try to tune with tons of OCCT and other load / stress testing, want to get CO stable ? Run P95 core testing esp this is even more important due to the previous RMA / WHEA issues. This is all public knowledge from OCN and Reddit.
Where as having nebulous lines and typical defense on AMD is cheesy at max and worse second.
duploxxx - Wednesday, April 28, 2021 - link
why would you tune DRAM further than default settings when the ryzen CPU already beats all the Intel stuff by default? easy plug and play memory fit just buy 3xxx ish memory. The time when massive gains from memory and cpu tuning were possible are long gone. gains are still possible if you want.https://www.youtube.com/watch?v=9IY_KlkQK1Q
running x570 - 3700x - 2*16GB Crucial @3600 16-18-18-38-1T
what about the 10900 vs 5900 review?
https://www.techspot.com/review/2132-amd-ryzen-590...
its faster in any general task you can imagine by at least 20%, gaming is on par (mainly because of old less cpu demanding games in test) but it consume 25% less power. There is still a cpu upgrade on its way (warhol) and motherboards with PCI-e4 are already there for a LOOONG time with normal pricing.
no brainer choice vs a dead intel platform.
but as usual on any AMD post you are smoking some fine INTEL wheat....
Silver5urfer - Wednesday, April 28, 2021 - link
I don't even have new Intel PC here. I'm in the market to buy, what to buy and what to expect. Down to the LAN port Intel problems to the AMD USB issues.Intel LGA1200 is dead so is AM4, Warhol is a rumor, how can anyone expect a rumor to be proper future for an AM4 platform ? Did you read AT review on any of the Intel CPUs ? They mention both platforms are dead end only thing is AM4 has a 16C32T option while Intel is dead at 8C on Gen 4 and 10C on Gen 3 PCIe.
I mentioned above 5900X is the processor which beats 10900K and where I would spend money on, and that is expected of 2C4T more, if it's not then Zen 3 would have been a failure.
DRAM tuning is not dead on either of the platforms, G1/G2 on RKL has a hard cap on G1 vs CML no issues and Zen 3 parts have solid DRAM tuning too, which is even that 2000Mhz FCLK tuning, for those few lucky IMCs on the CPUs. And both the platforms help with improving performance, depends on the user budget and needs.
And why would you run 3600 on that DR with poor spec 18trc kit ? X570 doesn't even have 2 DIMM mobos like Dark and Apex, a 3600C15 SR Bdie would be easy on Ryzen 5000 over that DRAM ah you don't even know and think that is a solid spec DRAM. Got it.
Spunjji - Wednesday, April 28, 2021 - link
"DRAM tuning on Zen needs Zentimings software"Oh noes, doing unnecessary tweaking is made easy with free software, how terrible
Spunjji - Wednesday, April 28, 2021 - link
"the end user has to scour over the reddit and other forums to get those numbers and voltage readings with tons of software to tune even to DRAM"This has always been the case for RAM tuning, what are you even on about?
eva02langley - Tuesday, April 27, 2021 - link
Don't bid against AMD, you are going to get burned...Unashamed_unoriginal_username_x86 - Wednesday, April 28, 2021 - link
What if I'm bidding on Arm?(Not that I could outbid Nvidia of course)
eva02langley - Wednesday, April 28, 2021 - link
1. ARM will not be sold to Nvidia2. ARM custom silicon can be a major security concern in the data center
3. Data center ecosystem are closed and compatitbility can be an issue even from one x86 uarch to another. ie Intel to AMD)
4. AMD/Xilinx could reshape the norm in term of custom enterprise silicon. FPGA, micro controllers, DSP... with CPUs on SOCs, while ARM is just ARM.
mode_13h - Wednesday, April 28, 2021 - link
> ARM custom silicon can be a major security concern in the data centerARM in the datacenter is already a fact. Amazon is scaling it up in AWS at a stunning pace. Microsoft reportedly has an ARM server chip in the works, as does Google.
> AMD/Xilinx could reshape the norm in term of custom enterprise silicon. FPGA, micro controllers, DSP... with CPUs on SOCs, while ARM is just ARM.
ARM's new interconnect fabric can do this, as well. ARM's entire business model is helping customers craft customized solutions.
GeoffreyA - Wednesday, April 28, 2021 - link
I'm happy to see AMD in this position. Back when they were down, I never failed to believe in them and waited for the day they'd knock Intel. Well done! Keep up the fantastic work.Spunjji - Wednesday, April 28, 2021 - link
I just wish I'd been able to afford to buy AMD stock back in the 'dozer days!eva02langley - Wednesday, April 28, 2021 - link
It is pointless to make such wish. It is like saying you should have bought bitcoin... or Nvidia... or Tesla... or Gamestop... or BB... or whatever... just move on.mode_13h - Wednesday, April 28, 2021 - link
> bitcoin, Tesla, GamestopThe crypto bubble is gonna crash. Crypto won't completely go away, but I wouldn't be buying into bitcoin, right now.
Tesla is priced beyond perfection. Reality will eventually catch up.
Gamestop never made a whole lot of sense. For every time there's a Gamestop scenario, there are probably 100 companies that didn't come back. If you're into gambling, go ahead and bet on a Gamestop. Professional investors usually stay well clear of stuff like that.
mode_13h - Wednesday, April 28, 2021 - link
I wish I'd thought to, but Intel still seemed almost unstoppable, back then. For AMD, the best part of their product portfolio was their GPUs.mode_13h - Wednesday, April 28, 2021 - link
They should really buy a machine learning ASIC company. If they're serious about being a player in that space, then they can't just keep trying to use their GPUs (which have indeed come a long way, considering the MI100's new Matrix cores, but still...).Targon - Wednesday, April 28, 2021 - link
You may have missed the whole Xilinx purchase then. FPGA combined with AMD CPU and GPU technologies is going to surprise people with the synergies that come from it. Unlike Intel, which only saw adding FPGA as a way to diversify, AMD seems to understand the idea of combining multiple products that used to be in their own area and the resulting product be better than just the sum of its parts.Don't be surprised if not only machine learning will come of the Xilinx merger, but other things that most people couldn't understand until the fusion of the technologies has been around for a few years.
mode_13h - Wednesday, April 28, 2021 - link
Thanks, but I most certainly did not miss it.FPGAs are useful for certain embedded usecases and as datacenter "glue", but they're no substitute for the power and efficiency of a purpose-built ASIC. FPGAs can't outperform Nvidia GPUs on machine learning, much less dedicated ASICS. They can't even deliver better perf/W on modern networks, either!
The reason for this is simple. When you have something like a tensor core, it does the bulk of the computation with the efficiency of a dedicated ASIC, due to the amount of hardwired logic it contains. A FPGA with, discrete multipliers and adders that have to get linked together over longer distances, just can't compete on timing or power-efficiency. You can even stuff it with DSP cores and still watch them lag behind what Nvidia's tensor cores can do.
And where ASICS really come into their own is that they ditch the graphics hardware, coherent caches, and 64-bit floating-point hardware of GPUs, often replacing it with more on-chip storage and more dataflow-optimized architectures.
If AMD doesn't get into the machine learning ASIC game, they're going to be stuck on the sidelines. The MI100 is a good chip, but it still devotes too much silicon to HPC.
I'm not saying they should run out and buy Cerebras, but it's not far off from where their heads should be at.
Farfolomew - Wednesday, April 28, 2021 - link
These are good numbers for AMD, and I'm not trying to be too gloomy here, maybe just pointing out an aberration. But AMD's Semi-custom gross this last quarter vs 20q1 is a billion dollars more. If most of that is due to the consoles, and they've shipped most of their console chips already, will that revenue dry up and return to lower numbers?Their client and business revenue went from 20Q1's $1.4B to 21Q1's $2.1B, what a 25% increase? Considering the chip shortage, COVID, their existing momentum, release of Zen3, etc, I guess I'm not super impressed by that percentage increase, and thought it would be a lot more, especially for a small company on the verge of becoming big.
mode_13h - Wednesday, April 28, 2021 - link
I think you yourself cited the reason it's not greater: chip shortage. AMD can't suddenly get more wafers than it already ordered and it can't just jack up prices -- it has to honor existing supply contracts and doesn't want to create bad blood with OEMs by squeezing them upon renewal.Farfolomew - Thursday, April 29, 2021 - link
Well I mentioned COVID as a positive for AMD; people bought more electronic gizmos and stayed inside, etc. But yeah it definitely seems to have impacted supplies too.What I really want to know is how good AMD would do if they weren't supply-limited. Or severely supply limited. How much sales to OEMs, big companies, end users, etc, are they leaving on the table because they can't make enough product?
mode_13h - Thursday, April 29, 2021 - link
It's a good question, but it applies to almost everyone, right now!Farfolomew - Friday, April 30, 2021 - link
It applies to everyone a little bit, sure. But AMD was always going to run into a massive bottleneck in 2020 with the new consoles, new Radeon, new mobile Ryzen (4000 parts), and new Zen3 all being produced on TSMC 7nm. This was a logjam coming well before COVID hit.Ian recently did a piece where he predicted AMD needed to buy ~60k wafers from TSMC just to produce the PS5 CPUs alone. Add X-Box SOCs into that and that's like well over 100K wafers. If the demand for those dry up and they can divert those wafers to Ryzen/Radeon silicon, it will be very exciting to see how it affects the market vis a vis Intel and NVidia product costs.
mode_13h - Saturday, May 1, 2021 - link
Is it AMD buying them, or Sony & MS? Either way, I doubt it comes out of some "AMD allocation".yeeeeman - Thursday, April 29, 2021 - link
Amd is in a weird situation where they are too small for such a big success.Linustechtips12#6900xt - Thursday, April 29, 2021 - link
Nah its just circumstantial or however u spell thatmode_13h - Thursday, April 29, 2021 - link
If they re-invest it wisely and make smart acquisitions, they'll do fine.Farfolomew - Friday, April 30, 2021 - link
I kind of agree; they have arguably the best product in the market, but can't produce it. Imagine if Apple touted their latest and greatest iPhone with world-class silicon, but no one could buy it. Say what you will about Apple and Intel, but I never worry about not being able to buy their products.mode_13h - Saturday, May 1, 2021 - link
> they have arguably the best product in the market, but can't produce it.Uhhh... yes they sure can! What they're struggling with is keeping up with demand. That's different, even if it feels the same to a would-be customer.
> Say what you will about Apple and Intel, but I never worry about not being able to buy their products.
Have you been under a rock? Over the past few years, Intel has had massive availability problems off-and-on, and from top-to-bottom of their product stack! At one point, Intel's Cascade Lake Xeons were so backordered that they actually told their sales channel to recommend customers buy AMD EPYC!! That's insane!
By comparison, Apple's availability hiccups are more minor and limited, but their products have gotten backordered at launch, on several occasions. I'm not an Apple fan, so it could be the case that they've gotten those issues all sorted out, but it's also the case that a new iPhone model is no longer a must-have, the way they once were.
Tom Sunday - Tuesday, May 11, 2021 - link
A few of us last year dumped a lot of their 401K, RMD's and hobled together available cash laying around into AMD. Cashing in big time also thanks to the many AMD fans stretched around the many tech-sites. Happy times and virtually doubling our money in about 8-months time. Now I would hate to see AMD stock going over $100 a share. Which I believe with Alder Lake coming to this 2021 theater soon will be a hard reach given the lingering AMD product maturity, technical and badly executed production problems. Wall Street already calling it unsurmountable AMD headwinds. With this I wonder if Intel in their 2022 4th quarter earnings report will afford us another grab at quick cash? Or for me a low milage cash-grab for a Mercedes AMG GT and another Breitling Super Chronomat? Hope that my ship will come in again!mode_13h - Tuesday, May 11, 2021 - link
Enjoy feeling like a genius, because you won't be right every time. I'm glad the "few of you" didn't lose big chunks of your 401(k)'s.As you can probably guess, I've lost money buying individual stocks. However, the main thing that taught me is that I don't have the patience or passion to develop a genuine aptitude for it. And while aptitude is no guarantee of success, the lack of it is a recipe for failure.
Oxford Guy - Wednesday, May 5, 2021 - link
Not difficult when you're backed by Sony and Microsoft (console scam) and barely have to compete in one of your two major markets (GPUs) due to a situation of extremely inadequate market competition (only two players in the non-APU bracket).mode_13h - Wednesday, May 5, 2021 - link
Sounds like someone is butt-hurt cuz his PS5 is still back-ordered.Oxford Guy - Thursday, May 6, 2021 - link
You may want to read my comments prior to concocting strange claims.mode_13h - Thursday, May 6, 2021 - link
I know this is going to sound crazy, but maybe you could simply *tell* us why you're upset. Then, we wouldn't make assumptions and might even show a little empathy.Qasar - Monday, May 10, 2021 - link
or better yet, post proof of this console scam BS.britney36 - Sunday, May 2, 2021 - link
Both are overclockable with Precision Boost 2 technology on the whole core and XFR 2 maximizes the maximum clock ceiling with well-cooled systems. These 2 APU lines are targeted at mainstream users with very affordable prices, do not need to invest in additional discrete graphics cards but can still play well with many online games at medium - high graphics. Many Ryzen Mobile laptops have been launched such as Acer Nitro 5, ASUS K series, Dell Inspiron 7000, 5000 series with AMD versions, Lenovo Ideapad 330s, ThinkPad A285... These machines are not very high, they seem to be only sold by brands in certain markets.https://bubbleshooter.io/
mode_13h - Monday, May 3, 2021 - link
Spammer.Oxford Guy - Wednesday, May 5, 2021 - link
Hooray.AMD will have more money to buy wafers to make consoles to compete directly against the PC gaming platform.
It's nice to see that both AMD and Nvidia can profit so well by not selling GPUs to PC gamers.
Oxford Guy - Wednesday, May 5, 2021 - link
'It's nice to see that both AMD and Nvidia can profit so well by not selling GPUs to PC gamers.'But mention to gamers that the only way things will get better for them is if they choose to help themselves, by putting their money into a company that actually wants to sell GPUs to PC gamers — and you get excuses seventeen miles high, including claims like it's easier to create the Google empire from scratch than to create a GPU company that's actually favorable toward PC gaming.
Oxford Guy - Wednesday, May 5, 2021 - link
'AMD will have more money to buy wafers to make consoles to compete directly against the PC gaming platform.'And, really... I should always take the time to mention Nvidia pushing the Switch — another parasitic redundant x86 walled garden. It's just not as obviously terrible as what AMD is doing, since the Switch, at least, has the weak excuse of being based on a somewhat more novel form factor (not that the PC gaming platform can't do everything it does — plus ship with non-defective joysticks).
Oxford Guy - Wednesday, May 5, 2021 - link
'non-defective joysticks'Hruska did an article about how all three 'consoles' use the same low-grade drift-prone plastic mechanism. This is one of the joyful benefits to consumes for them paying to keep the console scam going. Who doesn't want to buy products like the Switch that have expensive joysticks that are defective because the companies selling them have redundant parasitic walled software gardens to make it impossible for competitors to produce better-quality products (like what would be available if the 'three consoles' were part of the legitimate PC gaming platform, which should be based on Linux and OpenGL + Vulkan — not Windows)?
mode_13h - Wednesday, May 5, 2021 - link
Please show me evidence that AMD is buying console wafers, rather than Sony and MS.Oxford Guy - Thursday, May 6, 2021 - link
Sony and MS are using their own non-x86 CPUs and their own non-AMD GPUs in those 'consoles' -- CPUs and GPUs that are made from 200mm wafers on old nodes. So, yes... your rebuttal has been 100% effective.mode_13h - Thursday, May 6, 2021 - link
You confuse a rebuttal with a request for evidence. I don't know for a fact that AMD isn't buying the wafers. People seem to believe they are, but I wonder if that's merely an assumption or do we have supporting evidence?As for your points, what I think you're not taking into account is that MS and Sony are engaging AMD as a design contractor or an IP house, much like ARM. As such, both MS and Sony have ownership rights to the IP for their console APUs. MS famously got burned when one of their previous suppliers (forget if it was Intel or Nvidia) decided to stop production on one of the XBox chips before MS was ready to stop selling the console.
Finally, both MS and Sony are existing customers of TSMC (and other fabs) and have the size and supply-chain management experience to deal with TSMC, themselves. It's entirely plausible that's exactly what they're doing.
fyipc - Sunday, May 9, 2021 - link
AMD Ryzen 9 5900X (Zen 3) CPU Reviewhttps://www.fyipc.com/cpu/24816/amd-ryzen-9-5900x-...