It's definitely not just you. I spent a few tries wondering what I was doing wrong and re-read the start of the article until I tried the drop-down menu instead of the links.
I completely agree with the conclusion that the 2300X makes perfect sense, but the 2500X is harder to place in the picture ...
On the other hand, despite 2400G and the 2500X have the same TDP, if I look at the graph with full load power consumption, I can clearly see that the latter has a very generous thermal limit, compared with the 2400G where the thermal envelope seems to be very strictly limited.
Meaning OEMs will probably be able to use the 2500X for cheaper gaming systems where auto-overclocking is used as a feature and AMD will thus be able to offer something better for a lower price.
This also allows AMD to push AM4 harder on the market, giving itself the opportunity to future upgrades for AM4 buyers.
So the 2500X will show considerably better performance than the 2400G despite the similar config (minus the iGPU) while not cannibalizing the 2600 nor the 2400G.
If AMD manages to sell more 2500X through OEMs, AMD also builds a future upgrade market for itself, unlike Intel that will likely push buyers into purchasing new machines.
ppl buying these CPUs are not the sort to be upgrading the CPU.. to most the computer is a closed box and is upgraded as a whole . I do wonder where all these cores are going .. I mean its great to have 4 6 8 cores with another 8 hyperthreads .. but who is using all that power ? Lets make 4 cores the absolute limit , unless you have a Govt permit to purchase more.
Why would there be any limit on how man cores? Whats it to you that I want to transcode movies faster, or multitask more, or anything else? And government permit to have more? Thats just insane.
Ian, any reason why more often than not, you seem to "skip" 1440 in your benchmarks? It's only present for a few games.
Considering the GTX 1080, your best card, is always the bottleneck at 4K, as your numbers show, wouldn't it make more sense to focus more on 1440 instead?
Especially considering it's the "best" resolution on the market if you are looking for a high pixel density yet still want to run your games at a playable levels of fps.
Some benchmarks are run at 1440p. Some go up to 8K. It's a mix. There's what, 10 games there? Not all of them have to conform to the same testing settings.
Sorry for the confusion. I can clearly see we've got very different settings in that mix. I guess a more direct question would be: why do it this way and not with a more standardized series of test?
A followup question would also be, why 8K? You are already GPU limited at 4K so your 8K result are not going to give any relevant information about those CPUs.
Sorry, I don't mean to criticized, I simply wish to understand your thought process.
What exactly do you want to see there that you can't see at 1080p? Differences between CPUs are going to be muddied due to approaching the GPU limit, and that's it.
Well, at 1080, you can definitely see the difference between them, and exactly like you said, at 4K, it's all the same because of the GPU limitations. 1440 seems more relevant than 4K considering this. This is after all, a CPU review and most of the 4K results could be summed up by "they all perform within a few %".
Is there a point in even mentioning that give how little control they now have over advertising? Just fire up the ad blocker or visit another site and let the new owners figure it out the hard way.
Your Intel bias is showing again, Ian. You've pitted a very nice selection of midrange processors from AMD against some very nice, almost double the price chips from Intel. If you're going to include the i5-8400 and i5-8600k, why not the R7 2600x or 2700? They're price-point competitors. But then, Intel wouldn't be at the top of the charts in almost any of the tests, would they?
All the data is in Bench for those parts. I mention repeatedly (as I did in our buyer's guide) that Intel doesn't really have anything competitive from 8th/9th Gen in the $120-$200 range. I put some parts in that are at least offer thread parity, as explained on page one of this review, if you read that far. But then again, Intel's 8th gen chips are priced well above the usual price right now.
Subsequently, your data bias is showing. It's not about being at the absolute top of the graph. It never has. It's about competing with what's around you and some context either side from major competitors. If you want to compare higher priced parts against higher priced parts, then there's either a benchmark database to look at, or the corresponding reviews for those chips.
All quite apart from which, most of my analysis is comparing the AMD parts to other AMD parts because they're not sold at retail and where they would fit in if they did. That's one of the major points of this review.
Is Anandtech trying to acquire an Intel i3-8100 processor for testing? This would seem to be a fairly natural comparison point to these processors at it's $117 customer pricing level. Granted you can approximate the results off the i3-8350K, and assume it's roughly 10% slower, but having actual numbers would be preferred over manual re-calculations.
Big difference is that it does not have Hyperthreading, been 6 cores without hyperthreading it could be serious competitor to Ryzen 5 2500X - it does have lesser max frequency than normal 8400
@Ian - Whilst not quite as militant as some other forum users, I do agree that the testing and comparisons you have used here are not the most appropriate or useful. A similarly priced Intel CPU like the i3 would demonstrate competitive value in the marketplace. If we are including the more expensive Intel CPUs (because of their similar thread count, which I understand) then the graphs should have the equivalently priced AMD alternatives, again to help consumers understand the value proposition from both sides.
Regarding the games/GPU options, I feel the testing you have carried out is useful, and although it's unlikely these CPUs would be paired with such a high-end GPU, we are at least ruling out the GPU being the limiting factor until reaching 4k, where your graphs demonstrate that the CPU is no longer the bottleneck. Without doubling the number of tests and data presented in the articles, I feel you've presented the most useful benchmarks and information. You'll never please everyone, I suppose.
Overall I think this is another fantastic write-up and appreciate the effort you put into the research and testing, but I can understand some people's frustrations when it comes to the comparisons you have chosen to demonstrate.
Well said "If we are including the more expensive Intel CPUs (because of their similar thread count, which I understand) then the graphs should have the equivalently priced AMD alternatives, again to help consumers understand the value proposition from both sides."
Someone that decided to start gaming, or changed to a game that required more graphics power so bought a graphics card. Maybe a kid whose parents bought a computer, or a hand-me-down computer. Even I have been gaming and building computers a long time and I have upgraded the graphics card on my computers several times around the middle of that system's lifetime (I keep them pretty long). I have friends that play WoW and needed to upgrade.There are plenty of situations.
Have you seen gaming benchmarks with low end CPUs vs high end CPUs when both have the same high end graphics card?
another intel biased review . Why didint youput the price of the i5-8600k like in other reviews?? maybe because you know that it costs the same as amd 2700 ??? tired of LIARS.
The statement by Pajuk is actually not correct - the i5-8600k costs $288.96 boxed, the Ryzen 7 2700 229.99 - both at Newegg - so it's not the same price but $60 more.
Really would have liked to have seen more Intel 4c/4t and 4c/8t cpu's for comparison, like 4690k, 6700k or 7700k. I'm curious how my 4790k stacks up to amds zen+ 4c/8t cpu but from the others tested its hard to say.
AMD's CPU naming scheme is a bit of a mess now. Used to be that Ryzen 7 = 8c/16t, 5 = 6c/12t, 3 = 4c/4t but now we have 4c/8t parts mucking up the 5s. IMO they should reorder their lineup by core and thread counts by moving current Ryzen 3 to 1, and 4c/8t CPUs from Ryzen 5 to 3.
Not really, given that the Ryzen 5 1400 and Ryzen 5 1500X are 4C/8T parts from just after the initial Ryzen launch, so in essence it was messed up to begin with. Also, if we're splitting hairs, Intel used to have HT in its i3 and i7 CPUs...
"Where possible, we will extend out testing to include faster memory modules either at the same time as the review or a later date." ____ _____ _____ _____ _______
It would be sweet with some OC action, too.
Most impressive is the jump between the AMD Ryzen 2500X/1500X and Ryzen 3 2300X/1300X --- roughly 10% +/-. Good work, AMD.
I guess this is the difference between Zen and Zen+. With 7nm Zen ++ arriving soon, and Zen+++ next year, the CPU times they are a changin' ...
These are both available from Norwegian retailers, though prices are ... not good. The 2500X costs as much as the 2600X, and the 2300X is barely cheaper than the 2600 (though admittedly the 2600 is _really_ cheap).
AMD APUs (2400G & 2300G) try to keep atleast 50% of the power budget for GPU so the CPU load graph doesn't show the whole picture. It shows the power used for all chips for CPU load not CPU+GPU load. While having a mixture of CPU-only & CPU+GPU chips present means you want to focus on the CPU the reader needs to be reminded that the CPU+GPU load will be higher.
I wish AMD had option for 95w TDP APU to compete with Intel models. With more CPU cores/headroom and 25% more GPU to use that 95w+ peak.
I'm begging here - can you please, please, please show us your config settings for the HEVC encoding? You get rates that are 6x+ faster than I can achieve - my O/C'd 8700k gets ~45fps with 1080 Fast 3500 settings using all else as default in Handbrake. I'd really love to hit the #s you get with just an i5. Help!
Ian, why don't you include idle power consumption? And not just in this article, in all CPU reviews? System's usually spent 95% time idling. That would be very interesting to know and compare. Thank you!
Idle power consumption varies very wildly when you're talking about zero CPU load, and can be influenced more by the motherboard/system than the CPU itself. You can check the data in Bench regardless:
Still disappointed that Anandtech cannot use a simple bar chart plugin like canvasjs in 2019. Would love the ability to add information on hover like the current market price, or MSRP, or make it so it fetches the price from the amazon or newegg api.
This is especially relevant in these charts where you're adding intel processors that are completely out of the price league of the CPU being reviewed and you're not adding the current price to the chart.
Using a JS library to render the HTML would make those charts mobile responsive.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
65 Comments
Back to Article
romrunning - Monday, February 11, 2019 - link
It may just be me, but all of the links on the "Pages In This Review" at the bottom of the main page simply return me to the main page.romrunning - Monday, February 11, 2019 - link
But the drop-down to the specific page works as expected.evilspoons - Monday, February 11, 2019 - link
It's definitely not just you. I spent a few tries wondering what I was doing wrong and re-read the start of the article until I tried the drop-down menu instead of the links.Ian Cutress - Monday, February 11, 2019 - link
That's my fault, as the hyperlinks need to be manually added. I had messed up the part of the URL after the /show/13945. It should be fixed now.Kevin G - Monday, February 11, 2019 - link
I noticed this as well.IGTrading - Monday, February 11, 2019 - link
Thank you Ian for a good review.I completely agree with the conclusion that the 2300X makes perfect sense, but the 2500X is harder to place in the picture ...
On the other hand, despite 2400G and the 2500X have the same TDP, if I look at the graph with full load power consumption, I can clearly see that the latter has a very generous thermal limit, compared with the 2400G where the thermal envelope seems to be very strictly limited.
Meaning OEMs will probably be able to use the 2500X for cheaper gaming systems where auto-overclocking is used as a feature and AMD will thus be able to offer something better for a lower price.
This also allows AMD to push AM4 harder on the market, giving itself the opportunity to future upgrades for AM4 buyers.
So the 2500X will show considerably better performance than the 2400G despite the similar config (minus the iGPU) while not cannibalizing the 2600 nor the 2400G.
If AMD manages to sell more 2500X through OEMs, AMD also builds a future upgrade market for itself, unlike Intel that will likely push buyers into purchasing new machines.
dromoxen - Monday, February 11, 2019 - link
ppl buying these CPUs are not the sort to be upgrading the CPU.. to most the computer is a closed box and is upgraded as a whole . I do wonder where all these cores are going .. I mean its great to have 4 6 8 cores with another 8 hyperthreads .. but who is using all that power ? Lets make 4 cores the absolute limit , unless you have a Govt permit to purchase more.GreenReaper - Monday, February 11, 2019 - link
Browsers have been getting a lot better at using multiple cores, and websites surely do enough in the background nowadays to justify the effort.RadiclDreamer - Tuesday, February 12, 2019 - link
Why would there be any limit on how man cores? Whats it to you that I want to transcode movies faster, or multitask more, or anything else? And government permit to have more? Thats just insane.kaidenshi - Tuesday, February 12, 2019 - link
He's trolling like he always does. Anything to get under someone's skin enough to get a reaction out of them.Le Québécois - Monday, February 11, 2019 - link
Ian, any reason why more often than not, you seem to "skip" 1440 in your benchmarks? It's only present for a few games.Considering the GTX 1080, your best card, is always the bottleneck at 4K, as your numbers show, wouldn't it make more sense to focus more on 1440 instead?
Especially considering it's the "best" resolution on the market if you are looking for a high pixel density yet still want to run your games at a playable levels of fps.
Ian Cutress - Monday, February 11, 2019 - link
Some benchmarks are run at 1440p. Some go up to 8K. It's a mix. There's what, 10 games there? Not all of them have to conform to the same testing settings.Le Québécois - Tuesday, February 12, 2019 - link
Sorry for the confusion. I can clearly see we've got very different settings in that mix. I guess a more direct question would be: why do it this way and not with a more standardized series of test?A followup question would also be, why 8K? You are already GPU limited at 4K so your 8K result are not going to give any relevant information about those CPUs.
Sorry, I don't mean to criticized, I simply wish to understand your thought process.
MrSpadge - Monday, February 11, 2019 - link
What exactly do you want to see there that you can't see at 1080p? Differences between CPUs are going to be muddied due to approaching the GPU limit, and that's it.Le Québécois - Tuesday, February 12, 2019 - link
Well, at 1080, you can definitely see the difference between them, and exactly like you said, at 4K, it's all the same because of the GPU limitations. 1440 seems more relevant than 4K considering this. This is after all, a CPU review and most of the 4K results could be summed up by "they all perform within a few %".neblogai - Monday, February 11, 2019 - link
End of page 19: R5 2600 is really 65W TDP, not 95W.Ian Cutress - Monday, February 11, 2019 - link
Doh, a typo in all my graphs too. Should be updated.imaheadcase - Monday, February 11, 2019 - link
Im on phone on AT and truly see how terrible ads are now. AT straight up letting scam ads now being served because desperate for revenue. 😂PeachNCream - Monday, February 11, 2019 - link
Is there a point in even mentioning that give how little control they now have over advertising? Just fire up the ad blocker or visit another site and let the new owners figure it out the hard way.StevoLincolnite - Tuesday, February 12, 2019 - link
Anandtech had Maleware/Viruses infect it's userbase years ago via crappy adverts.That was the moment I got Ad-Block. And that is the moment where I will never turn it off again.
Daeros - Monday, February 11, 2019 - link
Your Intel bias is showing again, Ian. You've pitted a very nice selection of midrange processors from AMD against some very nice, almost double the price chips from Intel. If you're going to include the i5-8400 and i5-8600k, why not the R7 2600x or 2700? They're price-point competitors. But then, Intel wouldn't be at the top of the charts in almost any of the tests, would they?Ian Cutress - Monday, February 11, 2019 - link
All the data is in Bench for those parts. I mention repeatedly (as I did in our buyer's guide) that Intel doesn't really have anything competitive from 8th/9th Gen in the $120-$200 range. I put some parts in that are at least offer thread parity, as explained on page one of this review, if you read that far. But then again, Intel's 8th gen chips are priced well above the usual price right now.Subsequently, your data bias is showing. It's not about being at the absolute top of the graph. It never has. It's about competing with what's around you and some context either side from major competitors. If you want to compare higher priced parts against higher priced parts, then there's either a benchmark database to look at, or the corresponding reviews for those chips.
All quite apart from which, most of my analysis is comparing the AMD parts to other AMD parts because they're not sold at retail and where they would fit in if they did. That's one of the major points of this review.
c4v3man - Monday, February 11, 2019 - link
Is Anandtech trying to acquire an Intel i3-8100 processor for testing? This would seem to be a fairly natural comparison point to these processors at it's $117 customer pricing level. Granted you can approximate the results off the i3-8350K, and assume it's roughly 10% slower, but having actual numbers would be preferred over manual re-calculations.HStewart - Monday, February 11, 2019 - link
What about i5-8400T - according to ARC it price at $179 which will be in price range you statedhttps://www.intel.com/content/www/us/en/products/p...
Big difference is that it does not have Hyperthreading, been 6 cores without hyperthreading it could be serious competitor to Ryzen 5 2500X - it does have lesser max frequency than normal 8400
Korguz - Tuesday, February 12, 2019 - link
HStewart...that price.. could be an intel suggested price, or the tray price....
HStewart - Tuesday, February 12, 2019 - link
It is the price on Amazon, and selling outhttps://www.amazon.com/Intel-CM8068403358913-Core-...
MattMe - Tuesday, February 12, 2019 - link
@Ian - Whilst not quite as militant as some other forum users, I do agree that the testing and comparisons you have used here are not the most appropriate or useful. A similarly priced Intel CPU like the i3 would demonstrate competitive value in the marketplace. If we are including the more expensive Intel CPUs (because of their similar thread count, which I understand) then the graphs should have the equivalently priced AMD alternatives, again to help consumers understand the value proposition from both sides.Regarding the games/GPU options, I feel the testing you have carried out is useful, and although it's unlikely these CPUs would be paired with such a high-end GPU, we are at least ruling out the GPU being the limiting factor until reaching 4k, where your graphs demonstrate that the CPU is no longer the bottleneck. Without doubling the number of tests and data presented in the articles, I feel you've presented the most useful benchmarks and information. You'll never please everyone, I suppose.
Overall I think this is another fantastic write-up and appreciate the effort you put into the research and testing, but I can understand some people's frustrations when it comes to the comparisons you have chosen to demonstrate.
mikato - Thursday, April 4, 2019 - link
Well said"If we are including the more expensive Intel CPUs (because of their similar thread count, which I understand) then the graphs should have the equivalently priced AMD alternatives, again to help consumers understand the value proposition from both sides."
Phynaz - Monday, February 11, 2019 - link
Typical AMD - Hot and SlowformulaLS - Monday, February 11, 2019 - link
Typical Phynaz, quit the forums and said he won't be coming back and ended up flat out lying about it. Grow up dude.Korguz - Monday, February 11, 2019 - link
Phynazbetter then the typical Intel.. overpriced, and not much gained
MDD1963 - Monday, February 11, 2019 - link
How many folks with GTX1080s would be using either of these CPUs tested (even if they were for sale)? :)Allan_Hundeboll - Tuesday, February 12, 2019 - link
Gamers on a budgetmikato - Thursday, April 4, 2019 - link
Someone that decided to start gaming, or changed to a game that required more graphics power so bought a graphics card. Maybe a kid whose parents bought a computer, or a hand-me-down computer. Even I have been gaming and building computers a long time and I have upgraded the graphics card on my computers several times around the middle of that system's lifetime (I keep them pretty long). I have friends that play WoW and needed to upgrade.There are plenty of situations.Have you seen gaming benchmarks with low end CPUs vs high end CPUs when both have the same high end graphics card?
Ethnipod - Monday, February 11, 2019 - link
wrong power consumption test ...Ryzen 5 2500X get DDR4 2933 (1.3 V) vs coffe lake DDR4 2667 (1.2 V).
(for Ryzen 5 2500X clock up ram freq to 3200, or downclock to 2667)
ty and sorry for my eng.
pajuk - Monday, February 11, 2019 - link
another intel biased review . Why didint youput the price of the i5-8600k like in other reviews??maybe because you know that it costs the same as amd 2700 ??? tired of LIARS.
Korguz - Monday, February 11, 2019 - link
prove it.. post some links...Irata - Tuesday, February 12, 2019 - link
The statement by Pajuk is actually not correct - the i5-8600k costs $288.96 boxed, the Ryzen 7 2700 229.99 - both at Newegg - so it's not the same price but $60 more.pajuk - Tuesday, February 12, 2019 - link
https://www.pcdiga.com/processador-amd-ryzen-7-270...https://www.pcdiga.com/processador-intel-core-i5-8...
pajuk - Tuesday, February 12, 2019 - link
you help me even more, this LIARS in anadtech are as bad as tomshardware.Korguz - Tuesday, February 12, 2019 - link
keep in mind Pajuk, the prices Anandtech quotes.. are US dollars i think..Karthick7 - Monday, February 11, 2019 - link
This has eventually encouraged a lot of others <a href="https://hosting-india.in/best-java-hosting-india/&... WordPress Hosting India</a> to look forward to starting their own WordPress websites.Ej24 - Tuesday, February 12, 2019 - link
Really would have liked to have seen more Intel 4c/4t and 4c/8t cpu's for comparison, like 4690k, 6700k or 7700k. I'm curious how my 4790k stacks up to amds zen+ 4c/8t cpu but from the others tested its hard to say.Rudde - Tuesday, February 12, 2019 - link
Visit bench?BlackSwan - Tuesday, February 12, 2019 - link
This OEM version is already available for retail purchase here in Russiahttps://www.regard.ru/catalog/tovar304279.htm?ymcl...
BlackSwan - Tuesday, February 12, 2019 - link
https://www.regard.ru/catalog/tovar304288.htm2700е
The_Assimilator - Tuesday, February 12, 2019 - link
AMD's CPU naming scheme is a bit of a mess now. Used to be that Ryzen 7 = 8c/16t, 5 = 6c/12t, 3 = 4c/4t but now we have 4c/8t parts mucking up the 5s. IMO they should reorder their lineup by core and thread counts by moving current Ryzen 3 to 1, and 4c/8t CPUs from Ryzen 5 to 3.End result: Ryzen 7 = 2700/X, Ryzen 5 = 2600/X, Ryzen 3 = 2500X/2400G, Ryzen 1 = 2300X/2200G.
silverblue - Tuesday, February 12, 2019 - link
Not really, given that the Ryzen 5 1400 and Ryzen 5 1500X are 4C/8T parts from just after the initial Ryzen launch, so in essence it was messed up to begin with. Also, if we're splitting hairs, Intel used to have HT in its i3 and i7 CPUs...Smell This - Tuesday, February 12, 2019 - link
Yeah.Chipzilla's naming scheme and product stack is The Greatest . . . (rolling eyes)
The_Assimilator - Tuesday, February 12, 2019 - link
We don't talk about Intel's lineup and naming scheme... or lack thereof. Down that path lies madness.Smell This - Tuesday, February 12, 2019 - link
Thanks, Yall!"Where possible, we will extend out testing to include faster memory modules either at the same time as the review or a later date."
____ _____ _____ _____ _______
It would be sweet with some OC action, too.
Most impressive is the jump between the AMD Ryzen 2500X/1500X and Ryzen 3 2300X/1300X --- roughly 10% +/-. Good work, AMD.
I guess this is the difference between Zen and Zen+. With 7nm Zen ++ arriving soon, and Zen+++ next year, the CPU times they are a changin' ...
mr_yogi - Tuesday, February 12, 2019 - link
Love the inclusion of the i5 2500K, great job.Valantar - Tuesday, February 12, 2019 - link
These are both available from Norwegian retailers, though prices are ... not good. The 2500X costs as much as the 2600X, and the 2300X is barely cheaper than the 2600 (though admittedly the 2600 is _really_ cheap).urbanman2004 - Tuesday, February 12, 2019 - link
Reviewing CPU's that'll never reach the mainstream open market. Smart idea Anandtech 😉mikato - Thursday, April 4, 2019 - link
Are you saying it doesn't count if they are sold in prebuilt systems?tygrus - Tuesday, February 12, 2019 - link
AMD APUs (2400G & 2300G) try to keep atleast 50% of the power budget for GPU so the CPU load graph doesn't show the whole picture. It shows the power used for all chips for CPU load not CPU+GPU load. While having a mixture of CPU-only & CPU+GPU chips present means you want to focus on the CPU the reader needs to be reminded that the CPU+GPU load will be higher.I wish AMD had option for 95w TDP APU to compete with Intel models. With more CPU cores/headroom and 25% more GPU to use that 95w+ peak.
azrael- - Wednesday, February 13, 2019 - link
One reason to favor the 2300X and 2500X over the 'G' series CPUs is that Pinnacle Ridge supports ECC whereas Raven Ridge does not.Icehawk - Wednesday, February 13, 2019 - link
I'm begging here - can you please, please, please show us your config settings for the HEVC encoding? You get rates that are 6x+ faster than I can achieve - my O/C'd 8700k gets ~45fps with 1080 Fast 3500 settings using all else as default in Handbrake. I'd really love to hit the #s you get with just an i5. Help!Ian Cutress - Friday, February 15, 2019 - link
Check page 3?https://www.anandtech.com/show/13945/the-amd-ryzen...
xrror - Wednesday, February 13, 2019 - link
One additional savings for OEMs - they won't need to populate the motherboard components for integrated video on systems shipped with these.No need for displayport/HDMI/VGA connectors and associated filtering bits, so that saves a bit more on total BOM for the OEM.
shticktical - Thursday, February 14, 2019 - link
Ian, why don't you include idle power consumption? And not just in this article, in all CPU reviews? System's usually spent 95% time idling. That would be very interesting to know and compare. Thank you!Ian Cutress - Friday, February 15, 2019 - link
Idle power consumption varies very wildly when you're talking about zero CPU load, and can be influenced more by the motherboard/system than the CPU itself. You can check the data in Bench regardless:https://www.anandtech.com/bench/CPU-2019/2183
RSAUser - Sunday, February 17, 2019 - link
Still disappointed that Anandtech cannot use a simple bar chart plugin like canvasjs in 2019.Would love the ability to add information on hover like the current market price, or MSRP, or make it so it fetches the price from the amazon or newegg api.
This is especially relevant in these charts where you're adding intel processors that are completely out of the price league of the CPU being reviewed and you're not adding the current price to the chart.
Using a JS library to render the HTML would make those charts mobile responsive.
fadsarmy - Monday, June 10, 2019 - link
How does a 65W cpu (Ryzen 5 2600) draw 77.97W?