There have been problems in the past where "ghost-silicon" that's disabled still consumes some amount of wattage and still makes it more difficult to overclock past the usual threshold. I don't really know if there would be any overclocking benefits here at all.
Also, while the pricing for this is ridiculous, I think Intel is rightly scared of people just buying this non IGP chip instead as a big majority of gamers will have dedicated GPUs instead anyway.
Even if you have a dGPU you can use quick sync. It’s incredibly valuable for a variety of tasks. I’ve replaced Xeons with i7’s just to enable quick sync on workstations
This isn't all that ridiculous. Intel wants to get rid of chips with defective IGPs. Unless a review sites shows OC advantage, and there won't be any as many posters already indicated due to the IGP being disablable in the BIOS and deep sleep states of silicon that is not in use, Intel will not sell any of them at prices equivalent to their IGP-enabled offerings.
So, they are just wasting their time - bad marketing.
Actually if you think about it - how many people use integrated graphics on desktop chips - how much extra power does it used on system - only benefit I see in integrated graphics - if you graphics card goes bad - you still have integrated.
Are you really loosing features if person does not use it.
Premiere Pro supports hardware accelerated H.264 encoding via integrated GPUs in Intel chips. In some cases using the Intel GPU can halve your export times. Some mini builds may prefer to have an iGPU and utilize a thunderbolt 3 GPU dock for its modularity/portability.
There's not a whole lot of pros for the Intel iGPU, but there are some.
The resale value drops, as you could sell it to many people not wanting power hungry gamer GPUs for their office PC. Granted, most of them have laptops nowadays, but that market is still there.
This is just the clear expression that intel doesnt care what it costs for them to make something. They only care what they can get for it. And they will always take as much money as they can.
Intel, bad core values show clearly. Next time take off at least a few dollars. At least a token jesture. We are fucking sick of evil companies.
This is just the clear expression that you don't understand how all for profit companies in the world work. If you are under the illusion anything you purchase has a cost related to its production cost I suggest you broaden your horizons. While this practice sucks for the consumer, it's hardly and "evil" Intel thing. Every product sold is sold at the highest price the market will bear. The price will lower when the sales figures require it to.
Given that the planetary ecosystem is collapsing due to human "development", and especially the "massively consume and waste" model, I have to disagree.
As for corporate morality... in short: They are amoral by definition. That is in the conventional sense of the word, not "lacking", which the prefix "a" can also mean. They are designed to fleece the masses to enrich the few, which includes all aspects of the masses' resources (including the people themselves and all aspects of the/their environment) – at least those they can lay their hands on (which is pretty much everything, as technology expands its reach).
What few realize is that the corporation is, by definition, an evil conspiracy against the people. That may sound silly but it's a fact. It's also a fact that pale-skinned children went on strike to demand a 55 hour workweek instead of a 60+ hour workweek in the US. Just because the exploitation is more hidden from our citizens today doesn't mean it's not still around. It also doesn't have to be that drastically terrible to constitute exploitation.
Your model of the world that says every company must push the least consumer friendly model is terribly flawed. Enjoy being stupid so that your mind can rationalize its own flawed makeup. I know plenty of companies that pass savings on to consumers. I know plenty that are taking over their respective fields while doing it. The only reason amd is catching up witb intel is because intel was happy to stagnate and abuse consumers for 8 years instead of trying to push the value they can give. Now even the long held crown of desktop gaming will be challenged by amd according to their recent announcements. I expect that to be a best case scenario but amd has been fairly accurate on these things lately so the gap will be extremely small. Keep rationalizing your fucked up ways while dinosaurs like you continue to die off.
With your name calling and cursing would you be able to site one of these companies you reference (preferable a publicly traded one worth a few dollars to compare apples to apples, I'm not looking for a reference to your friends Etsy store). I'm not sure how you draw a line between a company's actual business model and your gross misunderstandings to me "being stupid so that your mind can rationalize its own flawed makeup" I can tell by the emotional response you're already defensive so I know that learning isn't going to happen at this point. You suffer from a very closed minded perspective which completely negates the need for future product development. I'm not sure how many multi billion dollar research facilities you have managed the production of, but I'm guessing selling your current products for bottom dollar hasn't left you with much capitol to entertain this opportunity. That's a great perspective though, ignore future growth and development potential, ignore securities and unforeseen liabilities, just view each pricing model in a vacuum so simple even Opencg can understand it. Of course Intel has been stagnant. Of course I'm not happy we don't have less expensive, higher performing chips. Point me in the direction where I can purchase a higher performing chip in the desktop and pro markets? I'll give them my money instead.
Is it at all likely that the deactivated IGP will allow for higher OC due to better thermal? No IGP also means losing features like Quick Sync encode and video decode support (no Netflix 4K HDR), correct?
Sounds like a terrible chip if there's no additional OC headroom while also losing features.
When you deactivate in bios it is still active circuitry, but traffic is not directed to it. Supposedly with these chips it is fused off entirely which may make a bit more difference. But that is giving up quite a few features for the potential of a few hundred MHz on chips that can typically hit 5GHz+ anyways. I really would have liked to see this happen at a lower price point... and would like Netflix to allow those of us with older but still very capable computers to be able to view 4k HDR content.
>When you deactivate in bios it is still active circuitry, but traffic is not directed to it.
This doesn't make any sense at all. Modern CPU's implement clock gating and deep sleep states which mean this is either not true or that there is a major design flaw in every Intel CPU with an integrated GPU.
Few hundred MHz. I think you vastly underestimate how low idle power consumption can be with components in sleep states. My guess is 0 MHz although individual results may vary. It isn't like a disable IGP is burning away at full tdp.
Netflix 4k works on older cpus if you have a newer gfx card (nVidia pascal with minimum 3Gb or AMD polaris or vega). Also requires Win10 with latest updates and using Edge. (confusingly the Netflix app does not allow 4k).
Duopolies don't constitute a proper level of "capitalist competition". Cartels are the worst-case scenario. Monopolies are second. Duopolies are third. None of them constitute adequate competition.
Why is cartel worse than monopoly? This makes no sense. Cartel = many companies deciding together to screw everyone while making an illusion they don't. Monopoly = single company deciding to screw everyone while making an illusion they don't. By any normal reasonable position, having a single company with that power is MUCH worse than having many.
Duopoly is the "minimum energy" state of the capitalism due to anti-monopoly regulations.
And "edit" button edit: Just to be clear, I think Intel IGP is very capable in what it is meant to do - displaying stuff and transcoding video, their multimedia block is state of art. It can't game though...
Agreed. It's time for Intel to remove the GPU from the most performant desktop processors. If you are building a performance desktop ... you are not interested in the Intel GPU.
Not true. Many millions of desktops are purchased with only an iGPU because they are used for fairly mundane office tasks. Go check a few of the major OEMs and note how few business class desktops include dedicated graphics. My clerks, data entry clowns, and peon programmers don't need anything but the cheapest hardware to throw an image on a screen or two.
Personally, I do not see where the iGPU is of much use in any of the higher end desktop CPU for most customers. I doubt too many run their 9xxx series CPU with the iGPU rather than a higher end external GPU, so there is really very little (if any) added benefit to it.
For lower tier CPU (i.3, i5) which are run in office type systems, having the iGPU makes sense as it saves money by removing the need for an external GPU.
As for pricing - if the iGPU block is completely removed (i.e. smaller chip), then it should be cheaper as manufacturing cost is lower. If it is simply fused off, then the cost is the same.
The 9700K is literally the same chip as the 9900K, only with HT disabled. The other high-end chips are the same silicon as well, only with cores fused off.
No one expects to pay the 9900K price for their 9700K or 9600, as those lack features the first chip have. While the IGP may not be /as important/ as additional cores or HT for most users it still provides functionality above nothing at all.
Ie. it's still product segmentation and should be priced accordingly, even if it's a lower cost reduction than some of the other features.
You are completely missing the boat on all of this. These are chips they most likely couldn't sell as their full fledged brothers, so they are repackaging them and trying to sell them for the same price? Why would they charge the same for what is in-essence scrap processors?
Further there are a lot of people who want fast cores and QuickSync which is built into the iGPU for content creation. I am one of these. I can use the processing cores to help with my day to day stuff and the iGPU can do some (albeit less effective than software) hardware encoding/transcoding. These are worthless to me. This is a lot of the reason I don't buy AMD high end processors.
By that logic, if they didn't sell those "scrap" processors, the price of all the others would be higher. Lower supply = higher price (unless AMD fills the void). But I agree they should be priced a bit lower, which is likely to actually happen at retail because individual CPU buyers will only chosen the the obvious better one with no price difference. We'll find out soon enough...
Every generation would have had these failed iGPU processors and they didn't sell them. They clearly set their prices with NOT selling these processors. It is only because they have the CPU shortage that they are selling these bad processors and doing so without any sort of discount.
The balance of supply, demand and competition is how prices get set. Buy what ever you want. Nobody is holding a gun to your head. And let's see what the actual retail prices settle to... This article is click-bait. A dog whistle to intel haters. Go get an AMD CPU...
This reminds me of a story about an old great uncle (never knew him personally). He was a salesman and had a small shop similar to a pawn shop. He had a set of crystal glasses. A lady comes in and wants one glass. She sees one of them has a small chip.... "Hey, this glass has a small chip. Can you sell this one to me for 50 cents instead of a dollar?" Uncle picks it up and inspects it... "You're right! This is not acceptable and he turns around, tosses the chipped glass in the garbage, shattering it and then raises the price on the rest of the glasses.... "Would you like any of the remaining ones?" And she did...
These chips are not just chips with good iGPU that they are choosing to fuse off. These are chips that would go in the recycle bin because they could not sell them as K chips because of the faulty iGPU. It is a good marketing move to get money for something instead of throwing away. However, the perceived value is less because there is less function available. So pricing them the same is shameful. However, maybe there are not that many and they don't care how many they sell. They will probably sell some. It is the "Build it and they will come" philosophy.
@Ian: Thanks for mentioning the debate about whethery our comparison between the AMD Athlon 200GE ($55 SEP) and the Intel Pentium G5400.
I find it interesting that you acknowledge the fact that the comparison may not have been correct from the start since you are comparing MSRP / street prices to purchase price per 1,000 unit (to which seller mark-up needs to be added.
In this respect alone, it may have been prudent to check actual retail prices before writing the article and adjust the CPU selection accordingly (e.g. replace the G5400 with a 2C/2T Celeron that does have the same retail price as the Athlon GE.
Still curious in which region(s) exactly the G5400 hits the Intel 1,000 unit price point (for stores that actually have it in stock) ? Judging by the comments in your comparison, it does not appear to be North America, Europe and parts of Asia.
"it may have been prudent to check actual retail prices."
Bingo. Same goes for this article as well. It's just click bate. Retail prices will be higher but will differentiate according to demand once end users can buy them.
It is somewhat irritating that the information about the lack of reliable relationship between Intel's 'tray price' and the actual average street price is acknowledged here... But not in the article where it would have been relevant. Why so? It is rather common to update existing articles when a new, relevant bit of information shows up, so why wasn't that done?
And, like you, I really wonder where in the world is the G5400 available at 1ku prices. I can't help noting that this was already asked in the "G5400 vs 200GE" article comments, that Ian obviously read them, and didn't seem to be willing to answer, or even acknowledge the question.
So they are milking the fact that they have shortages and they likely don't have that many dies with defective GPUs.
Might be an interesting Q1 in DIY as GPU and memory prices are likely driving sales well above normal seasonal patterns.Would be wiser to wait for Zen 2 but it doesn't seem that folks are doing that just yet.
While it is a very divisive decision on Intel's part, and while the words I want to use are not allowed, the worst part here is that Intel's pricing strategy makes sense to me on a business level. The intrinsic value of the processor is in its performance, core count, differentiation factor (slim as they may be - Higher per-core IPC etc.) and from a business point-of-view they could potentially cause harm to the perceived value of the 9th Gen processors by having 2 equally performing chips at 2 differing price levels. People that paid full buck for the full-phat chip would feel done in as they did not use the integrated graphics. At the same time users would feel they deserve the discount as their chip has fused off graphics cores they will never use. Again, the businessman point of view is that everybody forgets that even if the graphics section of the chip doesn't work Intel pays per wafer and has to maximise return and they do not get a discount when cores don't work as intended or the graphics doesn't work as intended, they just have to deal with the yields. So them pricing the products the same may be a prudent business move to protect equity, branding value and product perception, it will not win them any friends or awards. So again it is divisive to say the least but is a smart business move on their part.
However I am a consumer first and foremost and !#@)*!&^@# Intel you are disabling things on the actual processor so that you can harvest dies, meaning the processor has something less, and if you are harvesting dies to save money, at least charge us less for a slightly crippled CPU. THat is just blatantly fleecing the consumer and I hope the Ryzen 3 Launch has some epic stuff to make you cringe.
Intel has margins no one else in the industry can even dream of. And yields in a now 4 year old process should be superb, so all the rant about brand perception and equity and whatnot, while having merit, doesn't change the fact that Intel is just capitalizing on the cpu shortage and selling crap they wouldnt have dared to sell in other times, like the cockroaches they are.
If Intel was wise, they would have increased the turbo by a notch if they were keeping the prices the same. At stock base and turbo clocks, the benefit of the F suffix can be obtained on non-F chips by disabling the IGP in firmware. The main benefit is simply more time at higher turbo ratios under load. That difference isn't going to be that great as single and dual core turbos can be maintained on the non-F chips for extended periods of time given a good air cooler. All core turbo is where the real benefit would be had but the all core turbo doesn't have as high of a boost compared to the base clock. The gains will be small, especially if the IGP is disabled on the non-F chips.
The other variable is of course cooling so the differences between the F and non-F chips might diverge as cooling becomes more aggressive. However, once you get into custom water loops chances are the users will also be overclocking it and generally savvy enough to disable the IGP since they more than likely have a discrete GPU too. Worthy of testing but I wouldn't expect much difference between the two chips.
Isn't the more interesting news here the 9400 and 9350KF? While the 9400 is only a slight speed bump over the 8400, it costs the same as the 8400 and is a line that larger OEMs (such as DELL) actually sell in bulk in the $600 price range for the system. Before this, you could only get the 9th generation from Dell by going the ultra expensive Alienware or XPS Special Edition routes. As for the 9350KF, it allows for 15% turbo boost over the 8350K for just $5. Seems like a nice choice to have. As for charging the same for a feature the people who read tech articles generally don't use, I don't think the extreme complaints are justified (worthless, ridiculous, etc).
So basically Intel is going to try to sell these CPU's with less features for the same price as the fully featured CPU's. Granted all they did was fuse off the iGPU to make it so it is not usable which most do not use anyway except for maybe testing. Yes there are a lot that use the iGPU but those are mostly the non K SKU's and are more main stream systems that do not require a dGPU installed.
I also am fairly sure these will not over clock any better mainly because it is 100% the same CPU's as the ones with iGPU except the iGPU has been fused off to not work. They might run a tiny bit cooler if the iGPU is not getting powered up at all but that will be about the only plus and even then it might drop the temps 2-3 degrees at most.
I just want to say that there is 100% no way these chips can run cooler or overclock better than their iGPU counterparts with the iGPU disabled. The iGPU has its own power plan that is separate from everything else. When you disable the iGPU in BIOS you are fully powering down the iGPU. It will draw no power at all.
Do not buy a KF series in hopes that overclocking or power draw will improve. It won't and physically they can't.
if the iGPU is disable by intel.. wouldnt they do this in hardware ? say.. cutting the power to it? if thats the case.. then the iGPU wouldnt use any power what so ever.... while disabling it in the bios., could still use minimal power.
That's OK. When the Ryzen 3000 chips and the new Threadrippers launch, it's going to be fun to watch what Intel does next. I just hope AMD prices them low.
Maybe this is a workaround for supply constraints--anyone building a box with discrete graphics values the IGP near $0 anyway, and this way they can get their orders filled? Still weird!
The suggestion that users are "losing features" is a false economy. You still have the option to buy the SKU with the features that are not present on these SKUs. Really simple stuff.
Here's what you get with a KF SKU that make it superior to a K SKU: 1) lower power consumption 2) lower heat output 3) higher overclock potential
All for the same price. If you don't need an IGP or QuickSync the KF SKUs are superior.
When a company releases a part that part competes against the rest of the stack.
So, if Corporation X releases Widget A, which contains feature A and feature B and then Corporation X releases Widget B, which contains feature A and not feature B, consumers are losing features if Widget B's pricing is basically the same as what Widget A's was (or still is, in situations where there is limited availability of Widget A).
" I just want to say that there is 100% no way these chips can run cooler or overclock better than their iGPU counterparts with the iGPU disabled. The iGPU has its own power plan that is separate from everything else. When you disable the iGPU in BIOS you are fully powering down the iGPU. It will draw no power at all.
Do not buy a KF series in hopes that overclocking or power draw will improve. It won't and physically they can't."
I wonder if the turbo states will be any different? Base clock and max turbo don' tell the whole story. If the F models can keep more cores at higher states it might make sense.
Leave it to Intel to turn junk (CPU dies with a faulty iGPU) into cash. While they officially don't even use the thermal headroom that the disabled (faulty?) iGPU should provide, the only conclusion left is that these "F" chips also allow Intel to uprate the binning of these chips. Reduce, reuse, recycle, and then, sell them at full price!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
93 Comments
Back to Article
eva02langley - Wednesday, January 16, 2019 - link
Honestly, it is just ridiculous.Ironchef3500 - Wednesday, January 16, 2019 - link
+1gavbon - Wednesday, January 16, 2019 - link
Unless there's some benefit for overclocking, then I'm struggling to see ANY advantage hereJoeyJoJo123 - Wednesday, January 16, 2019 - link
There have been problems in the past where "ghost-silicon" that's disabled still consumes some amount of wattage and still makes it more difficult to overclock past the usual threshold. I don't really know if there would be any overclocking benefits here at all.milkywayer - Sunday, January 20, 2019 - link
Also, while the pricing for this is ridiculous, I think Intel is rightly scared of people just buying this non IGP chip instead as a big majority of gamers will have dedicated GPUs instead anyway.sharath.naik - Wednesday, January 16, 2019 - link
No GPU no quicksync, no encoding speed/quality.basroil - Wednesday, January 16, 2019 - link
Exactly sharath, even if there's a tiny overclock advantage the performance hit is huge... Guess my next CPU is going to be a Zen 2 at this rate!Qasar - Wednesday, January 16, 2019 - link
if you dont need the features the iGPU provides, and use a dGPU... when that is a moot point....Samus - Wednesday, January 16, 2019 - link
Even if you have a dGPU you can use quick sync. It’s incredibly valuable for a variety of tasks. I’ve replaced Xeons with i7’s just to enable quick sync on workstationsacme64 - Monday, January 21, 2019 - link
you can use the dgpu tho, why bother?IGTrading - Thursday, January 17, 2019 - link
I don't understand why people still buy Intel ... for almost any reason ...Ok, if your code is 40% faster do to some architectural particularity ... ok. But other than that ??!
Even gamers ... Those that went AMD, can use Zen, Zen2 and so on. Those that chose intel ... often need to change the motherboard as well.
SharpEars - Wednesday, January 16, 2019 - link
This isn't all that ridiculous. Intel wants to get rid of chips with defective IGPs. Unless a review sites shows OC advantage, and there won't be any as many posters already indicated due to the IGP being disablable in the BIOS and deep sleep states of silicon that is not in use, Intel will not sell any of them at prices equivalent to their IGP-enabled offerings.So, they are just wasting their time - bad marketing.
HStewart - Wednesday, January 16, 2019 - link
Actually if you think about it - how many people use integrated graphics on desktop chips - how much extra power does it used on system - only benefit I see in integrated graphics - if you graphics card goes bad - you still have integrated.Are you really loosing features if person does not use it.
JoeyJoJo123 - Wednesday, January 16, 2019 - link
Premiere Pro supports hardware accelerated H.264 encoding via integrated GPUs in Intel chips. In some cases using the Intel GPU can halve your export times. Some mini builds may prefer to have an iGPU and utilize a thunderbolt 3 GPU dock for its modularity/portability.There's not a whole lot of pros for the Intel iGPU, but there are some.
MrSpadge - Wednesday, January 16, 2019 - link
The resale value drops, as you could sell it to many people not wanting power hungry gamer GPUs for their office PC. Granted, most of them have laptops nowadays, but that market is still there.eva02langley - Thursday, January 17, 2019 - link
The move is not ridiculous... selling it at the SAME PRICE IS!BurntMyBacon - Friday, January 18, 2019 - link
The "F" is for FAIL.I'll leave it to you to figure out whether that applies to the silicon or the marketing.
Opencg - Wednesday, January 16, 2019 - link
This is just the clear expression that intel doesnt care what it costs for them to make something. They only care what they can get for it. And they will always take as much money as they can.Intel, bad core values show clearly. Next time take off at least a few dollars. At least a token jesture. We are fucking sick of evil companies.
bldr - Wednesday, January 16, 2019 - link
This is just the clear expression that you don't understand how all for profit companies in the world work. If you are under the illusion anything you purchase has a cost related to its production cost I suggest you broaden your horizons. While this practice sucks for the consumer, it's hardly and "evil" Intel thing. Every product sold is sold at the highest price the market will bear. The price will lower when the sales figures require it to.Oxford Guy - Wednesday, January 16, 2019 - link
"it's hardly and 'evil'"Given that the planetary ecosystem is collapsing due to human "development", and especially the "massively consume and waste" model, I have to disagree.
As for corporate morality... in short: They are amoral by definition. That is in the conventional sense of the word, not "lacking", which the prefix "a" can also mean. They are designed to fleece the masses to enrich the few, which includes all aspects of the masses' resources (including the people themselves and all aspects of the/their environment) – at least those they can lay their hands on (which is pretty much everything, as technology expands its reach).
What few realize is that the corporation is, by definition, an evil conspiracy against the people. That may sound silly but it's a fact. It's also a fact that pale-skinned children went on strike to demand a 55 hour workweek instead of a 60+ hour workweek in the US. Just because the exploitation is more hidden from our citizens today doesn't mean it's not still around. It also doesn't have to be that drastically terrible to constitute exploitation.
Opencg - Thursday, January 17, 2019 - link
Your model of the world that says every company must push the least consumer friendly model is terribly flawed. Enjoy being stupid so that your mind can rationalize its own flawed makeup. I know plenty of companies that pass savings on to consumers. I know plenty that are taking over their respective fields while doing it. The only reason amd is catching up witb intel is because intel was happy to stagnate and abuse consumers for 8 years instead of trying to push the value they can give. Now even the long held crown of desktop gaming will be challenged by amd according to their recent announcements. I expect that to be a best case scenario but amd has been fairly accurate on these things lately so the gap will be extremely small. Keep rationalizing your fucked up ways while dinosaurs like you continue to die off.bldr - Thursday, January 17, 2019 - link
With your name calling and cursing would you be able to site one of these companies you reference (preferable a publicly traded one worth a few dollars to compare apples to apples, I'm not looking for a reference to your friends Etsy store). I'm not sure how you draw a line between a company's actual business model and your gross misunderstandings to me "being stupid so that your mind can rationalize its own flawed makeup" I can tell by the emotional response you're already defensive so I know that learning isn't going to happen at this point. You suffer from a very closed minded perspective which completely negates the need for future product development. I'm not sure how many multi billion dollar research facilities you have managed the production of, but I'm guessing selling your current products for bottom dollar hasn't left you with much capitol to entertain this opportunity. That's a great perspective though, ignore future growth and development potential, ignore securities and unforeseen liabilities, just view each pricing model in a vacuum so simple even Opencg can understand it. Of course Intel has been stagnant. Of course I'm not happy we don't have less expensive, higher performing chips. Point me in the direction where I can purchase a higher performing chip in the desktop and pro markets? I'll give them my money instead.nathanddrews - Wednesday, January 16, 2019 - link
Is it at all likely that the deactivated IGP will allow for higher OC due to better thermal? No IGP also means losing features like Quick Sync encode and video decode support (no Netflix 4K HDR), correct?Sounds like a terrible chip if there's no additional OC headroom while also losing features.
Sahrin - Wednesday, January 16, 2019 - link
The IGP can be deactivated in the BIOS, so this isn’t really a ‘benefit’ they’re just pre-making a decision for you then charging you for it.CaedenV - Wednesday, January 16, 2019 - link
When you deactivate in bios it is still active circuitry, but traffic is not directed to it. Supposedly with these chips it is fused off entirely which may make a bit more difference. But that is giving up quite a few features for the potential of a few hundred MHz on chips that can typically hit 5GHz+ anyways. I really would have liked to see this happen at a lower price point... and would like Netflix to allow those of us with older but still very capable computers to be able to view 4k HDR content.Sahrin - Wednesday, January 16, 2019 - link
>When you deactivate in bios it is still active circuitry, but traffic is not directed to it.This doesn't make any sense at all. Modern CPU's implement clock gating and deep sleep states which mean this is either not true or that there is a major design flaw in every Intel CPU with an integrated GPU.
namechamps - Wednesday, January 16, 2019 - link
Few hundred MHz. I think you vastly underestimate how low idle power consumption can be with components in sleep states. My guess is 0 MHz although individual results may vary. It isn't like a disable IGP is burning away at full tdp.blu3dragon - Wednesday, January 16, 2019 - link
Netflix 4k works on older cpus if you have a newer gfx card (nVidia pascal with minimum 3Gb or AMD polaris or vega). Also requires Win10 with latest updates and using Edge. (confusingly the Netflix app does not allow 4k).Sahrin - Wednesday, January 16, 2019 - link
The only reason this company is relevant is because you guys keep giving them your money.AshlayW - Wednesday, January 16, 2019 - link
+1UltraWide - Wednesday, January 16, 2019 - link
I guess we could have an AMD monopoly? heheSahrin - Wednesday, January 16, 2019 - link
No, that would be just as bad. But Intel isn't getting 50%, they're getting 90%.Oxford Guy - Wednesday, January 16, 2019 - link
Duopolies don't constitute a proper level of "capitalist competition". Cartels are the worst-case scenario. Monopolies are second. Duopolies are third. None of them constitute adequate competition.Zizy - Thursday, January 17, 2019 - link
Why is cartel worse than monopoly? This makes no sense. Cartel = many companies deciding together to screw everyone while making an illusion they don't. Monopoly = single company deciding to screw everyone while making an illusion they don't. By any normal reasonable position, having a single company with that power is MUCH worse than having many.Duopoly is the "minimum energy" state of the capitalism due to anti-monopoly regulations.
Gigaplex - Thursday, January 17, 2019 - link
They'd still be relevant even if it was turned around and Intel only had 10% market share.TristanSDX - Wednesday, January 16, 2019 - link
Intel confirmed that their graphics is worthlessDodozoid - Wednesday, January 16, 2019 - link
Anandtech, please add "upvote" buttonDodozoid - Wednesday, January 16, 2019 - link
And "edit" buttonedit: Just to be clear, I think Intel IGP is very capable in what it is meant to do - displaying stuff and transcoding video, their multimedia block is state of art. It can't game though...
Irata - Wednesday, January 16, 2019 - link
An edit button would be nice - made too many typos and copy / paste errors in my posts already, so that would save some embarrassment :)TEAMSWITCHER - Wednesday, January 16, 2019 - link
Agreed. It's time for Intel to remove the GPU from the most performant desktop processors. If you are building a performance desktop ... you are not interested in the Intel GPU.Kvaern1 - Wednesday, January 16, 2019 - link
Intel graphics is more than good enough for the majority of the PC's in the world.Fact.
HStewart - Wednesday, January 16, 2019 - link
That is true - but these are not mobile chips but desktop chips - most desktop users buy external Graphics card anyway.PeachNCream - Wednesday, January 16, 2019 - link
Not true. Many millions of desktops are purchased with only an iGPU because they are used for fairly mundane office tasks. Go check a few of the major OEMs and note how few business class desktops include dedicated graphics. My clerks, data entry clowns, and peon programmers don't need anything but the cheapest hardware to throw an image on a screen or two.Gigaplex - Thursday, January 17, 2019 - link
Most businesses seem to be moving to laptops instead of desktops when they don't need massive horsepower.eva02langley - Thursday, January 17, 2019 - link
Like the cheapest piece of trash of a car would be... it is not an argument.CaedenV - Wednesday, January 16, 2019 - link
Well... Intel really "F"d that one upIronchef3500 - Wednesday, January 16, 2019 - link
Wish this was surprising.ratbert1 - Thursday, January 17, 2019 - link
I see what you did there.Irata - Wednesday, January 16, 2019 - link
Personally, I do not see where the iGPU is of much use in any of the higher end desktop CPU for most customers. I doubt too many run their 9xxx series CPU with the iGPU rather than a higher end external GPU, so there is really very little (if any) added benefit to it.For lower tier CPU (i.3, i5) which are run in office type systems, having the iGPU makes sense as it saves money by removing the need for an external GPU.
As for pricing - if the iGPU block is completely removed (i.e. smaller chip), then it should be cheaper as manufacturing cost is lower. If it is simply fused off, then the cost is the same.
Exodite - Wednesday, January 16, 2019 - link
The cost for Intel is irrelevant though.The 9700K is literally the same chip as the 9900K, only with HT disabled. The other high-end chips are the same silicon as well, only with cores fused off.
No one expects to pay the 9900K price for their 9700K or 9600, as those lack features the first chip have. While the IGP may not be /as important/ as additional cores or HT for most users it still provides functionality above nothing at all.
Ie. it's still product segmentation and should be priced accordingly, even if it's a lower cost reduction than some of the other features.
beersy - Wednesday, January 16, 2019 - link
You are completely missing the boat on all of this. These are chips they most likely couldn't sell as their full fledged brothers, so they are repackaging them and trying to sell them for the same price? Why would they charge the same for what is in-essence scrap processors?Further there are a lot of people who want fast cores and QuickSync which is built into the iGPU for content creation. I am one of these. I can use the processing cores to help with my day to day stuff and the iGPU can do some (albeit less effective than software) hardware encoding/transcoding. These are worthless to me. This is a lot of the reason I don't buy AMD high end processors.
woggs - Wednesday, January 16, 2019 - link
By that logic, if they didn't sell those "scrap" processors, the price of all the others would be higher. Lower supply = higher price (unless AMD fills the void). But I agree they should be priced a bit lower, which is likely to actually happen at retail because individual CPU buyers will only chosen the the obvious better one with no price difference. We'll find out soon enough...beersy - Wednesday, January 16, 2019 - link
Every generation would have had these failed iGPU processors and they didn't sell them. They clearly set their prices with NOT selling these processors. It is only because they have the CPU shortage that they are selling these bad processors and doing so without any sort of discount.woggs - Thursday, January 17, 2019 - link
The balance of supply, demand and competition is how prices get set. Buy what ever you want. Nobody is holding a gun to your head. And let's see what the actual retail prices settle to... This article is click-bait. A dog whistle to intel haters. Go get an AMD CPU...woggs - Thursday, January 17, 2019 - link
This reminds me of a story about an old great uncle (never knew him personally). He was a salesman and had a small shop similar to a pawn shop. He had a set of crystal glasses. A lady comes in and wants one glass. She sees one of them has a small chip.... "Hey, this glass has a small chip. Can you sell this one to me for 50 cents instead of a dollar?" Uncle picks it up and inspects it... "You're right! This is not acceptable and he turns around, tosses the chipped glass in the garbage, shattering it and then raises the price on the rest of the glasses.... "Would you like any of the remaining ones?" And she did...ratbert1 - Thursday, January 17, 2019 - link
These chips are not just chips with good iGPU that they are choosing to fuse off. These are chips that would go in the recycle bin because they could not sell them as K chips because of the faulty iGPU. It is a good marketing move to get money for something instead of throwing away. However, the perceived value is less because there is less function available. So pricing them the same is shameful.However, maybe there are not that many and they don't care how many they sell. They will probably sell some. It is the "Build it and they will come" philosophy.
Teckk - Wednesday, January 16, 2019 - link
The Turbo will be still the same as the processors with graphics?won't there be some headroom, or is it for overclocking?jordanclock - Wednesday, January 16, 2019 - link
There is zero expectation of higher overclocking. These are expected to be entirely the same, but with a function removed.Irata - Wednesday, January 16, 2019 - link
@Ian:Thanks for mentioning the debate about whethery our comparison between the AMD Athlon 200GE ($55 SEP) and the Intel Pentium G5400.
I find it interesting that you acknowledge the fact that the comparison may not have been correct from the start since you are comparing MSRP / street prices to purchase price per 1,000 unit (to which seller mark-up needs to be added.
In this respect alone, it may have been prudent to check actual retail prices before writing the article and adjust the CPU selection accordingly (e.g. replace the G5400 with a 2C/2T Celeron that does have the same retail price as the Athlon GE.
Still curious in which region(s) exactly the G5400 hits the Intel 1,000 unit price point (for stores that actually have it in stock) ? Judging by the comments in your comparison, it does not appear to be North America, Europe and parts of Asia.
Care to share which these may be ?
woggs - Wednesday, January 16, 2019 - link
"it may have been prudent to check actual retail prices."Bingo. Same goes for this article as well. It's just click bate. Retail prices will be higher but will differentiate according to demand once end users can buy them.
kkilobyte - Thursday, January 17, 2019 - link
It is somewhat irritating that the information about the lack of reliable relationship between Intel's 'tray price' and the actual average street price is acknowledged here... But not in the article where it would have been relevant. Why so? It is rather common to update existing articles when a new, relevant bit of information shows up, so why wasn't that done?And, like you, I really wonder where in the world is the G5400 available at 1ku prices. I can't help noting that this was already asked in the "G5400 vs 200GE" article comments, that Ian obviously read them, and didn't seem to be willing to answer, or even acknowledge the question.
jjj - Wednesday, January 16, 2019 - link
So they are milking the fact that they have shortages and they likely don't have that many dies with defective GPUs.Might be an interesting Q1 in DIY as GPU and memory prices are likely driving sales well above normal seasonal patterns.Would be wiser to wait for Zen 2 but it doesn't seem that folks are doing that just yet.
HStewart - Wednesday, January 16, 2019 - link
If you are waiting for Zen 2, just wait for Ice Lake with Sunny Cove.jjj - Wednesday, January 16, 2019 - link
That's 1.5- 2 years away in desktop, not a realistic waiting period at all.Spunjji - Thursday, January 17, 2019 - link
Why? A longer wait for a CPU from a company that continues to dick its customers at every opportunity? No, thank you.WickedMONK3Y - Wednesday, January 16, 2019 - link
While it is a very divisive decision on Intel's part, and while the words I want to use are not allowed, the worst part here is that Intel's pricing strategy makes sense to me on a business level. The intrinsic value of the processor is in its performance, core count, differentiation factor (slim as they may be - Higher per-core IPC etc.) and from a business point-of-view they could potentially cause harm to the perceived value of the 9th Gen processors by having 2 equally performing chips at 2 differing price levels. People that paid full buck for the full-phat chip would feel done in as they did not use the integrated graphics. At the same time users would feel they deserve the discount as their chip has fused off graphics cores they will never use. Again, the businessman point of view is that everybody forgets that even if the graphics section of the chip doesn't work Intel pays per wafer and has to maximise return and they do not get a discount when cores don't work as intended or the graphics doesn't work as intended, they just have to deal with the yields. So them pricing the products the same may be a prudent business move to protect equity, branding value and product perception, it will not win them any friends or awards. So again it is divisive to say the least but is a smart business move on their part.However I am a consumer first and foremost and !#@)*!&^@# Intel you are disabling things on the actual processor so that you can harvest dies, meaning the processor has something less, and if you are harvesting dies to save money, at least charge us less for a slightly crippled CPU. THat is just blatantly fleecing the consumer and I hope the Ryzen 3 Launch has some epic stuff to make you cringe.
cpkennit83 - Wednesday, January 16, 2019 - link
Intel has margins no one else in the industry can even dream of. And yields in a now 4 year old process should be superb, so all the rant about brand perception and equity and whatnot, while having merit, doesn't change the fact that Intel is just capitalizing on the cpu shortage and selling crap they wouldnt have dared to sell in other times, like the cockroaches they are.AshlayW - Wednesday, January 16, 2019 - link
Aaaand this is why I won't buy Intel products.A5 - Wednesday, January 16, 2019 - link
If this is your ethical bar for boycotting a company, you must not buy literally anything, ever.Kevin G - Wednesday, January 16, 2019 - link
If Intel was wise, they would have increased the turbo by a notch if they were keeping the prices the same. At stock base and turbo clocks, the benefit of the F suffix can be obtained on non-F chips by disabling the IGP in firmware. The main benefit is simply more time at higher turbo ratios under load. That difference isn't going to be that great as single and dual core turbos can be maintained on the non-F chips for extended periods of time given a good air cooler. All core turbo is where the real benefit would be had but the all core turbo doesn't have as high of a boost compared to the base clock. The gains will be small, especially if the IGP is disabled on the non-F chips.The other variable is of course cooling so the differences between the F and non-F chips might diverge as cooling becomes more aggressive. However, once you get into custom water loops chances are the users will also be overclocking it and generally savvy enough to disable the IGP since they more than likely have a discrete GPU too. Worthy of testing but I wouldn't expect much difference between the two chips.
dullard - Wednesday, January 16, 2019 - link
Isn't the more interesting news here the 9400 and 9350KF? While the 9400 is only a slight speed bump over the 8400, it costs the same as the 8400 and is a line that larger OEMs (such as DELL) actually sell in bulk in the $600 price range for the system. Before this, you could only get the 9th generation from Dell by going the ultra expensive Alienware or XPS Special Edition routes. As for the 9350KF, it allows for 15% turbo boost over the 8350K for just $5. Seems like a nice choice to have. As for charging the same for a feature the people who read tech articles generally don't use, I don't think the extreme complaints are justified (worthless, ridiculous, etc).cpy - Wednesday, January 16, 2019 - link
I'm just glad we don't have to pay extra for removing iGPU.DigitalFreak - Wednesday, January 16, 2019 - link
If Intel were a telecom: $488 + a $20 die fusion feecasperes1996 - Wednesday, January 16, 2019 - link
Why would an OEM pick these instead of the non-F model then?willis936 - Wednesday, January 16, 2019 - link
To pay respects.rocky12345 - Wednesday, January 16, 2019 - link
So basically Intel is going to try to sell these CPU's with less features for the same price as the fully featured CPU's. Granted all they did was fuse off the iGPU to make it so it is not usable which most do not use anyway except for maybe testing. Yes there are a lot that use the iGPU but those are mostly the non K SKU's and are more main stream systems that do not require a dGPU installed.I also am fairly sure these will not over clock any better mainly because it is 100% the same CPU's as the ones with iGPU except the iGPU has been fused off to not work. They might run a tiny bit cooler if the iGPU is not getting powered up at all but that will be about the only plus and even then it might drop the temps 2-3 degrees at most.
Khenglish - Wednesday, January 16, 2019 - link
I just want to say that there is 100% no way these chips can run cooler or overclock better than their iGPU counterparts with the iGPU disabled. The iGPU has its own power plan that is separate from everything else. When you disable the iGPU in BIOS you are fully powering down the iGPU. It will draw no power at all.Do not buy a KF series in hopes that overclocking or power draw will improve. It won't and physically they can't.
Qasar - Wednesday, January 16, 2019 - link
if the iGPU is disable by intel.. wouldnt they do this in hardware ? say.. cutting the power to it? if thats the case.. then the iGPU wouldnt use any power what so ever.... while disabling it in the bios., could still use minimal power.Spunjji - Thursday, January 17, 2019 - link
"minimal power" in that context translates to "such a tiny fraction of overall power that it will make no difference to overclocking".Seriously, I'll happily wait for measurements to be proven right, but this will not appear to operate any differently than the standard CPUs
SharpEars - Wednesday, January 16, 2019 - link
That's OK. When the Ryzen 3000 chips and the new Threadrippers launch, it's going to be fun to watch what Intel does next. I just hope AMD prices them low.The_Assimilator - Thursday, January 17, 2019 - link
Amen!twotwotwo - Wednesday, January 16, 2019 - link
Maybe this is a workaround for supply constraints--anyone building a box with discrete graphics values the IGP near $0 anyway, and this way they can get their orders filled? Still weird!The_Assimilator - Thursday, January 17, 2019 - link
It's definitely due to supply constraints, but that still doesn't justify charging the same amount of money for less product.JTWrenn - Wednesday, January 16, 2019 - link
Translation...they are the same chip and we turned it off...oh and suck it.techguymaxc - Wednesday, January 16, 2019 - link
The suggestion that users are "losing features" is a false economy. You still have the option to buy the SKU with the features that are not present on these SKUs. Really simple stuff.Here's what you get with a KF SKU that make it superior to a K SKU:
1) lower power consumption
2) lower heat output
3) higher overclock potential
All for the same price. If you don't need an IGP or QuickSync the KF SKUs are superior.
Oxford Guy - Wednesday, January 16, 2019 - link
When a company releases a part that part competes against the rest of the stack.So, if Corporation X releases Widget A, which contains feature A and feature B and then Corporation X releases Widget B, which contains feature A and not feature B, consumers are losing features if Widget B's pricing is basically the same as what Widget A's was (or still is, in situations where there is limited availability of Widget A).
piroroadkill - Thursday, January 17, 2019 - link
Just to quote someone else:" I just want to say that there is 100% no way these chips can run cooler or overclock better than their iGPU counterparts with the iGPU disabled. The iGPU has its own power plan that is separate from everything else. When you disable the iGPU in BIOS you are fully powering down the iGPU. It will draw no power at all.
Do not buy a KF series in hopes that overclocking or power draw will improve. It won't and physically they can't."
ct909 - Thursday, January 17, 2019 - link
They'll get away with it if the non-F are in short supply and the integrated graphics are not required.Darcey R. Epperly - Thursday, January 17, 2019 - link
Very true. Get a F or pay for a premium (not Intel) market price.TheWereCat - Thursday, January 17, 2019 - link
RIP 4k Netflix on PC thenMidwayman - Thursday, January 17, 2019 - link
I wonder if the turbo states will be any different? Base clock and max turbo don' tell the whole story. If the F models can keep more cores at higher states it might make sense.eastcoast_pete - Thursday, January 17, 2019 - link
Leave it to Intel to turn junk (CPU dies with a faulty iGPU) into cash. While they officially don't even use the thermal headroom that the disabled (faulty?) iGPU should provide, the only conclusion left is that these "F" chips also allow Intel to uprate the binning of these chips. Reduce, reuse, recycle, and then, sell them at full price!Sahrin - Monday, April 20, 2020 - link
Isn’t this Intel tacitly admitting their IGP’ s are worthless?