Comments Locked

22 Comments

Back to Article

  • woggs - Tuesday, January 29, 2019 - link

    Will people really be stupid enough to pay more for less? If so... Then... Um? Bravo?? :/
  • Sttm - Tuesday, January 29, 2019 - link

    Yeah it does not really make sense unless there is some sort of overclocking advantage I am unaware of.
  • woggs - Tuesday, January 29, 2019 - link

    It would be part of the marketing campaign if so.
  • ImSpartacus - Wednesday, January 30, 2019 - link

    There is sometimes a tiny advantage to having "dark silicon" to suck up heat and reduce thermal density, making it easier to cool.

    But the difference between an idling GPU and a fused off GPU is minimal, so my bet is that Intel is just looking to make their supply go further.
  • twotwotwo - Tuesday, January 29, 2019 - link

    Think they're betting that people will pay the same for less if it's the only way they can get a chip *now* (because of the constrained supply) and they weren't going to use the IGP much anyway (because they already want a graphics card). Still just weird to charge the same for them, but "it's the only variant in stock" is a legit reason to get one.
  • Lord of the Bored - Tuesday, January 29, 2019 - link

    Sounds like a legit reason to go AMD, to me.
    Pity Intel and AMD use different interfaces these days and we can't Am486 it up anymore, or do a repeat of the Socket 7 omniplatform.
  • Samus - Wednesday, January 30, 2019 - link

    Even paying the SAME for less is ridiculous. Without the iGPU you are undoubtedly losing features, even if you use a dGPU, such as Quicksync.
  • maroon1 - Friday, February 1, 2019 - link

    If your dGPU get broken or you sell it, you still have iGPU as a spare
  • cosmotic - Tuesday, January 29, 2019 - link

    Is there a thermal advantage to the chips with the graphics fused off? Obviously its nice for intel that they can sell an otherwise unsellable chip when there are defects on the graphics portion of the die... but shouldn't it be cheaper?
  • DigitalFreak - Tuesday, January 29, 2019 - link

    I guess that's something Anandtech needs to test. :-)
  • woggs - Tuesday, January 29, 2019 - link

    Unless Intel provides samples making claims of this directly, it's a waste of time. There is no claim of it that I'm aware of. And if it were true, Intel would charge a premium and put it in all the marketing, but are not. So, it seems B&H is counting on some customer stupidity.
  • Lord of the Bored - Tuesday, January 29, 2019 - link

    Intel's charging the same price for less. That's basically asking a premium.
  • Samus - Wednesday, January 30, 2019 - link

    Physically improbable because the iGPU is still essentially deactivated (until you use a feature of it - like Quicksync) when using a dGPU. I'll believe it when I see it, but until then, I don't see how fusing off a feature that would otherwise be inactive on a system with a dGPU will have any meaningful benefit.
  • Khenglish - Friday, February 1, 2019 - link

    Nothing prevents someone from powering off the iGPU of a fully functional K series in bios. The iGPU has its own power plane separate from the rest of the GPU and it is fully powered down if diabled in BIOS. Then you get the same thing as a KF, but always have the option of turning on the iGPU for more display outputs or dGPU debugging... for the same price.

    The iGPU burns a few hundred mW at idle even if left on. It is very negligible.
  • paul sss - Wednesday, January 30, 2019 - link

    soon they will release a program taht takes advantge of that igp for other processing and this will be even a greater dud. you can use the igp in premier pro
  • Samus - Wednesday, January 30, 2019 - link

    That's the worst part of this. They are really screwing over customers (and developers) that use iGPU-based extensions because now there are going to be a metric fuckton of chips out in the wild that are in a series of chips that have historically been compatible with these extensions.

    Until now, it was understood the only Intel CPU's NOT supporting, for example, QSV, were Xeons, and very early Sandy Bridge-based Pentiums and Celerons. Almost every CPU Intel has made since Ivy Bridge has had QSV extensions and they have added industry-leading hardware decoding across the board since Haswell.

    Nice way to roll it backwards, Intel.
  • abrowne1993 - Wednesday, January 30, 2019 - link

    lol fuck you too, Intel
  • piroroadkill - Wednesday, January 30, 2019 - link

    The best part is going to be idiots buying this because it will "overclock better", even though it's been explained several times why that technically won't be the case. Also, why would these even be good bins? If there's a chance the GPU doesn't work on these because of a defect, then there's nothing whatsoever to suggest the CPU will be a great specimen.
  • tezcan - Saturday, February 2, 2019 - link

    I'm pretty sure it is the same exact silicon, except the KF has a Faulty GPU in it, hence the F in the name KF. If the GPU is not in use it is most likely fully powered down by Intel's power management, which is no hard things for any IC to do these days, let alone a CPU which is doing this non stop. I am an enthusiast but would still prefer the free GPU in case that rig becomes a server or something one day or I need to pull out my GPU.
  • eastcoast_pete - Sunday, February 3, 2019 - link

    In addition to the obvious lunacy of paying more for less (loose the iGPU? That'll cost you), this whole F-series (disabled iGPU) from Intel brings something else to mind: For a number of years, GPU-assisted computing was seen as the "next big thing", and, for dGPUs and certain applications, it actually makes sense and even happened. A professional high-end graphics card (Nvidia Titan and others) is now more likely to be used for number crunching for deep learning and other computing challenges. Of course, the iGPU on Intel's dies is nowhere near that speed and capability, but, is it really so incapable of helping with floating point-type tasks? Really disappointing that this circuitry is not used beyond showing the Windows desktop.
  • Machinus - Monday, February 11, 2019 - link

    So are we really supposed to believe the integrated GPU has a total TDP of 0W?
  • Drpoomanchu - Tuesday, February 12, 2019 - link

    I think they failed to add the the KF version comes with a CPU cooler. Probably the $50 difference

Log in

Don't have an account? Sign up now