Can they quickly respond to you ask them when the Skylake i7-6700K will be available in North America?, we are 1/2+ of September and stock hasn't improved in the slightest.
It's same for central Europe, if they are available, it's just few pieces (whole Broadwell + Skylake) and they quickly become unavailable (Skylake much faster than Broadwell). Not even speaking about the rest of Skylake LGA-1151 models, which are not even listed in shops as "out of stock/receiving preorders" etc.
"Delivering Innovation For Each Segments" nice proof reading there, Intel. /troll
So, Intel presumably doesn't believe there is a need for Skylake-C; either Broadwell-C is perfectly adequate (and why not), or the move to Skylake-C is more trouble/cost than it's worth.
Intel knows no-one wants to buy them, since they're not actually any faster.
The only performance Intel is interested in improving is marketing performance. If people understood that Intel make only a small number of different chips and then disable (fuse off) functionality to sell to different markets they'd be ashamed of giving Intel their money.
And what CPU/SoC manufacturer exactly doesn't do that?
There are myriad of reasons for which doing that is a better solution. Not that Intel's marketing pratiques are perfect, but the binning is an important part of lowering the overall production cost....
Well, mobile SoC are a different market, where each unit is mafe of lots of different IPs, and where you don't have the same kind of fragmentation of the offering. But still, there is at least some level of binning in the production.
But look at the snapdragon 800 series. Meaning 800/801. Amongst the really complicated offering, you can find some models with the only difference being maximum clock speeds.
Though it is true it is less obvious than for common CPU/GPUs.
But still. What's wrong in using binning techniques to keep lower-end prices down?
You didn't specify that the mobile market was excluded in the question, and in fact you implied it was included by mentioning the term SoC.
Re: Snapdragon 800 vs 801 - the 800 isn't a binned 801, the 801 was released quite a bit after the 800 and replaced it in the production line as yields improved. The 801 also has eMMC 5 support plus DSDA, whereas the 800 only supports eMMC 4.5 with no DSDA. There actually was some binning that I wasn't previously aware of within the 801 SoC (some were 2.3GHz and some were 2.5GHz) so you do have a point there.
I never said there was anything wrong with binning, but artificially disabling certain features (virtualisation in particular) purely for marketing reasons really messed things up for the Windows XP mode during the Windows 7 release. I'm still bitter about paying extra for a laptop specifically to get virtualisation support, only to find that Intel didn't bother documenting that the chip I got was the only one of the advanced P line that didn't get the feature until I emailed them about it.
I don't think it was binning that had much to do with it as much as some manufacturers choosing to run lower clocks ( thus lower volts and lower power) to shave off the heat produced. This showed in the M8. Low cost manufacturers don't just put 2.5GHz 801's in their phones unless they cost the same. Some chose to down-clock and undervolt -I think that's all there is to it.
That was my point for SoC, not meaning (but I was not clear) that 801 is a binned 800, just meaning that some models in the series were binned models.
The unclear CPU name thing is pretty irritating of course. But my point is:if they want to sell non-VT CPU, the fact that it's disabled/cut out for binning reasons doesn't incur a loss on your side compared to the case whre they would simply produce cores without the VT unit. Actually I think you can (more or less) trust them with the fact they wan to cut down production costs as much as possible. The problem lies with the fact that they still choose to sell lower end processors without those functionalities, thus creating an artificial advantage for their higher-end, no matter how they produce them.
I thought they only disable the IOMMU (VT-d) not the normal virtualization (VT-x) on *K Series, and Skylake even has it on *K series. Still that IOMMU thing is really dangerous because it can be used to prevent attacks on ThunderBolt / USB 3 where any device can use DMA to access memory.
It's not simply binning, which reflects the natural performance variation in chips, it's destroying working silicon which the customer is paying for and then making that customer pay a premium for it.
And that's just one of Intel's repugnant practices.
I think that you are missing that people are not happy when they find that they have paid a large amount of money for a chip that COULD if Intel did not artificially limit functionality do everything that a slightly more expensive chip can do.
Broadwell looked to me from the beginning, more like a back up plan, in case something was going wrong with Skylake CPUs than a CPU line of products that could stay in the market alongside Skylakes. 14nm AND a new design was a big risk even for Intel. Skylakes look OK, so Broadwell is not really a necessity for Intel. This is what AMD should have done when introducing Bulldozer. Shrink Thuban at 32nm and after realizing that Bulldozer was a failure, continue improving the Phenom II design and abandon Bulldozer.
I doubt they had the money to develop THREE cores (K10-derivative, Bulldozer, Bobcat) at the same time. Regardless, K10.5 would've needed a significant overhaul to include support for new ISAs, presumably to the point that a new architecture would've been needed anyway.
For gamers, a failure. But, my 8 core FX is doing extremely nicely for the workloads it's being used for given that it cost $133.75 with an 8 phase motherboard. There is more to computing than gaming performance, although the chips are even looking better in that now that games are beginning to actually have modern engines that can load more than a few cores.
8 cores of what? Sorry but the AMD fanboy post is unnecessary here -- it's been a long time since Richland cores meant much of anything. I guess the FX is a good value compared to, say, Nehalem.
Broadwell is excellent. It did well in benchmarks and the Crystalwell parts are very intriguing. A Skylake Crystalwell would've been perfect for a couple of my friends, who wanted good iGPUs while waiting for Pascal.
... yeah that's 8 ECC and IOMMU capable cores in 2012 btw... its not always as simple as being a fanboy. what oxford guy said is just a fact, more games are using more cores and so what. My games are just fine since adding a 960 gpu. the system has been rock solid stable for years. i can't come up with a justification to upgrade the cpu as much as i'd like to. an 8 core FX is just fine for many many workloads, don't kid yourself about how much that extra whatever % matters in real life to non-gamers
I've got an i7-5775C up and running for over a month now, and it was the best decision I could've made. The CPU is reasonably fast at stock and the IrisPro makes a dedicated GPU unnecessary for my needs. I use this CPU for my graphics-workstation running AdobeCS, 3ds Max + Vray and some video-stuff. The IrisPro is totally fine accelerating these programs and I even can play stuff like Borderlands or EvE Online in 1080p without problems.
The best thing is, that my new rig is allmost inaudible and fits into a very small Cooltek/Jonsbo U1 case.
Neat, I've always been fascinated by how capable integrated graphics could be for modest use cases. Yours is a particularly interesting use case since you need the pricey cpu, but not necessary a remarkable gpu.
Actually Intel didn't exactly price the parts out of the park. Surprising, but interesting and appealing for the use-case he described. I personally found it interesting as a stop-gap while waiting for Pascal.
Price /Performance is actually spot on. I tried using the HD4600 of an i7-4770, but it didn't work that good. Panning and zooming big images in Photoshop wasn't smooth and the 3d-preview in V-Ray RT was very slow.
The IrisPro 6200 is comparable in performance to an entry-level card like the GT740 or R7-240, which both cost some $80. If you add those $80 to the i7-4770, then you'll end up with the same price of the i7-5775C, however you now have a way smaller package and less power-consumption in comparison. I could've built an even smaller system in the size of a MacMini by using the i7-5775C with it's 65W TDP, but then it wouldn't have been as silent.
Then the question is, where are the chips then? I actually think the socketed Broadwell seems like a good alternative to the Skylake chips, with its e-dram.
Ryan, is Intel Shipping Skylake Mobile H 4+2 part? Dell is waiting for Skylake H to refresh XPS15 with infinity display and they don't need 4+4e part as they use discreet Nvidia GPU in the laptop.
Glad to see Dell's expanding the infinity display line; are they planning to bring it to any other models beyond the XPS15 yet? Pity they're still using a neckbeard/nosehair camera to keep the top bezel as small as possible though.
You can expect infinity displays in one or two Dell Enterprise Precision laptops but those will be restricted to 13 and 15 inch displays, similarly consumer side will get XPS 13 and 15. I don't care about camera placement as i never use it and prefer the ultrathin bezel at the top to keep device dimensions as small as possible.
I guess I should feel happy to have found an i7-5775C at a good price for my small form-factor build. I was wondering if buying Broadwell with Iris Pro would be a good idea if a Skylake deskop with Iris Pro arrived 3 months later. Now it looks like that's not going to happen.
The unfortunate problem that I ran into is that motherboards are less compatible with Broadwell than their manufacturers lead you to believe. I tried two now-discontinued Asus boards that said they support 5th-Gen CPUs and updated the BIOS (with a Haswell installed), but still ran into issues with video output and boot devices with the Broadwell. If my experience is widespread, then Broadwell Desktop may still be doomed.
The i7-5775C is a perfect CPU for a SFF-Workstation. I run it on a Gigabyte H97N-WiFi mITX board with the latest BIOS and it runs perfectly fine. No bootup-issues, no iGPU-issues, etc. It just works as intended.
Might be a problem with certain manufacturers rather then a real widespread problem.
Because there has not been a practical impact between different memory speeds for a very long time on Intel platforms. Unless you are using integrated GPU and than you're not going to pay a premium for memory anyway. Can easily buy a GTX 750 ti instead.
PCI Express SSDs and NVMe are super expensive compared to good vanilla SATA SSDs.
And lastly, DDR4 16GB sticks are not readily available at prices close to 2x8GB sticks.
Jumping on first generation has never worked out for a significant subset of buyers. Motherboards will inventively have incompatibilities and bugs to be worked out of revision 0.
So what do you get other than maybe 5% CPU performance bump from switching platforms right now?
So for the many many tons of users that have SATA SSDs and free or cheap DDR3 access, getting a top of line 1150 to update their aging platform makes a lot of sense.
Since you are force fed an integrated GPU, why not get a CPU that will enhance your system with the 128MB L4 cache instead of just guaranteed waste of space.
That's why my trusty Q6600 @ 3.0Ghz from day one and rock solid all these years on stock Intel cooler will be replaced by i7-5775C.
My story and im sticking to it, unless you give me good feedback than i might change my mind.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
43 Comments
Back to Article
boozed - Thursday, September 17, 2015 - link
So where's this mythical 95W "Broadwell K"?Mark_gb - Thursday, September 17, 2015 - link
I think it melted... :pboozed - Thursday, September 17, 2015 - link
Whoops, that's the Skylake roadmap.AndrewJacksonZA - Friday, September 18, 2015 - link
"Whoops, that's the Skylake roadmap." - boozedYour username made me snicker at your post. :-)
HeyImHJ - Friday, September 18, 2015 - link
Can they quickly respond to you ask them when the Skylake i7-6700K will be available in North America?, we are 1/2+ of September and stock hasn't improved in the slightest.HollyDOL - Friday, September 18, 2015 - link
It's same for central Europe, if they are available, it's just few pieces (whole Broadwell + Skylake) and they quickly become unavailable (Skylake much faster than Broadwell). Not even speaking about the rest of Skylake LGA-1151 models, which are not even listed in shops as "out of stock/receiving preorders" etc.MrSpadge - Friday, September 18, 2015 - link
Well, i7 6700 is available in central Europe, 60€ cheaper than the cheapest available 6700K:http://geizhals.de/intel-core-i7-6700-bx80662i7670...
silverblue - Friday, September 18, 2015 - link
"Delivering Innovation For Each Segments" nice proof reading there, Intel. /trollSo, Intel presumably doesn't believe there is a need for Skylake-C; either Broadwell-C is perfectly adequate (and why not), or the move to Skylake-C is more trouble/cost than it's worth.
prisonerX - Friday, September 18, 2015 - link
Intel knows no-one wants to buy them, since they're not actually any faster.The only performance Intel is interested in improving is marketing performance. If people understood that Intel make only a small number of different chips and then disable (fuse off) functionality to sell to different markets they'd be ashamed of giving Intel their money.
nightbringer57 - Friday, September 18, 2015 - link
And what CPU/SoC manufacturer exactly doesn't do that?There are myriad of reasons for which doing that is a better solution. Not that Intel's marketing pratiques are perfect, but the binning is an important part of lowering the overall production cost....
Gigaplex - Friday, September 18, 2015 - link
The mobile SoCs (Qualcomm, Apple etc) don't appear to do so.nightbringer57 - Friday, September 18, 2015 - link
Well, mobile SoC are a different market, where each unit is mafe of lots of different IPs, and where you don't have the same kind of fragmentation of the offering. But still, there is at least some level of binning in the production.But look at the snapdragon 800 series. Meaning 800/801. Amongst the really complicated offering, you can find some models with the only difference being maximum clock speeds.
Though it is true it is less obvious than for common CPU/GPUs.
But still. What's wrong in using binning techniques to keep lower-end prices down?
Gigaplex - Friday, September 18, 2015 - link
You didn't specify that the mobile market was excluded in the question, and in fact you implied it was included by mentioning the term SoC.Re: Snapdragon 800 vs 801 - the 800 isn't a binned 801, the 801 was released quite a bit after the 800 and replaced it in the production line as yields improved. The 801 also has eMMC 5 support plus DSDA, whereas the 800 only supports eMMC 4.5 with no DSDA. There actually was some binning that I wasn't previously aware of within the 801 SoC (some were 2.3GHz and some were 2.5GHz) so you do have a point there.
I never said there was anything wrong with binning, but artificially disabling certain features (virtualisation in particular) purely for marketing reasons really messed things up for the Windows XP mode during the Windows 7 release. I'm still bitter about paying extra for a laptop specifically to get virtualisation support, only to find that Intel didn't bother documenting that the chip I got was the only one of the advanced P line that didn't get the feature until I emailed them about it.
0razor1 - Friday, September 18, 2015 - link
I don't think it was binning that had much to do with it as much as some manufacturers choosing to run lower clocks ( thus lower volts and lower power) to shave off the heat produced. This showed in the M8.Low cost manufacturers don't just put 2.5GHz 801's in their phones unless they cost the same. Some chose to down-clock and undervolt -I think that's all there is to it.
nightbringer57 - Friday, September 18, 2015 - link
That was my point for SoC, not meaning (but I was not clear) that 801 is a binned 800, just meaning that some models in the series were binned models.The unclear CPU name thing is pretty irritating of course. But my point is:if they want to sell non-VT CPU, the fact that it's disabled/cut out for binning reasons doesn't incur a loss on your side compared to the case whre they would simply produce cores without the VT unit. Actually I think you can (more or less) trust them with the fact they wan to cut down production costs as much as possible. The problem lies with the fact that they still choose to sell lower end processors without those functionalities, thus creating an artificial advantage for their higher-end, no matter how they produce them.
nils_ - Sunday, September 27, 2015 - link
I thought they only disable the IOMMU (VT-d) not the normal virtualization (VT-x) on *K Series, and Skylake even has it on *K series. Still that IOMMU thing is really dangerous because it can be used to prevent attacks on ThunderBolt / USB 3 where any device can use DMA to access memory.prisonerX - Monday, September 21, 2015 - link
It's not simply binning, which reflects the natural performance variation in chips, it's destroying working silicon which the customer is paying for and then making that customer pay a premium for it.And that's just one of Intel's repugnant practices.
Christopher1 - Friday, September 25, 2015 - link
I think that you are missing that people are not happy when they find that they have paid a large amount of money for a chip that COULD if Intel did not artificially limit functionality do everything that a slightly more expensive chip can do.yannigr2 - Friday, September 18, 2015 - link
Broadwell looked to me from the beginning, more like a back up plan, in case something was going wrong with Skylake CPUs than a CPU line of products that could stay in the market alongside Skylakes. 14nm AND a new design was a big risk even for Intel. Skylakes look OK, so Broadwell is not really a necessity for Intel.This is what AMD should have done when introducing Bulldozer. Shrink Thuban at 32nm and after realizing that Bulldozer was a failure, continue improving the Phenom II design and abandon Bulldozer.
silverblue - Friday, September 18, 2015 - link
I doubt they had the money to develop THREE cores (K10-derivative, Bulldozer, Bobcat) at the same time. Regardless, K10.5 would've needed a significant overhaul to include support for new ISAs, presumably to the point that a new architecture would've been needed anyway.DanNeely - Friday, September 18, 2015 - link
Ummm what? Broadwell as a die shrink of haswell has been the Tick half of the Tick-Tock strategy Intel's used since 2007.Oxford Guy - Saturday, September 19, 2015 - link
For gamers, a failure. But, my 8 core FX is doing extremely nicely for the workloads it's being used for given that it cost $133.75 with an 8 phase motherboard. There is more to computing than gaming performance, although the chips are even looking better in that now that games are beginning to actually have modern engines that can load more than a few cores.lmcd - Monday, September 21, 2015 - link
8 cores of what? Sorry but the AMD fanboy post is unnecessary here -- it's been a long time since Richland cores meant much of anything. I guess the FX is a good value compared to, say, Nehalem.Broadwell is excellent. It did well in benchmarks and the Crystalwell parts are very intriguing. A Skylake Crystalwell would've been perfect for a couple of my friends, who wanted good iGPUs while waiting for Pascal.
gearhead99 - Monday, September 28, 2015 - link
... yeah that's 8 ECC and IOMMU capable cores in 2012 btw... its not always as simple as being a fanboy. what oxford guy said is just a fact, more games are using more cores and so what. My games are just fine since adding a 960 gpu. the system has been rock solid stable for years. i can't come up with a justification to upgrade the cpu as much as i'd like to. an 8 core FX is just fine for many many workloads, don't kid yourself about how much that extra whatever % matters in real life to non-gamersjrs77 - Friday, September 18, 2015 - link
I've got an i7-5775C up and running for over a month now, and it was the best decision I could've made. The CPU is reasonably fast at stock and the IrisPro makes a dedicated GPU unnecessary for my needs.I use this CPU for my graphics-workstation running AdobeCS, 3ds Max + Vray and some video-stuff. The IrisPro is totally fine accelerating these programs and I even can play stuff like Borderlands or EvE Online in 1080p without problems.
The best thing is, that my new rig is allmost inaudible and fits into a very small Cooltek/Jonsbo U1 case.
ImSpartacus - Friday, September 18, 2015 - link
Neat, I've always been fascinated by how capable integrated graphics could be for modest use cases. Yours is a particularly interesting use case since you need the pricey cpu, but not necessary a remarkable gpu.Oxford Guy - Saturday, September 19, 2015 - link
Price/performance is likely a problem with Iris Pro, though, eh?lmcd - Monday, September 21, 2015 - link
Actually Intel didn't exactly price the parts out of the park. Surprising, but interesting and appealing for the use-case he described. I personally found it interesting as a stop-gap while waiting for Pascal.jrs77 - Monday, September 21, 2015 - link
Price /Performance is actually spot on. I tried using the HD4600 of an i7-4770, but it didn't work that good. Panning and zooming big images in Photoshop wasn't smooth and the 3d-preview in V-Ray RT was very slow.The IrisPro 6200 is comparable in performance to an entry-level card like the GT740 or R7-240, which both cost some $80. If you add those $80 to the i7-4770, then you'll end up with the same price of the i7-5775C, however you now have a way smaller package and less power-consumption in comparison.
I could've built an even smaller system in the size of a MacMini by using the i7-5775C with it's 65W TDP, but then it wouldn't have been as silent.
watzupken - Friday, September 18, 2015 - link
Then the question is, where are the chips then? I actually think the socketed Broadwell seems like a good alternative to the Skylake chips, with its e-dram.BMNify - Friday, September 18, 2015 - link
Ryan, is Intel Shipping Skylake Mobile H 4+2 part? Dell is waiting for Skylake H to refresh XPS15 with infinity display and they don't need 4+4e part as they use discreet Nvidia GPU in the laptop.DanNeely - Friday, September 18, 2015 - link
Glad to see Dell's expanding the infinity display line; are they planning to bring it to any other models beyond the XPS15 yet? Pity they're still using a neckbeard/nosehair camera to keep the top bezel as small as possible though.BMNify - Friday, September 18, 2015 - link
You can expect infinity displays in one or two Dell Enterprise Precision laptops but those will be restricted to 13 and 15 inch displays, similarly consumer side will get XPS 13 and 15. I don't care about camera placement as i never use it and prefer the ultrathin bezel at the top to keep device dimensions as small as possible.bolkhov - Friday, September 18, 2015 - link
"95W Broadwell" is Xeon E3 1285v4.And Xeons E3v4 are almost mythical, only journalists got their hands on them.
atbennett - Friday, September 18, 2015 - link
The ITworld story has since been corrected. It is, in fact, the Skylake-C and not the Broadwell-C that has been discontinued.http://www.itworld.com/article/2984695/hardware/in...
Brian_R170 - Friday, September 18, 2015 - link
I guess I should feel happy to have found an i7-5775C at a good price for my small form-factor build. I was wondering if buying Broadwell with Iris Pro would be a good idea if a Skylake deskop with Iris Pro arrived 3 months later. Now it looks like that's not going to happen.The unfortunate problem that I ran into is that motherboards are less compatible with Broadwell than their manufacturers lead you to believe. I tried two now-discontinued Asus boards that said they support 5th-Gen CPUs and updated the BIOS (with a Haswell installed), but still ran into issues with video output and boot devices with the Broadwell. If my experience is widespread, then Broadwell Desktop may still be doomed.
jrs77 - Monday, September 21, 2015 - link
The i7-5775C is a perfect CPU for a SFF-Workstation. I run it on a Gigabyte H97N-WiFi mITX board with the latest BIOS and it runs perfectly fine. No bootup-issues, no iGPU-issues, etc. It just works as intended.Might be a problem with certain manufacturers rather then a real widespread problem.
Kvaern2 - Friday, September 18, 2015 - link
I don't really get why Intel would want to keep a consumer Broadwell around unless there's issues with Skylake S + edram giving them no other choice?drothgery - Friday, September 18, 2015 - link
A high-end CPU with top-end integrated graphics really is a pretty niche part. Intel didn't release one with Haswell.blahsaysblah - Friday, September 18, 2015 - link
Because there has not been a practical impact between different memory speeds for a very long time on Intel platforms. Unless you are using integrated GPU and than you're not going to pay a premium for memory anyway. Can easily buy a GTX 750 ti instead.PCI Express SSDs and NVMe are super expensive compared to good vanilla SATA SSDs.
And lastly, DDR4 16GB sticks are not readily available at prices close to 2x8GB sticks.
Jumping on first generation has never worked out for a significant subset of buyers. Motherboards will inventively have incompatibilities and bugs to be worked out of revision 0.
So what do you get other than maybe 5% CPU performance bump from switching platforms right now?
So for the many many tons of users that have SATA SSDs and free or cheap DDR3 access, getting a top of line 1150 to update their aging platform makes a lot of sense.
Since you are force fed an integrated GPU, why not get a CPU that will enhance your system with the 128MB L4 cache instead of just guaranteed waste of space.
That's why my trusty Q6600 @ 3.0Ghz from day one and rock solid all these years on stock Intel cooler will be replaced by i7-5775C.
My story and im sticking to it, unless you give me good feedback than i might change my mind.
marc1000 - Friday, September 18, 2015 - link
Is it only me, or did anyone else notice the reference to the movie MegaMind? about the end of the movie:Hal: Woh! I…I thought you were dead!
Metro Man: My death was…greatly exaggerated.
magreen - Friday, September 18, 2015 - link
It's actually a quote ascribed to Mark Twain. ;)HisDivineOrder - Friday, September 18, 2015 - link
An Intel product that is barely available in the US is dead already? Ehhhh... a thing must live to die. ;)