Skylake DT could drag ass as long as Broadwell DT compared to the mobile launch and there would be no issues for Intel. In any case, I am happy to see Intel at least offering a serious iGPU in Desktop form factor without having to use the mobile part. AMD has very little room in the desktop market and iGPU heavy parts are the only place where buying AMD makes sense. Yet, that is still sacrificing CPU performance.
The Start Swarm DX12 preview shows us that Intel's iGPUs are not at all held back by the CPU. I think Intel needs to up the performance considerably on the iGPU. They have come a long way since the 845 chipset GPU, which wasn't hard to do better than.
Iris Pro on the desktop better not be over $300 CPUs. If it is a $400+ part I will never touch it.
"The Start Swarm DX12 preview shows us that Intel's iGPUs are not at all held back by the CPU. I think Intel needs to up the performance considerably on the iGPU."
Yes but the performance limitation isn't what you would traditionally think of when you hear of a GPU performance bottleneck. It isn't a shader or ROP limitation, it's something to do with the front-end. If Intel could simply address this bottleneck, they could probably drastically improve their Iris Pro performance (at least in DX12 Star Swarm) even with the exist shader/rop config.
IMO this tells us Intel is not at all interested in pushing GPU, because they update it only when NEEDED rather than pursuing GPU advances like Nvidia and AMD.
You're totally incorrect. Intel is after GPU very, very heavily. But, they have to design around AMD's and Nvidia's near endless supply of patents. Intel's goal is server/supercomputer performance density with its iGPUs right now. The goal isn't gaming. That's just a bonus. When Skylake Xeons come with the top iGPU SKUs and confuse the hell out of enthusiasts, you'll have your answer. Intel's prepping for a fight to supplant CUDA and fight off HSA by using OpenMP (which can offload to Xeon Phi or to Intel iGPU).
Not really. You would just have orders of magnitude slower performance due to the transistor constraints.
For instance, nVidia used to make integrated graphics as a separate chip compatible with Intel and ATI/AMD as well as VIA chipsets... And also made integrated graphics that was combined in the same die with it's own chipset solutions. Conversely, one such chip was known as the "Geforce 8100" which funnily enough... Can handle CUDA. It would be a hair pulling process however, it is slow.
Intel have a cross licensing patent with AMD, not sure if that would include GPU patents, but I would assume so.
Fact of the matter is.. Intel have never been able to design a fully competent GPU solution, today is a little better than what we had in the past, but the drivers for current Intel GPU's are still stupidly horrible.
Intel also uses less die space than AMD for it's GPU's too, so there is that.
That's only Anandtech's theory and not at all based in reality. There have been 7 driver updates since those tests were run. Let's see it done again, with high-end 2400MHz RAM this time.
Broadwell doesn't bring much of interest to the table for desktop users. This is probably Intel's answer to that, I'm still holding out until at least Sky Lake at this point unless Broadwell turns out to be a great overclocker (which seems unlikely).
Overclocking will always have a purpose. We could look at a 3.3GHz Intel chip and think "that's good enough," but when you can overclock it to 4.4GHz, you're getting a very tangible benefit (20% reduced render times, etc). Now in your future days of "not worth it" overclocking, where you're trying to bump 4GHz to 4.4GHz, the gains will be minimal, yes, but they're gains none-the-less, and could shave 10min off a 1hr render. That's still tangible.
I just noted with a smirk, that my 3920XM (a mobile chip) performed higher than that MSI GT80 Titan beast of a rig (just released), in Intel's CPU benchmark. Informing me, that after almost 3yrs later (system shipped in July 2012, just checked Dell website) my everyday computer experience is likely to be quicker than that thing. I'm aware the 4940XM exists, but due to electrical incompatability, I'm not able to use it, sadly.
Surly Intel must know some of us would buy NEWER chips, even in the Desktop Replacement catagory, were they actualy made available.
Small rant - not all of us will replace our systems as often as Intel would like, and with no QHD 18.4" *socketable CPU* (important that one) laptops out there, I may be holding onto this one for some time to come.
I disagree, and it doesn't matter if I do since everything here is simply us predicting the future.
But I feel that overclocking will go the way of the Dodo in say... 4 - 6 years?
Eventually it just won't be worth it, the gains will be too small. Because I believe in 4-6 years you will be lucky to get even 5% with an OC board and unlocked chip.
Right now? Yes, it is worth it, even if it isn't nearly as fun as it used to be. I miss comparing clocks with my friends back in Sandybridge days, those were fun days... :)
Crystal Well also had a 65W TDP for 'desktop' parts. I find it likely that they will launch Broadwell unlocked CPUs with Iris Pro and Skylake unlocked CPUs with HD graphics.
As someone looking to upgrade to a new build from a gen2 i3 desktop, this is interesting. It's possible that by the end of the year we will have 14nm Skylake, instead of good ol' haswell. I'd love to build an i5 Skylake Desktop, but I have to wonder if there will be any real gains, especially as I will be running dedicated graphics.
Really, these changes are intended for mobile, and the improved performance and battery life are much appreciated.
Recently benchmarks have shown up, reporting a 15% IPC gain for Skylake in comparison with Broadwell. That means that Skylake should have a ~20% higher IPC than Haswell and at least 40% more than Sandy Bridge (depending on the application of course, because there is no such thing as "one IPC"). Add slightly higher frequencies and better energy effiency and you have a decent upgrade.
Yeah, but for all we know we'll end up with lower frequencies. I'm not saying we absolutely will, but the last few years Intel sure hasn't been pushing increased performance.
The other issue is, all these integrated graphics chips are a complete waste for gamers, and gamers are one of the few groups who actually need higher performance. For your average user Haswell is already well beyond "good enough".
I also think that the maximum boost frequencies will start comming down together with the maximum TDP (read: performance), even on the desktop. There is simply much more money to be made by limiting device refresh rates to 10-20 fps, reducing performance accordingly, and selling a 24hour or a 48hour battery life to the average consumer. Even for applications that get power from the wall, the allure of a sub 1mm thick desktop (possible with system TDPs on the order of 100's of miliwatts), somehow, generates much more sales than high-performance gaming ever could.
Together with the ever increasing R&D and manufacturing costs associated with high-frequency semiconductors, I dont see how any company will be able to convince its investors to build another record-breaking performance CPU in our lifetimes.
PC gaming is an asbolutely massive market, until that market dies, Intel will ALWAYS build high performance record breaking CPU's. What you're also not accounting for is the server market. They tried the low power thing there and found out that high end CPU's completed the task faster and got back to a low power idle state faster and ultimately used significantly less power than the "low power" part.
So even if they stopped "focusing" on the high end desktop/gaming market for CPU's, they will still be pushing the envelope in the server market, and a lot of that R&D can be easily translated into the desktop market.
Saying we won't see a record breaking CPU in our lifetimes is the height of absurdity.
AMD hasn't been pushing clocks either, and neither of them can. Heat grows quadratically or even worse with clock increases. Software just needs to catch up to new instruction sets which do more parallel work in the same clocks.
You say iGPU is a waste now, but with Nvidia's PhysX engine now going open-source, what would you say to letting the iGPU do the physics calculations and let the big iron handle the rendering and texture painting? Synergy is the name of Intel's game, though it's more focused on compute performance to fight off the Zen APUs and HSA.
I agree, if we're talking strictly pushing stupidly high clock speeds, yes, i don't think there will be much if any focus on that. But faster is faster, whether they acheive that through better parallelism, new instruction sets, etc etc, the end result is the same. If a 4ghz chip can do 20% more work than the previous generation at the same 4ghz then i could care less what the number is, just how it performs.
That being said, i do still believe we've got a healthy period of time where intel will continue focusing on making monster fast chips (not necessarily high clock rate). The idea that they will abandon the high end desktop market is ludicrous at this point. Even if its only 10% of their sales, thats still billions of dollars a year, and noway they're spending billions of dollars on R&D for that segment (i.e. there is still profit to be had in that segment).
I think the advances in gaming are coming from different angles than just increasing CPU performance really.
The current bottleneck for gaming has been identified and is being handled with Dx12, the upcoming bottleneck for gaming (memory bandwidth) will be handled by stacked DRAM.
Almost no TV's have DP and it doesn't seem that likely to change in the near future.
60FPS/Hz content is coming more and more common (sport broadcasts etc), so there is definitely a need to drive TV's at 4K@60Hz, especially for HTPC users.
Crystal Well also works as cache for the CPU, or at least that's how it worked with Haswell. This can give double digit performance increases in some applications (and nothing in others). People should not only dismiss Iris Pro just because they're using a discrete GPU. A 4.5 GHz Broadwell with a 10% performance boost from Crystal Well is almost as fast as a 5.0 GHz Broadwell without the cache.
If Intel doesn't launch this, I'm switching to AMD for my next platform based on principle alone. I upgraded my motherboard from an H87 to H97 last year SPECIFICALLY to be guaranteed an upgrade path. If Broadwell DT becomes vaporware I think a lot of people (millions of people) are going to be pissed for the same reason. What was the point of the 90-series chipset (other than adding bootable "M2") if they aren't going to make a CPU for it?
"I'm switching to AMD for my next platform based on principle alone."
Absolutely. AMD is the one to side with, i.e. if one cares the slightest for professional integrity. Well spoken sir, well spoken indeed. Not many men have the moral strength to spend money on a technologically inferior product just for upholding their principles. We must all take the example of this great man and buy AMD CPUs from now on. *two thumbs up*
I think some people, like me, are just tired of being bent over by Intel. But of course many people like being ass raped, lied to and thrown around like a ragdoll by the Intel pimp.
"I think some people, like me, are just tired of being bent over by Intel."
This is modern business mate - everyone wants to bend the customer over. So if you KNOW you're gonna get bent over, then better bend over for the ones who know what they're doing.
"But of course many people like being ass raped"
<nitpicking> If someone likes it, you can't technically call it rape. lol, maybe they're just into rough love. </nitpicking>
"I think some people, like me, are just tired of being bent over by Intel."
Why did you buy a Intel rig, nobody forced you to, you had the choice to waste your money on a hot watt sucking IPC cripple obsolete AMD CPU but you instead choose to get, "bent over by Intel, WHY?
Was it because Intel offered a massive performance per watt advantage, or was it because Intel CPUs don't bottle neck high end GPUs and give you a SLI option when needed without choking your GPU investment to a slow stuttering death.
If you hate Intel so much for pushing new tech why did you feel compelled to buy Intel, please tell us all why you submitted to being bent over by Intel when you're so in love with AMD and its hot watt wasting IPC cripple CPUs ? People want to know why did a AMD lover buy a Intel rig ?
Do tell, please do tell us all why a AMD fan boy who hates Intel pushing new tech ends up buying a Intel rig?
I'm still laughing at the lack of response from him. I love how people equate not having EXACTLY to the T a product they specifically want, and that equals getitng raped/bent over, etc by the company. It would be like me complaining that Ford doesn't offer bright neon pink as a standard color option on the mustang, and if i want that i'd have to pay extra for some shop to repaint it, and Ford is bending me over, etc.
Actually, they had a special trim in California that was a nice pastel hot pink for awhile when I worked for Ford.
It was the "Pink Ladies" trim or something like that, some sort of woman group in CA. Not sure what they did, but they got their own special color and trim level. (Just a California trim with a hot pastel pink paint job and a special emblem.)
You can stick your AMD cheer leading principals where the sun don't shine.
My loyalty is to my hard earned money and for years now I don't upgrade my CPU because Intel's massive performance advantage gives me all the performance I need now and the future. Unlike AMD watt wasting IPC cripple CPUs Intel delivers the performance needed to skip a upgrade cycle or two/three, I have 920, 980 and Sandy Bridge rigs that with a high end GPU run new games with all the features and eye-candy turn on while AMD has nothing but bottle necking multi GPU strangulating, hot watt sucking CPUs. Wasting your money on a AMD rig makes you want to upgrade as soon as you turn it on, not so with Intel and its been like that for years now. Invest in a Intel rig and there is no need to upgrade and by the time the need for speed becomes a reality there's a whole new Intel platform to build without throwing good money away on a old inferior obsolete rig (AMD).
The only reason to upgrade a CPU younger than, lets say 2008, is to play games. Or to get some special tasks done, like rendering...for "tyical" user cases, there is barely any need to upgrade.
Absolutely right Jeffry but only if you when with Intel CPU that delivers the performance to skip upgrade cycles, now if you wasted your hard earned money on a AMD watt sucking IPC cripple, GPU bottle necking CPU then you would want to upgrade as soon as you plugged it in and cried I should of bought a Intel.
Agreed. My parents are running an old Core 2 Duo for the past 6 years and have had literally 0 issues. I threw a Radeon 6670 in there so my nephew could play minecraft and such when he is visiting his grandparents, and it runs most of the games he wants to play (including stuff like CS:Source/Go, etc etc) at perfectly acceptable framerates at 1080p. 1080p youtube videos? not a problem. Overall cost of the PC when we bought it + video card i later put into it? $570. Years they've been using it problem free? 6+
Still rock a P965 based E8600 @ 3.8 GHz... hell of a nice system with budget eBay upgrades (4 GB RAM, GTX 560) - actually playing it more than my main PC at the moment because of its location in the house. I think I'll be keeping my 3570K @ 4.2 + 980 GTX for a while yet, should last at least another 2-3 years before upgrade necessary (might need to go to 16 GB at some point but that's a trivial upgrade).
Yeah, im rocking a 2700k @ 4.3, 760 SLI, 16gb ram, etc, and have no intention of upgrading for a while. Though i will admit ive gotten the itch BAD this last christmas/tax season. Right now im saving my money to buy that new Acer IPS 1440p Gsync monitor when it comes out. Assuming its not eleventy billion dollars.
They took their time to get IrisPro to the desktop. Arriving too late for me. I got the performance I needed for my kid's Sims3 on the PC with the AMD A8-7600. USD250 for an A88X MITX mobo, the A8-7600, and 8GB dual rank memory to boost performance.
Did the same, I got an A8 laptop and it does everything for my little one, minecraft, WoT, LoL, WoW, SWTOR, Sims 4, but most importantly (Not that is is demanding) his favorite AC3.
Is it just me or does intel have it backwards for the Iris Pro on the desktop chip. I feel that if you are going to be buying a higher end chip like this, would you not most likely be paring it with a dedicated video card, thereby not needed to use the higher end Iris pro?
It depends on your use case really. A desktop chip at a 65W TDP with Iris Pro would be phenomenal for an HTPC that also does more casual gaming. The thing really holding back those types of builds has been integrated graphics performance, and Iris Pro really steps up their game in that department.
Though you're right, Iris Pro availability is moot if you're building a traditional gaming rig because you're going to disable the IGP anyway.
I am waiting for Broadwell or Skylake Iris Pro desktop part before I upgrade my aging system.
I want to a system with Intel iGPU as well as boot disk based on PCIe SSD drive (maybe with NVMe?) Intended use is a dual boot Windows & Linux general purpose machine to keep runing for the next 10 years.
I'm sitting on my hands until Skylake. As for "the next 10 years" part, you'll be lucky if your motherboard lasts that long. Plan on 4 or 6-year upgrade cycles, unless you think your IDE connectors, AGP slot, and floppy drive from 2005 are still relevant...?
We have old Conroe (2006) based PC's at work which having been upgraded to Win 7, SSD and more RAM over the years are still more than adequate for running browsers, Office365 and several business specific applications.
(to the original question) It seems like most Broadwell will NOT be desktop so if you want a desktop, Skylake may end up being your only viable option. Even if Broadwell desktop did exist, I personally would wait for Skylake anyway (and I am).
Similar here. All C2Duo iMacs except for 2 AMDs and 1 SB i3 (the "workhorse"). Average age, 7-8 years. Mice are all Microsoft Intellimouse, vintage 2000,2001, and the printers all Canon, vintage 2007-2009. They are perfect for financial/accounting software+Excel.
The iMacs run Yosemite (10.10) slowly, but are fine in SL (10.6) or Windows.
Maybe others have less luck with hardware longevity and maybe 10 years is too much to expect, but in terms of performance, 10 years is not too much to hope for. For 90% of what we do, the bottleneck in performance is the user or waiting for external data to input.
That said I haven't had a "gaming" device since a PS2 or a 600MHz Athlon w/ Geforce2 GPU and would love to get my hands on a good FPS or Skyrim or hundreds of other games I have been deprived of the past 11 years.
I agree on that one. I still have an old Yonah Core Duo running Lubuntu Linux on my Laptop. Its has its second fan, second hdd and 2 GB of RAM. Works fine for Business (Email, Office, Internet).
Hey, I have an Athlon64 system running with an AGP video card, IDE harddrive, and floppy drive (although never used) connected to our 27" CRT TV via component video, thank you very much! Runs Windows XP, Google Chrome, and the Plex Web Client just fine. It's not on 24/7, but it makes watching shows in the bedroom easy enough.
Just waiting for the hardware in that system to actually die before the TV is moved into the kids' room for use with all the old consoles (NES, SNES, some LeapFrog stuff, Wii, etc). Hoping it lasts another 2-3 years, though.
Hehe, amazing. I also have one of these. Its an AMD Athlon XP (not 64;), those were the first 64 bit AMDs) 'Barton' running a Ubuntu File Server (Samba). Works like a charm...
It is a shame that everyone has to pay for those completely useless iGPUs, which take a lot of development resources, while CPU performance has basically been the same for 5 years.
I agree that the lack of improvement is frustrating but Intel is merely providing what the market desires. The past 5 years the market has demanded focus on: portability iGPU performance per watt
If Intel focused on the high end desktop market only, they would be losing money right now.
The other factor is competition. You release IP to market when there is a competitive need. A Sandy Bridge i3 can wallop the latest AMD "CPU" (whatever it's called). If Intel had added a feature to Ivy Bridge that added 20% to integer/floating point performance, AMD could have reverse engineered that and put it in their current chips. If Zen is a performance machine, and I hope it is, we will see some big improvements coming from Intel shortly after.
I dunno if you pay that much for it, really. You might pay for the R&D but it's the same R&D that will get you that GPU on your Ultrabook where you might want it. As for the silicon on the chip, if unused, it's a bit like saying that someone buying a cut down i5 CPU is "paying" for that disabled 2MB cache they aren't using. or someone buying an i3 is "paying" for the silicone on two disabled cores. Since there is no line of Intel CPUs with disabled iGPUs I suspect they don't pose much of a problem binning wise.
I'm sitting on a 4.0 GHz quad Nehalem. I have this irrational feeling that I need to upgrade it based on its age alone (I bought mine 6 years ago), but it's still a well running system. This has certainly been my longest running system (by far). IPC improvements might finally make me jump, though. A lowly 2.6 GHz quad Broadwell should, in theory, perform just as well as my Nehalem. I haven't been concerned with power since it requires all four cores to be loaded before it draws 140W, but it would be nice to drop that to less than half.
Classic system but things have come a long way since then. With more Skylake IPC and power consumption improvements I suggest the time to upgrade is near.
Have a look on ebay for old surplus 1366 xeons, assuming your motherboard has a bios update for them (almost all socket 1366 boards do) you can pick up a 6 core /12 HT xeon for a few bucks, and even a low multiplier x5650 will run 4Ghz if your motherboard is okay with BCK of 186 or so. Just be careful with voltages - gulftown IMC isn't quite the indestructible beast bloomfield is ;)
It's definitely worth looking into if you want to get a few more years out of your rig while you wait to upgrade.
Intel has never been interested in selling more than 4 cores to mainstream users (115x sockets) because the workloads were generally not using up more than 2 or 3 cores. Be thankful you get HyperThreading. As Ammaross said, the bigger enthusiast sockets are where you would have to go.
What I want to know is whether a lot of Iris Pro chips will in fact be limited to 64MB as a few rumors have suggested. The original Iris Pro was potentially 128MB-256MB but ultimately just 128MB.
Perhaps the performance boost is too light to justify the extra transistors/cost/heat/space, or maybe it is a pricing issue where the 64MB version can command 80% of the price premium of the 128MB version.
Intel probably has the the highest useage rate of graphics in computers of anyone thanks to all those laptops business PCs and ot there, how ever at the higher end where topp end and unlocked i5 and i7 processors are integrated grphics servers no purpose other than to add wasted silicon untill someone properly implements off load Floating Point calculations to to the embeded graphics core. Anyone buying an unlocked processor is unlikely to be bothered whether a processor has HD 3000/4000/iris pro at this point in time.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
78 Comments
Back to Article
eanazag - Thursday, March 5, 2015 - link
Skylake DT could drag ass as long as Broadwell DT compared to the mobile launch and there would be no issues for Intel. In any case, I am happy to see Intel at least offering a serious iGPU in Desktop form factor without having to use the mobile part. AMD has very little room in the desktop market and iGPU heavy parts are the only place where buying AMD makes sense. Yet, that is still sacrificing CPU performance.The Start Swarm DX12 preview shows us that Intel's iGPUs are not at all held back by the CPU. I think Intel needs to up the performance considerably on the iGPU. They have come a long way since the 845 chipset GPU, which wasn't hard to do better than.
Iris Pro on the desktop better not be over $300 CPUs. If it is a $400+ part I will never touch it.
dragonsqrrl - Thursday, March 5, 2015 - link
"The Start Swarm DX12 preview shows us that Intel's iGPUs are not at all held back by the CPU. I think Intel needs to up the performance considerably on the iGPU."Yes but the performance limitation isn't what you would traditionally think of when you hear of a GPU performance bottleneck. It isn't a shader or ROP limitation, it's something to do with the front-end. If Intel could simply address this bottleneck, they could probably drastically improve their Iris Pro performance (at least in DX12 Star Swarm) even with the exist shader/rop config.
IntelUser2000 - Friday, March 6, 2015 - link
Broadwell did improve the front end geometry performance, but we don't know if that's the limitation.http://www.anandtech.com/show/8814/intel-releases-...
IMO this tells us Intel is not at all interested in pushing GPU, because they update it only when NEEDED rather than pursuing GPU advances like Nvidia and AMD.
patrickjp93 - Saturday, March 7, 2015 - link
You're totally incorrect. Intel is after GPU very, very heavily. But, they have to design around AMD's and Nvidia's near endless supply of patents. Intel's goal is server/supercomputer performance density with its iGPUs right now. The goal isn't gaming. That's just a bonus. When Skylake Xeons come with the top iGPU SKUs and confuse the hell out of enthusiasts, you'll have your answer. Intel's prepping for a fight to supplant CUDA and fight off HSA by using OpenMP (which can offload to Xeon Phi or to Intel iGPU).jeffry - Saturday, March 7, 2015 - link
CUDA cant be engaged with an iGPU. Such of a kind iGPU would take too much chip area away from the CPU itself.StevoLincolnite - Monday, March 9, 2015 - link
Not really.You would just have orders of magnitude slower performance due to the transistor constraints.
For instance, nVidia used to make integrated graphics as a separate chip compatible with Intel and ATI/AMD as well as VIA chipsets... And also made integrated graphics that was combined in the same die with it's own chipset solutions.
Conversely, one such chip was known as the "Geforce 8100" which funnily enough... Can handle CUDA.
It would be a hair pulling process however, it is slow.
jeffry - Friday, March 13, 2015 - link
Of course its possible. But painfully slow, which renders it useless in todays world. To do SIMD processing, a dedicated chip is needed (aka GPU).StevoLincolnite - Monday, March 9, 2015 - link
Intel have a cross licensing patent with AMD, not sure if that would include GPU patents, but I would assume so.Fact of the matter is.. Intel have never been able to design a fully competent GPU solution, today is a little better than what we had in the past, but the drivers for current Intel GPU's are still stupidly horrible.
Intel also uses less die space than AMD for it's GPU's too, so there is that.
patrickjp93 - Saturday, March 7, 2015 - link
That's only Anandtech's theory and not at all based in reality. There have been 7 driver updates since those tests were run. Let's see it done again, with high-end 2400MHz RAM this time.Flunk - Thursday, March 5, 2015 - link
Broadwell doesn't bring much of interest to the table for desktop users. This is probably Intel's answer to that, I'm still holding out until at least Sky Lake at this point unless Broadwell turns out to be a great overclocker (which seems unlikely).Refuge - Thursday, March 5, 2015 - link
I'm pretty sure the days of Overclocking are coming to an end slowly.Eventually there won't be anything meaningful more to have than stock/boost clocks.
Ammaross - Friday, March 6, 2015 - link
Overclocking will always have a purpose. We could look at a 3.3GHz Intel chip and think "that's good enough," but when you can overclock it to 4.4GHz, you're getting a very tangible benefit (20% reduced render times, etc). Now in your future days of "not worth it" overclocking, where you're trying to bump 4GHz to 4.4GHz, the gains will be minimal, yes, but they're gains none-the-less, and could shave 10min off a 1hr render. That's still tangible.Notmyusualid - Friday, March 6, 2015 - link
Indeed overclocking will always have a place.I just noted with a smirk, that my 3920XM (a mobile chip) performed higher than that MSI GT80 Titan beast of a rig (just released), in Intel's CPU benchmark. Informing me, that after almost 3yrs later (system shipped in July 2012, just checked Dell website) my everyday computer experience is likely to be quicker than that thing. I'm aware the 4940XM exists, but due to electrical incompatability, I'm not able to use it, sadly.
Surly Intel must know some of us would buy NEWER chips, even in the Desktop Replacement catagory, were they actualy made available.
Small rant - not all of us will replace our systems as often as Intel would like, and with no QHD 18.4" *socketable CPU* (important that one) laptops out there, I may be holding onto this one for some time to come.
Refuge - Monday, March 9, 2015 - link
I disagree, and it doesn't matter if I do since everything here is simply us predicting the future.But I feel that overclocking will go the way of the Dodo in say... 4 - 6 years?
Eventually it just won't be worth it, the gains will be too small. Because I believe in 4-6 years you will be lucky to get even 5% with an OC board and unlocked chip.
Right now? Yes, it is worth it, even if it isn't nearly as fun as it used to be. I miss comparing clocks with my friends back in Sandybridge days, those were fun days... :)
MyCookie - Thursday, March 5, 2015 - link
Crystal Well also had a 65W TDP for 'desktop' parts. I find it likely that they will launch Broadwell unlocked CPUs with Iris Pro and Skylake unlocked CPUs with HD graphics.BlueBlazer - Thursday, March 5, 2015 - link
New 3DMark API Overhead Test - Running on Intel 5th Gen Core Processor https://www.youtube.com/watch?v=rFhu4lpRAKw using DirectX 12. Could that be Broadwell's Iris Pro?TallestJon96 - Thursday, March 5, 2015 - link
As someone looking to upgrade to a new build from a gen2 i3 desktop, this is interesting. It's possible that by the end of the year we will have 14nm Skylake, instead of good ol' haswell. I'd love to build an i5 Skylake Desktop, but I have to wonder if there will be any real gains, especially as I will be running dedicated graphics.Really, these changes are intended for mobile, and the improved performance and battery life are much appreciated.
Novacius - Thursday, March 5, 2015 - link
Recently benchmarks have shown up, reporting a 15% IPC gain for Skylake in comparison with Broadwell. That means that Skylake should have a ~20% higher IPC than Haswell and at least 40% more than Sandy Bridge (depending on the application of course, because there is no such thing as "one IPC"). Add slightly higher frequencies and better energy effiency and you have a decent upgrade.Nagorak - Friday, March 6, 2015 - link
Yeah, but for all we know we'll end up with lower frequencies. I'm not saying we absolutely will, but the last few years Intel sure hasn't been pushing increased performance.The other issue is, all these integrated graphics chips are a complete waste for gamers, and gamers are one of the few groups who actually need higher performance. For your average user Haswell is already well beyond "good enough".
Xenonite - Friday, March 6, 2015 - link
I also think that the maximum boost frequencies will start comming down together with the maximum TDP (read: performance), even on the desktop. There is simply much more money to be made by limiting device refresh rates to 10-20 fps, reducing performance accordingly, and selling a 24hour or a 48hour battery life to the average consumer.Even for applications that get power from the wall, the allure of a sub 1mm thick desktop (possible with system TDPs on the order of 100's of miliwatts), somehow, generates much more sales than high-performance gaming ever could.
Together with the ever increasing R&D and manufacturing costs associated with high-frequency semiconductors, I dont see how any company will be able to convince its investors to build another record-breaking performance CPU in our lifetimes.
Kutark - Sunday, March 8, 2015 - link
PC gaming is an asbolutely massive market, until that market dies, Intel will ALWAYS build high performance record breaking CPU's. What you're also not accounting for is the server market. They tried the low power thing there and found out that high end CPU's completed the task faster and got back to a low power idle state faster and ultimately used significantly less power than the "low power" part.So even if they stopped "focusing" on the high end desktop/gaming market for CPU's, they will still be pushing the envelope in the server market, and a lot of that R&D can be easily translated into the desktop market.
Saying we won't see a record breaking CPU in our lifetimes is the height of absurdity.
patrickjp93 - Saturday, March 7, 2015 - link
AMD hasn't been pushing clocks either, and neither of them can. Heat grows quadratically or even worse with clock increases. Software just needs to catch up to new instruction sets which do more parallel work in the same clocks.You say iGPU is a waste now, but with Nvidia's PhysX engine now going open-source, what would you say to letting the iGPU do the physics calculations and let the big iron handle the rendering and texture painting? Synergy is the name of Intel's game, though it's more focused on compute performance to fight off the Zen APUs and HSA.
Kutark - Monday, March 9, 2015 - link
I agree, if we're talking strictly pushing stupidly high clock speeds, yes, i don't think there will be much if any focus on that. But faster is faster, whether they acheive that through better parallelism, new instruction sets, etc etc, the end result is the same. If a 4ghz chip can do 20% more work than the previous generation at the same 4ghz then i could care less what the number is, just how it performs.That being said, i do still believe we've got a healthy period of time where intel will continue focusing on making monster fast chips (not necessarily high clock rate). The idea that they will abandon the high end desktop market is ludicrous at this point. Even if its only 10% of their sales, thats still billions of dollars a year, and noway they're spending billions of dollars on R&D for that segment (i.e. there is still profit to be had in that segment).
Vayra - Wednesday, March 11, 2015 - link
I think the advances in gaming are coming from different angles than just increasing CPU performance really.The current bottleneck for gaming has been identified and is being handled with Dx12, the upcoming bottleneck for gaming (memory bandwidth) will be handled by stacked DRAM.
bryanb - Thursday, March 5, 2015 - link
You'll still need to wait for Skylake for 4K@60fps (3840x2160). All the Broadwell parts to date have been limited to 30fps at that resolution.p1esk - Thursday, March 5, 2015 - link
Haswell does 4k@60Hz just fine.Novacius - Thursday, March 5, 2015 - link
Only with DP, but not with HDMI.p1esk - Thursday, March 5, 2015 - link
Every decent 4k monitor has DP. As do all decent Haswell laptops/mobos.Notmyusualid - Friday, March 6, 2015 - link
And my Ivy Bridge laptop mobo.zepi - Wednesday, March 11, 2015 - link
Almost no TV's have DP and it doesn't seem that likely to change in the near future.60FPS/Hz content is coming more and more common (sport broadcasts etc), so there is definitely a need to drive TV's at 4K@60Hz, especially for HTPC users.
D. Lister - Thursday, March 5, 2015 - link
4K@60fps gaming with an iGPU, and an Intel iGPU no less? LOL... that is all. :)Novacius - Thursday, March 5, 2015 - link
They just mean display support, not gaming. That would be ridiculous.D. Lister - Thursday, March 5, 2015 - link
Yeah I got it :), he said "60fps", instead of "60Hz".MrSpadge - Thursday, March 5, 2015 - link
Crystal Well also works as cache for the CPU, or at least that's how it worked with Haswell. This can give double digit performance increases in some applications (and nothing in others). People should not only dismiss Iris Pro just because they're using a discrete GPU. A 4.5 GHz Broadwell with a 10% performance boost from Crystal Well is almost as fast as a 5.0 GHz Broadwell without the cache.Samus - Friday, March 6, 2015 - link
If Intel doesn't launch this, I'm switching to AMD for my next platform based on principle alone. I upgraded my motherboard from an H87 to H97 last year SPECIFICALLY to be guaranteed an upgrade path. If Broadwell DT becomes vaporware I think a lot of people (millions of people) are going to be pissed for the same reason. What was the point of the 90-series chipset (other than adding bootable "M2") if they aren't going to make a CPU for it?D. Lister - Friday, March 6, 2015 - link
"I'm switching to AMD for my next platform based on principle alone."Absolutely. AMD is the one to side with, i.e. if one cares the slightest for professional integrity. Well spoken sir, well spoken indeed. Not many men have the moral strength to spend money on a technologically inferior product just for upholding their principles. We must all take the example of this great man and buy AMD CPUs from now on. *two thumbs up*
Samus - Friday, March 6, 2015 - link
LOL. Well put.I think some people, like me, are just tired of being bent over by Intel. But of course many people like being ass raped, lied to and thrown around like a ragdoll by the Intel pimp.
D. Lister - Friday, March 6, 2015 - link
"I think some people, like me, are just tired of being bent over by Intel."This is modern business mate - everyone wants to bend the customer over. So if you KNOW you're gonna get bent over, then better bend over for the ones who know what they're doing.
"But of course many people like being ass raped"
<nitpicking> If someone likes it, you can't technically call it rape. lol, maybe they're just into rough love. </nitpicking>
Anyhow, *ahem* moving on...
CPUGPUGURU - Saturday, March 7, 2015 - link
"I think some people, like me, are just tired of being bent over by Intel."Why did you buy a Intel rig, nobody forced you to, you had the choice to waste your money on a hot watt sucking IPC cripple obsolete AMD CPU but you instead choose to get, "bent over by Intel, WHY?
Was it because Intel offered a massive performance per watt advantage, or was it because Intel CPUs don't bottle neck high end GPUs and give you a SLI option when needed without choking your GPU investment to a slow stuttering death.
If you hate Intel so much for pushing new tech why did you feel compelled to buy Intel, please tell us all why you submitted to being bent over by Intel when you're so in love with AMD and its hot watt wasting IPC cripple CPUs ? People want to know why did a AMD lover buy a Intel rig ?
Do tell, please do tell us all why a AMD fan boy who hates Intel pushing new tech ends up buying a Intel rig?
Kutark - Sunday, March 8, 2015 - link
I'm still laughing at the lack of response from him. I love how people equate not having EXACTLY to the T a product they specifically want, and that equals getitng raped/bent over, etc by the company. It would be like me complaining that Ford doesn't offer bright neon pink as a standard color option on the mustang, and if i want that i'd have to pay extra for some shop to repaint it, and Ford is bending me over, etc.Refuge - Monday, March 9, 2015 - link
Actually, they had a special trim in California that was a nice pastel hot pink for awhile when I worked for Ford.It was the "Pink Ladies" trim or something like that, some sort of woman group in CA. Not sure what they did, but they got their own special color and trim level. (Just a California trim with a hot pastel pink paint job and a special emblem.)
CPUGPUGURU - Saturday, March 7, 2015 - link
You can stick your AMD cheer leading principals where the sun don't shine.My loyalty is to my hard earned money and for years now I don't upgrade my CPU because Intel's massive performance advantage gives me all the performance I need now and the future. Unlike AMD watt wasting IPC cripple CPUs Intel delivers the performance needed to skip a upgrade cycle or two/three, I have 920, 980 and Sandy Bridge rigs that with a high end GPU run new games with all the features and eye-candy turn on while AMD has nothing but bottle necking multi GPU strangulating, hot watt sucking CPUs. Wasting your money on a AMD rig makes you want to upgrade as soon as you turn it on, not so with Intel and its been like that for years now. Invest in a Intel rig and there is no need to upgrade and by the time the need for speed becomes a reality there's a whole new Intel platform to build without throwing good money away on a old inferior obsolete rig (AMD).
So sorry but its so true.
jeffry - Sunday, March 8, 2015 - link
The only reason to upgrade a CPU younger than, lets say 2008, is to play games. Or to get some special tasks done, like rendering...for "tyical" user cases, there is barely any need to upgrade.CPUGPUGURU - Sunday, March 8, 2015 - link
Absolutely right Jeffry but only if you when with Intel CPU that delivers the performance to skip upgrade cycles, now if you wasted your hard earned money on a AMD watt sucking IPC cripple, GPU bottle necking CPU then you would want to upgrade as soon as you plugged it in and cried I should of bought a Intel.So sorry but so true
Kutark - Sunday, March 8, 2015 - link
Agreed. My parents are running an old Core 2 Duo for the past 6 years and have had literally 0 issues. I threw a Radeon 6670 in there so my nephew could play minecraft and such when he is visiting his grandparents, and it runs most of the games he wants to play (including stuff like CS:Source/Go, etc etc) at perfectly acceptable framerates at 1080p. 1080p youtube videos? not a problem. Overall cost of the PC when we bought it + video card i later put into it? $570. Years they've been using it problem free? 6+darkfalz - Sunday, March 8, 2015 - link
Still rock a P965 based E8600 @ 3.8 GHz... hell of a nice system with budget eBay upgrades (4 GB RAM, GTX 560) - actually playing it more than my main PC at the moment because of its location in the house. I think I'll be keeping my 3570K @ 4.2 + 980 GTX for a while yet, should last at least another 2-3 years before upgrade necessary (might need to go to 16 GB at some point but that's a trivial upgrade).Kutark - Sunday, March 8, 2015 - link
Yeah, im rocking a 2700k @ 4.3, 760 SLI, 16gb ram, etc, and have no intention of upgrading for a while. Though i will admit ive gotten the itch BAD this last christmas/tax season. Right now im saving my money to buy that new Acer IPS 1440p Gsync monitor when it comes out. Assuming its not eleventy billion dollars.Gadgety - Friday, March 6, 2015 - link
They took their time to get IrisPro to the desktop. Arriving too late for me. I got the performance I needed for my kid's Sims3 on the PC with the AMD A8-7600. USD250 for an A88X MITX mobo, the A8-7600, and 8GB dual rank memory to boost performance.Refuge - Monday, March 9, 2015 - link
Did the same, I got an A8 laptop and it does everything for my little one, minecraft, WoT, LoL, WoW, SWTOR, Sims 4, but most importantly (Not that is is demanding) his favorite AC3.Dahak - Friday, March 6, 2015 - link
Is it just me or does intel have it backwards for the Iris Pro on the desktop chip.I feel that if you are going to be buying a higher end chip like this, would you not most likely be paring it with a dedicated video card, thereby not needed to use the higher end Iris pro?
Mushkins - Friday, March 6, 2015 - link
It depends on your use case really. A desktop chip at a 65W TDP with Iris Pro would be phenomenal for an HTPC that also does more casual gaming. The thing really holding back those types of builds has been integrated graphics performance, and Iris Pro really steps up their game in that department.Though you're right, Iris Pro availability is moot if you're building a traditional gaming rig because you're going to disable the IGP anyway.
Kvaern2 - Friday, March 6, 2015 - link
Doesn't it boost Quick Sync performance as well?That might add value to some, in an i7 anyway.
darkfalz - Sunday, March 8, 2015 - link
Only if they find a way to enable iGPU for QS withtout Virtu / 2nd monitor other hacky workarounds.OFelix - Friday, March 6, 2015 - link
I am waiting for Broadwell or Skylake Iris Pro desktop part before I upgrade my aging system.I want to a system with Intel iGPU as well as boot disk based on PCIe SSD drive (maybe with NVMe?)
Intended use is a dual boot Windows & Linux general purpose machine to keep runing for the next 10 years.
Should I go with Broadwell or wait for Skylake?
Ammaross - Friday, March 6, 2015 - link
I'm sitting on my hands until Skylake.As for "the next 10 years" part, you'll be lucky if your motherboard lasts that long. Plan on 4 or 6-year upgrade cycles, unless you think your IDE connectors, AGP slot, and floppy drive from 2005 are still relevant...?
Kvaern2 - Friday, March 6, 2015 - link
That really depends on the user case.We have old Conroe (2006) based PC's at work which having been upgraded to Win 7, SSD and more RAM over the years are still more than adequate for running browsers, Office365 and several business specific applications.
purerice - Friday, March 6, 2015 - link
(to the original question) It seems like most Broadwell will NOT be desktop so if you want a desktop, Skylake may end up being your only viable option. Even if Broadwell desktop did exist, I personally would wait for Skylake anyway (and I am).Similar here. All C2Duo iMacs except for 2 AMDs and 1 SB i3 (the "workhorse"). Average age, 7-8 years. Mice are all Microsoft Intellimouse, vintage 2000,2001, and the printers all Canon, vintage 2007-2009. They are perfect for financial/accounting software+Excel.
The iMacs run Yosemite (10.10) slowly, but are fine in SL (10.6) or Windows.
Maybe others have less luck with hardware longevity and maybe 10 years is too much to expect, but in terms of performance, 10 years is not too much to hope for. For 90% of what we do, the bottleneck in performance is the user or waiting for external data to input.
That said I haven't had a "gaming" device since a PS2 or a 600MHz Athlon w/ Geforce2 GPU and would love to get my hands on a good FPS or Skyrim or hundreds of other games I have been deprived of the past 11 years.
jeffry - Friday, March 6, 2015 - link
I agree on that one. I still have an old Yonah Core Duo running Lubuntu Linux on my Laptop. Its has its second fan, second hdd and 2 GB of RAM. Works fine for Business (Email, Office, Internet).phoenix_rizzen - Friday, March 6, 2015 - link
Hey, I have an Athlon64 system running with an AGP video card, IDE harddrive, and floppy drive (although never used) connected to our 27" CRT TV via component video, thank you very much! Runs Windows XP, Google Chrome, and the Plex Web Client just fine. It's not on 24/7, but it makes watching shows in the bedroom easy enough.Just waiting for the hardware in that system to actually die before the TV is moved into the kids' room for use with all the old consoles (NES, SNES, some LeapFrog stuff, Wii, etc). Hoping it lasts another 2-3 years, though.
jeffry - Friday, March 6, 2015 - link
Hehe, amazing. I also have one of these. Its an AMD Athlon XP (not 64;), those were the first 64 bit AMDs) 'Barton' running a Ubuntu File Server (Samba). Works like a charm...IntelUser2000 - Friday, March 6, 2015 - link
"much improved Generation 8 graphics architecture."20% performance gain in C-P-U counts as "much improved" in my book, not 20% in GPU.
Harry Lloyd - Friday, March 6, 2015 - link
It is a shame that everyone has to pay for those completely useless iGPUs, which take a lot of development resources, while CPU performance has basically been the same for 5 years.horsebadorties - Friday, March 6, 2015 - link
Then don't buy a processor with an iGPU. Those of us who don't game are happy to avoid buying expensive, power-hungry graphics cards.purerice - Friday, March 6, 2015 - link
I agree that the lack of improvement is frustrating but Intel is merely providing what the market desires. The past 5 years the market has demanded focus on:portability
iGPU
performance per watt
If Intel focused on the high end desktop market only, they would be losing money right now.
The other factor is competition. You release IP to market when there is a competitive need. A Sandy Bridge i3 can wallop the latest AMD "CPU" (whatever it's called). If Intel had added a feature to Ivy Bridge that added 20% to integer/floating point performance, AMD could have reverse engineered that and put it in their current chips. If Zen is a performance machine, and I hope it is, we will see some big improvements coming from Intel shortly after.
MikeMurphy - Friday, March 6, 2015 - link
It wasn't that long ago when Intel Chipset GPUs couldn't even run Aero on Vista. Intel GPUs have been coming along just fine.darkfalz - Sunday, March 8, 2015 - link
I dunno if you pay that much for it, really. You might pay for the R&D but it's the same R&D that will get you that GPU on your Ultrabook where you might want it. As for the silicon on the chip, if unused, it's a bit like saying that someone buying a cut down i5 CPU is "paying" for that disabled 2MB cache they aren't using. or someone buying an i3 is "paying" for the silicone on two disabled cores. Since there is no line of Intel CPUs with disabled iGPUs I suspect they don't pose much of a problem binning wise.hansmuff - Friday, March 6, 2015 - link
Still waiting for a meaningful upgrade from the 2600k.Urizane - Friday, March 6, 2015 - link
I'm sitting on a 4.0 GHz quad Nehalem. I have this irrational feeling that I need to upgrade it based on its age alone (I bought mine 6 years ago), but it's still a well running system. This has certainly been my longest running system (by far). IPC improvements might finally make me jump, though. A lowly 2.6 GHz quad Broadwell should, in theory, perform just as well as my Nehalem. I haven't been concerned with power since it requires all four cores to be loaded before it draws 140W, but it would be nice to drop that to less than half.MikeMurphy - Friday, March 6, 2015 - link
Classic system but things have come a long way since then. With more Skylake IPC and power consumption improvements I suggest the time to upgrade is near.xrror - Friday, March 6, 2015 - link
Have a look on ebay for old surplus 1366 xeons, assuming your motherboard has a bios update for them (almost all socket 1366 boards do) you can pick up a 6 core /12 HT xeon for a few bucks, and even a low multiplier x5650 will run 4Ghz if your motherboard is okay with BCK of 186 or so. Just be careful with voltages - gulftown IMC isn't quite the indestructible beast bloomfield is ;)It's definitely worth looking into if you want to get a few more years out of your rig while you wait to upgrade.
ballast - Friday, March 6, 2015 - link
I wish they would produce oversized hexacore SkyLake parts with no iGPU. I could not care less for an iGPU.Ammaross - Friday, March 6, 2015 - link
Wouldn't those be the "Extreme Edition" parts we see for the 2011 socket? Yep....thought so.Urizane - Friday, March 6, 2015 - link
Intel has never been interested in selling more than 4 cores to mainstream users (115x sockets) because the workloads were generally not using up more than 2 or 3 cores. Be thankful you get HyperThreading. As Ammaross said, the bigger enthusiast sockets are where you would have to go.purerice - Friday, March 6, 2015 - link
What I want to know is whether a lot of Iris Pro chips will in fact be limited to 64MB as a few rumors have suggested. The original Iris Pro was potentially 128MB-256MB but ultimately just 128MB.Perhaps the performance boost is too light to justify the extra transistors/cost/heat/space, or maybe it is a pricing issue where the 64MB version can command 80% of the price premium of the 128MB version.
HPLiu - Saturday, March 7, 2015 - link
^^^this I want to know too!lorribot - Friday, March 6, 2015 - link
Intel probably has the the highest useage rate of graphics in computers of anyone thanks to all those laptops business PCs and ot there, how ever at the higher end where topp end and unlocked i5 and i7 processors are integrated grphics servers no purpose other than to add wasted silicon untill someone properly implements off load Floating Point calculations to to the embeded graphics core.Anyone buying an unlocked processor is unlikely to be bothered whether a processor has HD 3000/4000/iris pro at this point in time.
smilingcrow - Saturday, March 7, 2015 - link
People that have a use for decent CPU performance aren't automatically gamers so have a dGPU you know.thomasxstewart - Friday, March 13, 2015 - link
My government administrator is Iris Y, Hummmmmm.skylake, skylake, skylake....
DRASHEK