Agreed btw I'm willing to bet that even the most ardent of Intel fans would think thrice(if not more) to spend ~700$ on Iris Pro, even if it's unlocked, cause you're better off spending a grand on an 8 core monstrosity (-:
So you're saying a socketed version of Iris Pro can be had for ~255 hmmm I'd like to see that, as in where can I buy it & I'm assuming you meant retail products & not just for OEM's right ?
I don't see any reason to believe they'll significantly more expensive then the haswell k-series parts.. If you have evidence to suggest it'll be $700, share it.
I remember reading there will be more execution units and no doubt there will be a frequency increase as well. I expect a good 20% improvement, possibly more.
Depends on your workloads. The L4 cache would boost IPC and typically you can get the quad core chips to clock higher than the >6 chips. So for a less parallel task (gaming etc.), a Haswell with Iris Pro would make more sense than the 8 core Haswell-e.
Of course I wouldn't object to Intel releasing Haswell-e with some L4 cache which would make my above point moot. :)
Those chips usually have significantly more L3, which has a much higher bandwidth than L4. If I had to pick between more L3 and a large L4 I'd probably pick the larger L3. As for games: I doubt they'll need that much performance even five years down the line.
IDK. If memory serves, these Iris pros have a 128MB L4 cache. Further more, if you have a dedicated video card, that HUGE L4 cache is still available to the processor. I would rather have the large cache that extra procs that will more than likely go unused.
So by unlocked does this mean a non-"K" series? If they're just adding Iris Pro to the "K" series it would seem odd, as most "K" buyers probably have dedicated CPUs anyway.
If the gains from a large L4 cache were "worth it", Intel would be putting it on all their CPUs as a general way to improve performance. If you're going with a dedicated GPU, I can see this only being worthwhile if you've got some specialized task in mind which benefits disproportionately from using the eDRAM as a cache.
Decade ago you could've made the same argument about L3 cache (and years before that about L2$ and also about L1$). With 22nm process it doesn't make sense to add large L4 to all (highend) CPUs, but at 14nm I can easily imagine that it can make sense to reserve some 35mm2 to 128MB L4 on-die (or half of that for 64MB). Probably we won't see it in Broadwell, but in Skylake it is possible.
I don't think K is something to be desired since it means some fancy virtualization extensions are disabled. If it's unlocked, has those extensions, and L4 cache then it's a winner of a chip.
I still have absolutely no interest in iGPU graphics, especially when Nvidia is demonstrating better performance AND lower power consumption on 28nm with GM107 in mobile form. I'd personally just rather NOT pay the iGPU tax associated with buying a high end processor, and use that money towards any video card I want to get.
it's a pretty safe bet that a broadwell with iris pro will have lower power consumption than a broadwell + any non integrated GPU... not to mention all the extra materials, resources, and physical space the non-integrated GPU takes.
It's not a safe bet. 20nm GM107 Maxwell should be about 25-30% more efficient than it's current iteration, and the 840m + CPU is already way more efficient than when using a CPU with iris pro running full tilt.
While the 840m is certainly more efficient, I wouldn't take those numbers at face value. The 840m is paired with a 15w chip (i5-4200U, a dual core i5) while the Iris pro is running on a 47w chip (i7-4750HQ, a quad core i7). That does not explain the whole power gap, but I suspect the power draw of a theoretical Iris Pro + i54200 cpu would be sub-25w. Also, while Skyrim is a typically CPU intensive game, I suppose the low FPS numbers indicate Ultra settings, making it mostly GPU bottlenecked. Perhaps that is why Nvidia chose to pair the 840M with a low power cpu.
Ok, it says i5 4210U, but I could not find that particular sku online. In any case it would be unrealistic to expect 17 W system power running Skyrim at max on a standard voltage cpu.
look closely, that nvidia image you linked to is comparing a 47W CPU/iGPU w/iris pro to a 15W CPU/iGPU w/840M... which is irrelevant since of course a 47W cpu system will take more than a 15w ultrabook class CPU system... even when it has an non-integrated GPU.. the one that takes more power in that comparison will have an order of magnitude more CPU performance.
so let me revise my statement:
It's a pretty safe bet that a broadwell with iris pro will have lower power consumption than a broadwell with similar CPU performance with any non-integrated GPU.
It's nvidias slideware as usual. It actually doesnt measure nothing nor adequately compare. It just releases slides that will drool away users towards their camp. Once there they'll figure out for themselves they smoke something bad. Only thing i would have dedicated gpu from preferably ATi, povervr, nvidia above anything intels are eye candy features that in past always overcome intels robus effective office gpu which lacks support for true graphics on any contemporary d3d/ogl applications. but always overpromising they'll improve that when it comes into fancy just around corner (since 2006) raytracing apps.
It's the iGPU that ensures that AVX instructions actually bring a perfromance boost. The same goes for OpenCL applications, that, whilst are nonexistant at the moment, will work faster on a igpu that shares the same memory address space as the CPU than on a discrete GPU in most consumer/prosumer use cases.
Intel would need to bring some seriously big changes to their iGPUs to make that even remotely possible. AMD is a much better bet in that regard because of HUMA and their more compute-friendly GCN architecture.
I'm always open to be proven wrong, in fact I encourage it. Please prove me wrong Intel!
You're thinking of this from a desktop-centric perspective. The high-end iGPU is primarily for mobile. It would be weird to just cut it out of the processor for desktop. Also, you could stream 4K stuff easier with it than with weaker iGPUs. Most people are still on SB/IB.
Agreed - that's going to replace my 3770K! The eDRAM will keep the cores happy during heavy BOINC number crunching and will help significantly for iGPU number crunching (Einstein@Home) to supply that increased number of shaders. DDR3-2400 is barely enough for HD4000 with just 16 shaders!
excited about this, haswell was the only generation I skipped in years in the desktop space.... because I don't game much, but enough to want the iris pro.
The real story is if they maintain socket compatibility. They're not really known for that but since none of these announcements seem to be consumer chipset feature centric there's a chance they're not planning a new chipset for broadwell.
Actually they usually do keep the same socket if it is just a die shrink of the current architecture. Like when 32nm sandy bridge shrunk to 22nm ivy bridge it kept the same socket then when IB changes to haswell it was a new architecture still on 22nm so the socket changed up broadwell is just a die shrink to 14nm it will not have ddr4 or pcie 4.0 or sata express so I don't see any reason why it would not use the same LGA 1150 socket.
When intel goes from 14nm broadwell to the new skylake architecture on 14nm which will bring ddr4 pcie 4.0 and sata express to mainstream desktop users we will have to see a new socket for the new features.
broadwell will also still only have 16x pci-e lanes on the mainstream socket just like haswell does. Nowhere close to the 40 lanes haswell E has on the enthusiast socket, we will have to wait for skylake 14nm architecture changes and even then the mainstream lga 1151 socket will get 20x pci-e lanes still very short of the 40 lanes the enthusiast socket gives.
I hate how intel does that. They purposefully cut the lanes so you'll buy the higher enthusiast boards and chips to get the full 40 lanes
2011 isn't enthusiast centric, x58,x79,x99 are really hybrid boards that allow workstation regular PC hybrids similar to the Titan series GPUs, fact of the matter is for an enthusiast especially gamers you really don't need anything beyond the PCIe 3.0 x8 in a dual configuration(SLI/Crossfire), you will never bottleneck at the bus level, now for workstations well lets just say 3-5k GPU cards x4 in some cases isn't really ever enough.
It is called synesis or notional agreement and is a regular feature of Br. English and is perfectly grammatical in American English as well although in America it is a more explicit emphasis on the collective nature it the noun. Usually in Am. English you will see the mass noun become a compound noun with the addition of a word like members or employees &c. if you want to use notional agreement.
NO, just no, stop with the damn graphics already on the CPU die, I can understand this for OEM applications, but this does not belong in mid to higher end consumer products. A lot of are made to pay for this poor video solution that we never use, and most likely never will ever. The stupidity of the industry in the last 10 years has become a contest of who can price gouge, selling individual units at inflated costs to basically price fix the entire market with micro improvements, and holding on to outdated poor standards that have come to become a stifling agent.
Guess what? You'll pay the same for an Intel CPU regardless of it containing a GPU or not. Just look at the prices of those socket 1150 Xeons with and without GPUs.
Actually, the lowest price for the consumer involves leaving the integrated graphics in place. Despite what the AnandTech comment section might lead you to believe, the vast majority of Intel's customers do not want or need discrete graphics, and your preference is the minority. Maybe not the minority of enthusiasts, but the minority nonetheless.
Producing separate versions of each core without the GPU just for you would almost certainly involve charging MORE because of the additional research, engineering, and manufacturing lines that would be required for these separate, minority-oriented products.
Also I am very sorry that you are disappointed with the iterative development process. I happen to think we've (cumulatively) seen some pretty drastic improvement in the last decade.
seems pretty pointless to invest in iGPU in high end CPUs, but I think with new technologies like HSA we might be able to maximize the usage of the whole CPU/GPU. Now question is which standard is going to be the most popular and/or supported by Intel.
Actually it's not *that* daft. If you imagine an engineer who does mostly CPU-heavy work but still wants to see it rendered in 3D, they probably don't need enough GPU horsepower to run Skyrim on Ultra settings, but they need more than the emulation you get with no GPU at all. Hence these high-end CPUs with mid-end iGPUs are a cost-effective way of getting that (assuming the company is being sensible and avoiding the Xeon price gouge).
An i7-broadwell w/Iris Pro w/16GB RAM and a fast SSD sounds like a sweet CAD/3D workstation. On top of that you could probably get some Minecraft or TF2 on the go with £0 spent on a graphics card.
It will be interesting to see what they come up with here. Any Ideas (or rough guesses..) on launch dates? I saw no reason to upgrade to the 3x/4x series by Intel but something like this will be a lot more tempting.
several points to consider here , the fact pentium even the quad core does Not have AVX or AVX2 is a severe problem as a mass cpu today, as is the fact that people don't seem to realize that Intel have at best 2 or 2 tick/tocks to get the the generic UHD-2 7680 pixels wide by 4320 pixels high (33.2 megapixels) and requires 16 times the number of pixels as the current 1080p HDTV. It is the generation after 4K, which is set at 3840 pixels wide by 2160 pixels high.
its also apparent that many are unaware of the true long term UHD initiatives by NHK and the BBC R&D http://www.bbc.co.uk/rd/blog/2013/06/defining-the-... also you should be aware that NHK is To Give 8K Broadcast Demo at NAB Show
the Japanese public broadcaster NHK is planning to give a demonstration of "8K" resolution content over at single 6MHz bandwidth UHF TV channel at the National Association of Broadcasters (NAB) Show coming up in Las Vegas, Nevada, April 5 to 10.
This will be the first demonstration of an 8K Super Hi-Vision over-the-air broadcast outside Japan, according to the NAB Show organizers.
Engineers from NHK are also set to present details of a long-distance, single-channel, over-the-air 8K test broadcast conducted recently in Japan.
In order to transmit the 8K signal, whose raw data requirement is 16 times greater than an HDTV signal, it was necessary to deploy additional technologies
These include ultra multi-level orthogonal frequency domain multiplex (OFDM) transmission and dual–polarized multiple input multiple output (MIMO) antennas.
This was in addition to image data compression. The broadcast uses 4096-point QAM modulation and MPEG-4 AVC H.264 video coding.
Flagship broadcasting events such as the Olympic games or the soccer world cup often do much to stimulate switchover. Tokyo's successful bid to host the summer Olympic games in 2020 is seen as the event which may mark the beginning of the 8K era...
When all of this 8K stuff is ready..... well then I'll just use my system with an I7-7770k to run it..... So, I'll get several years of satisfactory computing and media consumption with my shiny new I7-5770k.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
64 Comments
Back to Article
lazarpandar - Wednesday, March 19, 2014 - link
first!tipoo - Wednesday, March 19, 2014 - link
I think "first" posts with nothing else should constitute insta-bans :)R0H1T - Thursday, March 20, 2014 - link
Agreed btw I'm willing to bet that even the most ardent of Intel fans would think thrice(if not more) to spend ~700$ on Iris Pro, even if it's unlocked, cause you're better off spending a grand on an 8 core monstrosity (-:Homeles - Thursday, March 20, 2014 - link
Good thing these won't be selling for $700. Not even close. Iris Pro *today* can be picked up for ~$255.R0H1T - Thursday, March 20, 2014 - link
So you're saying a socketed version of Iris Pro can be had for ~255 hmmm I'd like to see that, as in where can I buy it & I'm assuming you meant retail products & not just for OEM's right ?nathanddrews - Thursday, March 20, 2014 - link
4th Gen (current) Intel IGP supports DX12. Wowza.8steve8 - Thursday, March 20, 2014 - link
I don't see any reason to believe they'll significantly more expensive then the haswell k-series parts.. If you have evidence to suggest it'll be $700, share it.Klimax - Friday, March 21, 2014 - link
Most likely it is taken from price of Gigabyte Brix Pro forgetting it isn't just CPU...Samus - Saturday, March 22, 2014 - link
should be interesting to see how well iris gpu core overclocks. being 14nm i doubt it will be clocked the same as the HD5200.ryrynz - Saturday, April 12, 2014 - link
I remember reading there will be more execution units and no doubt there will be a frequency increase as well. I expect a good 20% improvement, possibly more.Kevin G - Thursday, March 20, 2014 - link
Depends on your workloads. The L4 cache would boost IPC and typically you can get the quad core chips to clock higher than the >6 chips. So for a less parallel task (gaming etc.), a Haswell with Iris Pro would make more sense than the 8 core Haswell-e.Of course I wouldn't object to Intel releasing Haswell-e with some L4 cache which would make my above point moot. :)
willis936 - Friday, March 21, 2014 - link
Those chips usually have significantly more L3, which has a much higher bandwidth than L4. If I had to pick between more L3 and a large L4 I'd probably pick the larger L3. As for games: I doubt they'll need that much performance even five years down the line.jragonsoul - Thursday, March 20, 2014 - link
Good to know.Sirk77 - Monday, March 24, 2014 - link
IDK. If memory serves, these Iris pros have a 128MB L4 cache. Further more, if you have a dedicated video card, that HUGE L4 cache is still available to the processor. I would rather have the large cache that extra procs that will more than likely go unused.purerice - Wednesday, March 19, 2014 - link
So by unlocked does this mean a non-"K" series? If they're just adding Iris Pro to the "K" series it would seem odd, as most "K" buyers probably have dedicated CPUs anyway.Kevin G - Wednesday, March 19, 2014 - link
It'll still help even with a dedicated GPU: the eDRAM can still be used as a large L4 cache. That'll give a slight boost to IPC.Solandri - Thursday, March 20, 2014 - link
If the gains from a large L4 cache were "worth it", Intel would be putting it on all their CPUs as a general way to improve performance. If you're going with a dedicated GPU, I can see this only being worthwhile if you've got some specialized task in mind which benefits disproportionately from using the eDRAM as a cache.qap - Saturday, March 22, 2014 - link
Decade ago you could've made the same argument about L3 cache (and years before that about L2$ and also about L1$). With 22nm process it doesn't make sense to add large L4 to all (highend) CPUs, but at 14nm I can easily imagine that it can make sense to reserve some 35mm2 to 128MB L4 on-die (or half of that for 64MB). Probably we won't see it in Broadwell, but in Skylake it is possible.willis936 - Thursday, March 20, 2014 - link
I don't think K is something to be desired since it means some fancy virtualization extensions are disabled. If it's unlocked, has those extensions, and L4 cache then it's a winner of a chip.ImSpartacus - Thursday, March 20, 2014 - link
It might be nice for users that want a nice GPU, but might not be able to afford a GPU to match.You could use the IGP for a few months and be moderately satisfied until you could afford a respectable GPU.
MonkeyPaw - Wednesday, March 19, 2014 - link
Unlocked with a higher TDP might be scary-good, especially if Broadwell packs even more EUs and/or higher clocks.rhx123 - Wednesday, March 19, 2014 - link
Small prediction from me on this CPU: no 16x 3.0 PCIE Lanes.Kevin G - Wednesday, March 19, 2014 - link
To maintain socket compatibility, it will likely have the 16 PCIe lanes.What Intel will neuter will be everything else like TSX, VT-d, ECC memory support etc.
extide - Wednesday, March 19, 2014 - link
Don't believe everything Charlie says.... I mean remember what the name of his site is... afterall..tviceman - Wednesday, March 19, 2014 - link
I still have absolutely no interest in iGPU graphics, especially when Nvidia is demonstrating better performance AND lower power consumption on 28nm with GM107 in mobile form. I'd personally just rather NOT pay the iGPU tax associated with buying a high end processor, and use that money towards any video card I want to get.8steve8 - Wednesday, March 19, 2014 - link
it's a pretty safe bet that a broadwell with iris pro will have lower power consumption than a broadwell + any non integrated GPU... not to mention all the extra materials, resources, and physical space the non-integrated GPU takes.tviceman - Wednesday, March 19, 2014 - link
It's not a safe bet. 20nm GM107 Maxwell should be about 25-30% more efficient than it's current iteration, and the 840m + CPU is already way more efficient than when using a CPU with iris pro running full tilt.http://images.anandtech.com/doci/7834/NVIDIA-GeFor...
duriel - Thursday, March 20, 2014 - link
While the 840m is certainly more efficient, I wouldn't take those numbers at face value.The 840m is paired with a 15w chip (i5-4200U, a dual core i5) while the Iris pro is running on a 47w chip (i7-4750HQ, a quad core i7). That does not explain the whole power gap, but I suspect the power draw of a theoretical Iris Pro + i54200 cpu would be sub-25w. Also, while Skyrim is a typically CPU intensive game, I suppose the low FPS numbers indicate Ultra settings, making it mostly GPU bottlenecked. Perhaps that is why Nvidia chose to pair the 840M with a low power cpu.
rhx123 - Thursday, March 20, 2014 - link
The 840m is not paired with a 4200u because the 4200u does not have 16x3.0 PCIE Lanes.It has to be paired with a standard voltage CPU.
duriel - Thursday, March 20, 2014 - link
I am just quoting from the fineprint on the Nvidia "unmatched efficiency" slide : http://images.anandtech.com/doci/7834/NVIDIA-GeFor...Ok, it says i5 4210U, but I could not find that particular sku online. In any case it would be unrealistic to expect 17 W system power running Skyrim at max on a standard voltage cpu.
8steve8 - Saturday, March 22, 2014 - link
look closely, that nvidia image you linked to is comparing a 47W CPU/iGPU w/iris pro to a 15W CPU/iGPU w/840M... which is irrelevant since of course a 47W cpu system will take more than a 15w ultrabook class CPU system... even when it has an non-integrated GPU.. the one that takes more power in that comparison will have an order of magnitude more CPU performance.so let me revise my statement:
It's a pretty safe bet that a broadwell with iris pro will have lower power consumption than a broadwell with similar CPU performance with any non-integrated GPU.
hrga - Tuesday, April 15, 2014 - link
It's nvidias slideware as usual. It actually doesnt measure nothing nor adequately compare. It just releases slides that will drool away users towards their camp. Once there they'll figure out for themselves they smoke something bad.Only thing i would have dedicated gpu from preferably ATi, povervr, nvidia above anything intels are eye candy features that in past always overcome intels robus effective office gpu which lacks support for true graphics on any contemporary d3d/ogl applications. but always overpromising they'll improve that when it comes into fancy just around corner (since 2006) raytracing apps.
Kevin G - Wednesday, March 19, 2014 - link
Yes! Finally a desktop chip with the 128 MB of eDRAM L4 cache.Flunk - Wednesday, March 19, 2014 - link
That's what I was thinking too. the iGPU for high end... no thanks. But that extra cache, it could be interesting to see what you could do with that.That iGPU would be good for media pcs too, but that's already possible with the BGA chips.
nevertell - Thursday, March 20, 2014 - link
It's the iGPU that ensures that AVX instructions actually bring a perfromance boost. The same goes for OpenCL applications, that, whilst are nonexistant at the moment, will work faster on a igpu that shares the same memory address space as the CPU than on a discrete GPU in most consumer/prosumer use cases.Flunk - Thursday, March 20, 2014 - link
Intel would need to bring some seriously big changes to their iGPUs to make that even remotely possible. AMD is a much better bet in that regard because of HUMA and their more compute-friendly GCN architecture.I'm always open to be proven wrong, in fact I encourage it. Please prove me wrong Intel!
Mondozai - Thursday, March 20, 2014 - link
"the iGPU for high end... no thanks"You're thinking of this from a desktop-centric perspective. The high-end iGPU is primarily for mobile. It would be weird to just cut it out of the processor for desktop. Also, you could stream 4K stuff easier with it than with weaker iGPUs. Most people are still on SB/IB.
larkhon - Thursday, March 20, 2014 - link
"You're thinking of this from a desktop-centric perspective."maybe because in the slide above it's written Iris Pro coming for 5th gen Desktop CPUs?
MrSpadge - Thursday, March 20, 2014 - link
Agreed - that's going to replace my 3770K! The eDRAM will keep the cores happy during heavy BOINC number crunching and will help significantly for iGPU number crunching (Einstein@Home) to supply that increased number of shaders. DDR3-2400 is barely enough for HD4000 with just 16 shaders!8steve8 - Wednesday, March 19, 2014 - link
excited about this, haswell was the only generation I skipped in years in the desktop space.... because I don't game much, but enough to want the iris pro.willis936 - Wednesday, March 19, 2014 - link
The real story is if they maintain socket compatibility. They're not really known for that but since none of these announcements seem to be consumer chipset feature centric there's a chance they're not planning a new chipset for broadwell.Laststop311 - Thursday, March 20, 2014 - link
Actually they usually do keep the same socket if it is just a die shrink of the current architecture. Like when 32nm sandy bridge shrunk to 22nm ivy bridge it kept the same socket then when IB changes to haswell it was a new architecture still on 22nm so the socket changed up broadwell is just a die shrink to 14nm it will not have ddr4 or pcie 4.0 or sata express so I don't see any reason why it would not use the same LGA 1150 socket.When intel goes from 14nm broadwell to the new skylake architecture on 14nm which will bring ddr4 pcie 4.0 and sata express to mainstream desktop users we will have to see a new socket for the new features.
Laststop311 - Thursday, March 20, 2014 - link
broadwell will also still only have 16x pci-e lanes on the mainstream socket just like haswell does. Nowhere close to the 40 lanes haswell E has on the enthusiast socket, we will have to wait for skylake 14nm architecture changes and even then the mainstream lga 1151 socket will get 20x pci-e lanes still very short of the 40 lanes the enthusiast socket gives.I hate how intel does that. They purposefully cut the lanes so you'll buy the higher enthusiast boards and chips to get the full 40 lanes
Homeles - Thursday, March 20, 2014 - link
The vast, vast majority of consumers don't need remotely close to 40 lanes. You're a niche user -- you have to pay niche pricing. Get over it.1Angelreloaded - Thursday, March 20, 2014 - link
2011 isn't enthusiast centric, x58,x79,x99 are really hybrid boards that allow workstation regular PC hybrids similar to the Titan series GPUs, fact of the matter is for an enthusiast especially gamers you really don't need anything beyond the PCIe 3.0 x8 in a dual configuration(SLI/Crossfire), you will never bottleneck at the bus level, now for workstations well lets just say 3-5k GPU cards x4 in some cases isn't really ever enough.Flunk - Thursday, March 20, 2014 - link
It's a lower cost version of the Xeon platform for servers.Kevin G - Thursday, March 20, 2014 - link
That 20 lane figure is a bit deceptive. That likely includes the DMI link which uses a PCIe physical layer.hrga - Tuesday, April 15, 2014 - link
According to old intel's slideware (circa IDF2011) that couldn't be more wronghttp://www.eteknix.com/intel-14nm-skylake-will-sup...
JBVertexx - Thursday, March 20, 2014 - link
Please guys, enough with using the wrong subject/verb agreement with companies. Intel is singular, not plural. Basic grammar.Ian Cutress - Thursday, March 20, 2014 - link
In British English, companies are plural entities. Basic grammar.errorr - Thursday, March 20, 2014 - link
It is called synesis or notional agreement and is a regular feature of Br. English and is perfectly grammatical in American English as well although in America it is a more explicit emphasis on the collective nature it the noun. Usually in Am. English you will see the mass noun become a compound noun with the addition of a word like members or employees &c. if you want to use notional agreement.mikk - Thursday, March 20, 2014 - link
Of course it's not HD 5200, Broadwell gets a completely new GPU.1Angelreloaded - Thursday, March 20, 2014 - link
NO, just no, stop with the damn graphics already on the CPU die, I can understand this for OEM applications, but this does not belong in mid to higher end consumer products. A lot of are made to pay for this poor video solution that we never use, and most likely never will ever. The stupidity of the industry in the last 10 years has become a contest of who can price gouge, selling individual units at inflated costs to basically price fix the entire market with micro improvements, and holding on to outdated poor standards that have come to become a stifling agent.MrSpadge - Thursday, March 20, 2014 - link
Guess what? You'll pay the same for an Intel CPU regardless of it containing a GPU or not. Just look at the prices of those socket 1150 Xeons with and without GPUs.unixbrain - Thursday, March 20, 2014 - link
Actually, the lowest price for the consumer involves leaving the integrated graphics in place. Despite what the AnandTech comment section might lead you to believe, the vast majority of Intel's customers do not want or need discrete graphics, and your preference is the minority. Maybe not the minority of enthusiasts, but the minority nonetheless.Producing separate versions of each core without the GPU just for you would almost certainly involve charging MORE because of the additional research, engineering, and manufacturing lines that would be required for these separate, minority-oriented products.
Also I am very sorry that you are disappointed with the iterative development process. I happen to think we've (cumulatively) seen some pretty drastic improvement in the last decade.
larkhon - Thursday, March 20, 2014 - link
seems pretty pointless to invest in iGPU in high end CPUs, but I think with new technologies like HSA we might be able to maximize the usage of the whole CPU/GPU. Now question is which standard is going to be the most popular and/or supported by Intel.stephenbrooks - Thursday, March 20, 2014 - link
Actually it's not *that* daft. If you imagine an engineer who does mostly CPU-heavy work but still wants to see it rendered in 3D, they probably don't need enough GPU horsepower to run Skyrim on Ultra settings, but they need more than the emulation you get with no GPU at all. Hence these high-end CPUs with mid-end iGPUs are a cost-effective way of getting that (assuming the company is being sensible and avoiding the Xeon price gouge).jimjamjamie - Thursday, July 10, 2014 - link
An i7-broadwell w/Iris Pro w/16GB RAM and a fast SSD sounds like a sweet CAD/3D workstation. On top of that you could probably get some Minecraft or TF2 on the go with £0 spent on a graphics card.just4U - Friday, March 21, 2014 - link
It will be interesting to see what they come up with here. Any Ideas (or rough guesses..) on launch dates? I saw no reason to upgrade to the 3x/4x series by Intel but something like this will be a lot more tempting.HardwareDufus - Friday, March 21, 2014 - link
Cool. Looks like I'll be getting an I7-5770k for my next mini-ITX build.ZveX - Sunday, March 23, 2014 - link
sub 70$ pentium's with iris pro igpu would be awesomeTheinsanegamerN - Sunday, March 23, 2014 - link
the broadwell unlocked pentium with the iris pro gpu would be the perfect combo.BMNify - Sunday, March 23, 2014 - link
several points to consider here , the fact pentium even the quad core does Not have AVX or AVX2 is a severe problem as a mass cpu today, as is the fact that people don't seem to realize that Intel have at best 2 or 2 tick/tocks to get the the generic UHD-2 7680 pixels wide by 4320 pixels high (33.2 megapixels) and requires 16 times the number of pixels as the current 1080p HDTV. It is the generation after 4K, which is set at 3840 pixels wide by 2160 pixels high.its also apparent that many are unaware of the true long term UHD initiatives by NHK and the BBC R&D http://www.bbc.co.uk/rd/blog/2013/06/defining-the-... also you should be aware that NHK is To Give 8K Broadcast Demo at NAB Show
the Japanese public broadcaster NHK is planning to give a demonstration of "8K" resolution content over at single 6MHz bandwidth UHF TV channel at the National Association of Broadcasters (NAB) Show coming up in Las Vegas, Nevada, April 5 to 10.
This will be the first demonstration of an 8K Super Hi-Vision over-the-air broadcast outside Japan, according to the NAB Show organizers.
Engineers from NHK are also set to present details of a long-distance, single-channel, over-the-air 8K test broadcast conducted recently in Japan.
In order to transmit the 8K signal, whose raw data requirement is 16 times greater than an HDTV signal, it was necessary to deploy additional technologies
These include ultra multi-level orthogonal frequency domain multiplex (OFDM) transmission and dual–polarized multiple input multiple output (MIMO) antennas.
This was in addition to image data compression. The broadcast uses 4096-point QAM modulation and MPEG-4 AVC H.264 video coding.
Flagship broadcasting events such as the Olympic games or the soccer world cup often do much to stimulate switchover. Tokyo's successful bid to host the summer Olympic games in 2020 is seen as the event which may mark the beginning of the 8K era...
HardwareDufus - Tuesday, March 25, 2014 - link
When all of this 8K stuff is ready..... well then I'll just use my system with an I7-7770k to run it.....So, I'll get several years of satisfactory computing and media consumption with my shiny new I7-5770k.