While I know this is analyzing media performance I find it sad how AT did not find it news worthy to mention that the GT4e parts will have 72 execution units. Considering this is the first article specifically on Intel Gen9 GPU's. Yes GT3/GT3e moving up to 48 from 40 is in one of the slides but no where was the EU count for GT4e included in this article. I sure hope AT is going to release an article that discusses GT4e specifically and how much performance we can expect from it.
To be fair, Intel did not even specify which SKUs are going to get which GPU. In fact, officially they didn't say anything specific about the SKUs apart from the 2 released K CPUs. So it would be very hard to judge GT4e performance, as it will depend significantly on TDP. Just look at how much power Broadwell GT3e already consumes under GPU load.
How much power compared to what? The i7-5775C (65W CPU+GPU) benchmarks fairly similar to a R7 250 (65W GPU) or GTX 750 (55W GPU), I couldn't find any benchmarks for the i7-5950HQ (47W CPU+GPU) but it seems to me Intel's performance/watt is pretty good. It's their performance/$ that sucks, but that's because they totally own the high end laptop CPU market. A full GT4e is probably intended to compete with the lower end of the nVidia 9xxM series that use 50-120w by themselves.
Bear in mind that in the case of the Intel CPUs you have not factored in RAM, whereas the graphics cards you have.
Of course, the CPU part not present in the dedicated GPUs probably uses more power than the VRAM not present in the CPU, but it just shows how hard it is to make a fair comparison.
Well, compared to CPUs with GT2 GPUs, Broadwell with GT3e seems to consume 30-40W more under GPU load. Now add 50% more shaders and it's easy to see how this chip might run into power limits, depending on the SKU. The performance between a chip with 95 W TDP and 37 W TDP will differ wildy.
Here's my only thing with integrated graphics. Even if it is as fast as say a hypothetical 950/940/935m (whichever happens to be the case), and even if it does have similar TDP's. The difference is that with a "discrete" GPU, the manufacturer can move the GPU to another area where it will make cooling it significantly easier. I wonder how much of an issue cooling the cpu/gpu package being on the same die/socket etc is going to cause with these higher end parts.
The other thing i wonder about is where they will come into use in the desktop market where the cooling factor isnt really an issue due to lack of space limitations. Once you get into that performance area, people are starting to be more of proper "gamers" and are much more likely to invest in a higher cost discrete GPU to get more gaming power.
I guess the AMD integrated gpu's are a good example of where that market might stand, and frankly i don't know anything about the sales figures on those or market share etc. I can't imagine its high. But hey, im just spitballing here.
The article says HEVC Main10 is supported, but needs the GPU. As far as I understood from it there is some dedicated hardware for codec and then there is the integrated GP2 GPU that is needed for Main10.
So all looks fine for me. Understanding problem on who's part? Cheers G.
Yeah, it's the 6700K that's nowhere to be seen... Came in stock 2-3 times at Newegg but only sold in combos, and sold out within 5 min each time. Was put up for order at Amazon once but with no ETA and later pulled (my order stayed active tho).
Came in stock once at Tiger for $50 over MSRP and sold out in short order, not sure if they've actually shipped any... Been available for pre order at BLT & NCIX but ETAs keep getting pushed back.
In summary, the 6700K was paper launches thru and thru, early 2000s style... Dunno why the press isn't taking Intel to task over it, I mean they had two different launch events already.
I really want to see how good core M will be on Skylake. If the CPU can sustain higher clock speed, and the GPU at least 30% faster, we're going to have awesome windows tablets/convertibles this year. Finally it will be thinner than the surface pro 3 yet with fewer compromises.
Gigabyte now shows that the GA-Z170X-Gaming 7 complies with HDMI 2.0 and HDCP 2.2, and it uses the MegaChip LSPCon. It is stated above that the system cost will be higher with that device. Is $220 about as low as a MB will be priced? Or is that still too high?
You can wait for a combo deal, last 4 releases they had cpu and mobo for $50-$75 on the egg. Don't know about this time though, since the prices are all up for the i7 series
"For USB 3.1, GIGABYTE would seem to have an initial exclusive of Intel’s Alpine Ridge controller. This is a four-lane PCIe controller that supports full speed on two USB 3.1 ports as well as Power Delivery 2.0 support (up to 36W) on Type-C connectors. For the G1, we get a Type-A and a Type-C here. With HDMI 2.0 support through Alpine Ridge, we’re at a bit of a conundrum here – on the board in white letters it explicitly states HDMI 2.0, but none of the marketing materials from GIGABYTE I have actually use it in any as a marketable point. So we’re unsure if it is indeed HDMI 2.0 capable, what standard, if this is via the AR controller or if this is a separate LS-Pcon. Then again, this is a motherboard designed for discrete gaming cards rather than integrated graphics.
Edit: We can confirm that HDMI 2.0 is via a separate LS-PCon, although it will need a future firmware update before it can be used."
I am curious if Skylake can drive more than one Miracast display, and if it counts against the display cap of 3. I found something interesting in my Win10 update: In Windows 8, my Surface Pro 3 i7 could drive 3 displays total via DP (using daisy chaining). If I connected to my tv via Miracast, one of the LCD's would get disabled. But in Windows 10 I can add a Miracast display as a fourth display in addition to my 3 directly connected displays.
I was surprised at this change. Anyone have any info on that?
Perhaps the problem is with the reader, not with the article?
You are welcome to not be interested in how these features are developing over time. (I'm certainly not especially interested in, for example, how Intel Enterprise features are developing.) But to imagine that *no-one* is interested in these features is myopic, narcissistic, and, above all, foolish.
Yeah, agreed. While I personally don't find anything at all interesting about media capabilities beyond the basic idea of whether or not a video I want to watch (something rare anyway since I don't find it enjoyable to view video content in general no matter what screen or device its on) plays and if a device is on battery power at the time, that the battery isn't drained too quickly, there are other people who are very, very interested in this sort of article and find the details interesting and meaningful.
4k60p at 240Mbps. That's twice the maximum bitrate of the UHD BluRay standard. I can only imagine how good that would look, given the proper viewing equipment of course.
If I bought one of those Skylake CPUs and had dedicated PCI-Express graphics, could I still make use of QuickSync and/or some of the multimedia magic talked about, or was that something quirky Virtu tried and it has never worked 100%?
You should be able to do this easily with some BIOS / Windows 8.x or later setting. Enable the iGPU monitor in BIOS, and create a dummy display on that adapter in Windows. Quick Sync will be available after that. Step-by-step instructions: https://mirillis.com/en/products/tutorials/action-...
I'm still waiting for the Skylake CPU-rundown. It's been a while since IDF now. I want to look in to the architectural differences like in the good ol days.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
35 Comments
Back to Article
icrf - Wednesday, August 26, 2015 - link
Damn, I was hoping for a Skylake powered HP Stream Mini to hook up to my HDMI 2.0 / HDCP 2.2 Vizio M70-C3.drzzz - Wednesday, August 26, 2015 - link
While I know this is analyzing media performance I find it sad how AT did not find it news worthy to mention that the GT4e parts will have 72 execution units. Considering this is the first article specifically on Intel Gen9 GPU's. Yes GT3/GT3e moving up to 48 from 40 is in one of the slides but no where was the EU count for GT4e included in this article. I sure hope AT is going to release an article that discusses GT4e specifically and how much performance we can expect from it.witeken - Wednesday, August 26, 2015 - link
GT4e will be discussed when they review GT4e...nathanddrews - Wednesday, August 26, 2015 - link
Yes, that should really get its own review.jeffkibuule - Wednesday, August 26, 2015 - link
Commenters these days, no respect.MrSpadge - Wednesday, August 26, 2015 - link
To be fair, Intel did not even specify which SKUs are going to get which GPU. In fact, officially they didn't say anything specific about the SKUs apart from the 2 released K CPUs. So it would be very hard to judge GT4e performance, as it will depend significantly on TDP. Just look at how much power Broadwell GT3e already consumes under GPU load.Kjella - Wednesday, August 26, 2015 - link
How much power compared to what? The i7-5775C (65W CPU+GPU) benchmarks fairly similar to a R7 250 (65W GPU) or GTX 750 (55W GPU), I couldn't find any benchmarks for the i7-5950HQ (47W CPU+GPU) but it seems to me Intel's performance/watt is pretty good. It's their performance/$ that sucks, but that's because they totally own the high end laptop CPU market. A full GT4e is probably intended to compete with the lower end of the nVidia 9xxM series that use 50-120w by themselves.rtho782 - Thursday, August 27, 2015 - link
Bear in mind that in the case of the Intel CPUs you have not factored in RAM, whereas the graphics cards you have.Of course, the CPU part not present in the dedicated GPUs probably uses more power than the VRAM not present in the CPU, but it just shows how hard it is to make a fair comparison.
MrSpadge - Thursday, August 27, 2015 - link
Well, compared to CPUs with GT2 GPUs, Broadwell with GT3e seems to consume 30-40W more under GPU load. Now add 50% more shaders and it's easy to see how this chip might run into power limits, depending on the SKU. The performance between a chip with 95 W TDP and 37 W TDP will differ wildy.Kutark - Wednesday, September 2, 2015 - link
Here's my only thing with integrated graphics. Even if it is as fast as say a hypothetical 950/940/935m (whichever happens to be the case), and even if it does have similar TDP's. The difference is that with a "discrete" GPU, the manufacturer can move the GPU to another area where it will make cooling it significantly easier. I wonder how much of an issue cooling the cpu/gpu package being on the same die/socket etc is going to cause with these higher end parts.The other thing i wonder about is where they will come into use in the desktop market where the cooling factor isnt really an issue due to lack of space limitations. Once you get into that performance area, people are starting to be more of proper "gamers" and are much more likely to invest in a higher cost discrete GPU to get more gaming power.
I guess the AMD integrated gpu's are a good example of where that market might stand, and frankly i don't know anything about the sales figures on those or market share etc. I can't imagine its high. But hey, im just spitballing here.
lukarak - Thursday, August 27, 2015 - link
GT3 has 48 already in gen 5. It says that GT4e has 3 slices and GT3e has 2 slices and 48 units? I think everybody can do some basic math.jmelan - Wednesday, August 26, 2015 - link
lack of native HDMI 2.0 and HEVC is disappointingHardwareDufus - Wednesday, August 26, 2015 - link
Maybe we will see this improvement with KabyLake ?Ken_g6 - Wednesday, August 26, 2015 - link
HEVC Main profile is supported natively. Just not HEVC Main 10.gue22 - Tuesday, September 1, 2015 - link
The article says HEVC Main10 is supported, but needs the GPU.As far as I understood from it there is some dedicated hardware for codec and then there is the integrated GP2 GPU that is needed for Main10.
So all looks fine for me.
Understanding problem on who's part?
Cheers
G.
jay401 - Wednesday, August 26, 2015 - link
Yeah yeah, now if only the things were in stock in the US so we could, i dunno, BUY THEM.ganeshts - Wednesday, August 26, 2015 - link
Newegg has i5-6500K available right now if you are impatient :) http://www.newegg.com/Product/Product.aspx?Item=N8... ; We will soon see the low power Skylake parts come out :)Impulses - Thursday, August 27, 2015 - link
Yeah, it's the 6700K that's nowhere to be seen... Came in stock 2-3 times at Newegg but only sold in combos, and sold out within 5 min each time. Was put up for order at Amazon once but with no ETA and later pulled (my order stayed active tho).Came in stock once at Tiger for $50 over MSRP and sold out in short order, not sure if they've actually shipped any... Been available for pre order at BLT & NCIX but ETAs keep getting pushed back.
In summary, the 6700K was paper launches thru and thru, early 2000s style... Dunno why the press isn't taking Intel to task over it, I mean they had two different launch events already.
kenansadhu - Wednesday, August 26, 2015 - link
I really want to see how good core M will be on Skylake. If the CPU can sustain higher clock speed, and the GPU at least 30% faster, we're going to have awesome windows tablets/convertibles this year. Finally it will be thinner than the surface pro 3 yet with fewer compromises.Byte - Wednesday, August 26, 2015 - link
I wouldn't hold your breath. The power improvements on the cpu side are almost nil on the desktop.MrSpadge - Thursday, August 27, 2015 - link
The duty cycle optimization is a perfect match for Core M.frankpc - Wednesday, August 26, 2015 - link
Gigabyte now shows that the GA-Z170X-Gaming 7 complies with HDMI 2.0 and HDCP 2.2, and it uses the MegaChip LSPCon. It is stated above that the system cost will be higher with that device. Is $220 about as low as a MB will be priced? Or is that still too high?Byte - Wednesday, August 26, 2015 - link
You can wait for a combo deal, last 4 releases they had cpu and mobo for $50-$75 on the egg. Don't know about this time though, since the prices are all up for the i7 seriesgue22 - Tuesday, September 1, 2015 - link
Fromhttp://anandtech.com/show/9485/intel-skylake-z170-...
of August 5 2015:
"For USB 3.1, GIGABYTE would seem to have an initial exclusive of Intel’s Alpine Ridge controller. This is a four-lane PCIe controller that supports full speed on two USB 3.1 ports as well as Power Delivery 2.0 support (up to 36W) on Type-C connectors. For the G1, we get a Type-A and a Type-C here. With HDMI 2.0 support through Alpine Ridge, we’re at a bit of a conundrum here – on the board in white letters it explicitly states HDMI 2.0, but none of the marketing materials from GIGABYTE I have actually use it in any as a marketable point. So we’re unsure if it is indeed HDMI 2.0 capable, what standard, if this is via the AR controller or if this is a separate LS-Pcon. Then again, this is a motherboard designed for discrete gaming cards rather than integrated graphics.
Edit: We can confirm that HDMI 2.0 is via a separate LS-PCon, although it will need a future firmware update before it can be used."
Cheers
G.
Reflex - Wednesday, August 26, 2015 - link
I am curious if Skylake can drive more than one Miracast display, and if it counts against the display cap of 3. I found something interesting in my Win10 update: In Windows 8, my Surface Pro 3 i7 could drive 3 displays total via DP (using daisy chaining). If I connected to my tv via Miracast, one of the LCD's would get disabled. But in Windows 10 I can add a Miracast display as a fourth display in addition to my 3 directly connected displays.I was surprised at this change. Anyone have any info on that?
veek - Wednesday, August 26, 2015 - link
idiotic article filled with buzzwords - you'd think this was a PhD thesis paper except it's filled with marketing crap.name99 - Thursday, August 27, 2015 - link
Perhaps the problem is with the reader, not with the article?You are welcome to not be interested in how these features are developing over time. (I'm certainly not especially interested in, for example, how Intel Enterprise features are developing.)
But to imagine that *no-one* is interested in these features is myopic, narcissistic, and, above all, foolish.
BrokenCrayons - Thursday, August 27, 2015 - link
Yeah, agreed. While I personally don't find anything at all interesting about media capabilities beyond the basic idea of whether or not a video I want to watch (something rare anyway since I don't find it enjoyable to view video content in general no matter what screen or device its on) plays and if a device is on battery power at the time, that the battery isn't drained too quickly, there are other people who are very, very interested in this sort of article and find the details interesting and meaningful.gue22 - Tuesday, September 1, 2015 - link
With idiots it´s often like with cages and population in the zoo:It´s topologically difficult to tell who´s inside and who´s outside!
ToTTenTranz - Thursday, August 27, 2015 - link
4k60p at 240Mbps. That's twice the maximum bitrate of the UHD BluRay standard.I can only imagine how good that would look, given the proper viewing equipment of course.
hansmuff - Thursday, August 27, 2015 - link
If I bought one of those Skylake CPUs and had dedicated PCI-Express graphics, could I still make use of QuickSync and/or some of the multimedia magic talked about, or was that something quirky Virtu tried and it has never worked 100%?ganeshts - Friday, August 28, 2015 - link
You should be able to do this easily with some BIOS / Windows 8.x or later setting. Enable the iGPU monitor in BIOS, and create a dummy display on that adapter in Windows. Quick Sync will be available after that. Step-by-step instructions: https://mirillis.com/en/products/tutorials/action-...MrBowmore - Friday, August 28, 2015 - link
I'm still waiting for the Skylake CPU-rundown. It's been a while since IDF now. I want to look in to the architectural differences like in the good ol days.willis936 - Saturday, August 29, 2015 - link
I'm reading all of these words like "low power" and "higher bitrate support" but all I see is "image quality in the toilet".rstuart - Sunday, August 30, 2015 - link
The Alpine Ridge ridge slide says it supports USB 3.0. I thought it supported USB 3.1.