I'd bet Intel would be more likely to screw up showmanship than screw up this product.
I've known some people from corporate demos teams (including some people on the current Intel team), and I can assure you: things break on stage, backups exist. Demo lineups change in the last minute, or another demo can fall through & they'll change it up. The demoed product is often still in development and is prone to errors/fixes/bugs/etc.
I don't know exactly what happened, but I bet it's no big deal, maybe they didn't have time, maybe something unexpected happened & they used a backup option and decided to keep the show running smoothly.
You don't have a clue. ALL laptops from either Intel or AMD typically use thinner mobile versions of processors which are cherry-picked to operate at lower voltages and be more energy efficient to run cooler. Yes, Ultrabooks with current 32nm Sandy Bridge processors consume more power than these new faster, molre energy-efficient 22nm Ivy Bridge processors. But there already Ultrabooks that have top-of-the-line i7 SNB processors. It's not clear how this one with pre-production 22nm IVB processors compares to those, but it's very unlikey that Intel would bring an i3 or i5 equivalent to CES.
Pay attention. When people say that 22nm IVB cpus are "hot", they aren't talking about temperature.
It get's better. Not only is the max clock lower... the base clock is much lower, too. So when (and it's a case of when) it throttles, it will be worse. Not only that, Intel's texture/image quality is worse than absolute crap, anyways. Even vs low and AMD GPU/APU.
Intel's integrated GPUs got better since they put it on-die, but lets not kid ourselves, they're the weakpoint, Intel takes shortcuts with rendering that compromises quality, I recall one review showing its IQ was similar to AMD/nVidia products from 2004.
Most importantly though is driver support. Intel absolutely sucks at this, they need to really take this seriously.
Agree. I spent a week preparing stuff for a show demo a couple of years ago to have them decide two days before to replace all the software with newer versions. Of course, when they tried setting up at the show nothing worked and the marketing/demo people didn't know anything about how to get it fixed. Since I am a contractor I wasn't flown to the show. The guy at the partner company that I setup everything with told me that it was a fiasco and they ended up demoing another vendors products. Good thing they saved the money in sending me out.
And, no, this had nothing to do with Intel or CES - it was a completely different trade show but the same rules apply. 1. marketing people don't know anything about tech and will always screw up a tech demo 2. the higher up the food chain the audience is the odds of a failure increase during the demo (like when I had a transformer blowout during an impromptu demo to the board of directors who just happened to be meeting when we got a next gen x-ray scanner to display realtime).
"1. marketing people don't know anything about tech and will always screw up a tech demo"
I don't think this is correct, and exhibit (a) is everyone's favorite marketing person, Steve Jobs. The issue is not "do we know about tech or not". The issue is "do we care about doing a good job or not". There are some people who CARE about doing a good job, and who will do what is necessary to make that happen. There are plenty more people who just don't give a damn, who half-ass it, and aren't ashamed when their presentation sucks.
Of course, why a company would allow such a person to retain their job is an interesting question...
Typically, Intel notebook cpu's all get the same IGP (and a better one than the desktop cpu's). so while the low power SKU might not have been ready for demo, the IGP is the same.
anyways, i got banned for stating the obvious from gizmodo :) jesus diaz is rather strict at his any post against an article of him policy.
Well not really,even if it's the same GPU, clocked the same and with the same turbo (and it's more than likely to not be) the very limited TDP is likely to not allow both CPU and GPU to go to 100% at the same time.You also got small details like cache size,extra power saving features that might impact perf and so on.
just shows me how biased anandtech can be towards INTEL..........
you ignore the fact that the demo was claiming it could run on an ULTRABOOK, where you have to under power sandy/ivy bridge cpu/gpus because of the thermal constraints.
Conveniently this article ignores ultrabooks and shows us a normal laptop running f1 2011. Show it to me on an ULTRABOOK before I believe anything.
as the poster above said,
"If there was no need to fake it, they 'wooden't' have faked it."
the ultrabook version of the hw is not ready yet, but has identical gpu specs. so they showed the gpu part of it. the rest will come when ready. if they would be ready, they would not show a demo, they would SELL THE HW.
BS. Have you been paying attention ? Do you read ANY of the technical literature & reviews ? Why do you think Ivy Bridge needs to be underclocked & underpowered ? Anyone can make an unsupportable comment. Prove me otherwise. Put it up or shut up.
You must be too used to AMD products that has to rely on older technology that consumes more power and tend to overheat.
Ivy Bridge uses the latest 22nm technology with the fastest transistors out there today that are also the most energy efficient. It's the last thing that would require it to be underclocked or underpowered.
You need to remember that piesquared is near retarded.
Those F1 games aren't easy to play straight up if you aren't familiar with them, so no doubt Mooly didn't want to look like a retard who couldn't drive.
But hey, if AMD fanboys want to think this is going to make any difference to how much IB is going to kick their beloved's backside, so be it.
When I see or hear people use the word "retard" as a pejorative, frankly I wonder about their own ability to think things through. No reasonable person would use that word, regardless of what his peers do, in that way.
Name-calling is generally a bad idea anyway. It throws any reasonable discussion into an emotional swamp, without enlightening anything. I know it's hard to get around at times, we are taught in our society (societies) it's okay to derail logical thinking with emotional appeals. Sometimes we just get exasperated and don't feel like being reasonable. I try (and sometimes fail, myself) to not fall into that trap and clear my head before I post . I suggest you will make it better for us all if you adopt a similar policy.
Anand is one of the most scrupulous hardware analysts on the internet. His bias is towards the facts. I don't know him personally, but I've read his reviews for years, so I know what I'm talking about.
Look around, read the posts after an article. People who are regulars are so used to Anand's work that they are rather surprised when they see misspellings, poorly worded phrasing, or a lack of proof-reading done by other testers on this site. Comments about these things would be sneered at on other sites, but here they are taken seriously.
You can bet Anand will analyze the chip thoroughly when he can get his hands on one, and if it fails to deliver he will ring the bell loudly and clearly.
DX11 is more efficient than DX10; if the on-die GPU has the code to run DX11, it should provide a frame rate increase in the same scenario (with DX11 capable content). I've experienced this with World of Warcraft myself, after the Cataclysm upgrade.
The caveat to what I just said is that most DX11 capable games available today don't take full advantage of all of its capabilities. A game that does may not see a frame rate increase. Generally speaking though any ultrabook that could run DX10 content will be able to run content better, all else being the same, with a DX11 capable GPU inside it.
I have no trouble believing that Intel (or any other serious CPU/GPU manufacturer) could make their chip DX11 capable. There is no reason for Intel to fake anything. They already have proven DX10 capable chips with Sandy Bridge, the leap to DX11 just isn't that big a deal, comparatively.
The link says the graphics on the ULV for ultrabooks run at almost half speed 350 vs 650 (but turbo almost to the same). So they might have not wanted to show a slideshow in the live demo. Or, like they said, maybe they didn't have time to install a game, but they had plenty of time to install the game, take a video of it, and put said video on another computer.
Regardless, "Will it work in other Dx11 titles?" is the real question.
Always show your best hand. Most people will see the "decent" performance of the highest-clocked, highest-thermal-ceiling parts, and assume that anything with HD 4000 is identical in all cases.
But hey, for a gaming laptop you can always get a dedicated Nvidia or AMD GPU. Ultrabooks? Not so much... kind of defeats the purpose.
Why the racing wheel on stage though? Why not just say "this is a capture of IB graphics running a DX11 game". It seems to me they wanted to give the impression that the ultrabook on stage was running it live.
He's walking away from his own presentation and using a jovial tone to say it's happening backstage in "realtime". Not sure how anyone can try to take that comment of his seriously without additional proof. IMO, he made a joking statement to break up the tension/frustration until proven otherwise. I definitely don't see it as a serious explanation, he wouldn't be walking away if it was.
Main thing I learned from this fib is that Intel is extremely determined to sell it's ULV chips.
The following points are omitted from Anand's horrible cover up attempt. 1: Different hardware that is apparently 25% over clocked (2.0GHz CPU running at 2.5 GHz). 2. No report on GPU frequencies or shader units/cores/what ever intel calls them. These are probably seeing the same 25% over clock too. 3. No report on power consumption (ie. at a 25% over clock being generous this 'laptop' is probably pulling around 45W for the CPU/GPU vs the 17W availible for the UltraBook chip) 4. While the game is in 'DX11 mode' there's no indication that it's not running mostly DX9 code path. Intel chose a 'DX11' title with a very old base engine that sitll has DX9 render paths. Since Intel's previous efforts never had their DX10 drivers working with the full instruction set I'm highly skeptical that DX11 (or even DX10) is really being implimented here. Maybe a couple of easy features are enabled, but the game is almost certainly running most of the code on the DX9 render pipeline. 5. No indication of game resolution or graphics settings.
If they had a good case for DX11 gaming they would have done the demo live with a real DX10/DX11 engine (think BF3 Frostbite 2 engine with no DX9 redner path). They didn't because they couldn't There's no time issue when you underpromise and over deliver. People wait there whole lives for that moment and they would have spent the entire presentation showing an ultrabook crush the competition (AMD and NVIDIA) in gaming if they could have. We just need AMD to show a 'Trinity' demo that was using an Intel CPU to feed 'representative' discreet AMD graphics and all three will have had their 'woodscrew' moment and we can get on with our lives.
So yes, case closed indeed sir. Too bad the case is that Intel doesn't have competative graphics. Also while I appreciate that you are an ex Intel employee, and still have very strong ties to the company, your treatment of their graphics has gone from humourous and informative, to sad and pathetic. It's ok to say that they aren't function for gaming. You don't even have to say they suck monkey nuts (which they do). They actually have lots of intresting and positive things going for them (which you point out ad naseum). Just stick to the good, and ignore the bad if you must. Don't try to rationalize the bad into good, it's depressing to watch.
I don't think they attempt to prove anything (clock frequencies, awesome fps, high res textures), except to say "DX11 works on IVB". It does not go on saying how powerful it will be or anything that you mention.
Very specificaly they said "DX11 Games will be playable on an Ultrabook using IVB." They didn't say, "we can get a DX11 game to load."
There's a huge difference between saying it has feature X, but cannot be used and saying something is a viable platform to do X with.
So yes, they specifically promised Ultrabook IVB is powerful enough to play DX11 games. This is what they are now atempting to cover with FUD.
If they had of just said, "look at us demo a DX11 game on IVB. We recoreded it as a video to save time" then they would have had no problem. I would have belived them and left it at that. I want Intel to be able to do these things. I don't want to have to expalin to my friends and family the thing they just bought cannot do what it is advertised to do. I really don't want to try to explain to them I cannot fix it either with my magic computer powers no matter how 'good at computers' I appear to be to them. People buying an IVB Ultraportable are going to be horribly fucked over because of Intel's lying, and people like Anand going to bat for them.
Intel is just trying, at worst, to over embellish. If you're unable to separate the meat from the puffery, I recommend allocating your purchase decision making to a parent/wife/family member.
All we do know is that IVB can run run one old DX11 game at what looks like 720p. That's it.
All the rest means jack until Anandtech and others get a hold of some IVB hardware and test the shit out of them.
Re: Re 1: No. Turbo is a function that works when you have TDP overhead. When running a game the CPU cores are being hit continuously as well as the GPU so niether of them are in a low enough power state to allow the other to turbo. From Jarred and Anand's own review "The sole victory for Intel comes in the lightly-threaded StarCraft II where Intel can really flex its Turbo Boost muscles." So for any modern game (ie more than 2 threads [so DX10+ render path]) turbo cannot kick in for any APU solution. If turbo is working in the demo then it just means they are running DX9 code path with a sampling of DX11 simple features, which means they are lying anyways.
Re: Re 2&3: Correct. Nor is there a law that they cannot lie, cheat or change their mind/product specs when ever they choose (see new Atom loosing DX10.1 capability at launch). All companies over promise and under deliver (look at bulldozer). However my problem isn't with Intel. Anand has an obligation to accurately report the capabilities of the products he's testing. He doesn't have to (and doesn't frequently) but that violates the implicit agreement he has with his viewers. It's Ananad, who certainly knows these things have a large impact on capability, is covering for Intel at the expense of his readers.
To your comment, I'll take 'much' as 'must' and we'll go from there. I have no doubt that their production silicone (which is heavily binned) will be able to hit 2.5 GHz at 35W with the GPU running at X specs. The CPU will probably be able to turbo to over 3.0 GHz too, but the issue is that giving the CPU/GPU more thermal headroom allows you to take both parts up to higher performance. So even with final tweeking you'll never get the performance Anand reported on a 17W Ultrabook IVB. Physicis doesn't care about your personal preference.
So go ahead and enjoy some revisionist history if you want, but the facts remain Intel promised DX11 gaming on their Ultrabooks using a 17W IVB chip and will not deliver. Justify your personal preference all you want, but I'm not willing to buy into Atom 2.0 from Intel. My GF's mom's netbook is living up to the book part (sitting on the shelf) since it cannot do what it was promised to do. That was only a 300 mistake, the IVB Ultrabook will be a $1000 mistake for the poor people that listen to you and Anand.
Re #1: The screenshot with 2.5Ghz was in windows, not the game. Basically 100% guaranteed to be turbo. And since its only 2.0 -> 2.5 it matches the profile of a ULV part. If it was a high end chip it would turbo up much higher (>3.0Ghz).
And ayone who buy ANY ultrabook expecting high performance gaming capability is going to be severely disappointed as NOBODY is advertising that capability, and NOBODY is claiming it.
I wish we could all own 7970s, but we can't. My GTX 470 struggles with several games whether DX10 or DX11. It also lacks some features newer and more powerful GPUs feature. It doesn't mean it's not good or not worth owning. It also doesn't mean it's not a DX11 part. I have to adjust settings to match.
in OpenGL too. In Mesa 8.0 they add not fully OpenGL 3.0 and they think, all is good. Sucks this shitty graphics from Intel. They cant even do good drivers for Linux, and they want DX11 on HD 4000? No more Intel IGP, only NVIDIA.
Nvidia isn't really an option as an IGP since they do not make chipsets for AMD or Intel anymore. So your only Nvidia option is as a discrete card which adds to the system power draw power draw and potentially lowering battery life, they do have hybrid graphics but still im not sure if that fits into the "ultrabook" big picture. Regardless im not really sure if they have a graphics core that fits into the same power envelope as something from intel or AMD.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
50 Comments
Back to Article
piesquared - Tuesday, January 10, 2012 - link
Ah, I see we have the first victim.....If there was no need to fake it, they 'wooden't' have faked it.
baldun - Tuesday, January 10, 2012 - link
or maybe they just run out of time to prepare?grrrrr - Tuesday, January 10, 2012 - link
should be: Intel confirms controlling VLC with steering wheel!sepht - Tuesday, January 10, 2012 - link
I'd bet Intel would be more likely to screw up showmanship than screw up this product.I've known some people from corporate demos teams (including some people on the current Intel team), and I can assure you: things break on stage, backups exist. Demo lineups change in the last minute, or another demo can fall through & they'll change it up. The demoed product is often still in development and is prone to errors/fixes/bugs/etc.
I don't know exactly what happened, but I bet it's no big deal, maybe they didn't have time, maybe something unexpected happened & they used a backup option and decided to keep the show running smoothly.
Even SNB can run F1 2011 at 25 FPS on medium settings. There's no reason to try and fake anything. http://www.notebookcheck.net/Intel-HD-Graphics-300...
Alchemist07 - Tuesday, January 10, 2012 - link
Not on an ULTRABOOK.......even an AMD E-350 beats any sandybridge cpu in a ultrabook for graphicsdavepermen - Tuesday, January 10, 2012 - link
ultrabooks have SNB, too. identical gpu settings. so yes, it runs at identical settings.Alchemist07 - Tuesday, January 10, 2012 - link
Um, ultrabook processors are ULV versions, the GPU is underclocked, and also has to run slower if it gets a bit hot.Hector2 - Tuesday, January 10, 2012 - link
You don't have a clue. ALL laptops from either Intel or AMD typically use thinner mobile versions of processors which are cherry-picked to operate at lower voltages and be more energy efficient to run cooler. Yes, Ultrabooks with current 32nm Sandy Bridge processors consume more power than these new faster, molre energy-efficient 22nm Ivy Bridge processors. But there already Ultrabooks that have top-of-the-line i7 SNB processors. It's not clear how this one with pre-production 22nm IVB processors compares to those, but it's very unlikey that Intel would bring an i3 or i5 equivalent to CES.Pay attention. When people say that 22nm IVB cpus are "hot", they aren't talking about temperature.
Alchemist07 - Tuesday, January 10, 2012 - link
Oh dear, you really don't understand ULV range...nevermind, enjoy your IVB graphics ;-)jeremyshaw - Tuesday, January 10, 2012 - link
It get's better. Not only is the max clock lower... the base clock is much lower, too. So when (and it's a case of when) it throttles, it will be worse. Not only that, Intel's texture/image quality is worse than absolute crap, anyways. Even vs low and AMD GPU/APU.Khato - Tuesday, January 10, 2012 - link
Feel free to continue living in denial, but for the rest of us it's easy enough to actually check an ultrabook review and see how that claim pans out - http://www.anandtech.com/show/4985/asus-zenbook-ux...Wierdo - Tuesday, January 10, 2012 - link
Intel's integrated GPUs got better since they put it on-die, but lets not kid ourselves, they're the weakpoint, Intel takes shortcuts with rendering that compromises quality, I recall one review showing its IQ was similar to AMD/nVidia products from 2004.Most importantly though is driver support. Intel absolutely sucks at this, they need to really take this seriously.
fic2 - Tuesday, January 10, 2012 - link
Agree. I spent a week preparing stuff for a show demo a couple of years ago to have them decide two days before to replace all the software with newer versions. Of course, when they tried setting up at the show nothing worked and the marketing/demo people didn't know anything about how to get it fixed. Since I am a contractor I wasn't flown to the show. The guy at the partner company that I setup everything with told me that it was a fiasco and they ended up demoing another vendors products. Good thing they saved the money in sending me out.And, no, this had nothing to do with Intel or CES - it was a completely different trade show but the same rules apply.
1. marketing people don't know anything about tech and will always screw up a tech demo
2. the higher up the food chain the audience is the odds of a failure increase during the demo (like when I had a transformer blowout during an impromptu demo to the board of directors who just happened to be meeting when we got a next gen x-ray scanner to display realtime).
name99 - Tuesday, January 10, 2012 - link
"1. marketing people don't know anything about tech and will always screw up a tech demo"I don't think this is correct, and exhibit (a) is everyone's favorite marketing person, Steve Jobs.
The issue is not "do we know about tech or not". The issue is "do we care about doing a good job or not".
There are some people who CARE about doing a good job, and who will do what is necessary to make that happen. There are plenty more people who just don't give a damn, who half-ass it, and aren't ashamed when their presentation sucks.
Of course, why a company would allow such a person to retain their job is an interesting question...
jjj - Tuesday, January 10, 2012 - link
The presentation was for Ultrabooks,so it has to be done on a low power SKU, no way to tell if this is that.davepermen - Tuesday, January 10, 2012 - link
Typically, Intel notebook cpu's all get the same IGP (and a better one than the desktop cpu's). so while the low power SKU might not have been ready for demo, the IGP is the same.anyways, i got banned for stating the obvious from gizmodo :) jesus diaz is rather strict at his any post against an article of him policy.
DigitalFreak - Tuesday, January 10, 2012 - link
That's what you get for posting at Jizmodo.jjj - Tuesday, January 10, 2012 - link
Well not really,even if it's the same GPU, clocked the same and with the same turbo (and it's more than likely to not be) the very limited TDP is likely to not allow both CPU and GPU to go to 100% at the same time.You also got small details like cache size,extra power saving features that might impact perf and so on.Alchemist07 - Tuesday, January 10, 2012 - link
just shows me how biased anandtech can be towards INTEL..........you ignore the fact that the demo was claiming it could run on an ULTRABOOK, where you have to under power sandy/ivy bridge cpu/gpus because of the thermal constraints.
Conveniently this article ignores ultrabooks and shows us a normal laptop running f1 2011. Show it to me on an ULTRABOOK before I believe anything.
as the poster above said,
"If there was no need to fake it, they 'wooden't' have faked it."
B3an - Tuesday, January 10, 2012 - link
... or like it says they simply could have not had the time to set up a live demo. These things happen you know.davepermen - Tuesday, January 10, 2012 - link
the ultrabook version of the hw is not ready yet, but has identical gpu specs. so they showed the gpu part of it. the rest will come when ready. if they would be ready, they would not show a demo, they would SELL THE HW.Alchemist07 - Tuesday, January 10, 2012 - link
Identical gpu specs, except it will be underclocked/poweredHector2 - Tuesday, January 10, 2012 - link
BS. Have you been paying attention ? Do you read ANY of the technical literature & reviews ? Why do you think Ivy Bridge needs to be underclocked & underpowered ? Anyone can make an unsupportable comment. Prove me otherwise. Put it up or shut up.You must be too used to AMD products that has to rely on older technology that consumes more power and tend to overheat.
Ivy Bridge uses the latest 22nm technology with the fastest transistors out there today that are also the most energy efficient. It's the last thing that would require it to be underclocked or underpowered.
Hector2 - Tuesday, January 10, 2012 - link
Who are you ? Some high school kid who games all day but doesn't really know what's inside the box ?Chad Boga - Tuesday, January 10, 2012 - link
You need to remember that piesquared is near retarded.Those F1 games aren't easy to play straight up if you aren't familiar with them, so no doubt Mooly didn't want to look like a retard who couldn't drive.
But hey, if AMD fanboys want to think this is going to make any difference to how much IB is going to kick their beloved's backside, so be it.
Sabresiberian - Friday, January 13, 2012 - link
When I see or hear people use the word "retard" as a pejorative, frankly I wonder about their own ability to think things through. No reasonable person would use that word, regardless of what his peers do, in that way.Name-calling is generally a bad idea anyway. It throws any reasonable discussion into an emotional swamp, without enlightening anything. I know it's hard to get around at times, we are taught in our society (societies) it's okay to derail logical thinking with emotional appeals. Sometimes we just get exasperated and don't feel like being reasonable. I try (and sometimes fail, myself) to not fall into that trap and clear my head before I post . I suggest you will make it better for us all if you adopt a similar policy.
;)
MrSpadge - Tuesday, January 10, 2012 - link
Sure, clock it a little lower and suddenly it will loose DX11 functionality...Sabresiberian - Friday, January 13, 2012 - link
Anand is one of the most scrupulous hardware analysts on the internet. His bias is towards the facts. I don't know him personally, but I've read his reviews for years, so I know what I'm talking about.Look around, read the posts after an article. People who are regulars are so used to Anand's work that they are rather surprised when they see misspellings, poorly worded phrasing, or a lack of proof-reading done by other testers on this site. Comments about these things would be sneered at on other sites, but here they are taken seriously.
You can bet Anand will analyze the chip thoroughly when he can get his hands on one, and if it fails to deliver he will ring the bell loudly and clearly.
;)
Bateluer - Tuesday, January 10, 2012 - link
Its not going to be powerful enough to actually do any DX11 gaming.therealnickdanger - Tuesday, January 10, 2012 - link
Well, y'know, except for the smooth performance seen in the video above.Sabresiberian - Friday, January 13, 2012 - link
DX11 is more efficient than DX10; if the on-die GPU has the code to run DX11, it should provide a frame rate increase in the same scenario (with DX11 capable content). I've experienced this with World of Warcraft myself, after the Cataclysm upgrade.The caveat to what I just said is that most DX11 capable games available today don't take full advantage of all of its capabilities. A game that does may not see a frame rate increase. Generally speaking though any ultrabook that could run DX10 content will be able to run content better, all else being the same, with a DX11 capable GPU inside it.
I have no trouble believing that Intel (or any other serious CPU/GPU manufacturer) could make their chip DX11 capable. There is no reason for Intel to fake anything. They already have proven DX10 capable chips with Sandy Bridge, the leap to DX11 just isn't that big a deal, comparatively.
;)
drmo - Tuesday, January 10, 2012 - link
The link says the graphics on the ULV for ultrabooks run at almost half speed 350 vs 650 (but turbo almost to the same). So they might have not wanted to show a slideshow in the live demo. Or, like they said, maybe they didn't have time to install a game, but they had plenty of time to install the game, take a video of it, and put said video on another computer.Regardless, "Will it work in other Dx11 titles?" is the real question.
Alexvrb - Friday, January 13, 2012 - link
Always show your best hand. Most people will see the "decent" performance of the highest-clocked, highest-thermal-ceiling parts, and assume that anything with HD 4000 is identical in all cases.But hey, for a gaming laptop you can always get a dedicated Nvidia or AMD GPU. Ultrabooks? Not so much... kind of defeats the purpose.
Vesku - Tuesday, January 10, 2012 - link
Why the racing wheel on stage though? Why not just say "this is a capture of IB graphics running a DX11 game". It seems to me they wanted to give the impression that the ultrabook on stage was running it live.Vesku - Tuesday, January 10, 2012 - link
He's walking away from his own presentation and using a jovial tone to say it's happening backstage in "realtime". Not sure how anyone can try to take that comment of his seriously without additional proof. IMO, he made a joking statement to break up the tension/frustration until proven otherwise. I definitely don't see it as a serious explanation, he wouldn't be walking away if it was.Main thing I learned from this fib is that Intel is extremely determined to sell it's ULV chips.
Donnie Darko - Tuesday, January 10, 2012 - link
The following points are omitted from Anand's horrible cover up attempt.1: Different hardware that is apparently 25% over clocked (2.0GHz CPU running at 2.5 GHz).
2. No report on GPU frequencies or shader units/cores/what ever intel calls them. These are probably seeing the same 25% over clock too.
3. No report on power consumption (ie. at a 25% over clock being generous this 'laptop' is probably pulling around 45W for the CPU/GPU vs the 17W availible for the UltraBook chip)
4. While the game is in 'DX11 mode' there's no indication that it's not running mostly DX9 code path.
Intel chose a 'DX11' title with a very old base engine that sitll has DX9 render paths. Since Intel's previous efforts never had their DX10 drivers working with the full instruction set I'm highly skeptical that DX11 (or even DX10) is really being implimented here. Maybe a couple of easy features are enabled, but the game is almost certainly running most of the code on the DX9 render pipeline.
5. No indication of game resolution or graphics settings.
If they had a good case for DX11 gaming they would have done the demo live with a real DX10/DX11 engine (think BF3 Frostbite 2 engine with no DX9 redner path). They didn't because they couldn't There's no time issue when you underpromise and over deliver. People wait there whole lives for that moment and they would have spent the entire presentation showing an ultrabook crush the competition (AMD and NVIDIA) in gaming if they could have. We just need AMD to show a 'Trinity' demo that was using an Intel CPU to feed 'representative' discreet AMD graphics and all three will have had their 'woodscrew' moment and we can get on with our lives.
So yes, case closed indeed sir. Too bad the case is that Intel doesn't have competative graphics. Also while I appreciate that you are an ex Intel employee, and still have very strong ties to the company, your treatment of their graphics has gone from humourous and informative, to sad and pathetic. It's ok to say that they aren't function for gaming. You don't even have to say they suck monkey nuts (which they do). They actually have lots of intresting and positive things going for them (which you point out ad naseum). Just stick to the good, and ignore the bad if you must. Don't try to rationalize the bad into good, it's depressing to watch.
Death666Angel - Tuesday, January 10, 2012 - link
I don't think they attempt to prove anything (clock frequencies, awesome fps, high res textures), except to say "DX11 works on IVB". It does not go on saying how powerful it will be or anything that you mention.Donnie Darko - Tuesday, January 10, 2012 - link
Very specificaly they said "DX11 Games will be playable on an Ultrabook using IVB."They didn't say, "we can get a DX11 game to load."
There's a huge difference between saying it has feature X, but cannot be used and saying something is a viable platform to do X with.
So yes, they specifically promised Ultrabook IVB is powerful enough to play DX11 games. This is what they are now atempting to cover with FUD.
If they had of just said, "look at us demo a DX11 game on IVB. We recoreded it as a video to save time" then they would have had no problem. I would have belived them and left it at that. I want Intel to be able to do these things. I don't want to have to expalin to my friends and family the thing they just bought cannot do what it is advertised to do. I really don't want to try to explain to them I cannot fix it either with my magic computer powers no matter how 'good at computers' I appear to be to them. People buying an IVB Ultraportable are going to be horribly fucked over because of Intel's lying, and people like Anand going to bat for them.
nategator - Tuesday, January 10, 2012 - link
Technically you are spreading FUD.Intel is just trying, at worst, to over embellish. If you're unable to separate the meat from the puffery, I recommend allocating your purchase decision making to a parent/wife/family member.
All we do know is that IVB can run run one old DX11 game at what looks like 720p. That's it.
All the rest means jack until Anandtech and others get a hold of some IVB hardware and test the shit out of them.
To be continued...
MrSpadge - Tuesday, January 10, 2012 - link
Re 1: could it be Turbo?Re 2 & 3: There's no law for neither Intel nor AMD to disclose full specs of pre-production hardware.
And you know SNB has no problems hitting 3.5 GHz in 35 W TDP packages? So, sure, the more advanced part much draw 45 W at 2.5 GHz ..
+1 @ Death666Angel
Donnie Darko - Tuesday, January 10, 2012 - link
Re: Re 1: No. Turbo is a function that works when you have TDP overhead. When running a game the CPU cores are being hit continuously as well as the GPU so niether of them are in a low enough power state to allow the other to turbo.From Jarred and Anand's own review
"The sole victory for Intel comes in the lightly-threaded StarCraft II where Intel can really flex its Turbo Boost muscles."
So for any modern game (ie more than 2 threads [so DX10+ render path]) turbo cannot kick in for any APU solution. If turbo is working in the demo then it just means they are running DX9 code path with a sampling of DX11 simple features, which means they are lying anyways.
Re: Re 2&3: Correct. Nor is there a law that they cannot lie, cheat or change their mind/product specs when ever they choose (see new Atom loosing DX10.1 capability at launch). All companies over promise and under deliver (look at bulldozer). However my problem isn't with Intel. Anand has an obligation to accurately report the capabilities of the products he's testing. He doesn't have to (and doesn't frequently) but that violates the implicit agreement he has with his viewers. It's Ananad, who certainly knows these things have a large impact on capability, is covering for Intel at the expense of his readers.
To your comment, I'll take 'much' as 'must' and we'll go from there. I have no doubt that their production silicone (which is heavily binned) will be able to hit 2.5 GHz at 35W with the GPU running at X specs. The CPU will probably be able to turbo to over 3.0 GHz too, but the issue is that giving the CPU/GPU more thermal headroom allows you to take both parts up to higher performance. So even with final tweeking you'll never get the performance Anand reported on a 17W Ultrabook IVB. Physicis doesn't care about your personal preference.
So go ahead and enjoy some revisionist history if you want, but the facts remain Intel promised DX11 gaming on their Ultrabooks using a 17W IVB chip and will not deliver. Justify your personal preference all you want, but I'm not willing to buy into Atom 2.0 from Intel. My GF's mom's netbook is living up to the book part (sitting on the shelf) since it cannot do what it was promised to do. That was only a 300 mistake, the IVB Ultrabook will be a $1000 mistake for the poor people that listen to you and Anand.
extide - Wednesday, January 11, 2012 - link
Re #1: The screenshot with 2.5Ghz was in windows, not the game. Basically 100% guaranteed to be turbo. And since its only 2.0 -> 2.5 it matches the profile of a ULV part. If it was a high end chip it would turbo up much higher (>3.0Ghz).extide - Wednesday, January 11, 2012 - link
Also, they promised DX11 support, THATS IT.And ayone who buy ANY ultrabook expecting high performance gaming capability is going to be severely disappointed as NOBODY is advertising that capability, and NOBODY is claiming it.
Get back to reality dude.
extide - Wednesday, January 11, 2012 - link
Check this out, http://www.anandtech.com/show/5192/ivy-bridge-mobi...It is probably a i7-3667U
Base clock 2.0Ghz, with turbo @ 2.9-3.1Ghz depending on #/cores active. This is a 17w part.
therealnickdanger - Tuesday, January 10, 2012 - link
I wish we could all own 7970s, but we can't. My GTX 470 struggles with several games whether DX10 or DX11. It also lacks some features newer and more powerful GPUs feature. It doesn't mean it's not good or not worth owning. It also doesn't mean it's not a DX11 part. I have to adjust settings to match.Alexvrb - Friday, January 13, 2012 - link
Yeah, and your GTX 470 is several times more powerful than this. So if you're struggling, what does that say for the Intel GPU?MrSpadge - Tuesday, January 10, 2012 - link
Not if it's just a random hardware failure, which can happen to any product any time. You'd just need to get another one and it would work again.chizow - Tuesday, January 10, 2012 - link
of lip syncing the half time show during the Super Bowl?Seems a lot of wasted ink over something so trivial, but thanks for clearing up the non-controversy for good with the video Anand. :)
If Intel was making false claims about graphics performance it would've been exposed in a few months when IVB launched anyways.
Olbi - Monday, January 16, 2012 - link
in OpenGL too. In Mesa 8.0 they add not fully OpenGL 3.0 and they think, all is good. Sucks this shitty graphics from Intel. They cant even do good drivers for Linux, and they want DX11 on HD 4000? No more Intel IGP, only NVIDIA.artk2219 - Monday, January 16, 2012 - link
Nvidia isn't really an option as an IGP since they do not make chipsets for AMD or Intel anymore. So your only Nvidia option is as a discrete card which adds to the system power draw power draw and potentially lowering battery life, they do have hybrid graphics but still im not sure if that fits into the "ultrabook" big picture. Regardless im not really sure if they have a graphics core that fits into the same power envelope as something from intel or AMD.