Comments Locked

64 Comments

Back to Article

  • marioyohanes - Thursday, May 31, 2012 - link

    Intel should learn from Apple users for the keyboard design... We all know how much Apple users hate white keyboard in previous MacBook models, that's why Apple is using black keyboard now...
  • JarredWalton - Thursday, May 31, 2012 - link

    Hahaha... I realized I never took a picture of the open laptop and so I used the image Intel sent along. I'll post an updated picture to the article in a moment, but suffice it to say that the review sample doesn't have a white keyboard.
  • marioyohanes - Thursday, May 31, 2012 - link

    I love the keyboard design on your new pic... :)
  • kamalppc - Tuesday, January 29, 2013 - link

    You can find the reviews here -

    http://cpudistro.com/intel/core-i5-3427u/
    http://ultra-book.co/core-i5-3427u-review/
  • mcquade181 - Thursday, May 31, 2012 - link

    I guess Apple users must value asthetics over funtionality!
    From a useablity point of view black keyboards are horrible.
    In low light conditions it is very difficult to see individual keys, which, unless you can touch type, makes them very hard to use.
    I suspect that's why there is so much desire for backlit keyboards.
  • Concillian - Sunday, June 3, 2012 - link

    We should make keyboards for people who can't touch type?

    This is 2012 dude, if you can't touch type you have a problem. Everything you do in the business world and half the things in todays social world require using a keyboard of some kind as an input.

    Catering hardware (and software) to the people who can't use them right ends up hindering productivity for those who are actually productive on their computer.
  • Argedut - Sunday, June 10, 2012 - link

    You're totally right. Also if you don't understand the dewey decimal system don't even BOTHER looking for a job. Am I right?
  • iwod - Thursday, May 31, 2012 - link

    I am amazed by the Chipset TDP and die size, But as transistor shrinks but I/O lanes remain constant, may be we could further include things inside the chipset? Things like an SSD Controller? Or few years down the road there will be no more Chipset, just a SoC.
  • Shadowmaster625 - Thursday, May 31, 2012 - link

    The SSD controller should be right on the CPU die right next to the memory controller. But I guess Intel doesnt mind getting totally devoured by Apple, who was actually smart enough to make a chip with a flash controller. Granted its not a very fast one, but at least it is there.

    There is no reason why every new computer should not have at least 32GB of flash that reads just as fast as DRAM, and with DRAM caching, would basically write just as fast too. With the controller in the cpu it lowers the cost of 32GB to just a few dollars.... the current spot price of four 8Gbitx8 MLC NAND chips is just $18. With a good integrated flash controller, the lower latencies on random reads could bring the real world random read speeds well past even a Vertex 4.
  • ZeDestructor - Thursday, May 31, 2012 - link

    What exactly is the difference between the QS77 and the QM77? As far as I can see, there's no reason to use the QM77 at all since the QS77 has a better TDP and power usage than the QM77 while retaining all the features...
  • Ryan Smith - Thursday, May 31, 2012 - link

    Price.
  • ZeDestructor - Thursday, May 31, 2012 - link

    And what's the difference in price? I shouldn't think its that much a big difference in between the two compared to, say, the CPU cost...
  • JarredWalton - Thursday, May 31, 2012 - link

    The chart we put together didn't include this information, but QS77 is targeted at SFF systems, so it has slightly lower power characteristics and it comes in a 22x22mm package instead of the 25x25mm packaging used on the other chipsets. Interestingly, even UM77 is 25x25mm, and that seems like the perfect chipset to have a smaller footprint.
  • JarredWalton - Thursday, May 31, 2012 - link

    Oh, nevermind: it's there. You just need to look for it.
  • ZeDestructor - Thursday, May 31, 2012 - link

    Still no price info though, I had to go Google it up :(

    QS77 is USD53 vs QM77 which is USD48.

    To me eating 5$ worth of profit to gain some battery life is worth it, and also allows for smaller board designs. Once you factor in economies of scale, I don't see why the QS77 can't lose at least a further USD2 if the QM77 is cut out entirely and become even better value.

    Source: http://laptoport.com/2011/12/28/intel-will-unveil-...

    Its a good chart, just needs a price line ;)
  • JarredWalton - Thursday, May 31, 2012 - link

    Intel didn't release pricing information, so I'd have to go to other sources to see what Intel has said in the past. Of course, it's basically meaningless to talk about chipset prices as the only people buying chipsets are the OEMs and motherboard manufacturers. Sadly, those are the same folks that will save $10 in most cases to ship a crappy LCD in place of something much better.
  • thunderising - Thursday, May 31, 2012 - link

    Wow, even with HD4000, IVB dual cores perform terribly in games.

    And even now, I'm pretty sure these are priced wayy above the old and new Llano Parts.

    I give my vote to AMD's Trinity this round
  • JarredWalton - Thursday, May 31, 2012 - link

    Don't jump ahead of the data: Dual-core Ivy Bridge ULV parts have generally weak performance in graphics. In a 17W TDP, it's very likely the maximum GPU clocks are not coming anywhere near 1150MHz on some titles. I suspect the standard voltage parts will be withing 5-10% of the quad-core parts for HD 4000 performance.

    As for AMD, their ULV/LV parts take some major GPU performance cuts as well. The A10-4655M will have GPU clocks that are 30% lower than the A10-4600M, while the A6-4455M has lower clocks and half as many Radeon cores. Given how far the A6 is cut back to fit in a 17W TDP, I don't expect it to fare much better on games.
  • ananduser - Thursday, May 31, 2012 - link

    Well let's wait for a review first shall we ? :)
  • JarredWalton - Thursday, May 31, 2012 - link

    Sure, before we make any final statements, but based on the clocks and core counts we can certainly make a pretty good guess of where Trinity will rank with LV/ULV configurations. It's sort of like Brazos in that aspect, which is to say cutting the GPU or CPU down that far isn't going to help performance. On the compute side, though, funny things happen and compute doesn't always scale directly with core counts and clock speeds. Heck, look at the i5-3470 review from Anand: HD 2500 is sometimes 1/3 the performance of HD 4000. Ouch.

    Anyway, let's make some wagers. My guess on A6-4455M is that it will be about half as fast as the A10-4600M in games, while delivering about 80% of the single-threaded performance and less than half of the multi-threaded performance. The A10-4655M should do better, but it will probably still be 20-25% slower than A10-4600M for games, maybe 90% of the single-threaded performance, and 80% of the multi-threaded performance.
  • ananduser - Thursday, May 31, 2012 - link

    I'm going to chicken out, out of this one :)
  • Hector2 - Thursday, May 31, 2012 - link

    Jarred is spot on. There's a mobility review on another tech website that shows how badly Llano's performance dives when fitting it into the lower TDP platform as compared to IVB.
  • ssiu - Thursday, May 31, 2012 - link

    So what is the holdup in getting 17W mobility Trinity review? Didn't AMD announce the 17W part at the same time as the 35W part? (Unlike Intel who announced the quad Ivy first, and 17W part was under NDA until now.)
  • JarredWalton - Thursday, May 31, 2012 - link

    Getting hardware. No one is shipping the LV/ULV Trinity stuff yet, and AMD didn't send out a prototype with one or both of those chips. They may not be under NDA, but they're not available for purchase anywhere that I know of.
  • R3MF - Thursday, May 31, 2012 - link

    "Given the 17W and 25W TDP on the A6-4455M and A10-4655M, they could easily fit in similar sized laptops (e.g. HP’s “Sleekbooks”). "

    Unfortunately HP are being utter cretins and only offering Trinity in the 15.6" Sleekbook, and not the 14" version.

    WHY!
  • Spunjji - Thursday, May 31, 2012 - link

    Because Intel, duh. :/
  • R3MF - Thursday, May 31, 2012 - link

    i'm not sure that logic applies.

    they allow the 15.6" Sleekbook to roll with an AMD APU.........................

    but why not the 14" Envy 4!
  • kallogan - Thursday, May 31, 2012 - link

    So ULV gpu base clocks are half of regular core i7 HD 4000. Not surprising they are not performing very well.

    On the cpu side it's pretty awesome. I mean Max dual core turbo is 3,0 ghz for the highest ulv cpu. I guess it's possible to run on turbo forever if the cooling system allows it. Too bad it's only BGA.

    I don't know if there is 17 inches notebooks with ulv on board. Saw some 17 inch sitting at 10-15W while idling even with regular core i5. Would love a low power 17 inch.
  • JarredWalton - Thursday, May 31, 2012 - link

    The HD 4000 ULV clocks are interesting. Base clocks are very low, but maximum clocks are quite high. WIth better cooling and configurable TDP (e.g. TDP Up or whatever it's called), it's possible there will be Ultrabooks that manage to get within 10% of the quad-core HD 4000 for graphics performance. However, Intel is only guaranteeing a rather low 350MHz iGPU clock, so in practice I bet average gaming clocks will be in the 700-900MHz range.
  • IntelUser2000 - Thursday, May 31, 2012 - link

    If there's one thing Apple is clearly ahead is managing thermals. Macbook Air's graphics performance is pretty darn close to top performance.
  • mikk - Thursday, May 31, 2012 - link

    Why you don't record the frequency used in games with gpu-z? This would be interesting. And it would be also interesting to see how it performs with a disabled cpu turbo to give more headroom for the iGPU.
  • JarredWalton - Thursday, May 31, 2012 - link

    GPU-Z doesn't detect HD 3000/4000 frequency; in fact, I'm not sure anyone has a utility that correctly reports HD 4000 core clocks. If I'm wrong, please let me know as I'd love to be able to do a FRAPS run and log the iGPU clocks! If you know of one, please post and/or email me.
  • mikk - Thursday, May 31, 2012 - link

    JarredWalton: "GPU-Z doesn't detect HD 3000/4000 frequency; in fact, I'm not sure anyone has a utility that correctly reports HD 4000 core clocks. If I'm wrong, please let me know as I'd love to be able to do a FRAPS run and log the iGPU clocks! If you know of one, please post and/or email me"

    You are wrong, gpu-z 0.6.2 fully supports the HD4000, you can "log to file" the frequency. I have tried it myself on a Desktop HD4000. It's interesting. Due to some driver issues some games did not run with the max turbo frequency ony my Desktop 77W model. It's fixed with driver build 2752. There are several turbo steps between base 350/650 and max turbo 1150 Mhz.
  • JarredWalton - Thursday, May 31, 2012 - link

    Going to go check now... I think I might have been running an older version (6.0?) I'll update the article when I have some details. Thanks for the heads up -- GPU-Z has failed to provide any useful information on HD 3000 for so long that I never noticed anything had changed! :-)
  • JarredWalton - Friday, June 1, 2012 - link

    I've posted a follow up article, in case you don't see it over in Pipeline:

    http://www.anandtech.com/show/5878/mobile-ivy-brid...
  • vegemeister - Wednesday, June 6, 2012 - link

    CPU idle power doesn't mean much when you have to drive the backlight for a 17" screen. Smaller screens are more portable anyhow, and look better given the same resolution.
  • sonofsanta - Thursday, May 31, 2012 - link

    So are stardates always one month ahead of the real date, even if that stardate doesn't technically exist on the Gregorian calendar? ;)

    (I'm guessing you meant May 31...)
  • JarredWalton - Thursday, May 31, 2012 - link

    Oh crap! And there aren't even 31 days in June! Hahahaha.... wrote that too late at night after a long day of testing/writing! But of course, actual Star Trek Stardates are never expressed with a month and year:
    http://en.wikipedia.org/wiki/Stardate
  • mschira - Thursday, May 31, 2012 - link

    I am amazed how Intel professionals fail to realize that if a laptop does not have VGA out not even with a converter (note apple has VGA out with the right converter) it's useless for presentations.
    This may be different for other people but for me one of the most important jobs of a laptop is to run a presentation at conferences and this sort.

    I have yet find a conference location where you could hook up your laptop via display port. Or even DVI. They simply don't exist.

    -> no VGA = useless.
    M.
  • JarredWalton - Thursday, May 31, 2012 - link

    HDMI? I've seen plenty of HDMI projectors at least. But this is a prototype, and there will undoubtedly be Ultrabooks with VGA outputs (possibly via a converter). Anyway, Ultrabooks are a specialized market, so I don't expect most of them to target business users that need VGA outputs. Just because some people find VGA indispensable doesn't mean there aren't many others who wish the connector would just die already. It's basically just hanging around for legacy purposes, sort of like PS2 mice/keyboard connectors. In five more years I hope to be rid of all the old style connections on the majority of products (with those who absolutely need them catered to by niche products).
  • A5 - Thursday, May 31, 2012 - link

    Don't buy an ultrabook if that's really important to you then. HP, Dell, etc will all sell your company "professional" laptops that have all of that legacy functionality.
  • Hector2 - Thursday, May 31, 2012 - link

    You realize, of course, that Intel doesn't sell or make laptops --- they make chips. The same ones that are inside Apple's Macs. And I'm pretty sure that there are a lot of engineers at Intel that use laptops for projecting presentations. For that matter, I'm pretty sure that their counterparts at HP, Asus, Lenovo, Dell, etc, do too. :-)
  • name99 - Thursday, May 31, 2012 - link

    And why can't people buying ultrabooks use exactly the same convertor that Apple customers use?

    This seems to me a winge exactly along the lines of "OMG they don't have a floppy slot anymore --- AND no parallel port".
  • mschira - Thursday, May 31, 2012 - link

    Because if the chipset doesn't support VGA a simple adapter won't work.

    The chipset targeted for ultrabooks does not support VGA.

    Ultrabooks are perfect for presentations or conferences. They are light so it is easy to carry them around all day. But if they don't have any VGA connectivity they are useless.
    M.
  • Ryan Smith - Thursday, May 31, 2012 - link

    If they use DisplayPort a simple adapter will work. DP->VGA adapters are all active devices that require nothing on the part of the source device. The only purpose of having on-board VGA is to have an on-board VGA port, since you can't do mini-DVI or HDMI to VGA in the first place.
  • mschira - Thursday, May 31, 2012 - link

    for serial port you can get a very simple USB to serial adapter, They are active converters but the logic is so small it fits in the plug. That doesn't work for VGA.
    Floppys and CD-ROM take a lot of space. (and work perfectly external).
    One could accept a passive VGA adapter (like apple) but NO VGA is just a killer.
  • IntelUser2000 - Thursday, May 31, 2012 - link

    The low performance being attributed to drivers in Civ V just might be the most accurate one. Here's a statement from RWT's Ivy Bridge article.

    "Intel is planning to reduce the driver overhead to comparable levels when measured in CPU cycles per draw call."

    So for current Intel drivers the CPU overhead for a draw call is High. High number of objects in Civ V means there are lots of draw calls are happening. The overhead might be in acceptable range for most other games, but maybe not for Civ V.

    For high TDP chips, high number of draw calls and high overhead causes it to be CPU bound, and the GPU isn't being utilized.
  • JarredWalton - Thursday, May 31, 2012 - link

    I have a bunch of other inside information on Civ5, but I was told it's "strictly confidential" so I didn't feel I could discuss it. Basically, drivers are part of the problem. The rest, well, let's just say that the way Civ5 does some things is sort of a pathological worst-case scenario for HD 3000/4000.
  • IntelUser2000 - Thursday, May 31, 2012 - link

    It's not just Civ 5, it applies to Total War as well, which is another RTS with LOTS of units on the battle, and I think to a much less extent even Starcraft 2.
  • JarredWalton - Sunday, June 3, 2012 - link

    I think TWS2 and SC2 both use instancing, which can reduce some operations that would otherwise incur CPU overhead. If you look at our performance results for TWS2 and SC2 with HD 4000, they're not quite as bad as Civ5 relative to other games:

    http://www.anandtech.com/show/5772/mobile-ivy-brid...

    HD 4000 Quad-core is 2/3 of Llano 6620G performance in TWS2, and it's actually slightly faster in SC2. Civ5 on the other hand, Llano is 70% faster. Interestingly, on Trinity, those three titles do appear to be some of the worst on HD 4000. (http://www.anandtech.com/show/5831/amd-trinity-rev... I'm not sure what the cause is for the dramatic change in SC2, other than perhaps the CPU performance improvements in Trinity really help. Llano is almost certainly CPU limited in SC2, even with HD 6620G.
  • Oatmeal25 - Thursday, May 31, 2012 - link

    Shift+End.

    Ultrabooks are almost there (for anyone doing more than web.) They just need to take fewer shortcuts with screen, GPU and storage and put less emphasis on CPU. Consistent build quality and lower prices wouldn't hurt either.

    Wish Intel would work harder on their Integrated GPUs. I have an HD3000 in my Lenovo Y570 and when it's in use (also has a GT 555M with Optimus switching) dragging windows in Win7 is choppy.
  • JarredWalton - Thursday, May 31, 2012 - link

    But there's no "End" key, which is why I list the Fn+Right key combination. Just using Shift+Right, or Control+Shift+Right doesn't trigger the issue I experienced much if at all; it's when I have to hit a lot of keys that it gets iffy. Like Fn+Control+Shift+Right to do "select to end of document" frequently ends up with the Control key registered as pressed when I'm done. So then I have to tap it just to let the OS know I've released the key.
  • mikk - Thursday, May 31, 2012 - link

    JarredWalton: "The HD 4000 ULV clocks are interesting. Base clocks are very low, but maximum clocks are quite high. WIth better cooling and configurable TDP (e.g. TDP Up or whatever it's called), it's possible there will be Ultrabooks that manage to get within 10% of the quad-core HD 4000 for graphics performance. However, Intel is only guaranteeing a rather low 350MHz iGPU clock, so in practice I bet average gaming clocks will be in the 700-900MHz range"

    Why you don't record the frequency used in games with gpu-z? This would be interesting. And it would be also interesting to see how it performs with a disabled cpu turbo to give more headroom for the iGPU.
  • JarredWalton - Thursday, May 31, 2012 - link

    Working on it (see above). I'll update the article when I have some results.
  • thejoelhansen - Thursday, May 31, 2012 - link

    I see that next gen quad core part, 3720QM (2.6 GHZ), in the mix. However, there wasn't a quad core from last gen. Any chance of an update with a 26xx - 27xx part?

    I realize the article was more about the ULV and new dual core IVB chips for ultra books, but I'm kinda curious how all these new duals stack up to last gen's quads. Might be interesting... ?

    Anyway, thanks for the well written and documented article and benchmarks (as always). :)
  • JarredWalton - Thursday, May 31, 2012 - link

    There's always Mobile Bench. Here's the comparison you're after:
    http://www.anandtech.com/bench/Product/608?vs=327
  • Yangorang - Thursday, May 31, 2012 - link

    So do you guys know what kind of frequencies the GPU was running at while benchmarking games? I am curious as to whether better cooling / lower ambient temps could actually net you signaficantly better or worse framerates.
  • name99 - Thursday, May 31, 2012 - link

    "There is an unused mini-PCIe slot just above the SSD, which might also support mSATA"

    The last time we went through this (with comments complaining about companies using their own SSD connectors, not mSATA) the informed conclusion seemed to be that mSATA was, at least right now, a "proto-spec" --- a nice idea that was not actually well-defined enough to translate into real, inter-compatible, products. The wikipedia section on mSATA, while not exactly clear, seems to confirm this impression.

    So what's up here? Is mSATA a real (as in, I can go buy an mSATA drive from A, slot it into an mSATA slot from B, and have it work)? If not, then why bother with speculation about whether slots do or don't support it?
  • JarredWalton - Thursday, May 31, 2012 - link

    It was more a thought along the lines of: "If this were a retail laptop, instead of an SSD they could use and HDD and put an mSATA caching drive right here." Can you buy mSATA drives and use them in different laptops? I don't know -- Apple and ASUS for sure have incompatible "gumstick drive" connections. I was under the impression that mSATA was a standard but apparently it's not very strict if that's the case. It would benefit the drive makers to all agree on something, though, as right now they might have to end up making several different SSD models if they want to support MBA, Zenbook, other mSATA, etc.
  • Penti - Friday, June 1, 2012 - link

    mSATA is a standard, that Apple and Asus don't use. You can normally use the mSATA slot for a retail mSATA SSD or any conforming product. Some mSATA-slots can also support PCIe obviously, but they should be few in todays laptops. Lenovos, HPs and DELL should be just fine running a real SSD instead of a cache drive there. Just look at what other users have done on the same model to be sure. They only appear to be stupid about it mSATA SSD + HDD is an almost perfect solution. mSATA is specified by SATA-IO in SATA 3.1 specifications and an earlier JEDEC specification specifies the mechanical design i.e. same size as a normal mini PCIe card. Not all mSATA slots will have PCIe though and not all mini PCIe slots will support SATA, you really have to know before hand, it requires some additional circuitry to have a switchable/multisignal slot. It's not like it is costly for Asus and Apple to order their custom designs. You can question Asus decision though. Apple made theirs before mSATA had made any headway. It's not like 256GB mSATA SSDs aren't around. Sandisk (that Asus uses) have theirs in various form-factors though, both custom and standardized, but including mSATA sandisk.com/business-solutions/ssd/form-factor-development It's they who finance and produces different boards for different customers, as it's fairly easy PCB's and they know the electrical requirements already it's not a big deal.
  • pityme - Thursday, May 31, 2012 - link

    Jarred,

    How does Ivy Bridge compare for CAD products specifically SolidWorks?

    Thanks
  • Frenetic Pony - Friday, June 1, 2012 - link

    Which is disappointing, because that was initially the laptop I was hoping to replace my dinky, falling apart little netbook with. But the worst battery life and performance of the lot, Intel or AMD, means I'm definitely avoiding that stupid thing.
  • Death666Angel - Sunday, June 3, 2012 - link

    You do realize that it has the smallest battery out of the bunch as well? It is also the only 11.6" laptop I think. The 13" pendants will have more standard 50Whr batteries.
  • lootmaster - Monday, June 4, 2012 - link

    What kind of a performance gap is there between the mobile and desktop lines? Is a Sandy Bridge Mobile i7 more like a desktop i5 or i3? Couldn't find a good answer online.

Log in

Don't have an account? Sign up now