Comments Locked

223 Comments

Back to Article

  • tipoo - Wednesday, December 8, 2010 - link

    What percentage of the transistors on die are dedicated to the GPU? And how much of the TDP comes from it?
  • tipoo - Wednesday, December 8, 2010 - link

    Oh, and can we ever expect OpenCL on SB graphics?
  • HibyPrime1 - Wednesday, December 8, 2010 - link

    OpenCL is definitely an important one.

    As a more broad question with the same intentions, do they plan to put a more comprehensive focus on GPU drivers now that they have a focus on GPU performance?
  • ltcommanderdata - Thursday, December 9, 2010 - link

    As well, what degree of benefit do they foresee in OpenCL due to the IGP and CPU sharing L3 cache which should greatly reduce the latency and increase the bandwidth of data transfers between the CPU and IGP compared to copying data back and forth using a crossbar as in AMD's Fusion or over FSB or PCIe as in the case for nVidia IGPs or for discrete GPUs?

    Since Intel is working on OpenCL 1.1 drivers for both CPU and IGP, will we be able to see dynamic loading, where the drivers will automatically load balance between the CPU and IGP? Again the shared L3 cache should help here. And will this dynamic load balancing also extend to any discrete GPUs that are attached?

    Can the IGP stay active even if a discrete GPU is connected? The ideal use-case for a game would be DirectX/OpenGL on a discrete GPU, OpenCL for say physics on the IGP, and the CPU doing it's standard processing and maybe helping out with OpenCL as needed. Rather than the IGP being permanently disabled if a discrete GPU is connected.

    And finally, will OpenCL drivers also be made available for Arrandale's IGP? It was once reported that DirectCompute drivers would be coming for Arrandale's IGP this year, which seems increasingly unlikely now.
  • billythefisherman - Wednesday, January 5, 2011 - link

    Ok so the chap said we can't use the IGP at the same time as a discreet graphics card but he said in the future we will be able to. Does anybody have any idea whether we're talking about Ivy Bridge or new motherboards or motherboard firmware updates here?

    Surely the motherboard has to signal to SB to turn off the IGP so if you wanted to use it purely for physics the motherboard simply tells SB it hasn't got one plugged in.

    Ok so I'm assuming its not this striaght forward but looking at Intels block diagrams the IGP looks like its plugged directly into the ring bus so surely it must be able to access memory through L3 and be able to process that memory at the same time as the other cores including the discrete graphics?

    This would seem to be a *massive* missed oportunity if this can't be achieved through a motherboard firmware update as it really could of provided massive processing power to support the CPU - who cares about AVX when you have a GPGPU on a ring bus connected to you!
  • mlavacot - Wednesday, January 19, 2011 - link

    Hi Everyone - Sorry for the delay on this post. I actually started with some later posts so please look through those for a lot more details on various topics. Here is a quick update since the broadcast.

    We do not support OpenCL acceleration on the graphics portion of the Processor for Sandy Bridge. OpenCL would be handled by the CPU.

    The Sandy Bridge parts release so far are not intended to replace the existing X58, i7-9xx desktop platforms, but you can assume we will introduce a platform that will.

    To overclock the CPU of a ‘K’ SKU part, you must use the P67 PCH. To overclock the graphics portion of the ‘K’ SKUs, you must use the H67 PCH.

    You can use both the integrated processor graphics and discrete card graphics at the same time as multi-monitor. In this configuration, you can also take advantage of the Quick Sync capability. It you only plug in one monitor to the discrete card, Quick Sync will not be available. We might be able to change this behavior so stay tuned.

    There are no “switchable” desktop graphics solutions today; however, some solutions are in the works. The current trend for desktop switchable is to have graphics data from the add in card transfer via the PCIe bus to the processor and then out the processor graphics port. You might be able to do a poor man’s switchable solution today by just plugging both graphics outputs to the same monitor (two cables to two different input ports of the monitor). Then you use the monitor input button to switch between solutions depending on the app that you are running.

    Thanks for the questions and watching the webcast. Mike
  • talevski - Thursday, January 6, 2011 - link

    i think that amd 880g mainbord with cpu araound 90 dolars plus some 55xx series gpu can do better in terms of encoding decoding video playback games etc. and all that without alot of money spend on inetl new socekets wich you have to trow away when they make the next cpu.So please corect me if i am wrong

    to anandtech&co
  • ltcommanderdata - Thursday, January 6, 2011 - link

    I just wanted to thank Anand and Michael for taking a crack at answering some of my questions. The OpenCL answer seemed vague on whether OpenCL will be accelerated by the IGP or just through a x86 CPU driver so hopefully that will be clarified in a blog post as promised.

    In terms of switchable graphics on desktop, Michael did bring up some very practical considerations about transferring the display signal. I do wonder more on the ability to operate the IGP as a pure GPGPU computational device leaving the discrete GPU to handle all display/graphics related functionality. This puts the IGP to use without consideration of switch-ability since both the IGP and discrete GPU will be operating simultaneously. This presumably can be achieved in Sandy Bridge with appropriate BIOS/EFI updates to not disable the IGP when a discrete GPU is plugged in assuming the IGP has sufficient GPGPU programmability and performance to make it worthwhile.
  • tipoo - Wednesday, January 5, 2011 - link

    I got so excited when you said my username and question! But come on, your Indian, you should know its pronounced "Tea - Pooh", lol. Awesome interview though!
  • Gary Key - Wednesday, December 8, 2010 - link

    1. Can I call in for the webcast and skip the HD taping session? I am not camera friendly.
  • trae32566 - Wednesday, December 8, 2010 - link

    It's ok man, I'm sure you'll do fine, I get afraid of cameras too.
  • mfenn - Wednesday, December 8, 2010 - link

    Gratz on the new(?) job Gary!
  • freezervv - Wednesday, December 8, 2010 - link

    2. What does Gary Key look like on HD camera?
  • Ryan Smith - Thursday, December 9, 2010 - link

    More or less the same as he looks in person.
  • Kensei - Saturday, December 11, 2010 - link

    Kind of like this... http://3dvision-blog.com/ces-2010-video-interview-...

    I miss not only his articles but his introductory quotes from literature.
  • MrSpadge - Thursday, December 9, 2010 - link

    "I am not camera friendly."

    Haha! Don't worry, just make some grimaces.. that'll teach this camera!

    MrS
  • videogames101 - Wednesday, December 8, 2010 - link

    When using a discreet GPU, how much power will the GPU portion of Sandy Bridge continue to use?
  • tipoo - Wednesday, December 8, 2010 - link

    And will there be switchable graphics implementations for desktops like there is for laptops?
  • Exodite - Wednesday, December 8, 2010 - link

    Seconded.

    IMO there's a good reason for AMD, Intel and Nvidia to sit down and work out a common standard for graphics switching.
  • yzkbug - Wednesday, December 8, 2010 - link

    Yes, I’m interested in this very much too. Basically, are there any limitations on Intel side that prevent NVidia from implementing Optimus on their desktop cards?
  • Stuka87 - Wednesday, December 8, 2010 - link

    I second both. How much power will the GPU use when it is not in use, and do they have plans for a standard for switchable GPU's from their 'HD' graphics and a discrete GPU.
  • Full Ctrl - Thursday, December 9, 2010 - link

    Wouldn't the switching be a bit more complicated on desktops? I'm thinking specifically of cabling: Would you need to have your monitor plugged into both the on-board (Intel) video and separately to the discrete card?

    Not that I don't think it's a great idea from a power consumption standpoint, but it sounds like it would require extra cabling.
  • Nataku - Thursday, December 9, 2010 - link

    I think they can just by-pass the discrete graphic's gpu and go straight to the ports eliminating the need for 2 cables to be plugged in... then again im just guessing

    my question is probably if this is the last socket change they will do in a long time... the current socket seems to have died a little too quickly causing some up roar
  • davmat787 - Sunday, December 19, 2010 - link

    This is where Microsoft comes in, to help the other companies work together to ratify a new mutually beneficial spec. An official spec and API for GPU switching would be great.
  • MrSpadge - Thursday, December 9, 2010 - link

    Seconded!

    If we can get Optimus-style switching, ideally with ATIs and nVidias, that would make the IGPU really valuable in terms of power savings, even for power users who need a real GPU anyway.

    And there's another point, closely linked to this one, which would be in Intels interest to push, but is not in their direct hands: My main rig feels choppy, because the GPU is crunching BOINC in the background. What's that got to do with SNB? Simply: if I could use the IGPU to drive my display it wouldn't matter what my n number of crunching ATIs or nVidias would be doing in the background - desktop, videos etc. would still be smooth. Given the increasing focus on GP-GPU such scenarios are likely to become numerous and an IGPU for free would be a nice solution, independent of when GPU scheduling, time slicing and partitioning will eventually be here.

    The discrete GPU might even use a device driver (like the Teslas) instead of a graphics driver, enabling faster access to it as co-processor. If Intel pushed MS and AMD/NV to enable such solutions more people would be inclined to upgrade to a GPU-enabled CPU.
  • jiffylube1024 - Friday, December 17, 2010 - link

    Building on that, how much power does the integrated GPU use out of the typical Sandy Bridge thermal envelope of 95W (TDP)?

    How much will power consumption/TDP go down (if at all) with integrated graphics disabled and a discrete PCI-e video card installed?
  • mlavacot - Thursday, January 20, 2011 - link

    It is difficult to put a number on how much of the TDP is reserved for the processor graphics and how much is for the CPU since they both change frequency and load depending on what they are doing. If you use a discrete card and you are not using the processor graphics at all (Examples: Desktop with add in card and nothing plugged into the processor graphics connector; or Laptop in a non-switchable configuration), the processor graphics is powered gated off.

    If the processor graphics is power gated off, it will give all of the TDP headroom of the processor to the CPU so the TDP does not change. The overall processor average power will drop when the graphics is power gated off, I just don’t have a number for you. But I can tell you that adding a discrete card will use much more power than processor graphics.
  • Filiprino - Wednesday, December 8, 2010 - link

    What will the encoding capabilities be? I have read on doom10 forums a discussion between an x264 developer and an Intel developer and I would like to know what can we expect from Sandy Bridge in respect to this topic.
    There will be only x264 acceleration or it could be capable of more generic acceleration via API calls?
  • Nehemoth - Wednesday, December 8, 2010 - link

    Maybe for another occasion you please could ask ASUS why they don't release an ADSL Modem with Wireless and Gigabits ports (4).

    A convergence router/switch/modem of this category is really desirable.

    I don't want to have a modem/wireless router and a gigabit switch for these things.

    Thank you
  • tntomek - Wednesday, December 8, 2010 - link

    The N's (i.e. N53JQ-A1)are one of the prettiest notebooks out there, I'll be pulling the trigger and voting with my $ come January and SB, I hope ASUS will offer a mid-high end product that goes beyond an expensive CPU. If one of your most expensive units offers only a N53JQ-A1 15.6" HD (1366x768) LED screen you have mental issues.
  • tntomek - Wednesday, December 8, 2010 - link

    I realize the 1080p is available in some markets, (not mine) but 900p should really be lowest res, or at least an option.
  • DanNeely - Wednesday, December 8, 2010 - link

    I agree. I can comfortably use a 125DPI 900p screen even sitting on a desk with a detached keyboard between myself and the screen; but the jump to 140DPI that comes with a 15" 1080p screen is just too small to be comfortable.
  • Marlin1975 - Wednesday, December 8, 2010 - link

    Can SB be overclocked (FSB type) or is it truly locked down as reported?

    What retail prices at launch will there be? (high end only or mid to low level?)

    Will the new supporting chipset/s support native USB3.0?

    With intels history of over promising and under achieving on the GPU side, will there be more resources to support the new Intel graphics like driver support?
  • nitrousoxide - Wednesday, December 8, 2010 - link

    SB without "K" notation is barely overclockable, with only 3MHz head-room for FSB overclocking. I think this issue has long been certain.

    The new chipset has no native USB 3.0 Support.
  • goldenstandard - Thursday, December 9, 2010 - link

    I don't think Intel will ever support USB 3.0, they wanna back their Light Peak technology, and theyre heavily invested in it. I think they're gonna offer it cheap to manufacturers and penalize them for using USB 3.0, like only intel knows how to do (think AMD)
  • ble52 - Wednesday, December 8, 2010 - link

    Is my current heatsink (for socket 1156) going to fit into new motherboards with socket 1155?
  • Catalina588 - Wednesday, January 5, 2011 - link

    Yes. They got that right.
  • gevorg - Wednesday, December 8, 2010 - link

    Can Intel confirm that Sandy Bridge chip used in Anand's preview had 12EU graphics?
  • nitrousoxide - Wednesday, December 8, 2010 - link

    How much will both CPU and GPU benefit from the newly introduced Ringbus? Do they compete for resources? Will this alleviate the lack of memory bandwidth for the GPU?
  • freezervv - Wednesday, December 8, 2010 - link

    (Seconded, questions on the change in bus)
  • Hrel - Wednesday, December 8, 2010 - link

    The Asus N53JF-XE1 is an extremely attractive laptop at $999. When can we expect to see a similar laptop except with Sandy Bridge for under a thousand bucks? Seriously, don't change anything except the CPU, maybe add some more USB 3.0 and up the GPU a little if it's in the budget. Everything about the laptop itself seems amazing for the money.

    I like that it doesn't FORCE me to buy an expensive quad core i7 just to get a halfway decent GPU. I persoanlly have very little interest in quad core, it doesn't help much if at all in games. Hyperthreaded dual core is more than enough and while it'd be nice to have it's not worth the price premium. Not even close.
  • DanNeely - Wednesday, December 8, 2010 - link

    A lot of the dual vs quad core question will come down to available clock speeds and prices. SB is supposed to push quadcore farther into the mainstream so some reasonably priced quads seem likely, which makes it a question of what the clock speed differences will be. A quad with 2 cores gated shouldn't be using any more power than a dual with both cores running, but arandale had a fairly large penalty there for the reasonably priced chips, if SB does better there might not be much reason to go dual unless you're getting a very budget system. The fact that in leaked mobile parts dual core models only outnumber quads by 2:1 vs 6:1 for clarkdale/arandale seems to imply that quads will occupy most of the slots currently taken by faster dual core chips.
  • K1rkl4nd - Wednesday, December 8, 2010 - link

    The HTPC market has been waiting for 3D-capable HDMI 1.4, but now we finding our current receivers don't like a 1.4 signal. High end Blu-ray players are offering a HDMI 1.4 connection to your TV for video and a HDMI 1.3 connection to carry audio to your standalone receiver. Is there going to be an easy way to implement this with Sandy Bridge setups, or are we going to get stuck with measly 5.1 performance through our optical cables, locking us out of DTS Master Audio unless we buy this year's flavor of receiver as well?
  • Wiggy McShades - Wednesday, December 8, 2010 - link

    In a scenario where the GPU is running full tilt and you wanted to multitask and do some other task that is memory bandwidth intensive, how much of an impact can we expect from using the GPU? So basically are the memory access requests by the cpu and gpu balanced in a situation where the combined memory bandwidth required is larger than what is currently available? Does one get precedence over the other ?
  • Catalina588 - Wednesday, January 5, 2011 - link

    The CPUs and GPUs share (compete for) level 3 cache. That's good when CPU hands physics off to GPU, but contention when two are off doing something completely different.

    My understanding is that the OS, as usual, is the traffic cop, not the chip. However, you can upclock and downclock the GPU in the BIOS. (Yes, I know that's crude).
  • GeorgeH - Wednesday, December 8, 2010 - link

    1) How long will LGA-1155 last?

    2) Why did Intel need to go with a new socket? ASRock has made an LGA-1156 P67 motherboard; are their engineers smarter than Intel's?

    3) If Intel is going to be releasing a new socket every 12 months, why should I spend lots of money on a fancy motherboard? Especially when overclocking is locked down the way it is, wouldn't it make much more sense to buy the cheapest and most disposable motherboard and CPU combo that'll still get the job done and spend the savings on quality network, audio, USB 3.0 and whatever else add-on PCIe cards that traditionally have helped differentiate high end motherboards?

    4) Will there ever be more than 4 cores on 1155, or will it live and die as nothing more than a Lynnfield/1156 speed bump?

    5) Why should I bother with LGA-1155 instead of waiting for Sandy Bridge on LGA-2011? With next-gen SSDs already rumored to be pushing the limits of SATA 6Gbps, will an LGA-1155 motherboard's PCIe lanes be completely saturated with PCIe SSDs, Light Peak cards (which will exist, right?), and GPU traffic long before the performance of the CPU itself is an issue?

    6) Why should I buy Sandy Bridge now, before we know what Bulldozer is capable of?
  • DanNeely - Wednesday, December 8, 2010 - link

    #4 is probably no. Current intel CPUs need half a channel of DDR3 per core to avoid bottlenecking; I don't see sandybridge changing that calculation.

    My gut feeling is that ivy bridge will either bring mainstreamish hex core chips via a resurrected LGA 1356 socket, or a new DDR4 socket (LGA1154?). The 2012 ETA for DDR4 would be possible, although it seems questionable that intel would launch DDR4 on a lower end platform first because the initial supply will almost certainly be tight and pricey.
  • Agamemnon_71 - Thursday, December 9, 2010 - link

    #2 Greed.

    #3 I dont buy highend mobos anymore. A total waste of money.
  • anactoraaron - Wednesday, December 8, 2010 - link

    Will the chip be able to "power gate" the gpu when it's not being used -essentially power off the gpu when a dedicated card is being used? Is that in the works for Ivy Bridge? It would be great for allowing additional "turbo" headroom for desktops and a battery saver for the notebook space.

    Oh and will the gpu have different op frequencies for notebook versions? IE clock down when only viewing web content etc. and go full power when watching HD movies?
  • dlwilliams21 - Wednesday, December 8, 2010 - link

    The new socket 1155 will support sandy bridge. Will that socket be sticking around for ivy bridge?

    How soon can we expect to see the uefi interface on asus motherboards?
  • LyCannon - Thursday, December 9, 2010 - link

    1) Bump on the UEFI! With HDD's breaking the 2TB barrier, UEFI is even more important now!

    2) Will ASUS have triple channel memory support?

    3) Will the GPU support CUDA-like software?
  • Mr Perfect - Thursday, December 9, 2010 - link

    I'd definitely like to see the EFI issue clearly spelled out. After reviewing the UEFI group's homepage, http://www.uefi.org/ , there are a few questions I still have.

    1. Is EFI/UEFI compulsory for the 6-Series motherboards, or do we have make sure to find ones with this option?

    2. Are they using EFI, or UEFI? Most people seem to use the two terms interchangeably, but the EFI spec is the older, Intel developed system, with UEFI being the name for the newer versions moving forward.

    3. It is mentioned in the UEFI group's FAQ that UEFI generally sits on top of BIOS. What does that mean in practice? How much will this speed up boot time if BIOS is still handling things like POST?
  • chaoticlusts - Wednesday, December 8, 2010 - link

    I'd like to know the marketing reasoning behind putting the more powerful on-die GPU packaged with the high end CPU's which will mainly sell to people with discreet GPU's anyway

    On a related note will systems like Hydra or Optimus work if you have a discreet GPU coupled with sandy bridge or will it have it's own way of taking advantage of the double up?
  • Nickel020 - Friday, December 10, 2010 - link

    I second this. I'd really like to hear the reasoning behing only enabling the more powerful GPU on the expensive CPUs, it doesn't make any sense to me. With these price isn't so much of a concern anymore and most people who need a performant GPU will just buy a discrete video card.

    With the cheap CPUs however price is very much a concern and the more powerful GPU would go a long way here since it might save money otherwise spent on a discrete video card. This money could instead be spent on the CPU itself. So if Intel were to offer the low-end CPU with a better GPU (for a mark-up) as well, it may make sense to spend more money on the CPU instead of byuing a cheaper CPU + video card that ends up costing the same overall. This would mean more moeny for Intel and less money for the GPU vendors, but also more options/value for the consumer.

    I guess Intel either didn't think this through or (conspiracy theory!) are purposefully letting AMD have the better CPU+GPU performance platform in the low-end. More financial problems for AMD would probably mean more anti-trust problems for Intel, so making sure AMD stays somewhat financially healthy is actually very important for Intel.
  • bah12 - Wednesday, January 5, 2011 - link

    Agreed, not only that but why can I not "have my cake and eat it too" with regards to QuickSync and overclocking. Overclocking the K's requires the P chipset which does not support QuickSync. Arguably the 2 best features of the new chip cannot be enjoyed, we are being forced to choose between an overclockable setup or the industries fasted transcoding.
  • ppokorny - Wednesday, December 8, 2010 - link

    Will ASUS have motherboards with the SAS capable southbridge?

    What slot organization can we expect? x16 slots spaced 3 apart for SLI/CF configs? x16 slots that shift to x8/x8 with PCI-e mux chips when both are populated? Will the slots covered by a dual-slot GPU be "valuable" PCI-E sockets, empty (save money) or useless PCI-32/33 slots?

    Will they have ATX, EATX? and mATX motherboard designs?

    What will be the state of the art in VRM design for high-efficiency across low to high CPU power load levels? Number of phases? Dual-, Tri-? mode cycle skipping techniques for low load levels?

    Has ASUS considered a "12V Only" motherboard? So a power supply could be more efficient by not producing -12V, 5V, 3.3V, etc (just 12V on/off and 5V standby) and the motherboard produce the various chip voltages required using efficient 12V DC/DC converters at the "point of load". Intel has a S5520WB motherboard with this option today and most "twin 1U" servers and blades use this technique.
  • DanNeely - Wednesday, December 8, 2010 - link

    Aren't hard drives typically powered off of the +5V rail? Do the server PSUs have a residual 5V output for that purpose, or is a 12V to 5V converter attached to the harddrive?
  • ppokorny - Thursday, December 9, 2010 - link

    The Intel S5520WB motherboard provides a 4-pin harddrive molex connector to power the hard drives with 5V generated from the 12V power supply.

    Most 3.5" drives draw from both 5V and 12V. Some SSD and 2.5" spinning drives draw from only 5V. 1.8" SSD require 3.3V

    A desktop power supply with modular power cables could have 12V connectors that could support plugging in either a PEG connector cable for graphics or a cable with in-line 5V and 3.3V converters (They are about the size of a postage stamp) for hard drive connectors. Wouldn't it be nice to have the flexibility of additional PCI-Express Graphics or additional hard drive cables?

    See http://www.mini-box.com/DC-DC for examples of small DC-DC converters that generate all the ATX voltages from a single 12V input.
  • BSMonitor - Wednesday, December 8, 2010 - link

    When developing the final specs of the CPU/GPU, how much influence do companies like Apple have?? As these chips are usually followed shortly by a refresh of their Macbook and Mac Pro lines.

    aka. "We'd like the GPU to put out X FPS in video encoding, minimum."

    I cannot wait for a Sandy Bridge Mac Pro 13".
  • Mathieu Bourgie - Wednesday, December 8, 2010 - link

    To Gary:

    More of a wish, but here I go anyway:

    I'm getting ready to buy a SB based laptop and I hope that we get a laptop like the ASUS U30Jc, that offers:

    - Good CPU Performance (Core i3-like is plenty enough)
    - Dedicated GPU with Optimus (Or the Radeon equivalent, if it's automatic like Optimus) for GPU acceleration.
    - Outstanding battery life
    - A nice aluminum casing (Get rid of the glossy plastic around the screen though, how about more aluminium? Make it solid please)
    - No Optical Drive (who uses these anymore? I just load my movies as a file, .ISO for disks if necessary)
    - Between 12" and 15" and light-weight (Preferably under 4lbs, the lower, the better)

    I would have bought the ASUS U30Jc, but the 1366 x 768 screen, with no upgrades available, was a deal breaker for me. It's just too small of a resolution to work with all day, so please, offer an higher resolution, 1600 x 900 or even better 1920 x 1080, at least available as an upgrade.

    I realize such a screen is more expensive for ASUS to use in their design, but I wouldn't mind paying more for it and I think that this would make for a worthy U30Jc successor.

    In sort: What I'd like is a relatively solid laptop (Bonus points for lack of glossy plastic and inclusion of brushed aluminum), with "enough" CPU Power, switchable consumer-class GPU, an outstanding battery and a resolution above 1366 x 768.

    Basically, the perfect laptop for those of us who want a workstation-like laptop to work on all day long without carrying a charger. Unlike the typical workstation-like laptop, we don't want/need an ultra-powerful quad-core CPU, nor the crazy expensive Workstation class GPU, but rather we just want to have "enough" power and an outstanding battery, on a laptop that ultra-portable and doesn't cost $2000+ ($1200-$1400 instead.)

    Somewhere halfway between a consumer laptop and the usual business laptop I guess?

    P.S. For the love of the whatever you believe in, no glossy plastic for the touchpad or places that we will obviously touch!

    Thanks!

    To Intel:
    Any chance that we'll see a CPU without an integrated GPU? Or at least the option within BIOS to entirely turn off the IGP, for those of us with a dedicated video card that don't want those extra Watts of power consumption?
  • hybrid2d4x4 - Wednesday, January 5, 2011 - link

    Seconding the good screen option on the Asus UL series laptops! 1600x900 is good for me, but if you can get me a screen that isn't glossy or overly dull (low contrast, color gamut), I'll live with 768p. No glossy plastics anywhere please! If you can do it on a netbook costing ~270 on sale (the 1001P...), you should be able to do it on a laptop under $1k (assuming similar feature set as the current version and yes, I'm perfectly willing to accept a $50-150 price hike to get a better screen). Keep using the big batteries and don't bother with bottom-of-the-barrel discrete cards- either go for midrange to higher-end or don't do it at all. The latter should be even more obvious with SB's on-die gpu.
  • freezervv - Wednesday, December 8, 2010 - link

    Ignoring exactly how quickly the PC will be outsold by mobile devices of various flavors ( http://www.computerworld.com/s/article/9199918/In_... ), it definitely does seem to represent a trend going forward (aka ubiquitous computing).

    Along the lines of Intel Wireless display ( http://www.intel.com/consumer/products/technology/... ), what are Intel's thoughts on the support of ubiquitous-style technologies in their consumer chipsets? Do they see this interface being served solely by motherboard integration (add-on chips) or does it have a place inside Intel chipsets?
  • ilkhan - Wednesday, December 8, 2010 - link

    Why couldn't they launch a week before the most fun house LAN of the year for me instead of a week after?

    Are we going to see a 6c sandy bridge on s1155?

    How much performance is lost with single channel memory?
  • Stuka87 - Wednesday, December 8, 2010 - link

    What is Intel's timeline for integrating USB 3.0 support into one of their desktop and/or mobile chipsets? Can we expect SB chipsets to have this support?
  • Casper42 - Thursday, December 9, 2010 - link

    Intel has stated a few times already that SB related Chipsets will have 2 SATA 6Gb ports but NO USB3 Natively.

    I would expect that damn near every SB board from like Asus/GigaByte/MSI/etc will have a USB3 chip added though.
  • adonn78 - Thursday, December 9, 2010 - link

    How much of a percentage in performance increase will we see in games vs the current core I7? Will it be faster than the 1366 CoreI7 chips?
  • Threekings - Thursday, December 9, 2010 - link

    I want to know if ASUS has plans to compete with Dell in the 11-13" gaming/power laptop market. Dell has only one real product in this category right now, the Alienware M11x, and I would like to see a response from ASUS.

    I would like a slightly bigger 13.3" laptop though because a 11" gaming laptop is just too small for me. There is a lot of enthusiasm for a powerful, portable laptop in 13" range on Dell's Ideastorm website.

    This suggestion notes the pros of the M11x and makes suggestions for improvements in the proposed Alienware M13x: http://www.ideastorm.com/ideaView?id=087700000000f...

    ASUS might find the suggestions in that link very useful if they intend to make a 11-13" gaming laptop. Just don't make a laptop that shouts "NEERRRRDDD!" from the rooftops. ;)
  • landerf - Thursday, December 9, 2010 - link

    Will 2011 have 4 or 8 ram slots?
  • Catalina588 - Wednesday, January 5, 2011 - link

    8 GB DDR3 DIMMs are expected by the time socket 2011 rolls out with 4 memory channels. So, 4x8 = 32 GB on a desktop. That's probably enough for 2012. Servers get more, and maybe some odd workstations would support more than 4 slots.
  • chemist1 - Thursday, December 9, 2010 - link

    What is Intel doing to future-proof its devices against SSDs that may soon saturate the 6 Gb/s SATA 3 standard? If I buy a Sandy Bridge-based computer in 2011, it would be nice if I could upgrade to a >6 Gb/s SSD in, say, 2013 and take full advantage of the performance improvement.
  • ppokorny - Thursday, December 9, 2010 - link

    Here's a thought. SAS drives have dual channels. 2x 6Gbps...
  • allingm - Thursday, December 9, 2010 - link

    How does Sandy Bridge communicate between the CPU and GPU, and how is this similar to any of the consoles?

    How will the GPU evolve in future versions of Sandy Bridge and future chips?

    Do you see Sandy Bridge as a revolution in how graphics are written for the PC?

    Does Sandy Bridge suggest a shift away from Larrabee ideals?

    Is it possible to have DirectX/OpenGL run on both the CPU and GPU seamlessly (to the programmer)?

    When is DirectX 11 support coming?

    Will Sandy Bridge enable smaller laptops?

    How will Sandy Bridge affect Asus's product line up.
  • Whizzard9992 - Thursday, December 9, 2010 - link

    There were rumors of "Turbo 2.0," but I was disappointed to see fixed turbo figures in the leaked specs.

    Will SB have the new temperature-controlled turbo? If so, how does it work? If not, when should we expect to see it?
  • Catalina588 - Wednesday, January 5, 2011 - link

    Temperature-controlled Turbo 2,0 works great. My i5-2500K stock runs Folding@Home 24x7 with 100% CPU utilization at 3.7 GHz, four bins over rated speed, with Core Temp reading of 60C. Multi-tasking, it seems to me that Turbo 2.0 does a good job of powering up (i.e., at app startup) then idling when it can. I am very pleased with Turbo, and SB overall.
  • Casper42 - Thursday, December 9, 2010 - link

    I don't care much at all about 1155.

    1) When will we see the 1366 replacement?

    2) will it be Socket B or R or what?

    3) Triple channel or quad channel memory?

    4) PCIe 3 like the server roadmap?

    5) Any chance of native USB3 for X68 considering the extra 4-6 months lead time?
  • plutopia - Thursday, December 9, 2010 - link

    What role does the SB graphics' EUs play when a dedicated GPU dedicated GPU is used? Do they just idle, or can they be turned into an APU/PPU/etc. or even, work in conjunction with a dedicated GPU?

    Or if they don't do anything in this scenario, are they turned off?
  • plutopia - Thursday, December 9, 2010 - link

    --early-morning-dozyness-corrected--
    What role do the SB graphics' EUs play when a dedicated GPU is used? Do they just idle, or can they be turned into a functional APU/PPU/etc. or else, work in conjunction with a dedicated GPU?

    Or if they don't do anything in this scenario, are they turned off to save power and heat?
  • Catalina588 - Wednesday, January 5, 2011 - link

    No, they don't morph. Yes, they power down and free up that power envelope for higher Turbo 2.0 performance by the CPUs. CPU and GPU share the thermal headroom.
  • white2011A - Thursday, December 9, 2010 - link

    how power consumption from sandy bridge new compare with previous sandy bridge? thanks
  • Venya - Thursday, December 9, 2010 - link

    What is "previous Sandy Bridge" ???
  • Venya - Thursday, December 9, 2010 - link

    Can I connect DualLink DVI monitors to Sandy Bridge integrated GPU using H67 based motherboards?
    I am waiting to upgrate my computer to Sandy Bridge and believe its integrated graphics will suit my needs as I don't play games, so I don't need discrete GPU anymore... The only open question is can I use my 30" Dell monitor with it (resolution 2560*1600).
    None of previous Intel-based motherboards was able to support dual-link dvi :-(
  • gookpwr - Thursday, December 9, 2010 - link

    First I also want to know about the UEFI BIOS and when Asus will be implementing that on their mobos?

    Also if I have a discrete gpu attached can I use the extra tdp for overclocking the cpu, and if so will that only apply to the K series CPU's?

    What is the official expected overclockability of the higher end SB chips?

    Will any SB boards have lightpeak, and if so when? If not how far away are lightpeak based boards?

    Thank you
  • prdola0 - Thursday, December 9, 2010 - link

    Hello,
    since the mobile Sandy Bridge seems to be a wonderful mobile CPU, I have a question for Asus. There is currently a dogma that small computers like netbooks have to be cheap and have weak CPU and graphics inside, slow hard drives and so on, compared to the full-sized notebooks. I wonder if, with the Sandy Bridge CPU near, Asus could introduce a small form factor mobile PC (10"), that could have a decent Sandy Bridge mobile CPU and a good Intel SSD? I am looking for a device that I could work and decently game on while on the go, but when I come to my office, that I could connect it to my bigger screen LCD and still work comfortably. Currently my only option is a device like the 1015PN (because it has matte screen and ION), which I have to upgrade to Win 7 Professional myself, have to disassemble it to insert a decent SSD, but still can't do much about the slow CPU inside. The point is that I don't really need a big screen notebook if I have a big screen at the office, but I do need the computing (and gaming) power of a notebook. A low-clocked Sandy Bridge CPU with the full 12 EPU graphics cores would be ideal for this.

    One more this for Asus: could you please make your specifications tab in the device descriptions on your website more detailes? Because if you for example just say your notebook has an SSD, I won't buy it. I will only buy it if I know what brand and type it is. This is true for all the components, not just SSDs.

    Best regards,
    Prdola
  • prdola0 - Thursday, December 9, 2010 - link

    I forgot to mention that at least 8 hours of real battery life (not idle at desktop, but for work) is a must for this type of device. I would be willing to pay as much as for a full-sized notebook for such a device and I think that there are many more people like me.

    Best regards,
    Prdola
  • ibudic1 - Thursday, December 9, 2010 - link

    Dude, get a spare battery.

    If the spare battery is too heavy start working out.

    If that is too hard turn down the brightness to 10% and get the UM version lowest clocked.
    If that does not work for you do everything by hand.
  • mindless1 - Sunday, January 9, 2011 - link

    Dude, we don't want to carry around a lot of separate pieces nor shut down to swap a battery just so some weakling can have the option of a device lighter when the anemic battery is in it.

    I too am rather annoyed at the notion that customers would rather shave a mere few cubic inches and half a kg off a device than have good runtime. I am very put off by the idea that I either need to be a slave to constantly plugging a portable device in to charge, or recall past usage and check charge level before I go anywhere with a laptop. This is not the case with my phone, MP3 player, etc.

    I propose that no portable device should need recharged within the same 8 hour work day, and consider swapping a battery the same difference because that is even MORE of a hassle because then you have to recharge 2 batteries, remembering to swap them around later to recharge both.
  • Oxyrus - Thursday, December 9, 2010 - link

    Will LGA 1155 support Ivy Bridge processors?

    Are there any plans on releasing more CPUs(Sandy B. or Ivy B.) for the 1366 socket?
  • iwodo - Thursday, December 9, 2010 - link

    Why no FMA ( Fused Multiply Add ) in Sandy Bridge? What is happening to it? Postponed to Ivy Bridge?
    Transistor Ratio between GPU, CPU and Cache?
    TDP Ratio between GPU and CPU?
    OpenCL for your GPU? OpenCL 1.1 Compatible?
    Are the GPU inside SB totally new? Any PowerVR Tech in it as you are one of the licensees?
    GPU hardware means nothing, without Decent Drivers it is nothing more then a pieces of useless Silicon. Are Intel going to do something about its Drivers? Like at least a constant update of drivers. Not a once per year event.
    Will all iGPU be 12 EU where the 6 EU version will be 12 EU with 6 EU disabled? Or will there be native 6 EU iGPU?
    Will Hardware Encoding be a fixed unit? i.e No used to X264
  • IntelUser2000 - Thursday, December 9, 2010 - link

    -You obviously never had to download Intel graphics drivers from the questions you ask. They update their drivers every 1-2 months or so.
    -Intel developers have mentioned multiple times before FMA will only appear with Haswell, the Tock after Ivy Bridge. They said adding FMA wasn't worth it in terms of die size and power efficiency.
    -I'm not sure why they'd suddenly go for PowerVR tech on their non-Atom chips since they never used it. They are starting to demonstrate that they can make decent graphics hardware in-house.
  • Catalina588 - Wednesday, January 5, 2011 - link

    Q. Will all iGPU be 12 EU where the 6 EU version will be 12 EU with 6 EU disabled? Or will there be native 6 EU iGPU?

    A. Only the 2500K and 2600K overclockable chips contain the Intel Graphics 3000 with 12 EUs. All the rest have 6 EUs.
  • marc1000 - Thursday, December 9, 2010 - link

    1. to intel: when we use a discrete card, what happens to the on-die GPU? will it turn completely off?

    2. to asus: in wich form-factors we will have boards on launch? ATX only? mATX? uATX?

    thanks,
  • Mr Perfect - Thursday, December 9, 2010 - link

    Not to come across as a grammar Nazi, but mATX and uATX are both shorthand for Micro ATX. You're referring to the tinny 170mmx170mm boards, right? They're called Mini ITX, or mITX for short.

    Hopefully Asus has some mITX boards at launch, because that's what I'm building. :)
  • marc1000 - Friday, December 10, 2010 - link

    yeah, mITX, my mistake. up to now I've only seen full ATX boards, and I prefer the small ones. Micro ATX is the best of all.
  • richough3 - Thursday, December 9, 2010 - link

    Will the ASUS motherboards ever incorporate an onboard slot for an SSD or USB flash drive to run the Express Gate software? The reason I would use Express Gate as opposed to booting a full OS off the hard drive is that I don't want everything else such as the hard drive and primary video card to power up, I just want a quick booting OS that runs on integrated video @ 1280 x 1024 minimum, so you can do basic tasks with minimal power consumption.
  • mindless1 - Sunday, January 9, 2011 - link

    Please explain why you feel that if you had an onboard slot for SSD or a USB flash socket onboard, that would magically keep your hard drive or primary video card from powering up when the PSU turns on? At present that tech does not exist in contemporary PC designs, adding a slot makes no difference, it would be the same as plugging a USB thumbdrive into the back of the board, but if you realy want it all self contained you could solder a 0.1" spaced pin header socket to the flash drive's USB contacts and plug that directly into a motherboard header.

    Presently what meets your desires the best is to own a notebook with a low power profile and a KVM to use your other kbd, mouse and monitor. This option has other virtues since a notebook is handy for mobile computing too, you can leverage tech already available and meet the goal.
  • play2learn - Thursday, December 9, 2010 - link

    Please consider a on/off button on the ouside of the laptops... Do I need to explain why?
  • ibudic1 - Thursday, December 9, 2010 - link

    yes, why?
  • Michael REMY - Thursday, December 9, 2010 - link

    hi!

    i'm working into 3d business and i care of render operations every days.
    Today, the more powerful desktop intel cpu is the core i7-980x.

    i'd like to know when a extreme or 6-core or 8-core cpu in sandy bridge will be release.

    thanks for answer.

    Best regards
  • n0b0dykn0ws - Thursday, December 9, 2010 - link

    Ask if they have fixed the 23.97 issue that Clarkdales suffer.

    The only way I will buy Sandy Bridge is if this issue has been resolved.

    n0b0dyk n0ws
  • Catalina588 - Wednesday, January 5, 2011 - link

    No, it's not fixed until Ivy Bridge. Read Anand's review.
  • n0b0dykn0ws - Thursday, December 9, 2010 - link

    Ask if they have fixed the 24 FPS issue that exists in Clarkdale.
  • n0b0dykn0ws - Thursday, December 9, 2010 - link

    For some reason my comment regarding twenty-four versus twenty-three point ninety-seven isn't staying.

    Please ask if they have fixed the issue that Clarkdale chips suffer from.
  • aeassa - Thursday, December 9, 2010 - link

    Will there be a high end version of SB that has more cores/cache and perhaps no integrated graphics?
  • Catalina588 - Wednesday, January 5, 2011 - link

    Yes, late 2011 using Socket 2011.
  • James5mith - Thursday, December 9, 2010 - link

    Will the Sandybridge platforms from Asus be UEFI natively?

    Will they have USB 3.0? How many ports? The traditional 2, or 4+?

    With Sandybridge, will there be any kind of desktop variant of "Optimus" or an equivalent. I.e. switching between on-die and discrete GPU's? Is this even possible with an Intel chipset instead of an nVidia one?

    Have there been any design headaches for motherboards with the new native SATA6Gbps/3Gbps mixed controller? Routing issues, etc.

    What kind of power phases are required by the new platform?
  • Shadowmaster625 - Thursday, December 9, 2010 - link

    Why is there a multiple channel memory controller, but no integrated SSD controller? Especially since it is almost universally accepted that the major bottleneck in most systems is the storage subsystem? By having an integrated SSD controller as part of the CPU, you allow OEMs to place a flash DIMM socket on the motherboard which gives us expandable flash memory at a fairly low cost. Or they could just solder 16-128GB of flash onto the mobo. Either way, the lower cost of flash memory without having to pay for the controller would encourage OEMs and consumers both to use flash as their primary OS storage. Then we could totally get rid of these slow lowest common denominator PC's that all developers must cater to.
  • xxxxxl - Saturday, December 11, 2010 - link

    Thumbs Up!
    I want to know this too.

    BUT with PCIe SSD coming out, i think this is one way to do it?(Since there is an integrated PCIe)
  • mindless1 - Sunday, January 9, 2011 - link

    Because you have to stop somewhere and the fact is, most PCs do not sell with SSDs installed nor can most people even run SSDs because there is not enough flash chip manufacturing capacity for so many SSDs, possibly not even if the mobile market didn't grab up the majority of chips.

    You wrote "allow OEMs to place a flash DIMM socket on the motherboard" but that is an extra slot they have to find room for, when already you could buy a PCIe SSD, and let's not forget that Intel will have to either develop this controller suitable for on-die or buy someone's IP, make the CPU die ever larger and expensive while reducing yields. It makes little sense when even then you still have a southbridge. In other words, southbridge is a more feasible place to put such a controller if there were no other obstacles.

    Soldering the memory on the board is not a good option, it ties up valuable chips in stock that may end up surplus goods sold at a loss, and puts motherboards beyond mass market appeal into an extremely expensive pricing tier.

    Consumers don't need "encouraged" to do what you want, they need to decide for themselves when to do it, plus you are seeming to imply there is a benefit when there probably is not in this day and age, it is cost vs flash chip shortages/pricing, not the performance bottleneck of SATA that is keeping mass adoption at bay.

    Further, most people are not very concerned about their HDD performance because contrary to benchmarks which seem to imply SSD is really important, real people in the real world tend to use and reuse the same apps and OS files which today are cached into gigs of main system memory.

    Further, flash controllers have been getting faster every few quarters, do you really want a multi-hunded dollar investment built into your CPU and soldered onto your motherboard only to find that 6 months later you could have had a lot higher performing regular SSD for no additional cost vs the integration you seek?
  • Itany - Thursday, December 9, 2010 - link

    I heard that the AVX excution pipeline is the combination of SSE int and fp pipeline, thus the int instruction throughput is doubled, while the fp throughput maintains the same as SSE.

    Is that true?

    If the complexity of the full width pipeline could not be overcomed, should the FMA instruction be a better way to enhance the throughput under the x64 architecture?
  • sihv - Thursday, December 9, 2010 - link

    I did not notice any other comments regarding virtualization which is surprising. Will SB CPUs and chipsets support VT-x and VT-d?

    Secondly I'd like to know if the integrated GPU is disabled when a discrete card is in use. Others have asked this too, but I don't want to disable it, I want to keep both! If virtualization support is there, I'm hoping I can set up a Xen environment with a Linux host using integrated graphics and a Windows guest with a dedicated discrete graphics card given to it via Xen's VGAPassthrough. I know this might not work yet as it's quite experimental but I'd like to at least have the possibility to try it out.
  • austonia - Thursday, December 9, 2010 - link

    Hi Anand/tech. I'm still on a Q6600 (from Q3 2007) and looking for a reason to upgrade. I had planned to hold out for hexacore, will SB deliver a reasonably priced ($300) part?

    I am most interested in the Transcode Engine on SB and how much faster it is compared to software transcode, with the task being 720p or 1080p HD videos (h264 HiP/DTS/mkv) resized to 480p (h264 SP/AAC/mp4) for use on portable devices or remote network streaming. I hope they will help integrate support into x264.

    Also, another vote here for including USB 3.0... what is the holdup?
  • Catalina588 - Wednesday, January 5, 2011 - link

    At press conference, Intel quoted 400 MB transcode in 14 seconds. Reviewers are saying its the fastest transcoder in the industry, including high-end discrete graphics cards. The QuickSync transcoder was built to do very fast work while keeping up image quality.
  • gtnx - Thursday, December 9, 2010 - link

    Regarding the "K" Series with unlocked multipliers, how important is the motherboard going to factor in OCing capabilities? We know that currently the difference between a $100 and a $300 motherboard can be huge in terms of OC potential because of the stability of the FSB. But if it's all going to depend on the chip multiplier, is a low-end motherboard going to OC just as well as a high-end one?
  • Catalina588 - Wednesday, January 5, 2011 - link

    First, pick the right chipset!

    The H67 chipsets do not support overclocking, even with a 2500K or 2600K overclockable processor. But the P67 chipset means you have to buy a discrete graphics card and lose the onchip graphics (but not, I believe, the QuickSync transcoder).

    You have limited PCIe lanes with any socket 1155 compared to socket 1366, so your mobo choice gets down to what features you're willing to pay for.

    Although it's early, the 15 SB reviews I read yesterday did not have much to bitch about regarding differences between the motherboards regarding OCing. Everybody is getting well over 4.3 GHz by pushing the multipliers on the K processors. Since I run my kit 24x7, I am more interested in long-term stability (e.g., not burning up) than absolute one-time superiority. All that says, I think you get what you pay for.
  • dougri - Friday, January 7, 2011 - link

    Jury still out on H67 and overclocking... PC Pro (UK) published a review in which they claimed to have OC'd a 2500K to 4.4GHz on an Intel H67 board with stock heatsink. Correspondence I've had with a system builder indicates it may have been an assumption in other reviews without documentation or verification from Intel (e.g. the ES SB chips could not overclock on H67, but the retail chips can). Intel's H67 product brief states it supports overclocking 'features' with unlocked chips (but that could mean only IGP). Anyway, I would not state definitively that H67 can not overclock. I mean seriously, why would intel only put HD 3000 on the unlocked chips? Do they really think people will pay an extra $100 for a step up in IGP graphics only to the performance of a $50 discrete card? I hope not.
  • sakanade - Thursday, December 9, 2010 - link

    I already own a lynnfield build and im pretty happy with it.

    I seen in one of the articles that theres a 20-35% increase performance over the 1156 platform with Sandy Bridge.

    Would you recommend me to upgrade to 1155 next year?
  • GTVic - Thursday, December 9, 2010 - link

    Will Sandy Bridge desktop motherboards have EFI replacing the BIOS this year? How soon and what will the percentages be between EFI and BIOS? Same questions for laptops.
  • GullLars - Thursday, December 9, 2010 - link

    Is there any improvement in interrupt handling over Nehalem? If so, what is the difference?
    It seems systems using high performance SSD RAID from integrated chipset controllers are IOPS limited in the 75-150.000 range by CPU interrupt handling, resulting in massive CPU usage increase when approacing the limit when the RAID is capable of scaling further. Especially when running multiple threads executing IOs with a queue.

    PS.
    AMDs K10 seems to be less effective than Nehalem at interrupt handling relating to IOPS limitations. Phenom II x6 1090T even when overclocked past 4Ghz can barely pass 100.000 4KB random read IOPS (but can do around 70.000 with one thread), while an i5 750 at stock can pass 120.000. This may also be related to ICHxR vs. SBxxx
  • mindless1 - Sunday, January 9, 2011 - link

    I'd use the word "instead", instead of "also" in your last sentence, possibly even replacing "may" with "is probably".
  • piker84 - Thursday, December 9, 2010 - link

    I am one of many who would also like to know if Sandy Bridge processors will see a noticable improvement in gaming over the current Core i5 and i7 market of processors, when combined with a high-end dedicated GPU.

    Will any significant gains be seen with, say, dual GTX 580's running in SLi?
  • mrmbmh - Thursday, December 9, 2010 - link

    when will we see Sandybride on laptops?
  • MrSpadge - Thursday, December 9, 2010 - link

    I noticed in the roadmaps that many SNB quads don't feature HT. I find this strange - if one already chooses a Quad over a highly clocked Dual, one would probably also benefit from even more threads. HT is in there and yields "performance for free" at the same clock speed. So I'd rather see the base clock turned down one step, have the voltage adjusted downwards a bit and have HT activated, while the maximum Turbo should obviously remain similar. The resulting CPU would perform similar at lightly threaded apps and be more efficient at heavily threaded ones.

    Does Intel not agree? Glad to see the Duals equipped with HT, though.

    MrS
  • ibudic1 - Thursday, December 9, 2010 - link

    1. Will we have an option to switch off SB graphics?
    2. Will we have an option to switch off discrete graphics?
    3. What will Asus offer for USB 3 since Intel can't?
    4. What is the highest official/unofficial support for memory bandwidth? (If I was to buy some additional DDR3 today, what should it be)
    5. Maybe you answered this before, but is cooling backward compatible with 1156 like it is with 2011 and 1366?
  • M-ManLA - Thursday, December 9, 2010 - link

    I would like to know about the new Sandy Bridge socket 2011. What are the plans? Will it have dual QPI links like the Socket 1366 cpu's? I heard about quad channel memory. Does that mean I can have eight RAM slots (16 on a dual CPU mobo)? Will they have a GPU built in like the 1155 CPUs? How many PCIe 2.0 (or maybe even PCIe 3.0) links will they have? How many SATA 6Gb ports will be supported. Will USB 3.0 and Lightpeak be supported in the chips.

    For Asus: What motherboard offerings will they have? Will they finally have Motherboards that have UEFI instead of BIOS? Will they finally get rid or PCI slots (I use programs like Pro Tools, where the HD cards need to have three adjacent slots for the cards). Will the motherboards have 10Gb LAN ports?

    Maybe sneak in a question about Intel's Knights Corner as well.
  • digarda - Thursday, December 9, 2010 - link

    Is it to be expected (as usual :-)) that major OEMs like Dell, HP and Co. will buy up all available SB processors for the first X months after the official launch on January 5, and - if so - what's the value of X likely to be?
  • chester457 - Thursday, December 9, 2010 - link

    My ASUS laptop w/ Sandy Bridge inside will arrive from Amazon/other internet retailer on ____________ (insert date here)?
  • awktane - Friday, December 10, 2010 - link

    90% of the questions asked are obviously not avid readers of anand... Most of the answers can be found in the sandy bridge preview....

    http://www.anandtech.com/show/3871/the-sandy-bridg...
  • Luke2.0 - Friday, December 10, 2010 - link

    The official specification of each chip, please. =)
    Both desktop and mobile ones.
    Complete with Turbo Boost multiplier and/or indicator numbers.

    Thank you.
  • Curious121345 - Friday, December 10, 2010 - link

    Can we turn off the internal GPU completely during gaming and and the external grahgics card during 2D?

    If we turn off the interal GPU completely how much extra performance would we get from the CPU? What kind of MHz increase are we then speaking?

    When will we have sandy bridge alike CPU without the GPU?
  • Shadowmaster625 - Friday, December 10, 2010 - link

    BIOSTAR G31D-M7 LGA 775 Intel Motherboard $39 at newegg. Good reviews. For just $10 more you can get a gigabyte model with over 600 reviews and a 4 star rating. Either way, you can overclock a $50 celeron or pentium and drop in a $50 video card and get FAR better gaming performance than you'll ever get for any sandy bridge combination at $150. In fact I bet you cant even get sandy bridge plus motherboard for anything less than $200. Yet for $200 you can get a $50 celeron, a $50 socket 775 mobo, and a Radeon 5750. No way in the world does it make any sense to go with the newer Intel. My question is and remains WHO IN THE WORLD WOULD EVER BUY SANDY BRIDGE? (Besides dumbed down morons who dont know what they're buying?) Socket 775 is far from dead and will be far from dead 2 years from now even though intel is stopping production at the end of 2011. My question for Anad is has Intel always worked this way? cannot recall a time where Intel screwed over consumers so badly that their newer product line is so vastly inferior to the older product line that it is actually possible to get a totally FREE Radeon 5750 just by going with the older product line. That is just pure insanity and I want some kind of answer from Intel.
  • dougri - Friday, December 10, 2010 - link

    Two benefits that I can see: 1) mobile market, and 2) encoding.
    1) For the past two years, laptops have outsold desktops. Intel, above all, is a corporation out to make money and satisfy its shareholders. Those that build their own PCs are a fraction of a fraction of the big picture. Having the GPU on die allows lower power consumption (which is marketable in the desktop segment as well for the hp and dells of the world).
    2) with the explosion of portable devices and media, transcoding is a prime use for new processors/systems. Mainstream buyers will be transcoding if the software is easy enough to use and fast. Pop in the DVD, transcode, move it to your droid and go... wireless even better.

    Not everyone is an overclocker, and Intel knows this. I doubt they expect SB to be a hit with the those targeting a $200 DIY build. They do expect it to be a hit with the millions that will be buying android/blackberry/webos tablets this year and cursing at the time it takes to encode video on their 2 yr old laptop. Intel is counting on those with specific needs to purchase SB and build their own. Other than that, SB is made for the big boys. LGA2011 will be for those willing to spend >$1k on a system.
  • mindless1 - Sunday, January 9, 2011 - link

    You are falsely presuming gaming is important to "most" people. In fact, the even slower current and past generations of Intel integrated video were the most popular in PCs sold!

    Who WOULDN'T buy SB is a better question, since it is nice to have an integrated video feature when the day comes that your primary gaming rig, to which you had a gaming card installed for gaming, has been retired to a secondary use which does not require gaming performance.

    The answer to that one is: Price sensitive customers. Decent Intel boards are going to cost a premium disproportionate to the actual performance (needs) benefit of most people. You so often see benchmarks about some high-end application but the truth is, most people don't do these kinds of tasks except rarely and when they do, it is not a race to shave a few seconds off total time doing them. Instead, customers will welcome SB as a cost cutting solution to have the integrated video but only if it actually saves money, remembering that for non-gaming you can pick up a basic PCIe video card for about $10+ after rebate, to be fairer let's call it $30 meaning the Intel platform cost must stay under $30 difference to make sense from a competitive pricing perspective.
  • rostrow - Friday, December 10, 2010 - link

    Will Asus make a motherboard similar to the ASRock P67 Transformer? That board supports an LGA-1156 socket CPU with the Intel P67 chipset.
  • dougri - Friday, December 10, 2010 - link

    There have been ramblings in the x264 development crowd that intel will open up the encoding 'parameters' enabling MUCH better encoding performance than available today (in terms of speed at very high quality). The IDF demo showed ~ 400fps for encoding to ipod resolution (with undisclosed quality settings); valuable for a laptop, but well within reach for both 2009 i7 and amd x6 cpus. What improvements to high quality transcoding have been made or are underway (i.e. encoding to 1080p at high quality settings), and what partners are involved?
  • Agamemnon_71 - Friday, December 10, 2010 - link

    If I dont want to pay extra for an useless on-die GPU do I have to turn to AMD?
    Having on-die GPU might make sense for the mobile market and people bying on a tight budget, for HTPC, or in business sollutions.
    But for the ones of us building their own higher-end systems it's a complete waste of just about everything...
  • xxxxxl - Friday, December 10, 2010 - link

    How much would a basic LGA1155 MB roughly cost?

    A little off-topic but :
    Will mobile Ivy Bridge come with 35W Quad-cores?
  • xxxxxl - Saturday, December 11, 2010 - link

    My main aim is for a Quad-core i7 Low voltage(LM at 25W) with at least 1.6GHz.
  • xxxxxl - Sunday, December 12, 2010 - link

    1. According to some websites, Sandy Bridge core is ~20mm each(true?)
    And according to http://aceshardware.freeforums.org/intel-avx-kills... , Westmere's core seems to be bigger(true?).

    2. CPU and graphics core are connected to the cache and the cache clock rate is dependant on CPU clock, so how does intel(or whoever) ensure that if the game does not use much CPU causing the CPU in turn to clock down, how does the graphics not get affected?

    Thanks!
  • CSMR - Tuesday, December 14, 2010 - link

    Are we going to get any improvements in the correctness of the drivers?
    A lot of correct DirectX / OpenGL software (Photoshop CS5, FastPictureViewer, MPC-HC) does not work completely with current Intel clarkdale/arrandale graphics.
    Are there people working on fixing the API support?
  • don^don - Wednesday, December 15, 2010 - link

    hi guys, does sandy bridge boost performance that much compared to current socket 1156? my friend is planning to buy a new midrange gaming rig this month, but i advised him to wait for sandy bridge. but after thinking for awhile, suddenly i felt that there's really no need to wait, since we don't use the discreet graphic at all. juz an i5 with probably a gtx460. should i tell my friend to get 1 now, or wait for SB? since SB will only arrive around january, and availability might even pull it back as late as march or even april.
  • xxxxxl - Wednesday, December 15, 2010 - link

    Whether for Sandy Bridge, Ivy bridge or future mobile processors, I want to know if Intel will provide a way to disable/enable HyperThreading. Intel is concentrating on the mobile graphic market, and Hyperthreading is known to probably slow games down, so if Intel provides a way to disable HT, power consumption could be reduced while also making games faster.

    Thanks!
  • xxxxxl - Thursday, December 16, 2010 - link

    One of the main attractions of SB is AVX, so i question how much chance is there of AVX being used?
  • Offwego - Friday, December 17, 2010 - link

    Sorry for repetitions:
    1. (For Intel) Does Intel have plans to add extra EUs prior to Ivy Bridge?
    2. (For Anand)What makes you the i52400 has 12 EUs? The retail part is listed with only 6!
    3. (For Anand)Do you think turbo was enabled on the graphics, and what difference would turbo make, say from 650 to 1100 for a given system? I assume it would be more complicated than the simple %.
    4. Comment: Great website!
  • xxxxxl - Saturday, December 18, 2010 - link

    1. Double the EUs with ivy bridge.
    2. Anand's ES(engineering sample) i5 2400 used 12EUs. Though the retail will only come with 6.
  • glad2meetu - Friday, December 17, 2010 - link

    I would like to say thank you to Intel for all the hard work that went into developing and manufacturing these new processors. My question is what sort of applications was Intel trying to target that take advantage of the extra 2MB of L3 cache enabled on the new 2600 products relative to the other new processors such as the 2500 products? Another question I have is does Intel see 3-D video and image processing as a critical new application in the future? Currently I see early adoption of 3-D video and image processing happening in the market place with lots of opportunity for improvement in the future as new display technology is introduced. Personally, I am very interested in the 2600 product line due to the 8MB L3 cache and multi threading.
  • xxxxxl - Saturday, December 18, 2010 - link

    I've been thinking of the impact of Turbo 2.0 on mobile, where power consumption can be crucial.
    While the increased performance is great, the increase in power consumption isn't. So i hope that there will be an option in BIOs, where users will be allowed to turn off Turbo 2.0 if they wish.

    Take for example a 45W quad core, if an 20% over TDP boost, would mean that the processor goes to 54W. Like, if i buy a 25W processor, why would i want it to go to 35W? make sense?
  • warden00 - Monday, December 20, 2010 - link

    Will Sandy Bridge offer the possibility of ECC memory on desktop or mobile variants? Optionally, is this something Intel might consider adding in future architecture upgrades?
  • smyter - Tuesday, December 21, 2010 - link

    Will the motherboards that support Sandy Bridge also support the 24 EU graphics system that will be out with Ivy Bridge?
  • RazerDan - Thursday, December 23, 2010 - link

    To Intel:
    1) Intel traditionally has not taken graphics performance seriously. A reasonable increase in performance was shown now that the GPU is tied to the CPU die. Can we expect a performance increase to be at least as good for Sandy Bridge's die shrink? Does Intel think they can eat into the sales of the low end discrete GPUs enough to remove some "families"?
    Side Note: Sandy Bridge performs favorably against the 5450, but in actuality it will need to perform well against the low end 6000 series to compete, which should release in the same time frame. Also, integrated performance in general seemed to be so poor for last generation that AMD was allowed to create a 5th family (Cedar/5450) with comparable performance to RV710, to which Sandy Bridge was compared.

    2) Does the increased interest mean we can expect to see reasonable support for video game titles in the future?

    3) How does Intel get over the memory bandwidth limitation for its GPU now that it has to share bandwidth/cache directly with the CPU cores? It appears to only have 1 stop on the ring bus in the architecture review. I can easily see Sandy Bridge stuttering any time it needs to fetch new textures, which would make many games unplayable but not hurt avg. frame rates too much. If any laptop (rumor: such as the low end mac book pros) used Sandy Bridge graphics, I would be skeptical of their performance before buying.

    To Anand:
    4) Every time a new generation of graphics cards comes out, the review seems to discuss anisotropic filtering quality. What is Sandy Bridge's filtering quality and why was it not shown in the Sandy Bridge architecture review?
  • aeyrg - Saturday, December 25, 2010 - link

    Why doesn't Intel put PCH on CPU package or die ?

    And also if they have any plants to integrate memory on CPU package and ship CPUs with 2/4/8GB/... ram ?
  • DanNeely - Sunday, January 2, 2011 - link

    RE the PCH, a combination of wiring issues (the CPU socket area is already extremely crowded with wires resulting in more EMI problems and the need for more expensive boards with thicker layers), and because there's nothing in the PCH that actaully requires the expense of a current generation process, vs one that's 2 or 3 generations old. Most of the cost of a process is the R&D/construction costs so legacy processes are almost free to use. You'll see newer processes used on mobile chipsets where shaving an extra watt or two off matters; but doing the same for desktops isn't worth spending billions on additional fabs.

    Putting dram on die is never going to happen due to size constraints. DRAM chips are already made on current generation processes, and your dimms are already sized around the chips attached to them. Trying to combine them with the CPU would result in insanely huge die sizes.
  • Out of Box Experience - Sunday, December 26, 2010 - link


    If Turbo Boost can temporarily overclock a Sandy Bridge Chip for 20 seconds or so until the temp sensors start knocking the clocks back to "Normal"

    Then, will the Turbo Boost overclock for much longer periods of time if we simply replace the stock air cooler with a Watercooled unit???
  • Out of Box Experience - Sunday, December 26, 2010 - link

    Will INTEL make drivers available for XP or is everyone now stuck with Windows 7 or Bust with Sandy Bridge in Notebooks especially???
  • stun - Tuesday, December 28, 2010 - link

    Isn't LGA 1155 lifespan going to be just a year, with the LGA 2011 coming in the 2nd half of 2011?
    So what is the point of upgrading to Core i7 Sandy Bridge CPUs?
  • Out of Box Experience - Wednesday, December 29, 2010 - link

    I guess some people can't wait another 6-8 months for a really fast 35 watt dual-core processor..

    I might be one of them
  • DanNeely - Wednesday, January 5, 2011 - link

    2011 is the LGA1366 replacement for high performance computers/servers, it's not intended to replace LGA1155's mainstream systems. That said, dual channel DD3 was only sufficient to feed a quadcore nehalem system, I don't expect that sandybridge has significantly lower memory needs so I assume that's still the case. This would require a different socket if IvyBridge brings mass market hex core chips. A resurection of LGA 1356, or a new LGA1154 (dual channel DDR4) socket would be my guesses.
  • René André Poeltl - Wednesday, December 29, 2010 - link

    I do not develop games but financial software.

    Whilst I already tried to program with opencl (I wanted to increase function calc speed), I would say that the idea of sending a string containing the opencl source to the gpu and to compile it there is not really user-friendly.

    Why not have the source compiled by the cpu as usual for a computer program, and define its execution target gpu and the factor (how many times in parallel) in the source of that computer program. That would not make much of a difference for most programmers to what they are used to and would be a implementation of gpu programming that is well integrated. That is not impossible. I know that there is a opencl.dll I can use from c, pascal. But the functions are that poorly developed that there is little if not less userfriendlyness available.

    I know that gpu is not cpu and that those two are different. But if opencl is the best what is offered then I doubt that nvidia understands what developers want.
  • will2 - Wednesday, December 29, 2010 - link

    I have followed your main articles on SB, but have seen no specs or pricing for the 2 Core/4 Thread lower TDP versions. All I see (no interest in desktops) are limited specs, neglible benchmarks for the 4 core expensive power guzzlers. With only a week to go to CES, surely more info should now be available ?

    A further question: I am looking for a 14" or 15" SB Notebook, with 900 line or more resolution AND a more photo-realistic display (perhaps like the one on the XPS15 your reviewed last October), at least 1 USB3, and hopefully 2 mini-PCIe slots that will accept a mSSD for running OS & Apps. Will SB allow this class of NB to be lighter & thinner ?
  • René André Poeltl - Thursday, December 30, 2010 - link

    Well what you want are mobo-features, not SB features.
  • will2 - Friday, December 31, 2010 - link

    ok, as now less than a week to what we were told is the official launch, when will CPU specs be available for the 18W, 25W, 35W Sandy Bridge CPUs be available ?
  • will2 - Friday, December 31, 2010 - link

    ok, as now less than a week to what we were told is the official launch, when will CPU specs be available for the 18W, 25W, 35W Sandy Bridge CPUs be available ?
  • will2 - Wednesday, December 29, 2010 - link

    As follow on to my SB post just sent: I know last few months your reviews have frequently commented on the poor quality of Notebook displays, and ocasionally you in the spec list you list the make & part no. of the LCD Panel. But it would help if you could publish more info on WHO makes the BETTER panels, & maybe which models, also publish who has made the BAD panels - as incentive to improve
  • René André Poeltl - Thursday, December 30, 2010 - link

    This is OT.
  • will2 - Friday, December 31, 2010 - link

    Agreed. It is just that Anandtech (and others) have noted the all too common poor screen quality, just my suggestion more info is given on both the 'good' and 'bad' panels, as incentive for Notebook makers to improve
  • René André Poeltl - Sunday, January 2, 2011 - link

    In that case it would have been 'good' to avoid OT posts ;-)
    There is no need to judge tft's as 'good' or 'bad' as the topic is SB.
  • realbabilu - Thursday, December 30, 2010 - link

    1. Why intel creating several chipset in just several years?
    too many choices = confusion on the client market. Please remember the cooler socket etc and upgrade ability..
    2.why intel reduce again to dual channel?
  • René André Poeltl - Thursday, December 30, 2010 - link

    1. They did care about that - but not the way you can understand it ;-)
    2. They didn't reduce again.
  • DanNeely - Sunday, January 2, 2011 - link

    Because the 95% of CPUs sold to mass market/business customers are going in computers that don't need the high end features, and 'mine's bigger than yours' collection of ports that will never be used; but those are needed to compete in enthusiast markets, saving a dollar or two per system adds up over the hundreds of millions that will be sold. Also intel (and AMD) can charge higher margins on the top end parts, while subsidizing the cheaper mass market ones.
  • mindless1 - Sunday, January 9, 2011 - link

    Certainly there are higher margins on the top end parts, but that cannot subsidize the mid to low end because that is where the bulk of sales are.
  • René André Poeltl - Monday, January 3, 2011 - link

    1. Why does innovation mean change ?

    If you want a more conservative upgrade strategy from the company that manufactured your cpu than amd would have been the better choice in the past years.
    The am2,am2+ am3 mobos were able to handle amd cpu's that were sold years afterwards.
    The six-core amd cpu (quite new) even runs on some AM2+ boards! (afaik those were sold in 2007)
  • geofelt - Saturday, January 1, 2011 - link

    For the gamer type of user, I have a couple of questions.
    1) The reason for getting a "K" model is to be able to overclock. Are the 2600K units binned better to usually permit higher overclocks than the 2500K units?
    2) Is there any reason to think that a 2500 cpu can clock to the same limits as a 2600K cpu?
    3) Are there any estimates of the performance value of the extra 2mb L3 cache in the 2600K?
    4) For the gamer who only needs 4 cores(and usually less) will there be a performance benefit to deactivating hyperthreading?
    5) Is the integrated graphics capability of any use to the gamer with a discrete graphics card? Is there any value in deactivating it?
    6) Are the required ram specs the same as for LGA1156?
    7) It appears that the P67 based motherboards will support the K cpu's and their unlocked multiplier overclocking.
    Will the H67 chipset also allow multiplier based overclocking of the K cpu's?
  • bckai2003 - Monday, January 3, 2011 - link

    I would also like to know the answers to these questions. From the benchmarks, it seems that the hyperthreading on the i7 2600k seems to be a detriment for a portion of the games tested. Of course, I would like to get the most performance for my dollar, but I'm also looking to be as futureproof as possible at the very least in terms of gaming.

    Ultimately, is the $100 premium for the i7 over the i5 worth it?
  • smilingcrow - Wednesday, January 5, 2011 - link

    Please ask the question below as it seems as if this is not possible which seems absurd.

    Will the H67 chipset also allow multiplier based overclocking of the K cpu's?
  • mlavacot - Wednesday, January 19, 2011 - link

    The P67 is required to allow multiplier based processor overclocking on the K SKUs. The H67 is required to do Graphics based overclocking on the K SKUs. It might sound crazy, but there is a reason. We are trying to address two different use models with one processor; processor overclocking for the hard core gamers (using discrete graphics) and graphics overclocking for the All-in-one or smaller form factors (that use Integrated graphics).
  • Teh_tourist - Monday, January 3, 2011 - link

    Can you please run some tests on SLI/Crossfire performance? I'm worried about purchasing two GTX580's if I'm not going to get a good performance increase due to the x8 by x8 PCI-E lanes in SLI.
  • mad_hatter - Tuesday, January 4, 2011 - link

    Will there be any i7-2xxx Xeon equivalents in the near future to replace the current socket 1156 Xeon's (which are due for a refresh)?

    While not strictly CPU related: Do you know when Intel will release the 3rd gen SSD's?
  • Piyono - Wednesday, January 5, 2011 - link

    DAWbench (dawbench.com) is a benchmark suite for testing the performance of Digital Audio Workstations (DAWs).

    Scott Chichelli of ADK Pro Audio is a DAW builder of high repute and his test results indicate that the 2300 and 2600 have trouble keeping up with the previous generation of CPUs.

    Can you offer any insight on the matter?

    Thanks,

    Piyono
  • Piyono - Wednesday, January 5, 2011 - link

    You can find Scott's (AKA jcschild) test results on the Cakewalk forums in the Computers subforum.

    This bbs system won't allow me to post a direct link, but it's easy enough to find.
  • ATOmega - Wednesday, January 5, 2011 - link

    It didn't seem to be quite clearly answered....

    So if I plug in a video card, and I use it, I can't do any OpenCL on the video portion of the CPU??

    Seems a waste. It would be nice if with OpenCL, it could use all OpenCL capable resources on a system.
  • semo - Wednesday, January 5, 2011 - link

    This was the same thing I was going to ask.

    Also why so few PCIe lanens?
  • DanNeely - Wednesday, January 5, 2011 - link

    The same reason LGA 1156 only had 16 on the CPU die. To keep costs down for the 99% of systems that are sold which don't need more.
  • semo - Wednesday, January 5, 2011 - link

    Oh please. Intel can be considered a premium brand. Also look at their bewildering array of products... with all that supposed choice, there is no option for more PCIe lanes (x58 is a dodo so don't even mention it)
  • mlavacot - Tuesday, January 18, 2011 - link

    yep
  • mlavacot - Tuesday, January 18, 2011 - link

    More lanes means bigger package, higher cost, and higher power. We have more lanes in the X58 class platform. You can also expect us to refresh that class of product.
  • mlavacot - Tuesday, January 18, 2011 - link

    Sorry about not having the answer on the webcast. Next time I will make sure to get a copy of the questions before the webcast. No OpenCL offload to the graphics section with Sandy Bridge. You can bet we are working on it though.
  • piesquared - Wednesday, January 5, 2011 - link

    Who gives a shit about sandy bridge? DRM infested junk, and anand is shamelessly showing his paycheck from intel. where's the outcry about DRM like years past? intel tasting pretty good there anand? what a fraud lol
  • DigitalFreak - Wednesday, January 5, 2011 - link

    STFU troll
  • landerf - Wednesday, January 5, 2011 - link

    What? The streaming thing? Who does that even effect. Seriously who would rent a hd stream just to rip it. Do what every other pirate does, download a torrent.
  • ricin - Wednesday, January 5, 2011 - link

    Trololol.
  • metafor - Wednesday, January 5, 2011 - link

    I'm curious about what the limitation is now with using both GPU's simultaneously for different data processing sets. Before, since the GPU resided on the motherboard and connected directly to the output driver, this was complicated to achieve.

    But it's part of the SB ring-bus now. Does SB drive the frame outputs on its own set of pins or does it send it over the bus to the north bridge? If the later, why could it not send it to the discrete GPU's framebuffer?
  • mlavacot - Tuesday, January 18, 2011 - link

    Sounds like you should be in the graphics architecture business… There are Hybrid graphics solutions for Laptops in the market today. There are various implementations being done and some are really clever. For desktop, it is a bit trickier. The standard Desktop model is adding a discrete graphics card that has its own display connector. So now you have two graphics connectors on the back of your desktop. One solution might be to use clone mode for both outputs and plug them into the same monitor using two separate cables. You would then have to change the monitor input (usually just a button on the monitor) to switch between them. Outside of that, some tricks need to be done to pass graphics information on the PCIe bus between the add in card and the Processor graphics. It can be done as we have seen in some Laptops, it just does not exist yet in Desktops. I think you will see some progress in the near future.
  • Hulk - Wednesday, January 5, 2011 - link

    didn't seem to know much about SB. I mean he seems like a very nice guy and quite intelligent but Anand seemed to know more about SB than he did. Most questions were answered with "maybe" or "I'm not sure."

    Would have been great for Anand to speak with an actual SB engineer that knows the design details inside and out.
  • mlavacot - Tuesday, January 18, 2011 - link

    Yeah, if I had known the types of questions that were coming, I could have prepare a bit better or brought in an architecture guy. Have you ever had an exam and studied the wrong material? Next time…
  • MeanBruce - Wednesday, January 5, 2011 - link

    Any new details on Sandy Bridge Enthusiast LGA-2011 platform?
  • mlavacot - Tuesday, January 18, 2011 - link

    Just not quite ready to comment.
  • Shadowmaster625 - Wednesday, January 5, 2011 - link

    Will it be possible to use all 4 cores (8 threads) to software transcode video while simultaneously using Quicksync? Are any developers working on maximizing output from both parts of the cpu?

    I dont know about anyone else, but I'd feel pretty silly if I spent $300 on a cpu just to have it sit mostly idle while it plugs away at my video. Even if a tiny little portion of the chip is built for transcoding, I'd still want to push my cpu cores to the maximum possible limit. Also, what about using the EU's to help? If you could encode a clip using x86 in 3 minutes, and if you could also encode that same clip using Quicksync in 2 minutes, then at least in theory you could use both to encode the clip in 80 seconds. And if you used the EUs, you could theoretically push that time down to around 70 seconds or so. So now we're talking about roughly doubling the output of Quicksync by fully utilizing the entire cpu.
  • mlavacot - Tuesday, January 18, 2011 - link

    Sounds logical to me, but I am not aware of that happening today. Will see if I can find anything to share.
  • ricin - Wednesday, January 5, 2011 - link

    I'm not upgrading my Q6700 until I can get one. BTW, Avinash says, "Hi!"
  • Postoasted - Thursday, January 6, 2011 - link

    Totally agree. Still using C2D 6300 on a 965P motherboard. We need AMD to put some serious pressure on Intel before we get what we really want. As it is now, Intel is just drip feeding us with these slow incremental die shrinks with goofy on die gpus which only make the NB makers happy. Can't blame them though, it's always been about the dosh.
  • mlavacot - Tuesday, January 18, 2011 - link

    Hi Avinash - No specific comments yet for unannounced products but we do have plans for the X58 replacement platform space.
  • mlavacot - Tuesday, January 18, 2011 - link

    and BTW, even last years Core i3 pretty much beats the Q6700. Get on the bus!
  • boe - Wednesday, January 5, 2011 - link

    I typically download TV shows which are often MKV files and convert them for DVD viewing when time permits.

    I'd love to see some benchmarks closer to my particular conversion needs.

    Thanks
  • mlavacot - Tuesday, January 18, 2011 - link

    We have a list of software vendors with applications compatible with Quick Sync on our website. Just search “Quick Sync” on intel.com to find them. You can then check out their products to see if they doing what you need. Remember, Quick Sync is used when you convert to h.264 formats, otherwise the processor will do the conversion.
  • blit - Wednesday, January 5, 2011 - link

    Hi
    Love AnandTech.

    Please ask your videographer to place left channel and right channel audio on center stage.
    In video / film all voice audio is center stage. You currently have one mic on left channel and one mic on right channel with lots of sound bleeding accross. This is very distracting to listen to.

    So please either center both channels if you want to retain stereo or convert it to mono.

    I mention this so I, and I am sure many others, can enjoy your truly wonderful work even more.

    Best Regards
  • triclops41 - Wednesday, January 5, 2011 - link

    I don't need a discrete gpu, just something to play starcraft 2 at my brother's house and do work without killing the battery in 4 hours.

    When will we see a 13.3" notebook with an i5 2520m?
  • Xtasy26 - Wednesday, January 5, 2011 - link

    Why does Sandy Bridge have very weak GPU performance? It can't even run Call of Duty: Black Ops with everything maxed out in HD 720P. It can't even beat AMD's $50 HD 5570. Even AMD's Fusion has a more powerful GPU than Sandy Bridge.
  • mlavacot - Tuesday, January 18, 2011 - link

    “It can't even run Call of Duty: Black Ops”??? Black Ops is extremely hard to play when you talk about integrated or Processor graphics. For that matter, I still have problems with Black Ops on my dual GPU ATI card with an i7-980 processor. Activision has had some updates but it is still really tough to play. Our gaming target was more mainstream gaming and not FPS games. It is a matter of economics. Adding more silicon costs more money. If most of the users don’t use it, they don’t want to pay for it. If you are looking for a performance bar, I can play COD MW2 competitively on the new mobile i7’s and the i7-2600K SKUs for Desktop. That said, we spent a lot of time focusing on Media performance and playback quality. I think it is better than anything available (integrated or discrete).
  • fatbaldandhappy - Wednesday, January 5, 2011 - link

    I love how you ask them to clarify their title's. "Technical Marketing Manager" Blah, blah, blah. "So what does that mean, what do you actually do?". I work in this corporate world where people often speak full paragraphs without saying anything anyone else understands. I thought it was great how you broke that down.
  • RagingDragon - Thursday, January 6, 2011 - link

    Why no VT-d (IO virtualization) on the unlocked K chips? For example the i7-2600 supports VT-d but the i7-2600K doesn't.
  • mlavacot - Tuesday, January 18, 2011 - link

    We are only targeting VT-d support on our vPro qualified processors since it mostly targets advanced business usages.
  • marraco - Thursday, January 6, 2011 - link

    Please, can the videos be subtitled? (or a transcript added to the article?)

    I understand written English, but have big difficult into hearing English.
  • marraco - Thursday, January 6, 2011 - link

    An important question:

    x86 programming is well known by programmers, because the information is public, and anybody (prepared) can write a compilator.

    But most info on GPUs had being keep secret. That's why open source drivers for nVidia and ATI are much worse performers than the privative, closed ones.

    So, since Sandy Bridge GPU is part of the same chip, the million dollar question is:

    Will the information be widely available? Will it be as open as it is on CPU?

    I think that it makes nonsense to integrate the GPU on processor, and keep the old GPU practices.

    No portion of the chip should be keep hidden to the programmer, if Intel wants for it to be successful.

    It makes nonsense to be able to access freely some portions of the chip, and being forced to access the other only under an API.

    On other side, an API may enhance the portability to other architectures, and maybe Intel don't want to be constrained to keep compatibility with his present architecture, as is constrained to maintain obsolete x86 instructions to preserve compatibility.

    Also, opening the GPU architecture would enable nVidia and AMD to take advantage of it to enhance discrete cards performance, by allowing his drivers to run instructions on GPU processors. But Intel may prefer to hinder performance for competitor’s software.

    But if Intel keeps information away from programmers, then it makes nonsense to integrate the CPU. It's just a non upgradeable GPU wasting the thermal and power resources on the procesor. An inferior solution compared to discrete GPU and pure CPU.
  • mlavacot - Tuesday, January 18, 2011 - link

    Not as much as you want, at least not yet. You can assume we will start with API calls via Direct X, OpenGL and Open CL and then move to lower levels from there. Interesting thought on an open GPU architecture, but that is where the graphics chip vendors have their secret sauce. I think it will have to be abstracted to an API level for some time to get commonality.
  • marraco - Thursday, January 6, 2011 - link

    A GPU driver is a software layer made by the manufacturer, but for software to rely on it, it needs to preserve compatibility between generations.

    CPU generally don't depend on drivers (up to some extent), because CPU programmation is made by the operative system maker.

    There is any plan to make a standard common to Intel and AMD? GPGPU programming should be done by OS writters, as is donne by CPU. Otherwise Intel will find itself burdened by maintaining increasingly complex OS tasks. Are the plans to limit GPU programming and management to DirectX/OpenCL APIs?
  • CreativeStandard - Thursday, January 6, 2011 - link

    When are you going to have time to update the GPU bench with Intel's new built in GPU? May serve as a good baseline for the cards.
  • will2 - Thursday, January 6, 2011 - link

    I did ask in your Q&A on SB last week there was a dearth of info on specs, benchmarks, and release date for the 25W and 17 Watt TDP versions - but not answered, and today still see no answers to any of those things. Can you now supply whatever info you have on the specs, benchmarks and availability dates. The i7 2629M (25W) and i5 2537M (17W) are of particular interest, but any info on the Notebook LV & ULVs is of interest. Even something as basic as what speed of DDR3 - 1333 or 1600, should be well known by now.

    The only other thing I noticed sine my post last week, is a reference in 'theinquirer' or 'theregister' & elsewhere, to 'Sandy Bridge sucks in Hollywood DRM' - but few details. Interested to know the scope of the DRM built into the hardware and how pervasive its effect on usability of a pc built with SB
  • mlavacot - Tuesday, January 18, 2011 - link

    We do not comment on products that have not been released, so I recommend you go to intel.com for answers that are available. There is a lot if you dig around. Sorry.
  • inaphasia - Friday, January 7, 2011 - link

    You should have grilled him on the USB 3.0 question. And when did Intel wave their hand saying "you don't NEED USB 3.0 just yet", making a lot of people here actually believe it? I missed that part.

    Don't get me wrong, I don't need the speed either... But the power would be nice.
  • mlavacot - Tuesday, January 18, 2011 - link

    We would have loved to have USB 3.0 integrated into this processor, we just could not get it done in time. We will get there, just not yet. BTW, OEMs can integrate a USB 3.0 controller on their boards as a separate feature, it just costs a few bucks more and takes up some extra board space.
  • CSMR - Friday, January 7, 2011 - link

    The major problem for me with Intel HD graphics is driver quality. It's abysmal, to the extend that half the things that I try that are supposed to work don't.

    You mentioned MPC-HC does not work. For Clarkdale/Arrandale this was a problem with Intel not supporting DXVA using standard methods. I presume this is still a problem on Intel's side. You say "It's an issue with MPC-HC and not properly detecting SNB": could you elaborate on this? In Clarkdale/Arrandale the acceleration mechanism wasn't even documented, and it required an Intel engineer to work with MPC-HC in his spare time (without success).

    Could you test compatability with Photoshop CS5 too? Clarkdale/Arrandale support is described as "basic" by the application.

    Also color management does not work in Clarkdale/Arrandale; the LUT is perpetually reset.

    Also FastPictureViewer does not work on Clarkdale/Arrandale, despite working correctly in the reference DX implementation. There is corruption in rendering images.

    Are they going to make any progress with Sandy Bridge? Unfortunately it's hard to tell as reviewers are obsessed with games performance exclusively.

    It is possible to ask them about this?
  • ProDigit - Friday, January 7, 2011 - link

    I hope intel sees this comment,
    it would have been nice to allow both graphic plugs to be working,
    eg:
    on the integrated graphics card, I will have windows open, messenger, excell, internet, on a primary monitor
    on the discrete I could play a game on the secondary monitor

    or,
    on the discrete I could play a game, while on the integrated I could have a dashboard of the game open, like inventory tab, hull dashboard, or backpack on first shooter games...

    or,
    I had hoped that it was at least possible to use dual screen monitors, for office work.

    It's a pitty one of the graphics disable, because not in all cases does the user want to switch between them.
    It'd be nice to have a driver update that would allow dual monitor setups like this with a discrete graphics card.
  • mindless1 - Sunday, January 9, 2011 - link

    What gaming video card would you use that doesn't already support dual monitors, some even more? I see no point in these three scenarios for the integrated video to be operational. If on the other hand you wanted to simultaneously use 4 monitors or more...
  • mlavacot - Tuesday, January 18, 2011 - link

    Okay, I have some good news and some bad news. We actually do support the use of both Processor graphics and discrete graphics at the same time with the new Sandy Bridge processors. This is with respect to having two displays working at the same time. Also, in this configuration, you can use the Quick Sync feature that allows you to transcode video REALLY fast. The bad news is if you only connect one monitor to the discrete and nothing to the Processor graphics, you cannot use the Quick Sync feature. We are working on a solution to offer this capability as well.

Log in

Don't have an account? Sign up now