Comments Locked

101 Comments

Back to Article

  • tviceman - Wednesday, October 21, 2015 - link

    It blows my mind that they didn't go with 2gb vram, and it's even more of a travesty they didn't go with a downclocked GM107 GPU, a la GTX 950m. It shouldn't have been too hard since it'd be the only component requiring active cooling.

    A great concept, a shame that it wasn't fleshed out better. Perhaps when Pascal hits, Microsoft can offer new keyboards with an upgraded GPU that works the the existing Surface Book. Upgrading a notebook/tablet would be pretty sweet way to ride out the product cycle of for both consumers and Microsoft.
  • tipoo - Wednesday, October 21, 2015 - link

    They're probably already at their thermal limits, and GDDR5 is a power hog. Unfortunate, but this sort of GPU power in an ultrabook is still impressive.
  • ImSpartacus - Wednesday, October 21, 2015 - link

    Sounds like a job for hbm... :3
  • DanNeely - Wednesday, October 21, 2015 - link

    Maybe, but the 850M was a full 40W GM107 GDDR5 part. The difference in power brick sizes (36 vs 65W) suggests a 30W TDP for the GPU, but as heavily cut down as this die is I wonder if its actually a bit less than that.
  • ImSpartacus - Wednesday, October 21, 2015 - link

    You're talking about the part in this year's model?

    Since it was branded ad a custom part, I doubt it is a partially disabled die harvest. It looks like a gm108 with a "custom" gddr5 setup, as proposed in the article.

    If I'm interpreting everything correctly, a cut down gm107 wouldn't need any "custom" work. So there would be no need to call it a custom part.
  • Visual - Thursday, October 22, 2015 - link

    I wouldn't call it impressive. I'd call it useless waste of space and power.
    An Iris chip (especially GT4e) can likely beat that performance, and save total power as it does too.
  • vithrell - Thursday, October 22, 2015 - link

    Yes, but you cant detach iGPU from CPU and GT4e CPUs will not fit into tablet format and TDP envelope.
  • Alexvrb - Thursday, October 22, 2015 - link

    Yeah you're right, a too-high-TDP-to-work-in-the-tablet-section CPU *would* be faster! Why didn't THEY think of that!? Why not just take a higher TDP chip with GT4E and cut it in half and stick half of it in the base?? BRILLIANT! Visual you should make your own Surface Book killer.
  • Visual - Friday, October 23, 2015 - link

    Your sarcasm doesn't help your case. There are GT3e 15W U-series parts. MS use them in the SP4.
  • Alexvrb - Friday, October 23, 2015 - link

    Make up your mind Visual. GT3e or GT4e? First you question why they didn't use GT4e. Then now you're clucking about GT3e. Here let me break it down for you:

    The dGPU they chose is faster than GT3e. Yet it can be located in a separate compartment - hence installed in the keyboard section (splitting the thermal load up nicely). GT4e might be about as fast, but it didn't fit in their thermal envelope.

    But you'll no doubt still consider yourself superior to all their dumb dumb engineers. I await the Visual Pro Book with baited breath.
  • TEAMSWITCHER - Friday, October 23, 2015 - link

    Why didn't they make a 15" Surface Pro with a quad Core i7 CPU with Iris Pro graphics? For me, the DGPU in the Surface Book is too meager for the extra money. The rest of the Surface Book specs are too much like the ultra portable Surface Pro 4. The Surface Book in it's current form is Microsoft's "Why the hell did they even bother making this?" device
  • Alexvrb - Friday, October 23, 2015 - link

    15" is a different class of device. Obviously a larger device can be equipped with more horsepower. There's plenty of larger devices on the market that prove that. They often come with more-powerful dGPUs. This 13.5" device competes with other 13.x-14" Ultrabooks and slim convertibles, which is a more lucrative segment right now. 15" has too much competition and lower margins.

    Anyway just because this isn't what you're looking for doesn't mean other people aren't looking for a device of this size. Do you often look at compact sedans and go "Why the hell did they even bother making this? I'm in the market for a large car and thus all vehicles produced should be full size sedans." :-P
  • Landiepete - Monday, October 26, 2015 - link

    Because then you would be complaining about the weight of the tablet. Or someting else.
  • trane - Wednesday, October 21, 2015 - link

    Actually, 1GB is a pretty good match for a 384CC part. It's never going to play core games well anyway, the GPU will always bottleneck before the mem buffer. So most people are going to use this for Photoshop, CAD, e-sport games etc, which use even less VRAM; and 1GB GD5 is a better option than 2GB GD3 anyday for perf. Let's face it, 2GB GD5 isn't worth the battery life impact, which could be an hour or two.

    Can't wait to see a HBM GPU inside the next one though! I wonder if that was the intention, but with the delays to FinFET it never materialised in time.
  • ddriver - Wednesday, October 21, 2015 - link

    1 GB is simply too low for this device native resolution
  • dragonsqrrl - Wednesday, October 21, 2015 - link

    I don't think anyone is expecting to run games at native resolution. And even if you did, would the memory capacity really be the bottleneck?
  • ddriver - Wednesday, October 21, 2015 - link

    For games you can run at a lower resolution, but for professional graphics applications this is not an option.
  • neonspark - Wednesday, October 21, 2015 - link

    I'm not sure what you mean exactly. You can run integrated graphics and drive dual 4K displays at 60fps as the windows UI doesn't use any gpu use. Photoshop doesn't need more VRAM to "work" and it will not be faster because of it, as pointed out the gpu computations ARE NOT done on your resolution but on the raw data. That could mean a very large RAW file, a video file, etc. Typically these apps' UI doesn't do anything fancy. Toolboxes, bitmaps, vectors, do not stress the GPU in any way. Even with 3D cad software, your VRAM isn't going to be the limit because of the resolution, in particular because you typically render in a preview window at lower settings than your final output as that is likely going to be a job ran on a server farm. So I'm not sure why you imagine resolution = vram for anything other than gaming.

    Skylake integrated graphics can drive the display panel, plus two external 4K screens at 60hz and it doesn't even rely on GDDR5 to do it, let alone have 1GB of it.

    I get it why you may "imagine" that 2GB would yield a benefit, but that is to a gamer, clearly not this product's target. For the professional, the biggest issue will be execution unit count, far before VRAM and resolution ever will.
  • ddriver - Wednesday, October 21, 2015 - link

    You sound like you've never used professional software. All of it is now GPU accelerated, and the larger your viewport the more video memory you need. And on a high resolution display, the viewport will be quite large, and it is not like you will not go fullscreen, considering it is a mere 13.5 inch.

    I am willing to bet the single gigabyte of vram will prove to be a bottleneck in such scenarios. I would not be surprised if the integrated GPU with 2 gigs of memory ends up being better in some cases, despite being overall slower and ram being slower than vram.
  • neonspark - Wednesday, October 21, 2015 - link

    I have used plenty of professional software and I'm a software engineer too. The viewport isn't going to be an issue here. For example, I process 36MP RAW images. The viewport will scale them because no LCD could fit them. When I run filters and processing using the GPU, in adobe CC and LR, the amount of VRAM isn't limiting here. I have ran the same jobs on machines with integrated graphics at 4K resolution, both using just a mediocre intel HD, as well as an older GTX card with 2GB and a newer GTX card with 1GB. The performance isn't tied to my resolution. I can change it to 4K, or 2K 1080p. This is because adobe's efficient viewports operate on the raw data, not the UI, and when they operate on the UI, such as LRCC, what gets pinged first is the CPU, followed by the GPU. Remember that the datasets for frame buffers in games are distinct from the datasets in scientific and graphics computation. Thus the most important aspect of the GPU for non gamers is the computational speed, not the VRAM figure, and last, the display resolution.

    The other part that you fail to understand is that VRAM is only needed for very fast memory access. The sofware can fall back on system RAM for everything else, and just leave the VRAM for the core algorithms that need it. The viewport rendering will be a factor in say, premiere CS, IF you want to do 4K editing at full res. But then again the screen isn't large enough for this. There is no question that the real time preview and of the premiere viewport will increase the load. But your claim that VRAM determines the performance of this window is ludicrious. The mercury engine will first tap out your GPU's execution units and 1GB is plenty for the frame buffer. Everything I've ever done has indicated this and you can try it for yourself. If you want to speed up, get more execution units before you get VRAM. Don't waste your money unless you're a gamer running huge frame buffers like say 16X AA, which does require the data to be in vram as to reduce the latency so that the framerate is sustained.
  • Morawka - Wednesday, October 21, 2015 - link

    looks like ddriver started something he couldn't finish.. well played
  • ddriver - Thursday, October 22, 2015 - link

    That's for sure, you cannot argue with a fence post ;)

    "Don't waste your money unless you're a gamer running huge frame buffers like say 16X AA"

    I guess nobody told GPU vendors vram makes only sense for games, the fools put twice as much memory on those professional graphics cards.
  • LukaP - Thursday, October 22, 2015 - link

    Simply because while it may not provide as much benefit as increasing the computational power of the chip, it allows for more data to be in the location with the smallest latency, meaning smoother operation (much less waiting for data to reload after you switch context, etc)
  • Spunjji - Thursday, October 22, 2015 - link

    Those professional GPUs also have vastly more execution resources... neonspark's point still stands. The VRAM is mostly valuable in GPU computation scenarios.

    It would be "nice" to have 2GB of VRAM on this, but it is also clearly not the best decision for its intended use.

    Bonus point: there is no such thing as an integrated GPU with 2GB of RAM. They use shared system memory. You're going to get way, way better performance out of 1GB of dedicated GDDR5 than you are out of whatever memory bandwidth you can scrape off the floor of a ~1600Mhz dual-channel DDR3 interface after the CPU is done with it.
  • neonspark - Wednesday, October 21, 2015 - link

    looking at adobe's site, photoshop, premiere, after effects and LR recommend 1GB of VRAM. so for the record: if you live and breath in the professional world of adobe, you pass. And I never thought I'd see this from a 13 inch device. I'm not sure why there is so much contempt at the fact it meet's adobe's recommendations. That is no small feat considering the mac book pro 13 inch, does not.
  • bluevaping - Thursday, October 22, 2015 - link

    Chipset Model: Intel Iris Graphics 6100
    Type: GPU
    Bus: Built-In
    VRAM (Dynamic, Max): 1536 MB
    Vendor: Intel (0x8086)

    Info from macbook pro 13" 2015 (16 GB ram, in this version). I think 8 will give you 1GB of VRAM. Incorrect statement.
  • Gigaplex - Wednesday, October 21, 2015 - link

    "as the windows UI doesn't use any gpu use"

    Surely, as a self proclaimed software engineer, you'd know that the Windows compositor, and other desktop APIs such as DirectWrite, does in fact use the GPU.
  • LukaP - Thursday, October 22, 2015 - link

    Didnt say any at all. of course it uses GPU. The goddamn calculator uses the GPU (assuming its like other windows apps written in C#), but so little of it, even a low end sandy bridge could drive it (with a sufficient rasterizer to drive all the pixels of course)
  • ImSpartacus - Wednesday, October 21, 2015 - link

    I know if I'm gaming on this device, I'm running at the quartered resolution (<1080p) to beef up performance.

    It might be more suitable in that situation.
  • MattL - Wednesday, October 21, 2015 - link

    Native resolution is 3000 x 2000... that's nearly 3 times the pixel count as 1080p and nearly 2 times the pixel count as 1440p... 1440p for serious desktop gaming requires at least descent desktop level video hardware to game well. It's completely unrealistic to expect a 13" ultrabook to game well at 3 times the pixels as 1080p.
  • Morawka - Wednesday, October 21, 2015 - link

    your crazy,, ipad air's have had this resolution for years with 1gb of ram
  • DanNeely - Wednesday, October 21, 2015 - link

    NVidia's last mainstreamish (read listed on their website/copied to wikipedia) 1GB mobile GPU was the 810M which was a Fermi based part with only 48 cores; although it's possible they've sold other OEMs custom models with only 1GB. With their entire low end lineup at 2GB, it does somewhat suggest that 1GB of ram isn't enough for these chips; OTOH lots of low end GPUs end up with stupidly high amounts of VRam to inflate specsheets for the ignorant. Has anyone done testing to see how much ram GM107/108 parts actually use when running games, to see if this is going to be bottlenecking it?
  • neonspark - Wednesday, October 21, 2015 - link

    I think the issue here is that people are looking at a one dimension (GB) without looking at the other (power and thermals). If 2GB is the norm, where can I find competitors with this feature spec at 3.5lbs and 12-9hr battery lives?

    Clearly if you ignore mobility, weight and thermals, you can start to make all sorts of bold assertions about what it "should" have. But where is this phantom market at because if we are to be fair, there should be plenty of examples of devices just as light and long lasting boasting 2GB of GDDR. Otherwise, all we're doing is nonsense: just because every PS4 ships with 8GB of GDDR5 doesn't mean anything less is sub-standard if it means getting something else in return.

    So this device must be put into context: 3.5lbs, 12hr battery life, will not singe your lap, 13 inches. Now please list below, its class competitors. I say this because I looked, and the closest thing was a MBP 15 inch but that was a beastly 4.5lb laptop, and the similar 13inch MBP has zero, that is right, zero GDDR5 let alone a dedicated GPU.
  • inighthawki - Wednesday, October 21, 2015 - link

    Microsoft should've put in a Titan Z with the full 12GB of GDDR5 memory. Anything less is absurd at this price range. </s>
  • DanNeely - Wednesday, October 21, 2015 - link

    Exactly. Drop the dGPU, drop the ability to use it as a tablet, and you've more or less got the XPS13. Before the Surfacebook was announced that was going to be my next laptop. Now I'm not sure for the hDPI screen, i5, 8GB, 128GB configurations the XPS13 is $1300, the SP with the dGPU is $1700. For roughly 2 weeks/year when I don't have access to my desktop that'd give me roughly double the FPS. Which is a nice boost; but is still a lot to pay for something I'd only use infrequently, especially since I'd still be limited to Indie games or AAA titles at relatively low quality levels. Especially since I've always managed to find something indie that will run OK on a crappy pre-baytrail atom, so I know I don't *need* the better GPU.
  • vithrell - Thursday, October 22, 2015 - link

    There is Lenovo Flex 14 2, and few more 2-in-1's with 360 degree hinge and discrete GPU. Only thing you can do with Surface Book that you couldn't otherwise is detaching keyboard and leave with ultra thin and light Surface Pro-like tablet.
  • dragonsqrrl - Wednesday, October 21, 2015 - link

    The answer is TDP, they're probably greatly restricted by the thermal capacity of the formfactor. The 940M has a ~25W TDP, which is already quite a bit higher than the CPU options. The 950M has a ~50W TDP. It just doesn't seem like a practical option given the size of the chassis.

    As for the limited VRAM, while it would've been nice on paper to include 2GB of GDDR5, I think it's also important to keep in mind the relative performance of this GPU. GM108 only has 384 CUDA cores and a 64 bit interface. I'm really not sure a 1GB framebuffer will be it's biggest performance bottleneck. I'm just glad they decided to use GDDR5, I'm pretty sure the 940M normally uses DDR3.
  • ImSpartacus - Wednesday, October 21, 2015 - link

    I think the use of gddr5 was the primary reason for the "custom" nature of the gpu, as proposed in the article.
  • ImSpartacus - Wednesday, October 21, 2015 - link

    I also lust for that configuration, but I can sorta understand their restrictions with thermals and battery life.

    Given the choice of gpu, going with 1gb of vram isn't that bad.

    I hope that we'll see something much better in the next iteration or two. Remember the surface pro didn't get really cool until the third iteration. The surface book might meet a similar fate.
  • DanNeely - Wednesday, October 21, 2015 - link

    GPU-Z data suggests that it's a heavily cut down GM107. The surface dGPU has 32 TMUs and 16 ROPs which is possible from a GM107 which has 40 and 16 at full spec, the GM108 only has 24 and 8.

    The fact that it's been cut down almost everywhere makes me wonder if MS is getting a model designed to use up dies that failed binning due to something being non-functional. NVidia hasn't sold any mainstream parts with a fused off GM107 since the GTX 745/750. There apparently have been a few other cut down GM107 laptop GPUs sold to a few OEMs under 940/945M branding; but they're all un-parts as far as NVidia's main page is concerned.

    http://hexus.net/tech/news/laptop/87353-microsoft-...
  • neonspark - Wednesday, October 21, 2015 - link

    TBH, I couldn't care less where they source any of it. Let's see how it performs with its peers in the class. Namely 13 inch ultrabooks which rely on iris or worse, HD graphics. I don't expect this gpu to beat a gaming "laptop". thankfully they didn't try this as it would have ruined the device's mobility.
  • DanNeely - Wednesday, October 21, 2015 - link

    Arstechica has tested the IGP and DGP versions. Except for one openGL test that the NVidia card faceplanted in and actually lost (presumably a driver problem of some sort) it's about 2x as fast the HD 520. Since intel's normal GPUs always scale sub-linearly due to bandwidth limitations that means it will probably manage to beat the fastest mobile GPU without eDram. Comparing to the eDRAM models should be interesting.
  • mczak - Wednesday, October 21, 2015 - link

    GPU-Z is just wrong and it's obvious. Cut down GM107 would still have 24 TMUS with 384 "cuda cores" (8 tmus per SM, so per 128 cores). Apparently GPU-Z reads 32 TMUs and 16 ROPs for all gm108 parts (looks to me like it can't read that out properly and relies on a bugged database instead for it).
    FWIW the reason for the 1GB is obvious: the biggest gddr5 parts you can get nowadays (albeit the memory manufacturers are claiming 8gbit is in production now) is 4gbit, these are 32bit wide, thus you need two chips for a 64bit interface - which gives you 2x4gbit, hence 1GB. It would be possible to use the chips in a 2x16bit mode instead but then you'd require 4 chips (larger footprint, probably slightly more power usage, and of course higher cost).
    Can't deny though a cut-down gm107 could have possibly been a viable option, because it has a 128bit memory interface they could have used ddr3 for still decent enough bandwidth (and ddr3 chips are 16bit wide, thus you'd need even 8 chips and with the "common garden variety" which is 4gbit nowadays that would give a more than sufficient 4GB then) - cut down to just 3 SMs (like the gm108 has) power draw likely should be comparable (in fact such cut down gm107 are indeed sold but only as quadro variants).
  • Murloc - Thursday, October 22, 2015 - link

    if MOBAs play with quite high fps, and they're playable with high settings even on the integrated gpu only, why is more required?

    This does the job.

    More is usually better but I'm sure that they've squeezed in what they could.
  • nikon133 - Thursday, October 22, 2015 - link

    Sometimes, I think they (in general) do these things on purpose - so they can improve next release.

    Dock for Surface Pro 3 has only one DisplayPort. It is multi-stream, but getting monitors with DP pass-through is not always possible (at least here in NZ)... and is always more expensive than standard monitors.

    Dock is targeting corporate users, and two screens are quite standard setup nowadays. We have "solved" problem with EVGA DisplayPort hub, but it is extra NZ$200 (with required cables), adds clutter to the desk, complicates setup... and is one more thing that can go wrong.

    And voila! Dock 2.0, with added 2nd DisplayPort. Tears in my eyes :)
  • Lonyo - Friday, October 23, 2015 - link

    When you consider there are already multiple options for each half, there's no reason to think they wouldn't enable that. You can buy a discrete or non-discrete keyboard section, effectively, and a choice of processors in the tablet section already. Just not on their own currently.

    Hopefully reviewers will try and buddy up and swap component elements to experiment. Plus people might choose to swap online if they decide they want the other one (e.g. someone decides they don't need a GPU while someone else wants one and they bought the "wrong" machine for them.
  • backbydemand - Monday, October 26, 2015 - link

    Hardly a travesty, they are topping every benchmark and blowing away everyone remotely close. When the competition tries to catch up then MS will release Surface Book 2 with 2Gb of Vram. I fully expect people to then moan about not having 4Gb.
  • tipoo - Wednesday, October 21, 2015 - link

    Cool, the GPU in the keyboard alone idea seems like a great idea to get much higher wattage GPUs into this form factor. I wonder if it will also means upgradable GPUs later on.
  • Maximilian122 - Wednesday, October 21, 2015 - link

    Could you check which kind of RAM is used (LPDDR3 or DDR4)? This should be important for the iGPUs performance, too.
  • Shadowmaster625 - Wednesday, October 21, 2015 - link

    $400 markup for a $60 video card? Hahha pass the pipe man you've had too much.
  • ddriver - Wednesday, October 21, 2015 - link

    well, that's what business is all about, rising the bar, the profit margin bar that is
  • neonspark - Wednesday, October 21, 2015 - link

    That is right. You really didn't think that Apple ram was also 400 dollars right ;) You're really paying for the fact it is there on that device, not because it "costs" that much. They are running a business, not a demo ware.
  • Murloc - Thursday, October 22, 2015 - link

    If the keyboard was just one big battery block, the additional design work and testing to do this weird detachable dGPU thing wouldn't have been necessary.
    That costs money.

    Plus it's about what customers are willing to pay. Are there cheaper notebooks with these specs with a dGPU?
  • rxc13 - Thursday, October 22, 2015 - link

    Ahem, It is a $200 markup. You should let go of the pipe ;)
  • rdwwdr - Wednesday, October 21, 2015 - link

    Uh, isn't the release date Oct 26th, not today?
  • DanNeely - Wednesday, October 21, 2015 - link

    Review embargos ending before the launch date are fairly common.
  • Speedfriend - Wednesday, October 21, 2015 - link

    I want one...
  • Nagorak - Wednesday, October 21, 2015 - link

    Can you guys come up with some sort of sustained GPU testing for laptops, because your current testing is very insufficient. A lot of laptops have inadequate cooling that cannot sustain GPU performance more than a few minutes. So, if you run a short test the numbers look great but after 10 minutes performance is 20% lower.

    Your results with the MSI GE60 were woefully off the mark, claiming it didn't throttle that much even after extended use. My family has 3 of them and I can tell you the cooling in that laptop was totally insufficient. The GPU is 100% thermally limited and drops down toward 900 MHz under sustained use (base boost clock of 1080 MHz). A cursory inspection of the cooling system for the laptop reveals numerous flaws, such as the fact the intake for the fan is not even close to lined up with the vent holes on the bottom cover. The problem is so bad I was forced to cut holes in the bottom cover to increase ventillation.

    Finding this sort of thing out is what we trust review sites like you to do, before we spend $4500 on a defective design (the replacement GE62 has 2 fans compared to the GE60s one, and much more ventillation built into the bottom cover so clearly MSI realized their mistake).

    You posted some good looking numbers for the surface book but my point is how much stock can we actually put on them? In a thermally constrained situation just running a benchmark one time telks us very little that is useful. I urge you to seriously reconsider your testing methodology for laptops. At the very least please consider running a thermal stress test like looping the Metro Last Light benchmark 10 Times.
  • meacupla - Wednesday, October 21, 2015 - link

    I want the dGPU model, but oh, that price...
  • abrowne1993 - Wednesday, October 21, 2015 - link

    Whelp, no mistake in going for the XPS 15, then. The novelty of the Surface Book is neat and all, but it's just not worth the price (at least for me).
  • Fiernaq - Wednesday, October 21, 2015 - link

    To me, this looks like a great 1.0 version. Sadly, it has just too many negatives for me to want it as a primary device. I'll definitely consider it as an option for some of my users but I have to be aware of the caveats.

    - no USB C
    - some reported issues with the release mechanism (Ars Technica reviewer) and no public long term testing for reliability of the release mechanism or durability of the hinge
    - no option for a larger battery instead of the dGPU
    - the keyboard layout has smushed up/down keys and uses combo home/end/page-up/page-down keys instead of providing them separately
  • neonspark - Wednesday, October 21, 2015 - link

    v1.0 is definitively a risk. I couldn't care less about USB-C, nobody uses it to any degree, and when they do, we'll be on 4th iteration of this device, and talking about the next USB-whatever that is "the future". The big risk is really that this may very well be the equivalent to the bulky surface pro 1-2 we saw 3 years ago. Meaning that if you burn 2-3K on this, the next one will make you curse.

    Still, compared to say, a traditional 2-1 or even 13inch ultrabook, you're getting a device that will not be matched anytime soon.
  • DanNeely - Wednesday, October 21, 2015 - link

    A lot of next year's phones are rumored to be using USB-C; so it'll be somewhat commonish within the next 6 or so months. Although depending on what your upgrade cycle looks like avoiding it for another 2 or so years might be possible.

    I wouldn't expect dramatic changes over the next model or two of this (unless the hinge fails in the real world); it looks like a much more refined device. The SP1 had a very 1.0 device vibe to it.
  • DarkXale - Wednesday, October 21, 2015 - link

    Yeah, but USB A <-> USB C is a valid cable spec, so phones using it isn't an issue.
  • Topweasel - Thursday, October 22, 2015 - link

    That's why I don't understand the comments here or on the SP4 review bringing up Type-C. Type-C is impractical of a connector for the PC end for a long while. Each Type-C connector they include is one less Type-A device that can be plugged, one less USB Stick, one less mouse, one less keyboard, one less ext HDD, one less nic, one less audio break out box, and so on.
  • SunnyNW - Friday, October 23, 2015 - link

    Exactly what I was going to post. +1
  • lilmoe - Saturday, October 24, 2015 - link

    +1
  • lilmoe - Saturday, October 24, 2015 - link

    We want USB-C on phones because Micro USB SUCKS ASS. Micro USB gets loose easily, breaks, and wears over relatively short periods of time. The reversible, reliable nature of type-c makes a LOT more sense on smartphones and mobile devices that otherwise use Micro USB.

    Just like others said, Type-C to USB A will be very common.
  • vithrell - Wednesday, October 21, 2015 - link

    Considering so small difference between performance of iGPU and dGPU i wonder how much performance it will gain with DX12 multiadapter feture.
  • Gunbuster - Wednesday, October 21, 2015 - link

    I wonder if they bring over the terrible Avastar WiFi from the Surface Pro's?
  • Carl Bicknell - Wednesday, October 21, 2015 - link

    Can someone please answer this: What's the difference in graphics performance between the 940M which seems to be in the Surface Book and the top Skylake Iris Pro (540) in the core i7 Surface Pro 4?
  • neonspark - Wednesday, October 21, 2015 - link

    not to spoil it but do you honestly think iris can match a dedicated nvidia chip with 1GB of GDDR5? It is going to kill it. Iris is fine if you basically need to render the UI, maybe play mediocre mobile games. The GPU alone probably has a TDP higher than the entire intel SOC. It is going to be a bloodbath.
  • smartypnt4 - Wednesday, October 21, 2015 - link

    I think it'll be closer than you realize. At lower resolutions (i.e., not bandwidth constrained), the dGPU here is only ~37% faster than the integrated GPU. As has been shown on Broadwell, the eDRAM goes a long way to removing that bottleneck at higher resolutions. Plus, they double the hardware they put down for the SP4's i7 (48 vs 24 EUs). Even in the same TDP, that's going to get you some more throughput. Should be ~20-25%. So the gap narrows significantly. I for one am a bit disappointed that they didn't put a 28W part in the tablet and cTDP'd it down and instead went for a 15W part.

    However, I think the main point of including the dGPU was to physically separate the GPU's thermal load from the CPU's, thereby allow a larger total TDP for the device at full tilt. If they'd put a 28W part in the tablet, they'd see it throttle constantly (which is something I'm worried about seeing in the SP4 with the i7).
  • andyd - Wednesday, October 21, 2015 - link

    Brett, can you include rMBP as part of the comparison? Microsoft claimed SurfaceBook is twice as fast as the Apple counterpart.
  • neonspark - Wednesday, October 21, 2015 - link

    That is like when apple claims the new iphone is faster than the old one by whatever amount. Clearly selective benchmarking. Both will use the same chip as apple cannot keep its thermals with the 13inch model to opt for the 4core intel chip. Assuming what MS means is that the dgpu is about as 2X more powerful than iris, I'd belive that. I'm sure some benchmark will show that, and that is all they need to make any claim.
  • digiguy - Wednesday, October 21, 2015 - link

    Yes exactly. And the benchmark were published by Arstechnica: Surface Book i7 is on par with MBP 13 (Cinebench r15), so that claim (not clear on purpose...) was about the Nvidia GPU vs the MBP 13 integrated GPU. Still if you want, you can have a 2x faster CPU in a SP3/4 format. It's the new Vaio Canvas Z with a quad core i7, but it weights 2.7 pounds (1.2 kg). And that CPU kills the 63Wh battery virtually as fast as the (much smaller) battery on the SB clipboard lasts...
  • neo_1221 - Wednesday, October 21, 2015 - link

    Curious that the dGPU model actually performs better at 1920x1080 high than at 1600x900 medium on the Dota 2 benchmark...
  • slashbinslashbash - Wednesday, October 21, 2015 - link

    All I can say is "Finally." I am a noted Mac fan but I have always felt that Microsoft hardware was highly competent... MS keyboards and mice have been my go-to for many years, and the Xbox offerings have been my gaming consoles of choice since they came out. But MS had to maintain its relationships with hardware companies and not box them out by offering their own line of PC's. Finally they are coming around and emulating Apple in actually putting out hardware. First with the Surface tablets and now with an actual laptop. FINALLY we are getting some competent design and execution in the non-Apple, non-ARM hardware arena!
  • osxandwindows - Wednesday, October 21, 2015 - link

    They are also getting better then apple at charging hy prices for there devices, don't cha agree?.
  • neonspark - Wednesday, October 21, 2015 - link

    I think if MS is the only PC OEM that makes beautifully designed and finished apple-like, apple-expensive-like hardware that is ok. Do we really need another "everything must go" equivalent of wal-mart PC OEM? no, we do not.
  • nerd1 - Wednesday, October 21, 2015 - link

    They are charging exactly the same money for SSD upgrade - so far apple has been making tons of money. so why not.
  • ikjadoon - Thursday, November 5, 2015 - link

    Except that these SSDs are TLC-based "junk" (the 512GB is on-par with a fast SATA SSD, but nowhere near other PCIe AHCI and PCIe NVMe drives).

    Apple's SSDs are actually worth the premium...you get ridiculously fast storage.
  • tdogdfw - Thursday, October 22, 2015 - link

    Apple Pencil $99, Microsoft Pen $49. Apple iPad Pro keyboard is also much more expensive than Microsoft keyboard and doesn't have trackpad or fingerprint option.
  • joelypolly - Wednesday, October 21, 2015 - link

    I miss the older Microsoft sidewinder joysticks, unfortunately the drivers are no longer supported in the latest versions of windows
  • Anonymous Blowhard - Wednesday, October 21, 2015 - link

    TDP for the GPU is allegedly 18W.
  • Wolfpup - Wednesday, October 21, 2015 - link

    This thing's really interesting, obviously. I wish it could have the GPU in the tablet, and all the battery there too...but obviously it can't. As such, I'm more interested in the Surface Pro 4, but still, really impressive hardware.

    Sounds like Optimus can actually be disabled on this?!? I'm really surprised by that if that's the case. I'm never buying a notebook where it can't be, as I've found it makes games a lot buggier and less stable than running it only on an Nvidia GPU. My few generations old Alienware M17x lets you disable Optimus, but I'm not sure if the newer ones do, and most notebooks don't :-/
  • extide - Wednesday, October 21, 2015 - link

    No optimus cant be physically disabled.
  • nerd1 - Wednesday, October 21, 2015 - link

    Laptops with G-sync display or SLI lacks optimus.
  • schuey7 - Thursday, October 22, 2015 - link

    Please correct me if I'm wrong but doesn't the 940m usually get around 45fps in dota 2 reborn enthustiast setting whereas here we are getting 54fps , a substantial jump.
  • Laststop311 - Thursday, October 22, 2015 - link

    For a version 1 of a brand new device this is a pretty good start. By the time they make a 2nd version with kaby lake and pascal and a cannonlake with volta, that third version with cannonlake and volta should have all the kinks worked out and be an amazing machine. And for people that don't need a ton of storage a 128GB or 256GB 3d xpoint drive would make this crazy fast.

    I'll be looking to buy this when it has cannonlake and volta with 10 gbps usb type c ports and cannonlake with iris pro gpu and a decent volta gpu with hbm memory and a 3d xpoint storage drive. If they can make a config like that happen they will take over the laptop market.
  • Klug4Pres - Friday, October 23, 2015 - link

    I love how people think they will get upgradeable GPUs. The PC industry's whole existence has been founded upon flogging you the same stuff year after year. Sure, you might get an upgradeable GPU, but you won't want to pay the upgrade price. Alternatively, next year there will be the "new, improved hinge" which unfortunately just won't allow the new bases to work with the "legacy" machine. Same old same old.
  • BMNify - Friday, October 23, 2015 - link

    The 3:2 aspect ratio display itself makes it a revolutionary laptop in the market, Eagerly waiting for a review from Brett.
  • bobjones32 - Friday, October 23, 2015 - link

    Hey Brett, super eager to see your full review :-D

    Any hint at a timeline for when you think you'll be done with it?
  • ikjadoon - Thursday, November 5, 2015 - link

    I heard this week. :)
  • meacupla - Friday, October 23, 2015 - link

    Looks like there will be an i5/128GB/8GB+dGPU model for $1699, or the same price as the i5/256GB/8GB model without dGPU.
  • metrolights - Saturday, October 24, 2015 - link

    I think adding Surface Pro i7 charts would give a better comparison than Surface Pro i5. Not only will it be an i7 chip, the Surface Pro i7 chip is different than Surface Book i7, since Surface Pro 4 i7 is i7-6650U with iris 540. Surface Pro i5 and Surface Book only has Intel HD graphics 520. I assume this means if I undock Surface Book it will perform worse than Surface Pro i7.
  • KimGitz - Tuesday, October 27, 2015 - link

    I love the innovation being shown by Microsoft. The Surface Book to me deserves a standing ovation and a round of applause.
    It has been interesting seeing the reaction from both Windows and Mac fans.
    You mentioned that there is no USB port on the clipboard part of the Surface Book, which got me thinking about the proprietary Surface Connector used.
    It has actually frustrated me because I think that should have just been a Thunderbolt 3 using the new USB-C connector.
    It functions the same way as Thunderbolt 3 from what I can tell.

    1. It is reversible
    2. It is PCIe
    3. Charging
    4. External GPU docking (Keyboard base)
    5. Docking for adding ports (Surface Dock)
    6. Ethernet networking (Surface Dock)
    7. Upto dual 4K displays (2X mini DisplayPort on the Surface Dock or 1 mini DisplayPort on the Keyboard base)

    The clipboard should have come with Alphine Ridge controller and 2 Thunderbolt 3, one at the exact same position the Surface Connector is and another at the other edge when using it in portrait mode.
    The Keyboard base is also just like a Thunderbolt GPU Dock and could have included a Thunderbolt 3 connector instead of the charging port.
  • ikjadoon - Thursday, November 5, 2015 - link

    But type-C isn't magnetic. /s

    haha, it would've been nice USB type-C/TB3. But, I think the issue is that it would require the extra Alpine Ridge chip and I think extra, one-use, dedicated chips on "battery-saving" laptops has always been regraded as bad. I have no evidence, though.

    I sincerely hope they have upgradeable bases in a few years...
  • Mixalis - Saturday, November 7, 2015 - link

    Hi Brett. Any news on when the complete review will be posted? Also are there any plan to review the new Dell XPS 15? Thanks.
  • shinken1 - Monday, November 9, 2015 - link

    These are the two Windows laptops I am really interested in so hope the Anandtech SurfaceBook review comes soon and is followed up by the new XPS 15
  • ge0kas - Sunday, January 3, 2016 - link

    I have a simple question.
    When I use external display (Eizo, 1920x1200 native) will I benefit from the dGPU ?

Log in

Don't have an account? Sign up now