The most important questions for consumers and some business customers:
1. Does this laptop's "Windows 10" run all my programs without glitches or lag? That is, is this Windows 10 on ARM or full-fat x86-64? Even Chromium-based Edge doesn't have a stable build on Windows 10 ARM...nearly two years after the idea came to Microsoft.
2. How much does it cost?
Both questions are still unanswered on its "unveiling", meaning this product is 75% likely just another 0.1 Alpha device that will flop globally. It'll die in the tech graveyard, just like the original Galaxy Book S.
Simply "having new technology after being criticized you're too slow" is step 4 out of 100. Microsoft/Intel have made the same clueless mistakes, year after year after year: poor performance, terrible pricing, confusing segmentation, etc.
If you wanted to ask, "How can multi-billion dollar corporations throw away hundreds of millions with their knack for completely misreading the market?", Exhibit A & B would be Microsoft & Intel's utter failure to accomplish anything in the smaller-than-a-laptop market.
"That is, is this Windows 10 on ARM or full-fat x86-64?" In which parallel universe could Lakefield, a x86-64 SoC, be possibly paired with Windows 10 on ARM?
This freudian slip is exactly the symptom: why does Microsoft have multiple cut-down, second-rate versions of Windows 10 that nobody genuinely recommends because of the horrible [performance / compatibility / interface]?
Is this Windows 10S, where users are stuck with the plethora of high-quality, first-party Store Apps?
Is this Windows 10X, a container-bonanza whose touch-centric interface & hardware-level restrictions make it work 9/10 days, unlike the 10/10 days you get with full-fat Windows 10?
And don't get started on how *well* four Atom cores (even w/ Tremont's claimed upgrades") can handle modern-day x86-84 applications, none of which particularly well even on Pentium SKUs....
The edge based edge has run on arm for many years.
There are many people on this earth with different tastes, we don’t all have to use computers the same way. I am alway perplexed at people hoping windows on arm fails. The worst thing Microsoft could do is make windows 7 for x86 pc’s for the next 10 years as they fade into obscurity.
The irony, mark my words, is after hundreds of millions thrown at the wall, Lakefield will most definitely trail Qualcomm in GPU performance because we already know past Adreno generations have sometimes significantly better performance.
I believe the Lakefield has Gen 11 graphics, so I think it will better than than Adreno. Plus keep in mind it has a Sunny Core CPU and it not emulated. It will be nice to see the performance on comparison between the two - but it definably not Windows on ARM. Intel is probably hold off to i7 version to see the response.
Excited to see some lakefield reviews and putting the first hybrid product from Intel through it's paces. It's been a year since Intel revealed this, shame it is taking so long to get to market.
Aha. I guess I was blinded by the cool "Foveros technology" name. It sounded like they had something new and exciting. Turns out it's just a name for something that's been done for years by others already. Well, I hope it lives up to the hype, anyway.
I don't think these are meant to be competitors. Van Gogh is AMD returning to the large fanless laptop / tablet market, but it shows no sign of being a design optimised for always-on and 2mw idle.
Fingers crossed for Intel that the full release version performs a little better. It would be rather humiliating for their competitor to provide a superior experience whilst emulating an instruction set.
You know what would've made this device even a tiny bit cool?
Upgrading from a 720p webcam for a "high-end" laptop launching in 2020.
Shameless cost-cutting... We've been complaining for a decade about horrible quality webcams and Samsung (who literally has an entire division dedicated to mobile imaging, i.e., the Samsung Galaxy devices) still can only muster a 720p webcam.
Before anyone needs to aplogize for multi-billion conglomerates failing to fix their problems, I remind you to look at *any* $500 iPhone or Android phone's front-facing camera. Samsung, please take notes for future devices:
1. 1080p minimum (yes, we have the bandwidth) 2. Larger sensors (for reducing noise at higher ISOs) 3. Better dynamic range 4. Better glass / lenses 5. Functioning IR-cut filters
If we can fit these amazing 7MP / 12 MP cameras in the tiny front bezels on phones, we can fit them in the tiny bezels on laptops.
Seconded. Though the problem is far from Samsung's alone. My super-expensive state of the art DELL mobile workstation laptop has a beautiful display with an absolutely atrocious webcam.
1080P is still possible, the Pixelbook Go has one. But yeah, the thickness of the lid is an issue. Would you guys trade for a 4K webcam if it meant the keyboard location, or similar, like on the Matebook X Pro?
First, the camera doesn't span the entire (or even most of the) thickness of a modern smartphone; just about half the thickness is taken up by the display, including the glass protecting it.
Second, even if it took a camera bump on the laptop's lid to fit in a decent camera with decent optics in, I would bet that's a trade most people would be willing to make without any hesitation. Far more willing, even, than accepting a camera bump on a smartphone.
Third, somehow Apple has no problem fitting half-way decent webcams into its MacBooks. If Apple can do it, certainly other vendors can!
The webcam will not better than 729p until Lakefield ends up inside a laptop that the manufacturer and tech journalists refer to as a "flagship" device. We all know that anything we call a flagship is the biggest, most powerful-est, and betterer than bestest of the best to sail the seven seas and deliver the weight of its broadside to some unsuspecting coastal fort all while putting up various flags to coordinate the actions of the rest of the fleet because FLAGSHIP!
You forget that any laptop can be referred to as a flagship. I've seen sub-$300 computers advertised on Amazon as flagships. Price does not have anything at all to do with the application of that particular, meaningless term. I'm poking a little fun at Anandtech for falling prey to using it as if it is a defined category.
Better sensors are thicker, which precludes them from the thin laptop displays.
Regardless, I'm pretty sure a 720p webcam is actually a pro rather than a con because I'm sure nobody wants a 4K60 webcam streaming their pores and pimples to the whole meeting room. People that want to video call their relatives/SO can simply use their phones which will have a much better sensor anyway. So I'm not sure why manufacturers should include a better webcam when the target market doesn't really exist
It's not just about resolution. Indeed, resolution is the least of the problems. Far worse, are the high levels of noise, poor color fidelity and contrast, and poor performance in environments that are either too dark or too bright.
As for the target market, tons of people are working remotely and/or from home - especially nowadays. Everyone who's ever been in a meeting over Zoom or Teams or GoToMeeting or whatever, is part of the target market.
It's not the camera that's the issue per se. It's the fact that typical x86 processors don't have a dedicated image processing block. Your iPhone and Android phone SOC has a dedicated processing block that handles the data straight from the camera sensors. No x86 cpu has this. It has to travel to the cpu via USB signal (usually 2.0 at best) to brute force with software rather than dedicated hardware. You can put a big ass camera lens and sensor in your laptop but the limitation is the cpu horsepower required to brute force the camera signal. This is why capture cards for video input from cameras are a thing. Your cpu can't handle the data from a camera sensor. Example: I use a capture card with hdmi output from my Sony a7 as a Webcam because the cpu can't handle it otherwise.
With those layers, I wonder about thermals. Now, I get that this is a low-power SoC, but those four stacks of semiconductor (read: also poor heat dissipation) with the CPU layer in layer 2 make me wonder how it'll allow for its big "Core" core to not bake itself. Any information on how this SoC is cooled? Does it use its base to dissipate heat? Any review or overview or link to one is appreciated!
there's still a "base" layer to each segment of the stack with heat dissipation and interconnect as primary goals. The interconnect components, afaik, can be pretty small, so the majority of that surface area between layers is heat dissipation.
Worth emphasizing that it's hard for a single core to "bake itself," hence why single-core turbo speeds are often dramatically higher than full-CPU turbo speeds. When we go through the components, it makes sense how this CPU will stay cool.
LPDDR4X is quite low power, and bog standard DDR2/3/4 memory already is easily cooled with even the smallest amount of airflow. Direct heatsink access to the top of the die should more than just cool the memory, it should start cooling the die below it.
The I/O components also similarly are passively cooled in all but the highest end motherboards, so we can assume those are relatively low heat dissipation. I don't know if there's a cooling solution on the bottom of the board, but some heat should dissipate there regardless.
Finally, most Atom CPUs are passively cooled, and that's not even necessarily on 10nm. These cores should not greatly impact the thermal performance of the big Core.
The whole SoC is 5-7W TDP, limited so it will not bake itself. As a result the big core has just about the same single-threaded performance as the 8cx but really suffers in multithreaded performance (about half that of the 8cx).
Appreciate your and imcd's comments! "Baking" was a bit hyperbole, but I am wondering/fascinated by the effect stacking multiple semiconductor stacks (each processor is already multiple layers, so is RAM) has on heat dissipation. It might just be that the thickness (thinness, really) of these stacks is such that heat will still find a way out. Especially at the low TDP of these.
It will be interesting to see a review comparing the 2. In terms of performance, I feel Intel should still be up top against Qualcomm. Lakefield seems to be an attempt to fend off Qualcomm from eating into their high profit margin, ultra low power CPU space. Running on Windows, Intel should have a significant advantage, though I feel the battery life is going to be poor as compared to Qualcomm.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
49 Comments
Back to Article
ikjadoon - Friday, May 29, 2020 - link
The most important questions for consumers and some business customers:1. Does this laptop's "Windows 10" run all my programs without glitches or lag? That is, is this Windows 10 on ARM or full-fat x86-64? Even Chromium-based Edge doesn't have a stable build on Windows 10 ARM...nearly two years after the idea came to Microsoft.
2. How much does it cost?
Both questions are still unanswered on its "unveiling", meaning this product is 75% likely just another 0.1 Alpha device that will flop globally. It'll die in the tech graveyard, just like the original Galaxy Book S.
Simply "having new technology after being criticized you're too slow" is step 4 out of 100. Microsoft/Intel have made the same clueless mistakes, year after year after year: poor performance, terrible pricing, confusing segmentation, etc.
If you wanted to ask, "How can multi-billion dollar corporations throw away hundreds of millions with their knack for completely misreading the market?", Exhibit A & B would be Microsoft & Intel's utter failure to accomplish anything in the smaller-than-a-laptop market.
They are truly the Hooli's of our time.
ikjadoon - Friday, May 29, 2020 - link
That should read, is it [Windows 10, full] or [Windows 10S / Windows 10X]?Atom & Sunny Cove are both x86-based CPUs, so it can't be Windows 10 on ARM.
maratd - Friday, May 29, 2020 - link
> Even Chromium-based Edge doesn't have a stable build on Windows 10 ARMWriting this comment on Chromium-based Edge for ARM on Windows 10 ARM running on the Samsung Galaxy Book S.
> It'll die in the tech graveyard, just like the original Galaxy Book S.
See above.
ikjadoon - Friday, May 29, 2020 - link
Ah, I stand corrected: Microsoft did release the ARM64 build in February 2020, a month after x64, into the stable channel. https://twitter.com/MSEdgeDev/status/1225905079774...Well, n = 1 there, mate. I'm sure lots of Microsoft Band and Zune users feel "supported", too. ;)
Santoval - Friday, May 29, 2020 - link
"That is, is this Windows 10 on ARM or full-fat x86-64?"In which parallel universe could Lakefield, a x86-64 SoC, be possibly paired with Windows 10 on ARM?
ikjadoon - Friday, May 29, 2020 - link
Already fixed.This freudian slip is exactly the symptom: why does Microsoft have multiple cut-down, second-rate versions of Windows 10 that nobody genuinely recommends because of the horrible [performance / compatibility / interface]?
Is this Windows 10S, where users are stuck with the plethora of high-quality, first-party Store Apps?
Is this Windows 10X, a container-bonanza whose touch-centric interface & hardware-level restrictions make it work 9/10 days, unlike the 10/10 days you get with full-fat Windows 10?
And don't get started on how *well* four Atom cores (even w/ Tremont's claimed upgrades") can handle modern-day x86-84 applications, none of which particularly well even on Pentium SKUs....
Zeratul56 - Friday, May 29, 2020 - link
The edge based edge has run on arm for many years.There are many people on this earth with different tastes, we don’t all have to use computers the same way. I am alway perplexed at people hoping windows on arm fails. The worst thing Microsoft could do is make windows 7 for x86 pc’s for the next 10 years as they fade into obscurity.
Samus - Friday, May 29, 2020 - link
The irony, mark my words, is after hundreds of millions thrown at the wall, Lakefield will most definitely trail Qualcomm in GPU performance because we already know past Adreno generations have sometimes significantly better performance.HStewart - Thursday, June 11, 2020 - link
I believe the Lakefield has Gen 11 graphics, so I think it will better than than Adreno. Plus keep in mind it has a Sunny Core CPU and it not emulated. It will be nice to see the performance on comparison between the two - but it definably not Windows on ARM. Intel is probably hold off to i7 version to see the response.cyrusfox - Friday, May 29, 2020 - link
Excited to see some lakefield reviews and putting the first hybrid product from Intel through it's paces. It's been a year since Intel revealed this, shame it is taking so long to get to market.nandnandnand - Friday, May 29, 2020 - link
Lakefield looks great but is late (teased in January 2019). I'll check out AMD's "Van Gogh" instead.https://www.notebookcheck.net/Report-AMD-s-upcomin...
yeeeeman - Friday, May 29, 2020 - link
Man, some people are delirious. Spare me with AMD, we're talking about an Intel product here.Desierz - Friday, May 29, 2020 - link
AMD doesn't have anything similar afaik. This is Big.little + integrated RAM on the die. Presumably this will save power.Wilco1 - Friday, May 29, 2020 - link
The DRAM is a separate die and uses the same PoP stacking mobile SoCs have been doing for many years: https://fuse.wikichip.org/news/3417/a-look-at-inte...Desierz - Friday, May 29, 2020 - link
Aha. I guess I was blinded by the cool "Foveros technology" name. It sounded like they had something new and exciting. Turns out it's just a name for something that's been done for years by others already. Well, I hope it lives up to the hype, anyway.lmcd - Friday, May 29, 2020 - link
The lower stack is interesting tech even if the DRAM stacking is not new.Santoval - Friday, May 29, 2020 - link
Not yet they don't. However the link above refers to AMD's *future* APUs. AMD are also going to switch to 3D stacking.Desierz - Friday, May 29, 2020 - link
Sure. The industry talks about it as way to further Moore's law. So I expect to see it become a widespread technology in CPUs and other ICs.lmcd - Friday, May 29, 2020 - link
Yea but AMD dropped their cat cores out of monetary need a while ago, even though a cat core would be perfect for this.nandnandnand - Friday, May 29, 2020 - link
I'll buy the Intel product. For $100.Santoval - Friday, May 29, 2020 - link
Since when talking about an Intel (or AMD) product has precluded comparing it to an AMD (or Intel) product?grrrgrrr - Friday, May 29, 2020 - link
AMD processors are not going to compete against Qualcomm anytime soon.Spunjji - Monday, June 1, 2020 - link
I don't think these are meant to be competitors. Van Gogh is AMD returning to the large fanless laptop / tablet market, but it shows no sign of being a design optimised for always-on and 2mw idle.Desierz - Friday, May 29, 2020 - link
It will be interesting to see benchmarks, and how they stack up to the ARM version.Wilco1 - Saturday, May 30, 2020 - link
It's not looking good, even single-threaded Lakefield can barely keep up with the 8cx: https://browser.geekbench.com/v5/cpu/compare/14837...Spunjji - Monday, June 1, 2020 - link
Oof, wow. That is SAD.Fingers crossed for Intel that the full release version performs a little better. It would be rather humiliating for their competitor to provide a superior experience whilst emulating an instruction set.
ikjadoon - Friday, May 29, 2020 - link
You know what would've made this device even a tiny bit cool?Upgrading from a 720p webcam for a "high-end" laptop launching in 2020.
Shameless cost-cutting... We've been complaining for a decade about horrible quality webcams and Samsung (who literally has an entire division dedicated to mobile imaging, i.e., the Samsung Galaxy devices) still can only muster a 720p webcam.
Before anyone needs to aplogize for multi-billion conglomerates failing to fix their problems, I remind you to look at *any* $500 iPhone or Android phone's front-facing camera. Samsung, please take notes for future devices:
1. 1080p minimum (yes, we have the bandwidth)
2. Larger sensors (for reducing noise at higher ISOs)
3. Better dynamic range
4. Better glass / lenses
5. Functioning IR-cut filters
If we can fit these amazing 7MP / 12 MP cameras in the tiny front bezels on phones, we can fit them in the tiny bezels on laptops.
boeush - Friday, May 29, 2020 - link
Seconded. Though the problem is far from Samsung's alone. My super-expensive state of the art DELL mobile workstation laptop has a beautiful display with an absolutely atrocious webcam.ajp_anton - Friday, May 29, 2020 - link
It's not just the size of the bezel. Look at the thickness of the phone, and then the thickness of the laptop screen/lid.WPX00 - Saturday, May 30, 2020 - link
1080P is still possible, the Pixelbook Go has one. But yeah, the thickness of the lid is an issue. Would you guys trade for a 4K webcam if it meant the keyboard location, or similar, like on the Matebook X Pro?Tams80 - Saturday, May 30, 2020 - link
No. I'd much rather be somewhat blurry than have someone having to stare up my nose and me have to crane my neck down.boeush - Sunday, May 31, 2020 - link
First, the camera doesn't span the entire (or even most of the) thickness of a modern smartphone; just about half the thickness is taken up by the display, including the glass protecting it.Second, even if it took a camera bump on the laptop's lid to fit in a decent camera with decent optics in, I would bet that's a trade most people would be willing to make without any hesitation. Far more willing, even, than accepting a camera bump on a smartphone.
Third, somehow Apple has no problem fitting half-way decent webcams into its MacBooks. If Apple can do it, certainly other vendors can!
grrrgrrr - Friday, May 29, 2020 - link
Convince apple firstIntelUser2000 - Saturday, May 30, 2020 - link
By the way some laptops don't have a webcam at all. The Ryzen 9 Asus Zephyrus for example has no webcam.PeachNCream - Saturday, May 30, 2020 - link
The webcam will not better than 729p until Lakefield ends up inside a laptop that the manufacturer and tech journalists refer to as a "flagship" device. We all know that anything we call a flagship is the biggest, most powerful-est, and betterer than bestest of the best to sail the seven seas and deliver the weight of its broadside to some unsuspecting coastal fort all while putting up various flags to coordinate the actions of the rest of the fleet because FLAGSHIP!IntelUser2000 - Saturday, May 30, 2020 - link
You forget even $5000 laptops have crappy webcams.PeachNCream - Saturday, May 30, 2020 - link
You forget that any laptop can be referred to as a flagship. I've seen sub-$300 computers advertised on Amazon as flagships. Price does not have anything at all to do with the application of that particular, meaningless term. I'm poking a little fun at Anandtech for falling prey to using it as if it is a defined category.Retycint - Saturday, May 30, 2020 - link
Better sensors are thicker, which precludes them from the thin laptop displays.Regardless, I'm pretty sure a 720p webcam is actually a pro rather than a con because I'm sure nobody wants a 4K60 webcam streaming their pores and pimples to the whole meeting room. People that want to video call their relatives/SO can simply use their phones which will have a much better sensor anyway. So I'm not sure why manufacturers should include a better webcam when the target market doesn't really exist
boeush - Sunday, May 31, 2020 - link
It's not just about resolution. Indeed, resolution is the least of the problems. Far worse, are the high levels of noise, poor color fidelity and contrast, and poor performance in environments that are either too dark or too bright.As for the target market, tons of people are working remotely and/or from home - especially nowadays. Everyone who's ever been in a meeting over Zoom or Teams or GoToMeeting or whatever, is part of the target market.
Ej24 - Monday, June 1, 2020 - link
It's not the camera that's the issue per se. It's the fact that typical x86 processors don't have a dedicated image processing block. Your iPhone and Android phone SOC has a dedicated processing block that handles the data straight from the camera sensors. No x86 cpu has this. It has to travel to the cpu via USB signal (usually 2.0 at best) to brute force with software rather than dedicated hardware. You can put a big ass camera lens and sensor in your laptop but the limitation is the cpu horsepower required to brute force the camera signal. This is why capture cards for video input from cameras are a thing. Your cpu can't handle the data from a camera sensor. Example: I use a capture card with hdmi output from my Sony a7 as a Webcam because the cpu can't handle it otherwise.IntelUser2000 - Tuesday, June 2, 2020 - link
Yes they do. At least the Intel mobile parts. Intel U/Y parts had a dedicated ISP since Kabylake-R.Actually the Surface Pro tablets have a decent webcam.
You also gotta remember that Smartphones have far better forward facing cameras than rear ones, while in laptops its all rear cameras.
ksec - Friday, May 29, 2020 - link
I wonder what are the keyboard like on this machines.nicolaim - Friday, May 29, 2020 - link
Typo: "Intel has spit up the chip"Ryan Smith - Friday, May 29, 2020 - link
Thanks!eastcoast_pete - Friday, May 29, 2020 - link
With those layers, I wonder about thermals. Now, I get that this is a low-power SoC, but those four stacks of semiconductor (read: also poor heat dissipation) with the CPU layer in layer 2 make me wonder how it'll allow for its big "Core" core to not bake itself. Any information on how this SoC is cooled? Does it use its base to dissipate heat? Any review or overview or link to one is appreciated!lmcd - Saturday, May 30, 2020 - link
there's still a "base" layer to each segment of the stack with heat dissipation and interconnect as primary goals. The interconnect components, afaik, can be pretty small, so the majority of that surface area between layers is heat dissipation.Worth emphasizing that it's hard for a single core to "bake itself," hence why single-core turbo speeds are often dramatically higher than full-CPU turbo speeds. When we go through the components, it makes sense how this CPU will stay cool.
LPDDR4X is quite low power, and bog standard DDR2/3/4 memory already is easily cooled with even the smallest amount of airflow. Direct heatsink access to the top of the die should more than just cool the memory, it should start cooling the die below it.
The I/O components also similarly are passively cooled in all but the highest end motherboards, so we can assume those are relatively low heat dissipation. I don't know if there's a cooling solution on the bottom of the board, but some heat should dissipate there regardless.
Finally, most Atom CPUs are passively cooled, and that's not even necessarily on 10nm. These cores should not greatly impact the thermal performance of the big Core.
Wilco1 - Saturday, May 30, 2020 - link
The whole SoC is 5-7W TDP, limited so it will not bake itself. As a result the big core has just about the same single-threaded performance as the 8cx but really suffers in multithreaded performance (about half that of the 8cx).eastcoast_pete - Monday, June 1, 2020 - link
Appreciate your and imcd's comments! "Baking" was a bit hyperbole, but I am wondering/fascinated by the effect stacking multiple semiconductor stacks (each processor is already multiple layers, so is RAM) has on heat dissipation. It might just be that the thickness (thinness, really) of these stacks is such that heat will still find a way out. Especially at the low TDP of these.watzupken - Sunday, May 31, 2020 - link
It will be interesting to see a review comparing the 2. In terms of performance, I feel Intel should still be up top against Qualcomm. Lakefield seems to be an attempt to fend off Qualcomm from eating into their high profit margin, ultra low power CPU space. Running on Windows, Intel should have a significant advantage, though I feel the battery life is going to be poor as compared to Qualcomm.