>Processor options are the top line Intel Core i7-1165G7 or Core i5-1135G7 – there’s no direct indication about what power level these are set to (a drawback of Intel’s non-fixed TDP marketing messaging), so these could very well be 12 W models with 1.2 GHz base frequency
Ian why do you keep stating this like its a new thing, OEMs could always change the TDP and turbo of CPUs. Cooling also affects performance.
Intel is just lining up their specs to what actually happens in the real world, the OEM will just set whatever power configs they're comfortable with.
That's why you should always look for laptops, not laptop CPUs.
It's the same for AMD, why look for a 4800U laptop without checking to see if it has a 10w TDP with 8w of cooling?
Especially since CPPC defines how efficient the CPU will be, not the actual power config itself.
Because Intel has changed its messaging stance to obfuscate. Presentations from partners are saying the 1165G7 has a base frequency of 3.0 GHz, regardless of what power mode it's actually running at, because when the marketing team go looking for the stats on its own hardware, that's the result they get when the processor SKU is looked up.
>Because Intel has changed its messaging stance to obfuscate. Presentations from partners are saying the 1165G7 has a base frequency of 3.0 GHz, regardless of what power mode it's actually running at
Ian, you don't understand. OEMs were already lowering the TDP of U series SKUs to Y series power levels, that changes the base clock. But the OEM still advertised the 15w base clock, and the 15w turbo.
"Intel is just lining up their specs to what actually happens in the real world" How can you "line up" a spec that *isn't there*? This is the worst rationalization. "The situation was terrible, so they're being pragmatic by making it worse". 🤦♂️
Let me put it this way: if they have the clout to drag partners into their Evo program and lock AMD out of high-end gaming laptops, then they have the clout to post multiple specifications for their SoC and dictate which spec their partners use based on what their chassis design is capable of cooling. You'd still see *some* variation in, say, long-duration testing - but you wouldn't get this absolute madness where you have no idea what the guaranteed base clock of the CPU you just bought is.
That, and CPUs by design have (for a good decade now) not even come close to having a 'fixed' power draw. Spec the peak power draw, and you vastly overstate the actual power draw. Spec the average draw, and you have complaining that it peaks to above that. Spec the actual power criteria, and you get complaints that its too complex. Then there's the conflation of power draw with TDP, when the two are different things that both affect the other in an indirect manner (due to the time constant of the thermal solution, available sink mass, time since last saturation, etc). There's no win situation, other than to acknowledge that there is no longer a valid single number for CPU power draw. Everyone has manged to wean off of the 'GHz means how fast a CPU is!' idea, so the same can and should happen for power.
Except that different TDP allowances allow for wildly different performance metrics. "GHz means how fast a CPU is" not only still applies for the vast majority of consumers, but when comparing CPU sof the same arch is also TRUE.
Because a 4800u with 8w of cooling would be out of spec, duh? Manufacturers can limit turbo but changing base TDP outside of pre defined ranges is out of the question.
The issue here is intel's defined range now goes everywhere from 12-45 watts with the same chips. AMD has the 4800u and 4800h and 4800hs to denote these ranges, intel previously did as well before making everything confusing.
>Because a 4800u with 8w of cooling would be out of spec, duh? Manufacturers can limit turbo but changing base TDP outside of pre defined ranges is out of the question.
Then why do they do it?
>The issue here is intel's defined range now goes everywhere from 12-45 watts with the same chips. AMD has the 4800u and 4800h and 4800hs to denote these ranges, intel previously did as well before making everything confusing.
No.
The defined range on paper was already 15w, didn't stop manufacturers setting it to 7 or 28w.
@Jorgp2 - Because they can. The better question would be, why do CPU manufacturers let them get away with it? Especially Intel, with their various design programs and their vast sums of "Marketing Development Funds".
You keep responding as if Intel haven't made any meaningful change to their specs. They have.
If Xe dedicated graphics are priced like better graphics from Nvidia in similar laptops, I don't see the point. It's Intel, so surely they'll do some CPU+GPU bundling deals to OEMs, but...I'm not seeing it yet.
They'll definitely try to make them cheaper, but even if they succeed, Nvidia will probably hack a few chunks off another higher-performing last-gen GPU and sell it at a knock-down price to regain competitiveness.
They'd have to be a *lot* cheaper to make up for those drivers, too.
They're not as bad as they were, but they're still fairly terrible. Notebookcheck have a couple of articles on it - games crashing, wildly inconsistent performance and graphical errors appear to be order of the day for a number of fairly high-profile titles.
That's aside from the fact that in-game performance in no way matches up to their results in synthetic tests.
I think I read somewhere that GTX 1650/1660 is the performance target for DG1--which would still blow MX 250/350 out of the water. But the mobile Xe MAX may be memory bandwidth starved as it uses LPDDR4X vs. GDDR6. We'll see soon enough.
The DG1 you are thinking of uses GDDR6 memory, so it has much higher bandwidth.
This Iris Xe Max uses LPDDR4X. It doesn't have to share with the CPU, but as AshlayW says the peak performance can't be better than the 1050 Ti mobile.
Verge says the Type-C connector is in fact Thunderbolt 4. I'm still unclear as to whether Thunderbolt 4 is forwards/backwards compatible with USB 4--or if that requires a software/firmware update--or if USB 4 is entirely different (is it supported in the chipset?)
I am a die-hard AMD fan, but this looks interesting. That battery life, a decent GPU, and with 512GB of SSD finally (Optane, on top of everything). $900 does not sound bad at all. Qhen are we going to see a review?
"This might be because entry level discrete graphics options are more favored in the APAC market."
or might be that the poor folk in APAC get what they are given. Typically favouring higher end SKUs, or extreme low end, so many mid-tier stuff just goes through to USA.
I look forward to reading the first reviews! If at all possible, please run it side-by-side with a laptop with the same CPU, but a GTX 1650 or 1660 for dGPU, so we can see what's what. I for one would love to see some more competition in low-mid range dGPU offerings for laptops; right now, it's mostly if not all NVIDIA all the time. Maybe Intel surprises us positively, for a change!
Thanks Ian! I guess Acer doesn't need the customers? I am in the market for a laptop in this range, but I will read the reviews first, and so will others.
And kind of expected. DG1 is the same spec as the top Tiger Lake IGP, but with a bit more spread out thermals to play with and dedicated VRAM. Not entirely sure if it deserves to exist.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
39 Comments
Back to Article
Vitor - Wednesday, October 21, 2020 - link
If the screen is actually decent, it would be a good buy. 16bg ram and 1080 is enough for me.cfenton - Wednesday, October 21, 2020 - link
The regular Swift 3 has one of the worst screens in a modern laptop, so I wouldn't expect any miracles.Spunjji - Thursday, October 22, 2020 - link
Who knows, maybe they were saving all the good screens for this one...sharathc - Friday, October 23, 2020 - link
It covers 72% of the NTSC gamut. Is it good?Link: https://www.acer.com/ac/en/IN/content/series/swift...
Jorgp2 - Wednesday, October 21, 2020 - link
>Processor options are the top line Intel Core i7-1165G7 or Core i5-1135G7 – there’s no direct indication about what power level these are set to (a drawback of Intel’s non-fixed TDP marketing messaging), so these could very well be 12 W models with 1.2 GHz base frequencyIan why do you keep stating this like its a new thing, OEMs could always change the TDP and turbo of CPUs. Cooling also affects performance.
Intel is just lining up their specs to what actually happens in the real world, the OEM will just set whatever power configs they're comfortable with.
That's why you should always look for laptops, not laptop CPUs.
It's the same for AMD, why look for a 4800U laptop without checking to see if it has a 10w TDP with 8w of cooling?
Especially since CPPC defines how efficient the CPU will be, not the actual power config itself.
Ian Cutress - Thursday, October 22, 2020 - link
Because Intel has changed its messaging stance to obfuscate. Presentations from partners are saying the 1165G7 has a base frequency of 3.0 GHz, regardless of what power mode it's actually running at, because when the marketing team go looking for the stats on its own hardware, that's the result they get when the processor SKU is looked up.Jorgp2 - Thursday, October 22, 2020 - link
>Because Intel has changed its messaging stance to obfuscate. Presentations from partners are saying the 1165G7 has a base frequency of 3.0 GHz, regardless of what power mode it's actually running atIan, you don't understand. OEMs were already lowering the TDP of U series SKUs to Y series power levels, that changes the base clock.
But the OEM still advertised the 15w base clock, and the 15w turbo.
Same for OEMs that locked PL1&2 to 28w.
Spunjji - Thursday, October 22, 2020 - link
Turbo, yes. TDP, no?"Intel is just lining up their specs to what actually happens in the real world"
How can you "line up" a spec that *isn't there*? This is the worst rationalization. "The situation was terrible, so they're being pragmatic by making it worse". 🤦♂️
Spunjji - Thursday, October 22, 2020 - link
Let me put it this way: if they have the clout to drag partners into their Evo program and lock AMD out of high-end gaming laptops, then they have the clout to post multiple specifications for their SoC and dictate which spec their partners use based on what their chassis design is capable of cooling. You'd still see *some* variation in, say, long-duration testing - but you wouldn't get this absolute madness where you have no idea what the guaranteed base clock of the CPU you just bought is.Jorgp2 - Thursday, October 22, 2020 - link
Yes, and no.TDP can be changed by changing the PL1.
edzieba - Thursday, October 22, 2020 - link
That, and CPUs by design have (for a good decade now) not even come close to having a 'fixed' power draw. Spec the peak power draw, and you vastly overstate the actual power draw. Spec the average draw, and you have complaining that it peaks to above that. Spec the actual power criteria, and you get complaints that its too complex. Then there's the conflation of power draw with TDP, when the two are different things that both affect the other in an indirect manner (due to the time constant of the thermal solution, available sink mass, time since last saturation, etc).There's no win situation, other than to acknowledge that there is no longer a valid single number for CPU power draw. Everyone has manged to wean off of the 'GHz means how fast a CPU is!' idea, so the same can and should happen for power.
TheinsanegamerN - Thursday, October 22, 2020 - link
Except that different TDP allowances allow for wildly different performance metrics. "GHz means how fast a CPU is" not only still applies for the vast majority of consumers, but when comparing CPU sof the same arch is also TRUE.TheinsanegamerN - Thursday, October 22, 2020 - link
Because a 4800u with 8w of cooling would be out of spec, duh? Manufacturers can limit turbo but changing base TDP outside of pre defined ranges is out of the question.The issue here is intel's defined range now goes everywhere from 12-45 watts with the same chips. AMD has the 4800u and 4800h and 4800hs to denote these ranges, intel previously did as well before making everything confusing.
Jorgp2 - Thursday, October 22, 2020 - link
>Because a 4800u with 8w of cooling would be out of spec, duh? Manufacturers can limit turbo but changing base TDP outside of pre defined ranges is out of the question.Then why do they do it?
>The issue here is intel's defined range now goes everywhere from 12-45 watts with the same chips. AMD has the 4800u and 4800h and 4800hs to denote these ranges, intel previously did as well before making everything confusing.
No.
The defined range on paper was already 15w, didn't stop manufacturers setting it to 7 or 28w.
Spunjji - Friday, October 23, 2020 - link
@Jorgp2 - Because they can. The better question would be, why do CPU manufacturers let them get away with it? Especially Intel, with their various design programs and their vast sums of "Marketing Development Funds".You keep responding as if Intel haven't made any meaningful change to their specs. They have.
tipoo - Wednesday, October 21, 2020 - link
If Xe dedicated graphics are priced like better graphics from Nvidia in similar laptops, I don't see the point. It's Intel, so surely they'll do some CPU+GPU bundling deals to OEMs, but...I'm not seeing it yet.Spunjji - Thursday, October 22, 2020 - link
They'll definitely try to make them cheaper, but even if they succeed, Nvidia will probably hack a few chunks off another higher-performing last-gen GPU and sell it at a knock-down price to regain competitiveness.They'd have to be a *lot* cheaper to make up for those drivers, too.
tipoo - Thursday, October 22, 2020 - link
I've been hearing decent things about Intel drivers lately tbh. Don't really have a modern IGP system to test first hand though.Spunjji - Friday, October 23, 2020 - link
They're not as bad as they were, but they're still fairly terrible. Notebookcheck have a couple of articles on it - games crashing, wildly inconsistent performance and graphical errors appear to be order of the day for a number of fairly high-profile titles.That's aside from the fact that in-game performance in no way matches up to their results in synthetic tests.
Teckk - Wednesday, October 21, 2020 - link
That's a very decent laptop and config for the price. Is there any guess on what the Xe will match up to, say a GTX 1660 level?Teckk - Wednesday, October 21, 2020 - link
Also, does it use the Intel 1W display? 17.5 hours of battery life sounds very good.AshlayW - Wednesday, October 21, 2020 - link
GTX 1050 Ti topsUNCjigga - Wednesday, October 21, 2020 - link
I think I read somewhere that GTX 1650/1660 is the performance target for DG1--which would still blow MX 250/350 out of the water. But the mobile Xe MAX may be memory bandwidth starved as it uses LPDDR4X vs. GDDR6. We'll see soon enough.IntelUser2000 - Wednesday, October 21, 2020 - link
The DG1 you are thinking of uses GDDR6 memory, so it has much higher bandwidth.This Iris Xe Max uses LPDDR4X. It doesn't have to share with the CPU, but as AshlayW says the peak performance can't be better than the 1050 Ti mobile.
Unless it can work with the iGPU.
UNCjigga - Wednesday, October 21, 2020 - link
Verge says the Type-C connector is in fact Thunderbolt 4. I'm still unclear as to whether Thunderbolt 4 is forwards/backwards compatible with USB 4--or if that requires a software/firmware update--or if USB 4 is entirely different (is it supported in the chipset?)DigitalFreak - Wednesday, October 21, 2020 - link
Thunderbolt 4 is a superset of USB 4. It includes everything in USB 4 and adds to it. It's built into the Tiger Lake CPUs.twotwotwo - Wednesday, October 21, 2020 - link
Swift 3 is apparently the new sleeper laptop: a model with the 4800U and another with Intel's shiny new graphics, both under $1k.yankeeDDL - Wednesday, October 21, 2020 - link
I am a die-hard AMD fan, but this looks interesting. That battery life, a decent GPU, and with 512GB of SSD finally (Optane, on top of everything). $900 does not sound bad at all. Qhen are we going to see a review?Spunjji - Thursday, October 22, 2020 - link
Part-Optane - I'm guessing it's the H10. Still not a bad proposition, if the GPU can deliver - big IF there, though.Mil0 - Thursday, October 22, 2020 - link
AMD will prob announce either Van Gogh, Cezanne or both around CES - esp van Gogh has a good chance of making the Xe Max and the MX 350 obsolete.cycomiko - Wednesday, October 21, 2020 - link
"This might be because entry level discrete graphics options are more favored in the APAC market."or might be that the poor folk in APAC get what they are given. Typically favouring higher end SKUs, or extreme low end, so many mid-tier stuff just goes through to USA.
eastcoast_pete - Wednesday, October 21, 2020 - link
I look forward to reading the first reviews! If at all possible, please run it side-by-side with a laptop with the same CPU, but a GTX 1650 or 1660 for dGPU, so we can see what's what. I for one would love to see some more competition in low-mid range dGPU offerings for laptops; right now, it's mostly if not all NVIDIA all the time. Maybe Intel surprises us positively, for a change!Ian Cutress - Thursday, October 22, 2020 - link
Unfortunately, we've not been sampled. Acer UK has me on their list for a unit, but no ETA on EU sampling.eastcoast_pete - Thursday, October 22, 2020 - link
Thanks Ian! I guess Acer doesn't need the customers? I am in the market for a laptop in this range, but I will read the reviews first, and so will others.pk1489 - Thursday, October 22, 2020 - link
the price in china only need 800$1165G7+16Gb LPDDR4 4266+512G SSD+Xe MAX=5299¥=800$
See in JD....
oracle14 - Thursday, October 22, 2020 - link
https://www.youtube.com/watch?v=wdi5jvZ8S1kthere is a mini review on Indonesian YouTube channel
Mil0 - Thursday, October 22, 2020 - link
Actually has quite a few benchmarks, but not all with comparisons or settings detailed.3D mark firestrike:
Xe Max: 4629
GTX 1050: 6646
GTA 5 very low 1080P: 54fps
Far Cry 5 very low 1080P: 37fps
Pretty dissapointing for a dGPU - seems barely faster than the iGPU/Vega8.
tipoo - Thursday, October 22, 2020 - link
And kind of expected. DG1 is the same spec as the top Tiger Lake IGP, but with a bit more spread out thermals to play with and dedicated VRAM. Not entirely sure if it deserves to exist.Spunjji - Friday, October 23, 2020 - link
No clear indication that it does. The giveaway will be whether it shows up in many other devices, or just this one test design.