Comments Locked

81 Comments

Back to Article

  • bodmat - Monday, January 31, 2022 - link

    From several tests of 8gen 1, it is disastrous.
    It overheats and throttles so bad that it scores even lower tha 870
  • ikjadoon - Monday, January 31, 2022 - link

    >We will first have that first instance of that in a PC chip sampling next year, getting to products in 2023.

    For everyone else who skimmed the introduction and went straight to the questions, this interview was in late 2021.

    //

    I'm needing to temper my expectations for the NUVIA SoC (even as we can now confirm it is an SoC and not just the P cores), unfortunately, and it's on Microsoft: Microsoft has zero incentives to push Windows on Arm as even a third-class product: see the failures of the Surface Pro X.

    1. Azure
    2. Office
    3. Xbox
    4. Windows
    5. Surface
    ...
    XXX. Windows on Arm

    Windows RT was the quietest implosion in modern OS history.
    Windows 10X was x86-based, not Arm.
    The Surface Pro X is priced out of this stratosphere for its performance and compatibility ($900 MSRP for 8 GB / 128 GB)

    IIRC, multiple major OEMs were disillusioned with Windows 8 / RT that they used Microsoft's arrogance as a stepping board to Chromebooks and ChromeOS (that have sold far, far better).

    Microsoft talks a big game ("Surface Pro X! CHPE! ARM64EC! Panos just loves Arm!") and then delivers 1/3 of the items we expected, piecemeal and with arbitrary restrictions, four years later. I'm waiting to laugh: "Windows 12 will do what we meant to do in Windows 11 on Arm, you see."

    - Windows on Arm launched four years before macOS (which is now 100% Arm).
    - Windows OEM is the worst-performing Microsoft segment every quarter (https://i.imgur.com/KcU5ZK4.png) and Windows has less than 65% marketshare in the US desktop / laptop market (https://i.imgur.com/VfpfuRB.png)

    Things Microsoft should do, but won't:

    -- shift their Surface devices primarily to Arm
    -- offer significant developer financial incentives to ensure Arm app maintenance over xx years & convince laggard developers
    -- fund their massive game studios to create select Arm-native ports of popular games
    -- invest in multi-vendor SoCs (e.g., Mediatek, Samsung) to ensure Qualcomm doesn't have a needless monopoly over every Windows on Arm PC
    -- significantly improve default / pre-installed Arm driver support so edge-cases (e.g., printers) don't surprise users
    -- re-invest in Windows 10X development that is Arm-native-first (battery life won't be consistent if these things need to be online 8 hours to install an OS update; let's move into the modern age)

    But, Microsoft never figured out how to make money with consumer computing besides Office, so why should they even try?
  • ikjadoon - Monday, January 31, 2022 - link

    Well, I didn't mean that as a reply to you, bodmat, but here we are...
  • bigvlada - Monday, January 31, 2022 - link

    Why should consumers prefer Arm SoC instead of x86? We had terminals and mainframe Unix machines in the eighties. Individual users got a lot more freedom with personal computers; discrete graphics, upgradable memory, disks and lots of expansion slots and ports. An we should ditch that and embrace SoC approach on desktop an in laptops. with no upgradeability because?

    Windows on Arm was made because someone paid for it. Just as someone paid for Windows on Alpha CPUs. What's the incentive to have professional software on Arm? Its faster?

    As for games, you want MS to order its studios to make another dumbed down version of their games because? We also had that situation in the ninties. For instance, in 1994 the game Rise of the Robots, an abysmal Mortal Kombat clone, was made for Amiga, Amiga CD32, DOS, Mega Drive, Game Gear, Super NES, 3DO Interactive Multiplayer and Philips CD-i. Of these platforms, only PC remains. And now, when we finally have x86 hardware on (portable)consoles and on PC, you want yet another segmentation in hardware. For what?
  • mode_13h - Monday, January 31, 2022 - link

    > now, when we finally have x86 hardware on (portable)consoles and on PC,
    > you want yet another segmentation in hardware.

    Phones and tablets are one of the biggest gaming platforms, today. So, they wouldn't really be creating a segmentation that doesn't already exist, unless you're talking about specifically *Windows* gaming devices (but then what do you mean by portable x86 gaming machines, if not the Linux-based Steam Deck?).
  • bigvlada - Tuesday, February 1, 2022 - link

    Aya Neo and One X Player portable consoles also use x86 cpus. They come with Windows 10/11 but you probably could install linux on them. You could probably also install Windows on Steam Deck. Even these version are not comparable to desktop because the hardware is not powerful enough.

    Phone and tablet games are...acquired taste. People can already play them on desktop using android emulators. They lack the power to play modern desktop games. In my aforementioned example, the PC version was the strongest, others had less textures, resolutions and colors. Nintendo Switch ports are few and far between.

    Porting games to other form factors is easier when the underlying architecture is the same. The poster above wants the studios to support different and less powerful architecture for reasons unknown. Even when the hardware is, let's be generous, equally powerful, the amount of work needed to support a different platform is not worth it. How many games are written for Macintosh, both x86 and M1 variants?
  • Raqia - Wednesday, February 2, 2022 - link

    If by acquired taste, you mean the dominant form of gaming now:

    https://www.visualcapitalist.com/50-years-gaming-h...

    The quality of a game has little to do with the power of the hardware it's running on past a pretty to achieve threshold. The Switch for instance has much less power than most high end cell phones, but its games are indeed full fat console exclusives.

    I'll agree most mobile games are hors d'oevres in comparison to sit the down for several hours you get with a console or PC; however, I like both Wendy's and Jean Geroge's.
  • bigvlada - Wednesday, February 2, 2022 - link

    It was a nicer way to say that 99% of mobile games are ad infested, lootbox, crystal buying garbage.

    I started gaming on Sinclair ZX81 which had 1kb of ram and 60*40 resolution. I know the hardware does not solely determine the quality of the game, but it helps. If that wasn't the case, Neo Geo would be a thing, not a museum curiosity.

    The Nintendo exclusives and the way Nintendo does business was explained in Atari vs Nintendo trial in early nineties. That's one of the reasons why studios prefer not to deal with them.

    Here's LTT video of a portable x86 console with top end 96EU Tigerlake graphics. It almost chokes when it tries to run Doom Eternal. Running something similar on arm device would be like trying to run first Crysis on Pentium. Also, until we move to optical computers, the portables will always struggle with thermals.
    https://www.youtube.com/watch?v=8NK1I6S_vFs
  • Raqia - Wednesday, February 2, 2022 - link

    Doom runs perfectly fine on the Switch:

    https://youtu.be/2d9bgoLzC4s

    If you complain that it's not running the same binary / assets, I'd say in return that given the very faithful quality achievable on a pint-sized SoC as the Switch's, the PC version's added fidelity for a fast moving game like that is well beyond the point of diminishing returns.

    I'd even say in general traditional pixel count, local kernel anti-aliasing models, poly count and texture scaling are all well past the point of diminishing returns. Better investments could be made in techniques that properly identify scene level features such as DLSS or make game entity movement and behavior more interesting i.e. good AI. Indeed, camera shot movies make perfectly seamless use of 1080p displays while even 1440p games show terrible aliasing so traditional brute force isn't what's needed for better gaming fidelity.
  • GeoffreyA - Friday, February 4, 2022 - link

    I haven't played any recent games, but feel from an outside point of view that realism hasn't improved all that much over the past decade. Seems, to me at any rate, the critical gains in 3D were laid down in the late '90s and early 2000s, and now it's just "making bigger whatever's there." I wonder if there'll be some new discovery that'll take 3D forward again.
  • Farfolomew - Friday, February 4, 2022 - link

    My sentiments exactly. The 3D-accelerated era brought on by games like Quake was the last true revolution in PC gaming gfx. I thought the next revolution might have come in the form of VR, and then Ray Tracing, but neither of those have really taken the industry by storm. I don’t know what the next graphics breakthrough will be.But that’s also why I think the discrete GPU is on its last legs. We’re at, or have already passed, the point of diminishing returns in modern graphical fidelity, and I think soon APU graphics will be sufficient for the vast majority of gamers (ie, 1080p)
  • GeoffreyA - Friday, February 4, 2022 - link

    It was normal life back then when each new engine of Carmack's took graphics forward. At least for id, I think Doom 3/idTech 4 was the last massive jump. If I'm not mistaken, Rage/idTech 5 actually regressed to baked-in lighting.

    After shaders, bump mapping, real-time lighting, and HDR, there doesn't seem to be much left any more. I was also of the view that ray tracing was going to advance graphics; but no luck. All we see nowadays is an obsession with increasing resolution and a bus-full of other gimmicks.
  • GeoffreyA - Friday, February 4, 2022 - link

    And I think they've reached the limits of their approximation, which, like a plastic fruit, will never look quite like the real thing, owing to the inward difference. Possibly, the only way to achieve true transparency, where 3D can't be told apart from reality, is to implement the laws of physics from some low level, from which the system painstakingly builds up the image we see. Slender chance, though, it'll break the 60 fps barrier any time this century!
  • mode_13h - Sunday, February 6, 2022 - link

    > I think they've reached the limits of their approximation, which, like a plastic fruit,
    > will never look quite like the real thing

    Ever heard of PBR?

    https://en.wikipedia.org/wiki/Physically_based_ren...

    > Possibly, the only way to achieve true transparency, where 3D can't be
    > told apart from reality, is to implement the laws of physics from some low level

    Like Ray tracing or ideally Path tracing.

    https://en.wikipedia.org/wiki/Path_tracing
  • mode_13h - Sunday, February 6, 2022 - link

    > bump mapping

    Geometry tessellation isn't bump mapping. Bump mapping is a cheap hack, but tessellation is the real deal.

    > All we see nowadays is an obsession with increasing resolution

    More resolution means you need more detailed textures and geometry. So, it's not as simple as just outputting more pixels.

    > a bus-full of other gimmicks.

    Visual fidelity is hard. Ray tracing is truly the next level, but it'll take a while before it goes fully mainstream. Meanwhile, more horsepower lets developers deliver better fidelity at higher resolutions and framerates.
  • Raqia - Friday, February 4, 2022 - link

    I do think untethered VR is taking off w/ the Oculus Q2 doing some 10mm units prior to the '21/'22 holiday season.

    Would say Ray Tracing is pretty meh in most implementations and is most useful if it alleviates artist burdens of creating a better, coherent presentation. By itself, it's just like any other tool for simulating non-primary light bounces like cube maps or ambient occlusion. DLSS (and other neural net based image enhancers) are game changers in terms of frame rate and visual quality. I also think most people don't realize what power is in their pockets:

    https://www.youtube.com/watch?v=nrvnpFCcZeA

    The build of Windows 11 in the vid mostly doesn't use the SoC's DSP which provides the main grunt for neural net style algos. I think a good DLSS style resolution enhancement implementation and the freedom to load OS' onto phones (say from an app store) would make a lot of us ditch the archaic ATX form factor boxes except for balls to the walls hobby purposes.
  • GeoffreyA - Saturday, February 5, 2022 - link

    While VR is interesting, and I'd certainly like to give Alyx a try, at the end of the day, it's really just a screen that wraps round in all planes. Still lacking is some new technique in 3D itself.

    As for these ML resolution enhancers, I reckon history will see them as a gimmick. These corporations have got nothing new in 3D, so they've got to make a din about the scalers, not to mention ML, which is fashionable and in vogue right now.
  • mode_13h - Sunday, February 6, 2022 - link

    > Still lacking is some new technique in 3D itself.

    The problem of visual fidelity was solved long ago. It essentially boils down to path tracing + BDRF material models. This is basically what the motion picture industry uses.

    The challenge is how to deliver the best quality on consumer hardware. And that's where things like VRS and DLSS come in.

    > As for these ML resolution enhancers, I reckon history will see them as a gimmick.

    I doubt it. You're always trying to stretch the rendering power of whatever silicon and Watts you have at your disposal. Now that they're here, they'll probably never go away.

    > These corporations have got nothing new in 3D

    That's not true, but mostly what hardware is delivering is more horsepower and better optimizations (e.g. VRS, better texture compression, new geometry shaders, etc.). If you followed developments of graphics hardware, graphics APIs, and rendering engines, you wouldn't say they're not doing anything new.
  • mode_13h - Sunday, February 6, 2022 - link

    > The 3D-accelerated era brought on by games like Quake was the last
    > true revolution in PC gaming gfx.

    Quake-era 3D hardware looks like garbage, compared with what we have today. It had limitations like 16-bit color, poor texture interpolation, no procedural shading, few light sources, etc. etc.

    If you want to look at big leaps in 3D hardware capabilities, the next was programmable pixel (fragment) and geometry shaders. Then, geometry tessellation and compute shaders. Next in line would probably be ray tracing. Along the way, there were advancements in countless areas, like texture compression and GPU ISA. Plus, optimizations like VRS and the horsepower to enable techniques like MSAA and eventually DLSS.

    > I think the discrete GPU is on its last legs.

    I think hardcore gamers couldn't disagree more.

    > I think soon APU graphics will be sufficient for the vast majority of gamers (ie, 1080p)

    APUs offer just a tiny fraction of what dGPUs can do. Plus, the world doesn't want 1080p gaming. Higher-res displays are now very affordable.

    Graphics performance is ultimately about die area and memory bandwidth. And APUs will never be able to devote enough silicon or have enough memory bandwidth to threaten dGPUs.
  • mode_13h - Sunday, February 6, 2022 - link

    > the critical gains in 3D were laid down in the late '90s and early 2000s

    I don't agree with that. Most of my gaming experience was from the PS3, where there seemed a huge difference between the visual fidelity of the early titles vs. the late ones. The PS4 then took it to another level.

    And looking at PC gaming, even Crysis looks incredibly dated, compared to current AAA games. Plus, don't forget that's Crysis running on today's hardware, so it's not even a fair comparison.
  • GeoffreyA - Sunday, February 6, 2022 - link

    As you can see, I am terribly uninformed and out of touch with gaming and the graphics industry, and perhaps was judging it not from facts but perception. What I saw was a difference in degree, not kind, whereas in the old days, going from say Quake III to Doom 3, it was evident new technology was at work.

    I must say, I hadn't heard of path tracing, and of PBR only in passing as a term, but path tracing seems particularly intriguing to me. Just the sort of thing games are in want of. As for the motion picture industry, I can't resist commenting that films have never looked more fake than today. They should just do away with actors altogether and let ILM make the whole film in the studio.
  • GeoffreyA - Sunday, February 6, 2022 - link

    (To cite just one example, the scale models of 1968's "A Space Odyssey" make a laughing stock of today's hi-tech CGI gunk in movies.)
  • Kangal - Sunday, February 6, 2022 - link

    You guys are an odd bunch.
    You started about debating x86 vs ARM, which morphed into power draw debate, then morphed into graphics debate, which then morphed into tech history. There's no real consistent argument here.

    At the end of the day, the hardware matters but it means nothing if you don't have the software for it. There is code that runs on both x86 and ARM. So this isn't even a CISC vs RISC problem.
  • mode_13h - Monday, February 7, 2022 - link

    > You guys are an odd bunch. ... There's no real consistent argument here.

    With so few articles being published on this site, the situation is ripe for discussions to meander. As long as it's good-natured and topically adjacent, I don't see a problem.
  • Kangal - Monday, February 7, 2022 - link

    I guess.
    But all this postulating is due to two big factors:
    - x86 performance and efficiency plateau
    - ARM performance and efficiency exponential growth

    If hypothetically ARM SoCs were today still floundering around the Cortex A17-to-Cortex A57 level of performance and efficiency. Whilst both Intel and AMD managed to develop their processors to keep the relative performance but at only one-third of the power requirements. In other words, they flipped.

    The landscape would be vastly different. People wouldn't care much about iPhone and Android as they do today. We wouldn't see any pressure to have macOS or Windows running on ARM. And people wouldn't be that interested in porting emulators to ARM.

    Instead, there would be efforts to try to build a modern smartphone with x86 internals. That way it becomes Omni. Can do your light tasks and calls, but could get docked in your Office or Home, where it transforms into a full-fledged Working PC. Or gets docked to your TV and becomes a Gaming PC.

    Since we are where we are, we need to acknowledge the limitations for such ambitions (either port everything to ARM)(or go all-in with Cloud Computing). It's a bit meaningless to discuss how x86's evolution in gaming applies to the current state of ARM.
  • GeoffreyA - Tuesday, February 8, 2022 - link

    Popular wisdom has it that ARM, in all its shiny glory, is the future of computing, heaps of performance just waiting in its Apple-run tanks. But Nature shows that everything reaches a diminishing return, and why not ARM too, if it hasn't done so already? Apple Silicon is fast, we admit, but Intel and AMD aren't very far behind. In the larger picture, there isn't much of a difference. I think what's happening is that computing is reaching a limit, under the current scheme of things, and no amount of shuffling will change that. It's physically evident in manufacturing, and doubtless computation, too, has some obscure limit, consistent with the world we live in.

    As for cloud computing, it's had brilliant success and is great, but people need to ponder what it means to have the world's calculations centred in the hands of a few big companies.
  • mode_13h - Tuesday, February 8, 2022 - link

    > Nature shows that everything reaches a diminishing return, and why not ARM too

    I think x86 will run out of gas before more modern ISAs, like ARMv9 and maybe RISC-V. Kangal's point about the inferior perf/W of x86 is good evidence of this fact.

    Let's wait and see what Qualcomm/Nuvia can actually deliver. Mediatek has also promised to deliver ARM SoCs with Nvidia graphics for small form factor NUC-like desktop machines. That should prove interesting.

    On the RISC-V front, Intel just joined the foundation as a Premium member. So, that deepens their investment in the ISA.
  • Kangal - Tuesday, February 8, 2022 - link

    And ages ago Intel was also the owner of Moblin, and were collaborating together with Nokia to make a Linux Distribution, that was open-source, modern, and ready for the next-generation of devices (eg iPhone 4, iPad 2, ThinkPad Helix, etc etc). They called it MeeGo. What ended up happening is they stifled development, much to the dismay of Nokia, who ended up releasing a bastardised version on the Nokia N9 and Nokia N950. Then the entire project (MeeGo) was scrapped, Nokia was sold to MS, and Intel was happy to continue their partnership with Windows.

    So Intel joining RISC-V is not a good thing. In fact, I see it as a negative.

    As for the future of computing?
    I think ARMv9 will soon surpass AMD and Intel with higher performance AND lower energy drain, on all markets: watches, phones, tablets, laptops, desktops, servers. However, it won't matter much for AMD or Intel, at least not for another 5-10 years. They can still push forward to 128-bit, 3D Stacking, better lithography binning, overclocking, and stronger cooling. So there's still some runway left.

    My postulation?
    We will hit the atomic limits of Silicon. And will have to move away from processing with electrons, to photons/light, by having microaturised lasers. And/or we will move away from Binary Computing into Quantum Computing. Specifically speaking Ternary Computing aka Base-3. So instead of an On/Off state, it will be a Negative/Neutral/Positive Charge state. So it will be a trit instead of a bit, or a tryte instead of a byte. In particular, a 27-trit computing (higher efficiency) and 81-trit computing (higher performance) will be the targets in the coming decades when we've ran out of runway on x86, Silicon, and Binary Computing.
  • GeoffreyA - Wednesday, February 9, 2022 - link

    Using photons could be the next step; but on the other hand, electrons already are able to reach a sizeable piece of the speed of light, so at best, light-based CPUs might only be marginally faster and come with tradeoffs of their own. I also think there's a practical limit to computation itself, and we're now scraping the surface of that coffin. If the universe is being computed, we can't operate faster than that.

    Will quantum computers replace classical ones? For my part, I doubt it, the two excelling at different tasks: QCs in the realm of probability, and classical computers where exact answers are needed. In every quantum thing, there always seems to be a classical touchstone, which shows that these two realms operate at different levels.

    My computer science is rather weak nowadays to hazard a guess on the base-3/binary question, but my feeling tells me it would complicate things a great deal, be slower, and goodness gracious me, what an upheaval in so much code!
  • GeoffreyA - Thursday, February 10, 2022 - link

    "electrons already are able to reach a sizeable piece of the speed of light"

    Realised this mistake only afterwards, that they're certainly not going to reach such velocities in silicon.
  • mode_13h - Thursday, February 10, 2022 - link

    > electrons already are able to reach a sizeable piece of the speed of light

    The promise I've long heard about optical computing is that photons don't interact. This supposedly enables optical computing to scale up better than electronic computers. However, I don't know if this applies to modern approaches to optical computing.
  • GeoffreyA - Thursday, February 10, 2022 - link

    Being both massless and a force carrier, it makes sense they behave differently than ordinary matter.
  • mode_13h - Wednesday, February 9, 2022 - link

    > Then the entire project (MeeGo) was scrapped

    Probably because of Android, if we're being honest.

    > Nokia was sold to MS

    Probably because of iPhone capturing the high-end phone market and Android leading to commoditization of the bottom end.

    > So Intel joining RISC-V is not a good thing. In fact, I see it as a negative.

    An OS distro is very different than an ISA spec. There are already enough members in the RISC-V consortium that I doubt Intel could gum up the works if it tried. It's also very late to the game.

    > it won't matter much for AMD or Intel, at least not for another 5-10 years.

    Maybe not in some markets, like desktops and some laptops, but Intel and AMD will suffer in the server market if CPUs cores like the N2 deliver on their promises.

    > They can still push forward to 128-bit, 3D Stacking, better lithography binning,
    > overclocking, and stronger cooling.

    128-bit what? ARM-based SoCs can do 3D stacking and have access to the same TSMC lithography. Overclocking just hurts perf/W even more. Not sure why you think Intel or AMD have any advantage in cooling.
  • ikjadoon - Wednesday, February 2, 2022 - link

    1) Because Qualcomm, Mediatek, and Samsung offer "good enough" performance for far less power. The Surface line-up is prime for this. AMD & Intel could have done better, but they clearly haven't, even on newer nodes. And would they ever honestly care enough?

    2) Discrete graphics, RAM, disks, expansion all all work with Arm. What on Earth does an ISA have to do with expansion and flexibility? Plenty of socketed Arm CPUs exist.

    3) Why would there be any segmentation when high-performance emulators exist? Developers that don't or won't change can keep churning out x86 code until they die.

    4) Why on Earth would you need to dumb down a game? What is with all these non-sequiturs? What does an ISA have to do with .... game design?
  • ChrisGX - Monday, January 31, 2022 - link

    >>Microsoft has zero incentives to push Windows on Arm...<<

    And yet, that is what is happening, albeit not at a pace that meets your expectations.

    Microsoft has expressed its support for Windows on ARM on multiple occasions. Panos Panay went to the trouble of publicly restating Microsoft's position on WoA at the 2021 Snapdragon Tech Summit Day 2 keynote. (Go to 0:50:00 on the video timestamp):
    https://www.qualcomm.com/company/events/snapdragon...

    Further announcements about ongoing collaboration between Qualcomm, Microsoft and several PC manufacturers -- HP, Lenovo, Acer and Asus -- were made at CES earlier this year. (Go to 13:33 and 16:50 on the video timestamp for relevant sections of the CES address):
    https://www.qualcomm.com/company/events/ces
  • ikjadoon - Wednesday, February 2, 2022 - link

    They are not "pushing" Arm. They are merely following Qualcomm's lead (which is a small lead to begin with!).

    Not my expectations: what exactly did you think Windows RT was? Who created that expectation?
  • mode_13h - Monday, January 31, 2022 - link

    > Things Microsoft should do, but won't

    Why do you seem to think it's Microsoft's job to usher in the era of ARM PC? It's not. That's on ARM and their SoC partners to bring compelling chips to market.

    Until then, don't expect Microsoft to get too far ahead of the market. I think Microsoft just wants to keep enough of an effort going that they can ramp it up if/when ARM ever really takes off.

    Remember, Microsoft is fundamentally a software company. They will use & support whatever hardware makes the most sense, at the time.
  • iphonebestgamephone - Wednesday, February 2, 2022 - link

    So theyll just have to get strong enough to emulate all windows apps and games? If running native they are already plenty strong.
  • ikjadoon - Wednesday, February 2, 2022 - link

    Because Microsoft does want Windows to run on Arm: Windows RT was 100% Arm and Windows 10X did have allowances for Arm applications (though it should've been 100% Arm native).

    Microsoft has interest in Arm, but with poor execution is my main gist here.

    Microsoft, even as a software company, does understand that software + hardware need to be developed together, e.g., their impetus to launch the Surface lineup (x86 hardware + x86 software).

    Microsoft has seemingly 3 out of 5 puzzle pieces, but the last two (corporate will + compelling chips) are seemingly too insurmountable.
  • GeoffreyA - Wednesday, February 2, 2022 - link

    Microsoft's toolsets have ARM support seemingly on the same level as x86-64. I'd expect one can compile their Windows application to ARM with minimal change. Seems they've got everything ready in case computing goes over to ARM.
  • domboy - Tuesday, February 1, 2022 - link

    Most of the issues with Windows on ARM were solved with Windows 11. I.E the lack of x64 app support. Yes, I'd like to see a lot better vendor support, but it's really a chicken/egg situation since unlike Apple, Microsoft isn't trying to ditch x86 support. As someone that had a Surface RT I can say Microsoft definitely did a much better job with Windows on ARM this time around. It's not locked down like the ill-fated Windows RT.

    Yes, I do think it'd be great to see cheaper devices, and I still think a Surface Go with an ARM chip would be a good opportunity. That said, with the introduction of the Surface Pro 8, Microsoft has re-positioned the Surface Pro X to be the cheaper Pro device.
  • The0ne - Monday, January 31, 2022 - link

    Seriously, wtf. Why is this question not being ask when it's so fcking crucial. Who the fck cares if it's Gen fcking 1million or something when it overheats, throttles and shuts down. This isn't just a Qualcomm issue, Google and their pathetic overheat then shutdown issue is also a problem and those aren't even using the top chips.
  • tkSteveFOX - Monday, January 31, 2022 - link

    I think we can blame this mostly on the Samsung process. Dimensity 9000 is already looking like the best performance and efficiency of the new Andriod premium SoCs. Yes, it may score 10-15% lower in games, but who cares? That gpu power is not needed + all of us would take the 20-30% better efficiency as a trade-off.
    With Huawei's Kirin SoC gone, the market has stagnated. Hopefully, MTK can take Kirin's place.
  • nandnandnand - Monday, January 31, 2022 - link

    Only Nuvia can save Qualcomm now.
  • ToTTenTranz - Monday, January 31, 2022 - link

    Qualcomm entering the discrete GPU market would be really interesting. We're in dire need of more players, especially as fab capacity catches up within the next 5 years.
  • mode_13h - Monday, January 31, 2022 - link

    > We're in dire need of more players

    Nvidia, AMD, Intel, and Imagination aren't enough? That's already going from 2 -> 4.

    Four is pretty unstable. Usually, markets settle out at 2-3 main players.
  • jamesindevon - Monday, January 31, 2022 - link

    Trying to read a little bit between the lines of this:

    > Nuvia ... first instance of that in a PC chip ... Then we're going to think about how to bring
    > it into mobile, automotive, and then infrastructure application as well.

    That Nuvia-based infrastructure CPU is going to look an awful lot like a server CPU, but in an area where Qualcomm know they can compete, giving them volume and income. It would be a lot easier to break into the server market from there.
  • Silver5urfer - Monday, January 31, 2022 - link

    So Qcomm will compete against 2023 Apple M series processor ? Lol. That gives me a big laugh. First they say they love how Mobile is coming to PC which is what ruined the damn PC with all the cancerous BGA trash infestation and the Windows 11 going mobile UX to the point of making the core explorer and shell downgrades on top of the taskbar changes.

    Also it's an SoC and no competing with Discrete Processors. No wonder, AMD and Intel would hemorrage this garbage in 2023, Intel has EMIB set with TSMC 3nm ready with Chiplet die already sampling. AMD has full plan laid out for complete dominance in Server leadership on both high performance AND high thread density (Genoa and Bergamo respectively) on Zen 4 for 2022 and 2023.

    So much talk they do. Past 2 generations of Snapdragons were overheating junk. Too much performance with zero efficiency, just made for benchmark BS. In real world these past 2 gen processors have worst battery life pair that with mmWave extra useless junk baggage to make consumer be a beta tester for their half-cooked mess. SD888 is worst till date, how about they first put some money into CPU R&D like post 810 disaster and make a worthy CPU with real efficiency that helps the battery life and consistent performance ? Then we can talk about the horrible mobile garbage BGA technology and ARM emulation nonsense for x86 universally best backward compat tech. Oh well how can they do that when they have to make consumer pay yearly for these. Not just Qcomm, Apple, Google refresh their OS and HW every damn year and push this. Qcomm is actually better on the Development but very poor in handling the overall package.
  • Ian Cutress - Tuesday, February 1, 2022 - link

    Qualcomm purchased a company built from the lead engineers of the Apple M-series.
  • Silver5urfer - Tuesday, February 1, 2022 - link

    With all respect Ian, they have been very disappointing for a long time now. Also that Nuvia graph was too much over the top, more than Apple marketing.
  • Ian Cutress - Wednesday, February 2, 2022 - link

    By they, I assume you mean Qualcomm. Well that really doesn't matter given Nuvia coming in and expectations for their core. I also didn't much like the graph, but I'm looking at the people doing the work and making the decisions. I remain hopeful.
  • Silver5urfer - Wednesday, February 2, 2022 - link

    Yeah I meant Qcomm. Okay, that's a fair take.
  • Kangal - Thursday, February 3, 2022 - link

    I think your hunch is correct, Silver5urfer.

    Qualcomm doesn't have to compete with Apple, they just need to make sure they're not complacent and behind or consumers and OEMs would start complaining. The only times Qcom starts offering discounts and better chipsets, is when they have competitors lighting their bottoms on fire. They're very Intel-like in that respect.

    In case you're interested, I was thinking of this topic the other day. And I listed all the flagship processors Qcom released, putting them from the most competitive to the least competitive. And competing not just against other Chip Manufacturers, but also against itself.
  • Kangal - Thursday, February 3, 2022 - link

    Best/most competitive chipset:
    QSD 800 (Krait-400), Qcm S4 Pro (Krait-200), QSD 801 (Krait-400), QSD 855 (Cortex-A76), QSD 855+ (Cortex-A76), QSD 820 (Kryo-100), QSD 821 (Kryo-100), QSD 805 (Krait-450)
    ...
    Qcm S4 (Krait-200), QSD 845 (Cortex-A75), QSD 870 (Cortex-A77+), QSD 835 (Cortex-A73), Qcm 600 (Krait-300), QSD 860 (Cortex-A76), QSD 865 (Cortex-A77), QSD 865+ (Cortex-A77)
    ...
    QSD 888 (Cortex-A78), QSD 888+ (Cortex-A78), QSD 808 (Cortex-A57), QC 8g1 (Cortex-A710), QSD 810 (Cortex-A57), Qcm S1 (Scorpion-v1), Qcm S2 (Scorpion-v2), Qcm S3 (Scorpion-v2+)
    Worst/least competitive chipset:
  • Silver5urfer - Friday, February 4, 2022 - link

    Yep, you are correct. Qcomm just only manages to give that basic bump nothing else. And that list is what I would definitely agree with.
  • domboy - Wednesday, February 2, 2022 - link

    > ARM emulation nonsense for x86

    I'm assuming you're referring to the x86 translation in Windows on ARM? What's wrong with it? It was a bit buggy at first, but at this point it works pretty darn well. So does the new x64 emulation in Windows 11. No you can't emulate drivers, and that's really the one weakness - if an app tries to install some kind of driver (eg VPNs or possibly anti-cheat add-ons to games).

    Sounds like you've got some beef with Qualcomm in general. I could really care less what they do in mobile space, but I hope they keep improving their chips for PCs.
  • Silver5urfer - Wednesday, February 2, 2022 - link

    Except Qcomm and Exynos there's not a single ARM processor that would spend my time on talking. Because only these 2 are proper in releasing the kernel src. Pixel does but that hw is utter garbage buggy disaster and horrible anti custom rom crap baked in.

    I do not hate Qcomm, I hate how PC is forced to become mobile with the useless use and throw machines festering the entire Computing aspect with BGA soldering taking things go south like Soldered SSDs, Wifi cards, DRAM etc.
  • TheinsanegamerN - Thursday, February 10, 2022 - link

    Consoomers want their super thin integrated devices. What did you expect would happen?
  • GeoffreyA - Wednesday, February 2, 2022 - link

    "Windows 11 going mobile UX to the point of making the core explorer and shell downgrades on top of the taskbar changes"

    Unfortunately. I reckon people will have to stay on 10 as long as they can. 11's a bit like ME and 8. Let's hope 12 brings sense back to Windows.
  • GeoffreyA - Wednesday, February 2, 2022 - link

    "ARM emulation nonsense for x86"

    I think most applications still under development will be ported to ARM. Just a matter of some changes here and there and recompiling. Notice 7-Zip and WinMerge, and I suppose there are others, have ARM builds of late.
  • TheinsanegamerN - Thursday, February 10, 2022 - link

    We've been hearing that for 3 years now. Still waiting.
  • GeoffreyA - Thursday, February 10, 2022 - link

    True. Could be difficulty or lack of interest. Getting it to compile should be easy. The tricky business will be with integers and bit manipulation, where ARM has some nuances.
  • yeeeeman - Monday, January 31, 2022 - link

    Ian, you didn't ask about the shitty thermals and power consumption of the sd 8 gen 1 and the fact that there are rumours circulating about a plus version made on tsmc.
  • Ian Cutress - Tuesday, February 1, 2022 - link

    Interview was in s8g1. Our preview of s8g1 performance wasn't even out yet. This interview was before the first preview benchmark sessions.
  • yankeeDDL - Monday, January 31, 2022 - link

    Snapdragon 888 and 8Gen1 are so far behind Apple A15 that Nuvia is got to be their "last" chance for a significant step forward. It astonishing how a mammoth the size of Qualcomm, with a massive head-start, has fallen behind so dramatically in the performance/power ratio, to be trounced so miserably by Apple's A15.
    The 888 was a hot mess hardly any better than the 865. The 8G1 does not make any progress on the CPU side so they are, effectively, stuck on 2-year old performance which was already significantly behind Apple.
    Chinese competition caught up and to certain extent surpassed Qualcomm's flagships.
    The need 3-4 generation jump compressed into one. Not easy. I'd say impossible to achieve in-house, given their lousy track-record.
    So I think Nuvia's SoC are a huge deal, and they know it. If they fail, I wouldn't say they are screwed, because they are still so dominant, but, boy, they better get their acts together, or they'll end up like Intel was to AMD for the past 5 years.
  • Wereweeb - Monday, January 31, 2022 - link

    1) It's clear to me that Qualcomm thinks too highly of itself. That level of arrogance is only tolerable when it's coming from Intel (Who are indeed too big for their own good) and Apple (Because they subsist on the cult they've gathered around their image).

    2) No one except millionaires and triple letter agencies want to see computers becoming "more like mobile", i.e. unrepairable garbage made to break in five years, filled with hardware backdoors, information harvesting and DRM.

    3) I only see Qualcomm discrete GPU's if they go hard in the server market. There's no reason to enter the cursed market of consumer dGPU's unless you're looking to get market share in the hope that'll get all kinds of software people better acquainted with your hardware.
  • mode_13h - Monday, January 31, 2022 - link

    > No one except millionaires and triple letter agencies want to see
    > computers becoming "more like mobile",

    Corporations do. My employer switched from buying machines to getting them on 3-year leases. They see old hardware as a liability for security and forcing them to employ more capable (i.e. expensive) IT staff. Rather than let employees install upgrades, they rather you just get a whole new machine. They also want you to put all your files in the cloud, so they don't even have to maintain any servers.

    The irony in all of this is that my work-from-home laptop had horrible stability problems that only got sorted out more than 2 years into its lease. I don't want a newer, faster machine if it means going back to that same instability I had before.
  • Silver5urfer - Monday, January 31, 2022 - link

    Workstations are laptops for the most part and nobody is upgrading them. They all are BGA. The aspect here is how x86 machines have configurable options from SSD, RAM, CPU, dGPU - Quadro or iGPU for thin and light. I've used Ivy Bridge Thinkpad in 2017 too it was super slow but it worked and when it got an issue the M.2 helped to port all the data. Later Xeon upgraded Dell workstation Precision. Nobody says put everything in the damn cloud, most of the corporations use OneDrive integration and its optional. With a locked down BGA junkyard nothing is possible at all, more e-waste when these companies dump them like Apple and this Qcomm once they make.

    Qcomm trash is not fit for Enterprise at all in the first place due to many reasons on top ARM to x86 is garbage even for Apple, and MS cannot nail it due to the sheer dependency of the world's computer systems on x86 and Windows.
  • mode_13h - Tuesday, February 1, 2022 - link

    > Workstations are laptops for the most part and nobody is upgrading them.

    First of all, many of us have non-laptop machines and the leasing policy applies to *everything*. Last I checked, the highest-spec we could order was a dual- 20-core Xeon W machine, but that was a while ago.

    Second, most of the laptops in the catalog have at least one DIMM slot and would have M.2 SSDs.

    > Nobody says put everything in the damn cloud,
    > most of the corporations use OneDrive integration and its optional.

    They don't backup the machines or let us buy servers, except for specific projects which require them. So, if you don't want to lose your data in the event of a SSD failure, etc. then putting your files on OneDrive is the only option.
  • bob27 - Tuesday, February 1, 2022 - link

    "So, if you don't want to lose your data in the event of a SSD failure, etc. then putting your files on OneDrive is the only option."

    Can you buy a thumb drive for backups through your company?
  • mode_13h - Wednesday, February 2, 2022 - link

    LOL, no. They somehow disabled USB storage devices from working.
  • mode_13h - Monday, January 31, 2022 - link

    > I would love to see a scaled Adreno GPU ...
    > Qualcomm could be an amazing fourth competitor in that space.

    Fifth. Imagination's IP is powering upcoming Chinese GPUs:

    https://www.techpowerup.com/289123/innosilicons-fe...

    > what color would it be? ... is an Adreno discrete GPU going to be gold?

    LOL. That's a cute angle.
  • blanarahul - Tuesday, February 1, 2022 - link

    We really need someone to make a good E core more than anything for smartphones. Cortex X1/X2 are simply too hot. Cortex A55/A510 have abysmal performance. A78/A710 are good but they don't seem to be low power enough to be E cores. We need a core that can manage 70-80% of A78's performance but can also operate at low enough power to replace the A55. Then throw 4-6 of these boys in a SoC for decent performance at decent power/heat.
  • nandnandnand - Tuesday, February 1, 2022 - link

    Qualcomm Nuvia cores.

    Meanwhile I'd like to see a successor to the A35, i.e. lower performance than A510.
  • iphonebestgamephone - Wednesday, February 2, 2022 - link

    When are you reviewing the new dimensity
  • Vitor - Wednesday, February 2, 2022 - link

    Lol @ Anandtech actually reviewing stuff nowadays
  • Dante Verizon - Friday, February 4, 2022 - link

    @Dr. Ian Cutress Can you make an article summarizing the current state of the risc V architecture, advances and challenges etc... Much is said about its revolutionary potential, but there are no products available to actually replace a simple office computer.
  • mode_13h - Sunday, February 6, 2022 - link

    > there are no products available to actually replace a simple office computer.

    That's because competing with x86 in its home territory is probably the pinnacle achievement, right now. There's nothing "simple" about it. Even ARM took until last year, if you count Apple's M1 Mac Mini. And if you don't, then ARM still isn't there!

    So far, people are using RISC-V where it's easiest to develop new processor cores and provides an immediate economic benefit. And that's in the embedded sector.
  • zamroni - Friday, February 18, 2022 - link

    Qualcomm needs too learn from apple that laptop apu doesn't need on die modem and strong dsp
  • mixmaxmix - Monday, March 21, 2022 - link

    When upload S22 ultra review?

Log in

Don't have an account? Sign up now