Comments Locked

64 Comments

Back to Article

  • Crono - Friday, June 28, 2013 - link

    So basically ARM is like an author who choose to work with a bunch of different publishers?

    It amazes me how dominant they are, though, in the mobile processor industry, all without having to manufacture chips themselves. 45 billion chips... wow. And I'm guessing there is still plenty of room to grow with more embedded chips in more devices.
  • airmanchairman - Friday, June 28, 2013 - link

    "It amazes me how dominant they are, though, in the mobile processor industry, all without having to manufacture chips themselves. 45 billion chips... wow."

    Their philosophy has always been centred on the needs of their client industries, which extend far beyo
  • airmanchairman - Friday, June 28, 2013 - link

    ...nd the mobile processor industry into the vast all-purpose market (street lights, lifts, automatic turnstiles etc). These industries strictly specified the maximum power output the chips could be capable of (used to be 700mW) and ARM designed within those parameters. As such their entire R&D/Logistics/Marketing focus has been perfectly suited to the battery-power-constrained mobile industry.

    Intel, on the other hand, after decades of dictating to the booming desktop industry with its faster/more cores/larger plants philosophy, is only just coming to grips with the power efficiency and discipline required to compete with ARM in the rapidly growing mobile sector.

    The new Haswell architecture shows the promise and potential that Intel may
  • airmanchairman - Friday, June 28, 2013 - link

    ... bring to the battle against ARM.
  • nadim.kahwaji - Friday, June 28, 2013 - link

    veryyyy nice post, anand i know that you are so busy, but we miss the podcast !!!!!!
  • blanarahul - Friday, July 26, 2013 - link

    Umm... Does Qualcomm also pay royalty for the chips that use Krait?
  • blanarahul - Friday, July 26, 2013 - link

    Is Qualcomm automatically eligible to use any of ARM designs since they are above the "Subscription Licence" level in the pyramid?
  • dealcorn - Friday, June 28, 2013 - link

    I could not follow the sense of your statement "From Intel’s perspective, it made the mistake of licensing the x86 ISA early on in its life, but quickly retreated from that business." I thought IBM required that Intel license X86 to competitors as a pre condition of IBM's selection of Intel's 8086 for the IBM PC. Did Intel made a boo boo letting IBM use the 8086 in the IBM PC? It was reported that Intel's selection of x86 license partners was driven by the dual criteria that the partner must be acceptable to IBM and also likely incompetent to exploit the benefits of the license. In retrospect, their selection of partners achieved these goals. What was Intel's licensing mistake?

    The aphorism, "Knitter, stick to your knitting" has long been appreciated as a useful business strategy. ARM is properly commended for it's strict adherence to this truism in it's attempted optimization of architecture design. Intel appears focused on SoC level optimization. Intel's holistic approach is more time consuming and capital intensive. Consumers will ultimately vote with their wallets which approach is better in the contested ultra mobile space.
  • SleepyFE - Friday, June 28, 2013 - link

    Intel licensed it's IP to AMD, but AMD didn't throtle it's CPU's so it would get more business with constant iteration of the same architecture with marginal improvements to it (Intel's tic toc development model). AMD made the best use of what they got from Intel. At that time their (AMD's) CPU's were faster and they were eating away Intel's business. Intel had since then made a new architecture and did not license it to AMD. Consequently AMD fell behind and had not recovered since.
  • Wolfpup - Friday, June 28, 2013 - link

    AMD and Intel cross license x86. It's not fair to claim x86 is Intel's thing along, given how much stuff Intel has to license from AMD for it-heck, the most obvious is 64-bit. The updated ISA is AMD's design, and Intel started using it a few years later.

    "At that time their (AMD's) CPU's were faster and they were eating away Intel's business. Intel had since then made a new architecture and did not license it to AMD. Consequently AMD fell behind and had not recovered since."

    I can't remember if AMD ever actually licensed chip designs-if they did, it wasn't recent, would have been like the 486. Everything since then that AMD sells has been AMD's design. When AMD was faster, that was with entirely AMD designs. And no, Intel doesn't license Core, but that's not new like you're saying, that's been the case since the 486 days (if those were even licensed), and it's not why AMD's fallen a bit behind. Intel was making horrible decisions with their CPUs, and AMD was designing better ones. AMD's stuff is still really strong/good, it's just Intel's bigger, has thrown more resources at it, usually has a process node advantage, and since getting serious again has been able to have more powerful chips at the high end (though of course the reality is AMD's chips keep getting more powerful too, and are only "bad" in comparison to Intel's newest, and sometimes even then AMD looks better, like that platform comparison using integrated graphics where AMD's new $140 CPU was stomping on one of Intel's best i7s, often 50% better).
  • Arbee - Friday, June 28, 2013 - link

    AMD's clones up through the 486 were true clones of the relevant Intel parts. For the Pentium Intel stopped allowing that and AMD started making their own designs, which at first weren't all that good.
  • ShieTar - Friday, June 28, 2013 - link

    Even these days, AMD is somewhat forced to at least copy the INTEL instruction set. At this point, Intel definitly profits from the situation, whatever they decide to include for their CPU, AMD needs to support soon enough in order not to fall back.
  • fluxtatic - Monday, July 1, 2013 - link

    The new arch Intel made was Itanium, as they were hoping to get away from x86. We all know how well that worked out.

    What put AMD behind was Intel's incredibly dirty tricks around the days the Athlon64 was stomping all over Intel - all the backroom deals with Dell and HP, among others, to keep AMD parts out of PCs. Intel would kick in significant amounts of money in both marketing dollars and straight (very illegal) kickbacks to companies that were willing to play ball.

    It ended up settling a couple years ago, with Intel paying around $2 billion, as I recall, but that was far too late for the damage done to AMD.

    And to Wolfpup's point - there's likely not even money changing hands between Intel and AMD these days - they've both got significant contributions to the x86 architecture that are covered by the cross-licensing agreement. However, likely a lot of the newer features Intel has developed aren't covered, and AMD has to clean-room reverse engineer them if they want to include them in their own designs.
  • Scannall - Friday, June 28, 2013 - link

    There were many reasons for them licensing x86 at the time. And there used to be a whole lot of licensees. AMD is just the last one standing. Other manufacturers were coming up with very good and compelling CPU architectures, and there was really no standard per se. Between contract requirements, and many companies using licensed x86 IP it became the 'standard'.

    So it looks to me like it was far from a mistake. Use all the licensees to get your architecture king of the hill, then grind them into the grave. Big win for Intel.
  • rabidpeach - Saturday, June 29, 2013 - link

    so arm could be early intel.
    1. build up cash pile thru licensing
    2. buy a fab
    3. make it make your exclusive next gen processor that isnt licensed
    4. sell it and destroy all the commodity producers

    this strategy would need intel to stay on the sidelines.
  • TheinsanegamerN - Sunday, June 30, 2013 - link

    except amd went and sold its fab.
  • slatanek - Friday, June 28, 2013 - link

    Great article Anand! It's actually something I have been thinking for a while too (ARM). But I have to admit I miss PODCASTS too! (I'm not the only one as it seems)
  • Callitrax - Friday, June 28, 2013 - link

    I'm not so sure the statement "In the PC world, Intel ... ends up being the biggest part of the BoM (Bill of Materials)" is all that accurate. I know it hasn't been for me for 10+ years, the largest part of my BoM is I think the same as a mobile device - the display. Anybody running multiple displays or 1200+ vertical lines (which probably constitutes a large fraction of the readers here) spent spent somewhere between $300 and $800+ dollars which means that they would need an i7 at the low end or i7 extreme at the high end (+PCH) to for Intel's cut to exceed the display cost. And for that matter Samsung just topped the processor in my computer (SSD).

    * okay technically you could break the monitor BoM into payments to a few companies, but my panel I would guess still exceeds my cpu since the display was $600 on sale.

    ** and the economics get fuzzed up a little since desktop displays can last through several cpus (mine is on its 3rd cpu and 4 graphics card) whereas mobile device displays are glued to the cpu.
  • mitcoes - Friday, June 28, 2013 - link

    it is possible we will see future ARM64 models where the SoC is plugged in, even a standard connector / format, and be able to upgrade the SoC 2 or 3 times as we do actually with desktop computers. I suggested it to several brands, and I suposse they read the suggestions, and perhaps one will do it
  • name99 - Friday, June 28, 2013 - link

    It makes zero sense to optimize for something that is cheap (and part of a cheap system). The days of replaceable cores are gone, just like the days of replaceable batteries. Complaining about them just reveals you as out of touch.

    There is good physics behind this, not just bitchiness. Mounting which allows devices to be plugged in and out is physically larger, uses more power, and limits frequencies and so performance. There's no point in providing it in most circumstances.
    Next to go (for this sort of reason) will be replaceable RAM. It's already not part of the phone/tablet/ultrabook experience, and at some point the performance limitations (as opposed to the size and power issues) will probably mean it's also removed from mainstream desktops.

    Complain all you like, but physics is physics. The impedance mismatch from these sorts of connectors is a real problem.
  • Spunjji - Friday, June 28, 2013 - link

    I think you'll find that for the vast majority of users it probably is. I know that I am a distinct minority in spending more than ~£120 on a monitor for a desktop PC, and notebooks are now the dominant segment in PC sales where the cost of the display is going to be reduced compared to a desktop unit.
  • name99 - Friday, June 28, 2013 - link

    Look at an MBA. That's a far more representative type machine than a multi-screen desktop. That's what, $1200 or so, of which, what, $3..400 or so is Intel.
    No-one else is close. The display is, what, maybe $100 at the most. The storage is maybe $200 at the most (being charitable in both cases).

    Lower end laptops which are still using Intel are cheaper and so the fraction is even higher.
  • madmilk - Friday, June 28, 2013 - link

    In the PC world, the usual display is a $70 (eBay panel pricing, which is definitely an overestimate of BOM costs) 1366x768 display, or maybe a $140 1920x1080 display for nicer laptops and desktops. Even the rMBP 15 panel is only $300. In comparison, Core i5 processors start at over $200, and are very common in anything mid-range or above.
  • lukarak - Friday, June 28, 2013 - link

    All of the iOS devices are Arm based as well. Apple just designs their own processors based on the ARM instruction set, like Qualcomm with Scorpion and Krait, and unlike Samsung, NVIDIA or TI.
    Also, all WP are also ARM based.
  • aryonoco - Friday, June 28, 2013 - link

    I know it's OT, but since others have mentioned, I thought I'd add my voice as well: I miss the podcast. I REALLY miss the podcast :-)
  • Deepak Chamarthi - Friday, June 28, 2013 - link

    Nice info , Thanks Anand. Indeed ARM like companies needs this type of dynamic model. Mobiles everywhere -:)
  • dishayu - Friday, June 28, 2013 - link

    So, who would the chosen 3 be? I can guess Qualcomm and Samsung. Which one's the 3rd? Apple? TI? nVidia?
  • PEPCK - Friday, June 28, 2013 - link

    Wouldn't be Qualcomm or Apple, as they only use the ARM architecture, preferring to build their own designs. TI is a partner for the A15, but is unlikely to be for future chips, due to their retreat from the high performance SoC business.
  • dishayu - Friday, June 28, 2013 - link

    Hmm... But then same logic should apply to Samsung and their Exynos chips as well.
  • Ryan Smith - Friday, June 28, 2013 - link

    Exynos uses standard ARM core designs though. Whereas Snapdragon and A6 (and beyond) are using custom core designs. There's a big difference between a custom chip using standard cores, and a chip using custom cores.
  • pluto7777 - Friday, June 28, 2013 - link

    Exactly what is this IP they sell? Is it for hardware or software because Mali is a giant black box. Intel came to realize they could do more good for the end users by embracing open source. It's a shame this idea completely escapes ARM.
  • ShieTar - Friday, June 28, 2013 - link

    ARMs main IP is the processor design. Intel has absolutely no plans to make their Haswell-Designs open source. They also still sell their compiler and high performance libraries, so I really don't see how they are "embracing open source". Care to clarify your statement?
  • Spunjji - Friday, June 28, 2013 - link

    I'm fairly sure they're talking complete nonsense.
  • name99 - Friday, June 28, 2013 - link

    And it's not even Haswell's design. Intel produces a ton of software (like ICC and their math libs) which aren't open source...
  • pluto7777 - Friday, June 28, 2013 - link

    True, but it's evident Intel is at least trying. Even AMD has made some positive steps. Do yourselves a favor and look into Lima. ARM apparently doesn't like people reverse engineering their stuff.
  • pluto7777 - Friday, June 28, 2013 - link

    Intel has open source drivers on Linux. They actually employee people to work on this instead of other companies who only have closed source drivers and force other people to backwards engineer everything if they want an open driver.
  • iwod - Friday, June 28, 2013 - link

    Seeing both IMG ( MIPS ) and ARM are IP liscensing company, do both company's business mode work similary?

    And When will Anantech do a piece on the newly annoucned MIPS v5 "Warrior".
  • pluto7777 - Friday, June 28, 2013 - link

    That's another true loss. I can't Imagine how anyone could want anything from Imagination Technologies. I hope they only get Microsoft support and don't try to leech off the open source community through some loophole like Android does.
  • munsie - Friday, June 28, 2013 - link

    I assume you mean the MIPS processor, not their GPUs? Because Apple has been shipping Imagination Tech GPUs since the first iPhone and Intel used them originally for their integrated graphics.
  • ShieTar - Friday, June 28, 2013 - link

    "In the PC industry, the concept of a fabless semiconductor isn’t unusual."

    And most chemistry labs also work with fabless semiconductors. I think this sentence is missing a word ;-)
  • Krysto - Friday, June 28, 2013 - link

    "It must frustrate ARM just how much attention is given to Intel in the ultra mobile space, especially considering the chip giant’s effectively non-existent market share"

    Well, it helps to have people like you Anand, who make baseless statements such as "Haswell will take over tablets and beat ARM", a whole year before you even know the details about the Haswell architecture or review it.

    Maybe if the media didn't fall for Intel's misleading marketing and press releases so easily, Intel wouldn't get as much attention with nothing to show for it.
  • cnxsoft - Friday, June 28, 2013 - link

    Interesting no growth at all forecast for the "Desktop PC and Servers" up to 2017. What about ARMv8 based servers? Or is it in another category (Networking?).

    I also thought ARM had a bigger market share in micro-controllers, but 8-bit and 16-bit designs still have a significant market share, so I guess that's partly why.
  • bill5 - Friday, June 28, 2013 - link

    Anand might find it interesting that according to a forbes article, ARM was one of the two finalists for Sony/MS next gen consoles...

    They supposedly had a performance "bake off", and decided on the Jaguars. With the caveat that although ARM would not be ready with enough performance in time for PS4/XB1, they would be soon after. So, evidently they just missed.
  • bill5 - Friday, June 28, 2013 - link

    linkkkkk http://www.forbes.com/sites/patrickmoorhead/2013/0...
  • A5 - Friday, June 28, 2013 - link

    I believe it. I'm thinking A57 will match or beat Jaguar performance, but it wasn't going to be ready for a Q4 2013 mass production launch. Considering that the home console environment is particularly power-constrained, there was probably no reason to wait for ARM to be ready.
  • A5 - Friday, June 28, 2013 - link

    *is not, damnit
  • maybeimwrong - Friday, June 28, 2013 - link

    I've been a fan of AT since the 90s, and will undoubtedly continue to be; there's no better tech site out there. That said, I'm uncomfortable with these "Featured Reviews." Perhaps I'm making unwarranted assumptions, but I can't help but read that phrase as "this article was commissioned by a third party, and may remain at the top of the main page for longer than it would have in the absence of payment." I'm not interested in what advertisers think should be featured; I want to see what Anand thinks should be featured. I understand that advertising pays the bills, but there will be far fewer advertisers if readers stop believing in the integrity of the site. While I'm sure this reads as an overreaction, allowing money to dictate editorial content is a slippery slope. Anand, please be careful with these advertising programs.
  • Ryan Smith - Friday, June 28, 2013 - link

    In this case I can assure you that you're reading into "featured review" a little too much. Anand likes to use that label for his major industry articles; this article wasn't commissioned by anyone.

    It also won't remain at the top of the page any longer than any other article. All of our articles are bumped down sequentially based on date of publication.
  • maybeimwrong - Friday, June 28, 2013 - link

    Thanks for the clarification, Ryan. I was paranoid because some of the language in the advertising section seems to allow for the kind of scenario I described. I know articles are bumped down in order of publication, but that doesn't mean the timing can't be adjusted to feature some articles more than others. Keep up the good work :)
  • THF - Monday, July 1, 2013 - link

    Well, how is he able to publish the "please do not publish without approval" presentation slides then? Some degree of ARM approval must have been involved in this article.
  • Wolfpup - Friday, June 28, 2013 - link

    8.7 billion chips in one year...that's mind blowing :-O And of course isn't even counting all the multiple CPUs in many chips.

    I guess I should really be supportive of what they do, given their model is probably healthier than Intel's, but I'm still biased against ARM because of how low end their stuff is (not to mention that it's never been used in a really great PC before...I suppose since Android is technically open it could be considered a PC, but...yuck).
  • Qwertilot - Friday, June 28, 2013 - link

    Not never - they did start out in very decent computers. Not for a long while of course. Wonder if we'll ever see arm linux devices in any sorts of real numbers.
  • dealcorn - Friday, June 28, 2013 - link

    As Android is branded linux, I think we are already there. The open source community is very supportive of ARM on Linux, but products available in the marketplace generally lack contemporary ARM chips at affordable prices. Linux clearly works on select non cutting edge ARM products such as the Rasberry Pi, but the performance falls short of mainstream appeal.
  • Qwertilot - Friday, June 28, 2013 - link

    That's the main issue I guess. I guess that (seemingly very popular) Samsung chrome book might be closest to counting. Suppose its mostly whether anyone bothers putting a workable thing together, all the interfaces you'd like on the SoC etc.
  • ShieTar - Friday, June 28, 2013 - link

    Sir, for the atrocity of calling the ACORN "not a really great PC" I will have to formally demand satisfaction from you! I propose we meet at the break of dawn, and duel by throwing Monkey Island style insults from 20 feet distance!

    The truth is, while Black-Hat-White-Hat thinking is fun in terms of you favorite electronic implementation of the overcomplicated abacus, its never been very relevant for my own decisions in reality. I've grown up an absolute Commodore addict, both on the C64 and the AMIGA, only leaving an Apple IIe in for a bit of latenight-programming (The commodores were connected to the family TV, the Apple had a tiny monochrome monitor and was firmly located inside my own room). But once I joined university (College for you US guys) and I needed a cost-efficient platform for simulations and LaTeX-editing, I've jumped over to the PC-platform quickly enough. And about 3 month after grudgingly admiting this "Wintel"-Thing into my home, I tremendously enjoyed myself playing Starcraft.

    Of course I still yell "AMIGA 4 Life" at anybody who proposes I use an Apple instead of an IBM compatible for my PC needs. Which confuses the hell out of everybody, but luckily building a satellite takes 5-10 years, so the team has pleanty of time to get used to my antics.

    Still, when some new platform comes around that can do what I need to be done, I will end up buying it. I will complain, I will point out to people that "The Cloud" is really just FTP with a better front-end, and I will yell "AMIGA 4 Life" from time to time. I may be affected by PC-Tourette.

    So, when it comes to ARM, I'm just about to earn them annother 0.4$. I went to our local tech-search-engine (http://www.heise.de/preisvergleich for those not scared of german) and figured out that of the 933 tablets sold in germany, there is exactly one that offers a 10 inch screen at a weight of 320g, with everything else starting at around 500g. Now I personnally have exactly 2 tasks which really require me to own a table: I need something to read comics without killing trees, which my Kindle won't really do due to a lack of pictures and colours. And I sometimes need a map when I visit a city which is not the one I live in. For both tasks, the exact kind of processor does hardly matter. The screen resolution is a point, but not critical. Cameras are irrelevant. SD-Cards are nice, but not critical. And so on. So while the majority of tech-sites goes on to discuss Apple vs. Samsung vs. Asus vs. I'm not sure what else, I will go and by myself a "Cube U30GT". I don't know anybody who owns one, I have never read a review from a credible site recomending one, but I know how to use a search engine, and this thing turns up as the best thing for my exact needs.

    To come back to the point:
    Intel engineering is probably "better" than ARM, but I will buy an ARM device next.
    Apple and ASUS are VERY likely better overall designers of tablets than Cube (If that is even the company name), but I will buy a Cube product next.
    I have a specific need, so I will buy a specific product.

    Mmh, and while on the subject, a somewhat random but remotely relevant comment on your post, Mister Wolfpup: "Long-Forgotten Guy from Sumeria" has found and refined the grain of wheat, of which we now trade 650 million tons each year (whatever that means in actual number of corns. 650 trillion corns? Maybe.)

    So, my point? Stop obsessing who is best, or most relevant, and let your voice be heard on what you want and need. This is how we progress, by understanding the needs of people and the capabilities of science and engineering. Random comments on "I won't buy your stuff because people don't buy your stuff" don't really progress anything.

    So, while we're at it: I want a 200g ASUS-Infinity-Tablet to go on a Keyboard-Dock. And I want that dock to have a big battery and a decent 2.5" SSD included, and nice mechanical keys and a decent number of USB-ports.

    Thanks.
  • hazydave - Tuesday, July 2, 2013 - link

    Nice to read the Amiga love... but keep in mind, real Amiga development at Commodore pretty much stopped in 1993, given the troubles with mismanagement and other crimes. The great effect of the x86-PC was democratizing the personal computer, at least at the system level. Anyone could (and can) build their own. Heck, even Apple's Macintosh evolved into a bog standard PC.

    ARM has managed to take that to the next level, by basically doing the same thing for chip designers that Intel and AMD and the huge industry behind them did for system integrators. I doubt that Amiga would have gone x86 in 1994, had Commodore been healthy, but it would have moved off 68K, that's for certain -- the successor to the Amiga 3000/4000 architecture was CPU neutral at the system level, and the custom chips were headed that way as well. But certainly, had things lasted, the Amiga would be x86 today. And maybe looking at ARM versions as well.

    I actually feel a little better about running Android on my Galaxy Nexus or Asus Transformer Infinity (128GB total storage... plenty for Android, you don't really need a larger SSD) than Windows 7 on my "home made" PC. Microsoft WAS the Evil Empire; I'm pretty sure they're still Evil, but not sure about the whole Empire thing anymore. ARM, Asus, Linux and Google... much less so.

    But at the end of the day, you have stuff to do on that PC. I took me a little while to figure out the whole retro-computing thing, but it's the basic idea that, in the early home computer days, certainly when the C64 and Amiga first shipped, the computer WAS your hobby. You bought a computer to "do computing"... probably writing some code, sure, some store-bought programs as well, but the central focus was that computer.

    That hasn't been the case for ages, though. Unless you're buying a Raspberry Pi or some-such, you're probably buying a PC as an engine for something else: web entre, CAD, music, video, gaming, office automation, etc. A few are still coding, of course, but even that's probably more about a specific project than learning every tiny detail of a specific bit of hardware and code. The PC itself is an easily replaceable part. And I think, at least in part, if you're nostalgic for Commodores or Amigas or Apple ][s, you're missing that bit of "soul" that emerged in exploring those machines' depths.
  • mali_07 - Friday, June 28, 2013 - link

    Intel's brand new haswell benchmarked against A9 which has been in Market for like 3 years that too on Intel optimized platform giving misleading data. Can Intel now dare to benchmark it against Cortex A12 which is 30% faster than A15.
    What about A50 series which will be 3x faster than present series of Cortex. Intel only has survived owing to its money which it uses in ads through you guys.
    ARM infact is one generation ahead of INTEL.
  • name99 - Friday, June 28, 2013 - link

    Oh for crying out loud.
  • ShieTar - Friday, June 28, 2013 - link

    You need to be more vocal and explicit in your responses. I have extracted almost no amusement at all from your response. I suggest you go ahead and outright call this person an Idiot. Or go for Imbecile, he seems to have problems to count to 15 anyways. Or maybe you want to go and make my day and declare him a "Cortex-loving son of an ARM-brained RISC-lover"

    Seriously, Kids today just don't know how to start a serious riot anymore.
  • zifuxyx - Friday, June 28, 2013 - link

    arm will finally defeat by intel
    it is no reason to doubt, just about the time
    in 1 or 2 years ?
  • 3dcgi - Sunday, June 30, 2013 - link

    Anand, do you have info on how the cost of an architecture license differs from a regular IP license? I assume it's a higher upfront license cost and lower royalty cost, but haven't seen this specified anywhere. Thanks.
  • hazydave - Tuesday, July 2, 2013 - link

    I actually wonder how much of Intel's fortune is really within their control anymore. On the one hand, it's hard to imagine why Intel, the only company really making a consumer-oriented CPU selling retail at about $1,000 every year, working so hard to match price, power, and performance against a market based on SOC's that OEM at $15.

    On the other hand, Intel's superpower has always been cash, and cash has always depended on volume. Intel clearly sees the Windows marketing contracting, and feels they need to make the jump to whatever's next. Android can of course run on Intel, it's actually been a standard thing for the last few versions. And that could be key... it's expected about 850 million Android devices ship this year, versus about 350 million Windows devices. Not as much cash, clearly, but it does suggest that this doesn't have to happen for too many years in a row to see Android going to replace most of the desktop uses, and desktops becoming niche.

    Intel did a pretty good job destroying every niche processor within range of the x86. SPARC and Power still hang on at the high end, AMD used their x86-64 to kill off Intel's IA-64 (ok, sure, Intel helped a lot there too). MIPS and PowerPC are mitigated to embedded... only ARM is a primary application processor against Intel (and occasionally AMD) these days.

    But Intel's definitely risking getting dragged down with Microsoft. Maybe Microsoft will bounce back on better x86s, maybe not. Intel doesn't seem willing to take that risk, and you can't exactly blame them.
  • goo.gle - Saturday, July 20, 2013 - link

    Give us the next one already! :D
  • EN10 - Thursday, August 8, 2013 - link

    Looks like Charlie Demerjian is taking your images & article notes and putting his stamp on them. Bit naughty, or just a coincidence?

    http://semiaccurate.com/2013/08/07/a-long-look-at-...
    http://semiaccurate.com/2013/08/08/how-arm-license...

Log in

Don't have an account? Sign up now