Comments Locked

22 Comments

Back to Article

  • tipoo - Tuesday, May 7, 2019 - link

    I still 'member that they took Project Offset down with Larabee

    https://www.youtube.com/watch?v=TWNokSt_DjA
  • CiccioB - Tuesday, May 7, 2019 - link

    Larrabee ran Compute Shaders and OpenCL very well - in many cases better (in flops/watt) than rival GPUs

    That's was because it was manufactured with almost 2 PP of advantage with respect to the competition (that's nvidia).
    As nvidia got access to better PP Intel's advantage was annihilated in a simple generation to never be acquired again as the idea to use fat x86 cores with even fatter AVX512 units to do parallel computing was a complete.. stupid idea. Intel probably thought that with the (at the time) big advantage on the competition assured by its very advanced PPs they could use x86 even when it was a clear looser in other circumstances.
    And in fact they tried to put it anywhere, also in smartphones against ARM. Loosing another war miserably as ARM closed the gap from 2 to just once PP of disadvantage.

    Intel's hope is that the new trends is to use more and more computing shaders to do what nowadays fixed function units do in GPUs. Ray tracing is the technology deputed at extinguish the use of those units (in favor of others as we have seen with Turing) where Intel has no experience nor patents.
    As ray trace will take its way into the gaming engines and development choice (leaving old and now quite clumsy rasterization effects and workaround behind) their new GPUs will close the gap with those of the competition.
    Their advantage, as it is now that of AMD with console market, will be the immense pool of users that will have its technology at disposal as a give away as integrated graphics.
    Any future game (and more those simple and with limited budged) will try to exploit their technology as much as possible to run well on the basic HW and enlarge the potential market.
    AMD, and even more nvidia, will have to cope with Intel's decisions on the technology (and implementation) that will pave the future. If they are not going to support that they way Intel does, the chances the new features/extensions/HW units are going to fail are high, even if they potentially are really powerful.
  • AshlayW - Tuesday, May 7, 2019 - link

    Shame that a company such as intel (anti-competitive, anti-consumer, anti-innovation {in consumer market]) would 'pave the future' of graphics, simply because a large number of people have their crap in their PC. Hopefully large number dimishes as Ryzen erodes their marketshare. No doubt intel will abuse their position to squeeze very last damn cent out of the consumer at the expense of innovation and fair pricing.
  • Calin - Wednesday, May 8, 2019 - link

    I think you are living in another world, one where Intel plays in midrange and high-end gaming. While Intel did some great leaps in integrated graphics prowess in their high-end chips (the ones with integrated DRAM), most of what they sell is scrape-of-the-bottom-of-the-barrel performance. Also, with an old process, Intel can't compete economically in the lucrative graphics market.
  • CiccioB - Wednesday, May 8, 2019 - link

    @Calin
    I'm speaking about the new future graphics processors Intel is creating, not the old ones (which however are not so terrible and have a decent support unlike AMD APUs).

    PP in this context is worthless. They are going to fill ~80% of the PC market with their own GPUs and so they give the developers a wide market to target.
    It doesn't matter if they are not the top of the hill in performance or efficiency.
    AMD is neither, but they are now guiding the way the engine games are developed thanks to their monopoly on console HW. This will soon change, however as Intel is going to be a new big player in the graphics market that developers will want to consider.

    However powerful will be AMD GPUs, they will still trail Intel in sold units, as Intel sells all their consumer CPUs with a integrated GPU while AMD is only providing a few APUs in laptops as their discrete GPUs are terrible in power efficiency and none wants them in the mobile market. On the desktop they are not (yet) putting a GPU into their Ryzen CPUs, so Intel has an easy path to invade the market with whatever (crappy or not) graphics technology they are going to create. It just needs to be "powerful enough".
  • mode_13h - Thursday, May 9, 2019 - link

    > AMD is neither, but they are now guiding the way the engine games are developed thanks to their monopoly on console HW.

    From talking to some AMD employees I know, I think this advantage is more imagined than real. In many cases, it seems the console and PC engines are developed by separate teams and have little in common.

    The flip side is that if Intel dominates the PC graphics landscape, it won't necessarily give them an advantage in consoles. Plus, gaming might just end up in the cloud, like everything else. The next gen of consoles might be the last.

    Oh, and AMD also sells Ryzen APUs for the desktop, in case you didn't notice.
  • CiccioB - Monday, May 13, 2019 - link

    I didn't say that Intel will have advantage in console market (where it is not present).
    I said that games engines and game optimization will be written to take into account ALSO Intel new GPUs as they are going to give decent performances (even though maybe not the best ones) and be spread everywhere on all ultramobile/laptop/desktop device enlarging the gamer population potentials.

    [blockquote]Oh, and AMD also sells Ryzen APUs for the desktop, in case you didn't notice.[/blockquote]
    Yes, I know, but how many of them did they sell?

    You probably have not understood completely the difference in market presence (and share) that Intel will have once it will deliver it's new integrated GPUs.
  • mode_13h - Thursday, May 9, 2019 - link

    > Intel probably thought that with the (at the time) big advantage on the competition assured by its very advanced PPs they could use x86 even when it was a clear looser in other circumstances.

    I think the reason was simpler than that. Intel had a history of getting burned by non-x86 (remember Itanium?). They probably learned this lesson too well, eyeing anything non-x86 with great skepticism.
  • CiccioB - Thursday, May 9, 2019 - link

    Well, the lesson was even more simple: x86 outside the market where it benefits of retro-compatibility simply has not chance to do anything better than the competition.
    Good for them that they manage to get rid of the entire competition in the server market as they could produce much cheaper CPU than others.
    But outside these markets, x86 is simply outperformed by anything else. They even failed to keep pace with nvidia and their GPUs in parallel computing, despite the difficulties that exists to make a (or even more than one) GPU compute decently.
    Without saying that Xeon Phi exploited almost nothing of those x86 cores (which could be anything else and enjoy better energy efficiency) and most of the crunching power just came from those beefy AVX units.
  • mode_13h - Thursday, May 9, 2019 - link

    Yeah, I'm just saying that within Intel, it was probably very difficult for anything non-x86 to succeed, politically. I get that the ISA is rather poor and there's only so much lipstick you can put on that pig.

    I would be interested in knowing why they killed the successor to the i860 - another big, non-x86 push the company once made. I heard rumors that it was really promising, but maybe it fell victim to grand dreams of Itanium.
  • drexnx - Tuesday, May 7, 2019 - link

    that blog was definitely well worth a read, because it looks like 3 years later they've done a 180 and now graphics is a target again
  • HStewart - Tuesday, May 7, 2019 - link

    It funny that Larabee came out according to that list when I got built a Supermiro Dual Xeon 5160 machine and I place NVidia graphics in it. And I never heard of Larabee, it obvious was an experiment that went no where.

    Thinks are total change now with Intel Xe is totally a different type machine, I don't believe Phi was intended a graphic processor - only thing I can remember it has a Many CPU machine - a board that had many little cpu's instead of larger cpu.

    One thing I am curious about is the x200 series are mark as discontinue but not 72x5 series.

    I don't see Intel continuing since they will have in next year or two Xe processors.
  • Ian Cutress - Wednesday, May 8, 2019 - link

    Larrabee was never launched as a consumer GPU. It went somewhere, just not as graphics. It ended up as Xeon Phi.
  • SanX - Tuesday, May 7, 2019 - link

    Looks like Xeon Phi was not generating Intel way too outrageous profits so they moved to something else.

    Can anyone show any supercomputer benchmarks of KNL versus those 56 core Xeons on molecular dynamics and PIC codes? On KNL versus previous gen Xeons the difference in PIC codes we checked was just within 20% and i doubt it will be way higher with these new incarnations of Xeons as supercomputer performance is often just memory bandwidth bound not processors performance bound.

    So, OK, intel, do discontinue KNL and do not forget to decrease 10x-100x its price tag so that we will be able to have our own personal home supercomputers. The only reason we do not have ones already is Intel's monopoly with processor prices an order or more higher than the production cost.
  • AshlayW - Tuesday, May 7, 2019 - link

    That's just intel in a nutshell: stagnate innovation and advancement of consumer technology to maximise profits and please the shareholders.
  • CiccioB - Wednesday, May 8, 2019 - link

    Ermmm.. that's what all companies do. Intel, AMD, nvidia, Qualcomm, Apple whoever, they exists to create profits for their shareholders.
    The way they do it is a not-caring things for them and each one follows what they think its the best path for that. Who leads try to get the maximum from what it already has, who follows try any maneuver to overtake it. Which include targeting only most profitable markets and selling at lower prices to the level of gaining anything (or loosing money) for years making the false impression they care about prices: they just are applying the best margins they can, as the leader. The difference is only on the magnitude.
  • Metroid - Wednesday, May 8, 2019 - link

    I wonder what would happen if Intel had no money like AMD has been for some years. The money Intel has gives them the ability to fail but not for long.
  • CiccioB - Wednesday, May 8, 2019 - link

    With less money they would do as any other company in the same situation: focusing on the main business avoiding spending in other things and trying to create a single product that can bring them out of that status.
    Innovation would be done only on a specific segment sacrificing other technological opportunities.
  • A.k.a.n.e - Wednesday, May 8, 2019 - link

    The table is not correct. Core i3-8121U supports AVX512-{F,CD,BW,DQ,VL,IFMA,VBMI} (see e.g. http://instlatx64.atw.hu/ ) though it's Core i3, so Cannon Lake may support AVX512-{BW,DQ,VL,IFMA,VBMI} on general (non-Xeon) ones. Also, Intel is not saying whether general Ice Lake supports AVX512-{BW,DQ,VL,VNNI,VBMI,...}. This is also the same for AVX512-BF16 of Cooper Lake.
  • mode_13h - Thursday, May 9, 2019 - link

    I guess nobody can claim Intel didn't try pushing x86 absolutely everywhere. All the way from IoT (via Edison) to GPUs (i.e. original Larrabee), phones, and HPC.

    I guess it had to learn the hard way that not everything is a nail you can just bash with an x86-based hammer.
  • yankeeDDL - Sunday, May 12, 2019 - link

    Who bought these things?
  • SarahKerrigan - Sunday, May 12, 2019 - link

    HPC sites. If you look at the Top500 list, you'll see quite a few Xeon Phi systems on there.

Log in

Don't have an account? Sign up now