Intel has officially lost the plot. Years of CPU barely incremental mediocrity, a new low bar on nvme for the p600, the hypetane fail, the last epic fail AT didn't even reflect - they are now pushing defective desktop parts with failed iGPU as dual channel, 16 PCIE lanes E series, as if the E line wasn't already trashed by Ryzen and it needs to push performance and features even lower.
looks like they are changing horses. But I am not entirely optimistic here, if they failed at the market they have practically wrapped around their finger, what's to expect when entering uncharted territory? Intel is a very big and bloated corporation, the reason it was too late and ultimately failed to gain any sizable market in mobile device platforms.
Their "IoT" products are piss poor value regardless of the price point. "AI"? Gonna have a hard time there as well, after their accelerator completely failed to materialize as a GPU, and is pretty much a meh in HPC, they will now try to shove it as a "machine learning" platform. Alas, it's design is fundamentally inefficient in machine learning, which benefits architectures, optimized for maximum throughput of small integers. And not that they can't design such an architecture, however, it is already late to the show and it is not even announced to be in the works.
Thus they do what big bloated corporations always do - go for nonsense acquisitions in hope it will materialize in profit or at the very least, relevance.
Turning into a "data-centric company" - woot, great news, because we cannot have enough of them private information hoarding, mining and selling to the highest bidder. They are such a benefit to society.
Way to go Intel. Also, don't forget to assure us how Moore's law is alive and well.
Intel's AI accelerator is from their Nervana acquisition, not something that they bought up in-house like Knight's Landing (which came from Larabee's grave). The AI/deep learning focused part is called Knight's Mill and due later this year. Knight's Landing, that is actually doing pretty OK in HPC and Omnipath had a few adopters.
They have been rather mediocre on recent desktop releases, Optane hasn't lived up to the hype (and currently mismarketed) and the idea of Kaby Lake in an ethusiast socket is just stupid (and I'm holding out hope that it is just a fake rumor as that is just stupid).
The one thing Intel has going for it is Sky Lake-EP and the platform it is built around. On paper it looks to be rather impressive and something they could used as the highight for this year's IDF.
Right. Let's face it, since Intel was awarded the Apple contract, (and Apple stopped using IBM/Motorola PowerPC RISC platform it honestly doesn't even need to advertise.
The most recent ads featuring Jim Parsons (Sheldon from Big Bang Theory) are less of an ad and more gloating that "98% of the Cloud runs on intel".
It wasn't long ago when we also had Sun/Sparc platforms as well. But it seems Oracle bought SunMicrosystems just to file a Java lawsuit against Android and Google.
If you ask me, way too many things in technology are influenced by Apple.
Not for long. Intel is about to get trashed even worse in servers than it was trashed in pretty much every aspect where performance matters in desktops.
Intel is becoming a hype company, but behind their cockiness hides the fact they don't have anything aside form a huge pile of cash they made on their monopoly, which itself was made on illegal bussiness practices.
The move of apple from Power to x86 also coincided with the company's shift from making decent hardware for professionals to making overpriced toys for vain posers. It didn't really made all the difference, considering the macos market share is the measly 5%.
In this aspect, I don't feel the the loss of the IDF itself is even a loss, as the event already was all about making a big deal of not that big "innovations", it was about hyping, and had nothing to do with development whatsoever.
So they hire popular dorks to claim how superior they are. This would hardly be needed if that was indeed the case. Sure, they are very big and rooted deep, which allowed them to do great even when their entire lineup was netburst garbage, but having that much money, IP and factories and doing so little with it qualifies for "pathetic" in my book.
And speaking of pathetic, they probably hit a new low with this video: https://www.youtube.com/watch?v=D5-gKsQgcRo And I'd say behind that "hip confidence" display they know how pathetic it really is, which is why they have disabled both comments and even rating for the video.
"Intel is becoming a hype company, but behind their cockiness hides the fact they don't have anything aside form a huge pile of cash they made on their monopoly, which itself was made on illegal bussiness practices."
Lets be fair to intel here, yes, what they did was illegal. They dug a pothole for AMD, AMD promptly took the shovel and dug a grave, sat in it, and started burying themselves in it before lisa came around and hauled them out.
Uhhhh no. Core 2 Quad Q6600 and Core i5 2500K wasn't "overpriced toys for vain posers". In fact, they were so good that people who owned machines with those two processors didn't have to upgrade for a very long time. Many people with 2500K / 2600K STILL haven't had a need to upgrade, that's how good those processors were.
-- Many people with 2500K / 2600K STILL haven't had a need to upgrade, that's how good those processors were.
well, or the symbiotic relationship with MS and Windows and Office has come to an end. there's no new thing to soak up ever increasing cycles. good enough is enough.
Seems like silly penny-pinching on the PR front to me. Not too long ago, they also cancelled their sponsorship of the national science fair... It appears, at least, like a case of penny-wise and pound-foolish.
More likely Intel has to go back to the drawing board and rethink their cpu strategy in light of the recent AMD Ryzen releases and so there won't be anything positive to talk about this year. Hope AMDs foot to intels posterior is a wake up call to actually innovate and not simply iterate.
CPUs take years, verging on decades to go from architecture concept to actual retail silicon. If Intel 'respond' to Ryzen, it will be many years from now. Everything sooner will be things that have been in the pipeline for years already.
Which I suspect has got them rattled. Yes, it's turning a supertanker. Problem is, they need to... I do sense a hell of a lot of disarray at Intel right now, with competitors coming at them from all sides, and I don't think 5G has a big future or IoT a lot of revenue potential. Retreating from IDF gives them some time to recover.
It's still quite a shock though, to go from the big whizz-bang shows of the last years to right now, nothing.
Of course, Intel aren't going anywhere, just as Microsoft or Apple didn't when they hit harder times. But they do need to do a bit of thinking, or they'll end up like Yahoo.
I suspect the leaked roadmap that started showing up last summer with a 6 core LGA11xx Coffee lake CPU for (early?) 2018 is Intel's primary response to Ryzen. It gives them a wider mainstream CPU for tasks where core count matters above all, while still presumably capitalizing on their core's ability to clock higher to maintain wins in single/low thread count tasks.
Of course, to complete, these new 6-core chips would have to be 12-thread too, and cost $250. Intel doesn't want to do that as it's slashing their margins. Unless AMD mess up in supply, they may have to.
Do we know how many cores AM4 supports? Could AMD build a 10+ core Ryzen for AM4? A 6C/12T Intel chip as the new Intel i7 is going to be $300+, most likely. That being the case, that leaves AMD a lot of room to respond in terms of product and pricing. If I was AMD I'd be more concerned about the possibility of Intel releasing a 4C/8T i5 and 4C/4T i3 processors. Of course, AMD could still probably undercut them a bit, but that's where the real challenge would occur. You can't just throw threads at the sub-$250 range, they found that out with Bulldozer. So they would need to be aggressive with price and/or release new models.
Some would say the current 64GB capacity and memory bandwidth available with their 4 DIMMs/2 channel memory controller is a match for 16 threads. If anything it should be getting more capacity/bandwidth for workstation class workloads for IT professionals.
You look naive. Intel know what is Zen since the very first test silicon and likely they had many strong informations since the beginning of the work. In this moment Intel knows the absolute truth about Ryzen from at least two years. Why to answer to a 200mm2 craziness on a critical 14nm ?, Not enough profit in the very discounted Pc market. Try to image a 1800X sold at 150$ to OEMs.....not a big deal for AMD balance sheet.
Better wait finer processes when they will yield enough.
Well Intel has a 6-core mainstream socket CPU in the works which should be able to compete with 8-core Ryzen. However if Zen 2 is significantly better than Ryzen, especially in gaming, then Intel is really in trouble.
Yes, but Intel has been dragging feet for years. Intel could have released the last few generations at least an year earlier. AMD's new toys will force them to compete again. I am looking forward to it.
SkyLake-EP is coming with a selection of accelerators and IO (FPGA, Nervana tech, Omnipath etc.) that could have been shown off. Cannon Lake and other 10 nm parts could have also been shown off. Optane has a few products left on its road map to show off like NVDIMMS. Intel's Gen 10 graphics need a bit of mention as they're bringing DP 1.3/1.4 and HDMI 2.0 to the masses and possibly even FreeSync. These are already in the pipeline and could be shown off barring any major issues.
I suspect they're finding 10 nm every bit as hard as 14 nm (which is why I'm so dismissive of TSMC's plans -- if Intel's finding it hard, TSMC are finding it harder).
Tbh I've seen Skylake-EP and it was pretty meh; we accepted an offer of discounted Broadwell-EP instead. The fact that Intel was making such an offer spoke volumes.
Not sure what Gen 10 is going to achieve; yes it will support those standards, no it won't be able to drive enough pixels to exploit them.
TSMC's 10 nm is a lot more conservative than Intel's. They are going to do several smaller steps rather than one big leap like Intel is planning, so I suspect TSMC will find their 10 nm node a lot easier than Intel's. Downside is they'll have to follow it up with 7 nm pretty quickly to stay competitive since the improvement from their 10 nm isn't as substantial.
skylake will bring next to nothing i guess. i only wait for the new mainboards. intels CPU tech was making babysteps for years. in 2017 a sandy bridge at 4.5 GHz from 2011 is still be good enough to fight with the latest kaby lakes. that´s a joke. optane is a joke so far.
You might want to check out the E7 line. That quad core with tons of cache actually destroys some workloads. We bought ten because they destroyed the overclocked Haswell-E X-series trips we had before. The many-core E5's are also useful and have more cache per core which really does matter for VM performance.
I've just joined, to praise your comment. Your comment is the most sharp and succint about the state of Intel. Just want to add that the current state of the pc market is IMO also Intel's fault with those crazy prices. Of course now someone else is joining to help kill it: DRAM manufactures are colluding to increase prices, it looks they are taking advantage of Ryzen to make some bigger profits. Again thanks for your comment, there should be a prize for really razor sharp and honest comments.
I'm totally sympathetic to your point, and I've been saying for a while what Meteor2 said, that they're now not just being attacked but being HIT on all sides. However let's try to be more nuanced.
First question. Is Intel losing anything valuable here? My guess is no. They have substantially less that they need to communicate each year than say, Apple or MS, does, so they don't NEED the equivalent of something like WWDC. They can (and do) provide tech details on the web, and they don't need to walk people each year through anything like new APIs and their uses. For communicating with Wall Street types, they will still hold the usual quarterly meetings where they tell analysts they have yuge manufacturing advantages over everyone else and bring out the charts showing that, for one carefully picked technical metric, they're like so totally ahead of TSMC and Samsung. And for releasing actual new products, they'll continue with the usual "leaked" road maps a few months in advance, with perhaps a press briefing on the day of the release.
So from a strictly numbers of point of view, it makes sense. More interesting is the optics. WHY do this? And why at such short notice, rather than something like holding the event this year and giving it a decent burial (something "it's great to see you all here, we've had a great time holding these for so many years, but the world moves on, everyone uses web, no longer necessary to fly in, save carbon, make the planet green, retire to spend more time with family, blah blah, no more IDF starting next year". But this scrambling seems to suggest that they really are hurting financially beyond expected levels. Maybe something like every department has been told mandatory 20% cuts for 2H17, and this was Marketing's way of meeting that?
Oh, to add to this. Are WE (ie AnandTech readers) losing anything? Honestly I think not. It's been YEARS since what was released by Intel (or anyone else) is actually relevant to what's making the CPU go faster. Knowing that there's a cache of a certain size is not interesting --- we've had caches for twenty years --- what's interesting is the details of things like the cache placement and replacement policies and the prefetch policies. Likewise seeing that there's branch prediction is not interesting, what is interesting is the details.
And it is those details that have stopped being released. You can draw the block diagram to every one of these CPUs in much the same way, but it's not interesting to know that Sandy Bridge had, I don't, 128(?) physical registers and Kaby Lake has 148(?). The real magic going forward (and has been for a few years) not in things like adding a few more issue window, ROB, and physical register slots, it has been in various techniques that allow you to get much more value out of that pre-existing machinery --- things like early register re-use, or long-term parking to move memory dependent instructions out of the issue queue. We're none of us hearing anything about these. Even the more technical fora like Hot Chips keep these details secret. The best you can do is look at patents --- but patents don't mean implemented, and some of these good ideas were suggested so long ago that they are now out of patent --- but are also now technically feasible.
So point is, what does it matter if Intel has five Coffee Lake sessions at IDF or not? They're not going to say anything actually interesting at any of those sessions anyway...
A good chunk of IDF did indeed go into the technical details but IDF was the key in putting all these pieces together for a coherent vision. Basically tying all of Intel's products together in a coherent strategy. Unlike financial calls where you can easily say something positive over a phone call, IDF as more show and tell.
IDF also highlighted what other companies were doing with Intel technology. While not that exciting for an end consumer, I was hoping that IDF this year would have had some updates on their 10 nm process and how their foundry business was doing (ie showing off some neat 3rd party designs from their foundries). Things like silicon photonics are reportedly shipping but I haven't heard much about what is actually using it.
The thing that stands out to me have been the recent wave of layoffs combined with numerous acquisitions. The death of IDF to me is signalling that Intel doesn't have that clear and coherent vision going forward. They are scrambling to latch on to some growth market as their strongholds shrink. A third of this is technical (process shrinks are HARD), a third of this is economic (too high of gross margin a spur new sales) and one third just bad senior management. I don't think Intel is in any danger of collapsing but there certainly feels like their will be a reckoning.
Some workloads seriously benefit from that extra cache, I mean a ton. In addition, the cache differences for VM's running on E5 series makes a huge difference for users. If you compare dual core Sandy Bridge VM experience to the dual core Broadwell VM experience, it really is significantly better and that is due to each VM no longer being starved for cache.In addition, you cannot deny that Intel's graphics for Broadwell and later Kaby Lake are head and shoulders better than anything Intel ever had before. There are advantages, but not for the i7 buying enthusiast with a discrete GPU.
It'll probably be less convenient for journalists and perhaps it'll be less exciting, but I'm sure Intel will still make sure all the information gets through.
You really are delusional. The only reason you'd cancel an event that draws not only paying customers but also heaps of press attention is that the truth of your situation doesn't reflect well on you.
People already know that Intel is in big trouble. Maybe they didn't know the panic would set in so quickly. Tell yourself whatever makes you feel better.
Intel is not in big trouble, unless AMD enters the server market in a big way. Intel could lose half the retail market and it would not be that big of a deal. The only action that would put the fear of God into them would be a competitor for the E5 series.
I don't feel better or worse depending on how Intel does. Since Intel has a strong business and plenty of new technology in its pipeline, as can be seen if you actually look at the company and its business, your assertion about "the only reason you'd cancel an event" is wrong. I don't know if you really are, but you certainly sound like another AMD fanboy. If so, do you guys get paid?
Regrettably, I think it makes perfect sense. Developers don't buy intel products, "line of business" IT managers do. Intel doesn't sell developer products ( with some quibbling exceptions ). Intel does need to evangelize their cool new tech, like FPGA in Xeon, like Optane, and it needs to get the message to developers who will develop the next generation of software, but it can do so in a targeted fashion. I hate to say it, because IDF has been a great show, is far more technical than most shows, but Intel's budget can now go into supporting other developer shows ( O'Reilly, Qcon, etc ), and targeting individual developers. The buyers of Intel technology are different cats, and if you had an invite to the "platinum party" at IDF last year ( I had to scam my way in, even though I'm exactly the person IDF targets ), the people intel paid real money on weren't developers. They were buyers from places like Lockheed.
The writing for this was on the wall as part of the Optane launch....
Tradeshows across the board are on their way out. As an example, Apple cancelled participation in MacWorld Expo a while ago and it stopped happening the next year.
While any event cancellation is a hit to the community, the high costs of putting on or participating in a tradeshow to reach X number of people who can attend in person, when you can reach 10-100X that many online just make the numbers not work out anymore.
yeah for consumer stuff they don't make as much sense as in the past anymore, as you can reach almost everyone online nowadays, and there's nothing to show as far as silicon products go that you can't show online (a chart is a chart, unlike seeing a cat in real life vs a video of it).
I think trade shows are essential. You get to mingle with clients, hang out with former colleagues, build relationships with vendors,...., in general they work well. If it is online no one will go.
Let's be clear that when you say "trade shows slowly going the way of the dinosaur" you have to be referring to a very particular set of trade shows...
Even in computing, CES, Computex, and MWC seem very much alive and kicking. In other industries, whether its medical, scientific instruments, or cars and boats, they likewise seem to be doing well. If there's a generic statement to be made encompassing the death of, say, both IDF and Macworld, I'm not sure quite what it is. Sometimes a few isolated happenings are just a few isolated happenings, not a trend of any sort.
Seems like people are underestimating their Altera purchase. If they can figure out a programming model I have to believe that FPGA will be the future of compute. Why bang away at really hard 5% general purpose processor gains when you can lay down hardware to accelerate your specific application a few hundred %.
You're right on this. Programmers have yet to catch up, however. Look at OpenCL, it's been around for a while, yet how many use it ? Amazon has just a few books on it, and the quality is so-so. DX12 and Vulkan are moving very slowly. Parallel computing on FPGA's is probably the future, but it will take a long time. So far, everyone is stuck on 2-3 threads at the most.
Do you think FPGA's will be more important than ASIC's? I feel like ASIC-based specific-purpose devices will be the future with custom silicon all over the place.
You're overestimating the gains from FPGAs. If it were that simple FPGAs would flood the market, but of course it isn't. You miss two things: CPUs do a lot of varied things, you'd have to find optimization opportunities in all of them to get an across the board improvement, and software is seldom structured in a way that you can conveniently move the bottlenecks to FPGAs. Even if 90% of the time in spent in 10% of the code you can't surgically remove that 10% from the code that supports it.
I'll miss IDF and the leadership Intel worked to provide to the industry.
I always felt that Intel, AMD, and other CPU vendors didn't get much in the way of credit for the amazingly difficult work they do. Mostly I see sniping in forums like this with vague (and not so vague) declarations that they don't do nearly enough or they don't support someone's favorite technology of the month initiative. It reminds me of the bit from Lewis CK about bitching about spotty wi-fi on a plane and losing the wonder that you are traveling miles above the ground at nearly 500 miles per hour while drinking a coke and nibbling on pretzels assured that you will travel 1000's of miles in just a few hours.
Intel was constantly pushing technologies with often herculean effort. They pushed Fab technology, supported constant streams of new memory technologies, advanced USB, PCI, Ethernet, business graphics, and general I/O improvements. Unlike software companies, Intel also had to risk Billions and Billions of dollars to push hardware technology forward.
Not PR and not working for Intel, just acknowledging how difficult what they and other vendors actually do to make hardware. This is usually lost on SW types that are often mostly clueless about silicon or complex hardware development.
I was at a forum where Microsoft's WHQL (Windows Hardware Quality Labs, the guys that invent Windows certification requirements) were telling hardware vendors they had 18-months to modify their silicon products to be compliant with a new standard they wanted to add to the OS. This was a ridiculous ask on the surface and they seemed to have to idea the scope of their hubris. At the time Microsoft was taking 5-6 years to update the OS but they expected the industry to resynthesize, tapeout, qualify, and replace old silicon product families or somehow replace their product portfolios in 18-months because of some extra register sets they wanted in silicon.
Work other people have to do somehow seems simple and easy until you're the one trying to make it happen. Happens all the time.
I think the problem revolves around how much money Intel has made, especially during that two-decade golden Wintel monopoly. What you say may fly for AMD or IBM, but Intel is rolling in cash, and has been for so long that people can and should expect a lot from the organization.
Intel management it's not to blame. It's the greedy shareholders. Big mutual funds, pension funds, teachers funds, hedge funds, etc. They park huge amounts of money in tech stocks and then demands big, steady returns, quarter after quarter. If they don't get it, they fire managers and executives. What's Intel to do ?
We have AMD doing GDC already for game developers maybe now there is no IDF they can take over and make an ADF for developers in general. Seems ripe for the picking and having developers on your side/optimizing would help AMD.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
76 Comments
Back to Article
CajunArson - Monday, April 17, 2017 - link
Something is "afoot" at Intel you say?YES IT IS: https://software.intel.com/en-us/articles/when-the...
BrokenCrayons - Monday, April 17, 2017 - link
That was the best joke ever!ddriver - Monday, April 17, 2017 - link
Intel has officially lost the plot. Years of CPU barely incremental mediocrity, a new low bar on nvme for the p600, the hypetane fail, the last epic fail AT didn't even reflect - they are now pushing defective desktop parts with failed iGPU as dual channel, 16 PCIE lanes E series, as if the E line wasn't already trashed by Ryzen and it needs to push performance and features even lower.ddriver - Monday, April 17, 2017 - link
looks like they are changing horses. But I am not entirely optimistic here, if they failed at the market they have practically wrapped around their finger, what's to expect when entering uncharted territory? Intel is a very big and bloated corporation, the reason it was too late and ultimately failed to gain any sizable market in mobile device platforms.Their "IoT" products are piss poor value regardless of the price point. "AI"? Gonna have a hard time there as well, after their accelerator completely failed to materialize as a GPU, and is pretty much a meh in HPC, they will now try to shove it as a "machine learning" platform. Alas, it's design is fundamentally inefficient in machine learning, which benefits architectures, optimized for maximum throughput of small integers. And not that they can't design such an architecture, however, it is already late to the show and it is not even announced to be in the works.
Thus they do what big bloated corporations always do - go for nonsense acquisitions in hope it will materialize in profit or at the very least, relevance.
Turning into a "data-centric company" - woot, great news, because we cannot have enough of them private information hoarding, mining and selling to the highest bidder. They are such a benefit to society.
Way to go Intel. Also, don't forget to assure us how Moore's law is alive and well.
Kevin G - Monday, April 17, 2017 - link
Intel's AI accelerator is from their Nervana acquisition, not something that they bought up in-house like Knight's Landing (which came from Larabee's grave). The AI/deep learning focused part is called Knight's Mill and due later this year. Knight's Landing, that is actually doing pretty OK in HPC and Omnipath had a few adopters.They have been rather mediocre on recent desktop releases, Optane hasn't lived up to the hype (and currently mismarketed) and the idea of Kaby Lake in an ethusiast socket is just stupid (and I'm holding out hope that it is just a fake rumor as that is just stupid).
The one thing Intel has going for it is Sky Lake-EP and the platform it is built around. On paper it looks to be rather impressive and something they could used as the highight for this year's IDF.
SkipPerk - Wednesday, April 19, 2017 - link
I thought the Xeon Phi was for their deep learning solution?JermaineWilliams - Tuesday, April 18, 2017 - link
Right. Let's face it, since Intel was awarded the Apple contract, (and Apple stopped using IBM/Motorola PowerPC RISC platform it honestly doesn't even need to advertise.The most recent ads featuring Jim Parsons (Sheldon from Big Bang Theory) are less of an ad and more gloating that "98% of the Cloud runs on intel".
It wasn't long ago when we also had Sun/Sparc platforms as well. But it seems Oracle bought SunMicrosystems just to file a Java lawsuit against Android and Google.
If you ask me, way too many things in technology are influenced by Apple.
ddriver - Tuesday, April 18, 2017 - link
Not for long. Intel is about to get trashed even worse in servers than it was trashed in pretty much every aspect where performance matters in desktops.Intel is becoming a hype company, but behind their cockiness hides the fact they don't have anything aside form a huge pile of cash they made on their monopoly, which itself was made on illegal bussiness practices.
The move of apple from Power to x86 also coincided with the company's shift from making decent hardware for professionals to making overpriced toys for vain posers. It didn't really made all the difference, considering the macos market share is the measly 5%.
In this aspect, I don't feel the the loss of the IDF itself is even a loss, as the event already was all about making a big deal of not that big "innovations", it was about hyping, and had nothing to do with development whatsoever.
So they hire popular dorks to claim how superior they are. This would hardly be needed if that was indeed the case. Sure, they are very big and rooted deep, which allowed them to do great even when their entire lineup was netburst garbage, but having that much money, IP and factories and doing so little with it qualifies for "pathetic" in my book.
And speaking of pathetic, they probably hit a new low with this video:
https://www.youtube.com/watch?v=D5-gKsQgcRo
And I'd say behind that "hip confidence" display they know how pathetic it really is, which is why they have disabled both comments and even rating for the video.
TheinsanegamerN - Tuesday, April 18, 2017 - link
"Intel is becoming a hype company, but behind their cockiness hides the fact they don't have anything aside form a huge pile of cash they made on their monopoly, which itself was made on illegal bussiness practices."Lets be fair to intel here, yes, what they did was illegal. They dug a pothole for AMD, AMD promptly took the shovel and dug a grave, sat in it, and started burying themselves in it before lisa came around and hauled them out.
ddriver - Tuesday, April 18, 2017 - link
How was any of what happened AMDs fault? I also wouldn't be hasty to deem them "hauled out" until they actually get in the net green.wut - Wednesday, April 19, 2017 - link
Uhhhh no. Core 2 Quad Q6600 and Core i5 2500K wasn't "overpriced toys for vain posers". In fact, they were so good that people who owned machines with those two processors didn't have to upgrade for a very long time. Many people with 2500K / 2600K STILL haven't had a need to upgrade, that's how good those processors were.FunBunny2 - Wednesday, April 19, 2017 - link
-- Many people with 2500K / 2600K STILL haven't had a need to upgrade, that's how good those processors were.well, or the symbiotic relationship with MS and Windows and Office has come to an end. there's no new thing to soak up ever increasing cycles. good enough is enough.
boeush - Monday, April 17, 2017 - link
Seems like silly penny-pinching on the PR front to me. Not too long ago, they also cancelled their sponsorship of the national science fair... It appears, at least, like a case of penny-wise and pound-foolish.SkipPerk - Wednesday, April 19, 2017 - link
If they cancel their StarCraft II sponsorship I will never buy another Intel product again.CoreLogicCom - Monday, April 17, 2017 - link
More likely Intel has to go back to the drawing board and rethink their cpu strategy in light of the recent AMD Ryzen releases and so there won't be anything positive to talk about this year. Hope AMDs foot to intels posterior is a wake up call to actually innovate and not simply iterate.edzieba - Monday, April 17, 2017 - link
CPUs take years, verging on decades to go from architecture concept to actual retail silicon. If Intel 'respond' to Ryzen, it will be many years from now. Everything sooner will be things that have been in the pipeline for years already.Meteor2 - Monday, April 17, 2017 - link
Which I suspect has got them rattled. Yes, it's turning a supertanker. Problem is, they need to... I do sense a hell of a lot of disarray at Intel right now, with competitors coming at them from all sides, and I don't think 5G has a big future or IoT a lot of revenue potential. Retreating from IDF gives them some time to recover.It's still quite a shock though, to go from the big whizz-bang shows of the last years to right now, nothing.
Of course, Intel aren't going anywhere, just as Microsoft or Apple didn't when they hit harder times. But they do need to do a bit of thinking, or they'll end up like Yahoo.
Gothmoth - Tuesday, April 18, 2017 - link
maybe they don´t want to tell to a big forum why they push back new technologies again and again.DanNeely - Monday, April 17, 2017 - link
I suspect the leaked roadmap that started showing up last summer with a 6 core LGA11xx Coffee lake CPU for (early?) 2018 is Intel's primary response to Ryzen. It gives them a wider mainstream CPU for tasks where core count matters above all, while still presumably capitalizing on their core's ability to clock higher to maintain wins in single/low thread count tasks.ImSpartacus - Monday, April 17, 2017 - link
Yeah, simply increasing all existing Kaby Lake core counts by 50% would go pretty damn far to compete with Ryzen. And that's Cannonlake.ImSpartacus - Monday, April 17, 2017 - link
Coffee Lake****Fml
Meteor2 - Monday, April 17, 2017 - link
Of course, to complete, these new 6-core chips would have to be 12-thread too, and cost $250. Intel doesn't want to do that as it's slashing their margins. Unless AMD mess up in supply, they may have to.Alexvrb - Tuesday, April 18, 2017 - link
Do we know how many cores AM4 supports? Could AMD build a 10+ core Ryzen for AM4? A 6C/12T Intel chip as the new Intel i7 is going to be $300+, most likely. That being the case, that leaves AMD a lot of room to respond in terms of product and pricing. If I was AMD I'd be more concerned about the possibility of Intel releasing a 4C/8T i5 and 4C/4T i3 processors. Of course, AMD could still probably undercut them a bit, but that's where the real challenge would occur. You can't just throw threads at the sub-$250 range, they found that out with Bulldozer. So they would need to be aggressive with price and/or release new models.blahsaysblah - Tuesday, April 18, 2017 - link
Some would say the current 64GB capacity and memory bandwidth available with their 4 DIMMs/2 channel memory controller is a match for 16 threads. If anything it should be getting more capacity/bandwidth for workstation class workloads for IT professionals.Stan11003 - Monday, April 17, 2017 - link
My suspicion is that Intel has technology in reserve for such an event as Ryzen.Gothmoth - Tuesday, April 18, 2017 - link
lol.... yeah they R&D stuff just to hide it away.. makes sense.. for a carpenter or a milkman maybe.Gondalf - Monday, April 17, 2017 - link
You look naive. Intel know what is Zen since the very first test silicon and likely they had many strong informations since the beginning of the work. In this moment Intel knows the absolute truth about Ryzen from at least two years.Why to answer to a 200mm2 craziness on a critical 14nm ?, Not enough profit in the very discounted Pc market. Try to image a 1800X sold at 150$ to OEMs.....not a big deal for AMD balance sheet.
Better wait finer processes when they will yield enough.
JimmiG - Tuesday, April 18, 2017 - link
Well Intel has a 6-core mainstream socket CPU in the works which should be able to compete with 8-core Ryzen. However if Zen 2 is significantly better than Ryzen, especially in gaming, then Intel is really in trouble.SkipPerk - Wednesday, April 19, 2017 - link
Yes, but Intel has been dragging feet for years. Intel could have released the last few generations at least an year earlier. AMD's new toys will force them to compete again. I am looking forward to it.Kevin G - Monday, April 17, 2017 - link
SkyLake-EP is coming with a selection of accelerators and IO (FPGA, Nervana tech, Omnipath etc.) that could have been shown off. Cannon Lake and other 10 nm parts could have also been shown off. Optane has a few products left on its road map to show off like NVDIMMS. Intel's Gen 10 graphics need a bit of mention as they're bringing DP 1.3/1.4 and HDMI 2.0 to the masses and possibly even FreeSync. These are already in the pipeline and could be shown off barring any major issues.Meteor2 - Monday, April 17, 2017 - link
I suspect they're finding 10 nm every bit as hard as 14 nm (which is why I'm so dismissive of TSMC's plans -- if Intel's finding it hard, TSMC are finding it harder).Tbh I've seen Skylake-EP and it was pretty meh; we accepted an offer of discounted Broadwell-EP instead. The fact that Intel was making such an offer spoke volumes.
Not sure what Gen 10 is going to achieve; yes it will support those standards, no it won't be able to drive enough pixels to exploit them.
saratoga4 - Tuesday, April 18, 2017 - link
TSMC's 10 nm is a lot more conservative than Intel's. They are going to do several smaller steps rather than one big leap like Intel is planning, so I suspect TSMC will find their 10 nm node a lot easier than Intel's. Downside is they'll have to follow it up with 7 nm pretty quickly to stay competitive since the improvement from their 10 nm isn't as substantial.SkipPerk - Wednesday, April 19, 2017 - link
How many chips were you buying? We were never offered discount Broadwell-EP.Gothmoth - Tuesday, April 18, 2017 - link
skylake will bring next to nothing i guess. i only wait for the new mainboards.intels CPU tech was making babysteps for years. in 2017 a sandy bridge at 4.5 GHz from 2011 is still be good enough to fight with the latest kaby lakes. that´s a joke.
optane is a joke so far.
SkipPerk - Wednesday, April 19, 2017 - link
You might want to check out the E7 line. That quad core with tons of cache actually destroys some workloads. We bought ten because they destroyed the overclocked Haswell-E X-series trips we had before. The many-core E5's are also useful and have more cache per core which really does matter for VM performance.Shadowmaster625 - Monday, April 17, 2017 - link
Well when you keep rebadging the same chip year after year I guess there is no need for a dev forum.Gothmoth - Tuesday, April 18, 2017 - link
+1Alvaro66 - Tuesday, April 18, 2017 - link
I've just joined, to praise your comment. Your comment is the most sharp and succint about the state of Intel. Just want to add that the current state of the pc market is IMO also Intel's fault with those crazy prices. Of course now someone else is joining to help kill it: DRAM manufactures are colluding to increase prices, it looks they are taking advantage of Ryzen to make some bigger profits. Again thanks for your comment, there should be a prize for really razor sharp and honest comments.cocochanel - Tuesday, April 18, 2017 - link
+1!!!Lolimaster - Monday, April 17, 2017 - link
Nothing to show except more Skylake rebrands as 8-9-10th gen @14nm :Dname99 - Monday, April 17, 2017 - link
I'm totally sympathetic to your point, and I've been saying for a while what Meteor2 said, that they're now not just being attacked but being HIT on all sides.However let's try to be more nuanced.
First question. Is Intel losing anything valuable here?
My guess is no. They have substantially less that they need to communicate each year than say, Apple or MS, does, so they don't NEED the equivalent of something like WWDC. They can (and do) provide tech details on the web, and they don't need to walk people each year through anything like new APIs and their uses.
For communicating with Wall Street types, they will still hold the usual quarterly meetings where they tell analysts they have yuge manufacturing advantages over everyone else and bring out the charts showing that, for one carefully picked technical metric, they're like so totally ahead of TSMC and Samsung.
And for releasing actual new products, they'll continue with the usual "leaked" road maps a few months in advance, with perhaps a press briefing on the day of the release.
So from a strictly numbers of point of view, it makes sense. More interesting is the optics. WHY do this? And why at such short notice, rather than something like holding the event this year and giving it a decent burial (something "it's great to see you all here, we've had a great time holding these for so many years, but the world moves on, everyone uses web, no longer necessary to fly in, save carbon, make the planet green, retire to spend more time with family, blah blah, no more IDF starting next year".
But this scrambling seems to suggest that they really are hurting financially beyond expected levels. Maybe something like every department has been told mandatory 20% cuts for 2H17, and this was Marketing's way of meeting that?
name99 - Monday, April 17, 2017 - link
Oh, to add to this. Are WE (ie AnandTech readers) losing anything? Honestly I think not.It's been YEARS since what was released by Intel (or anyone else) is actually relevant to what's making the CPU go faster. Knowing that there's a cache of a certain size is not interesting --- we've had caches for twenty years --- what's interesting is the details of things like the cache placement and replacement policies and the prefetch policies. Likewise seeing that there's branch prediction is not interesting, what is interesting is the details.
And it is those details that have stopped being released. You can draw the block diagram to every one of these CPUs in much the same way, but it's not interesting to know that Sandy Bridge had, I don't, 128(?) physical registers and Kaby Lake has 148(?). The real magic going forward (and has been for a few years) not in things like adding a few more issue window, ROB, and physical register slots, it has been in various techniques that allow you to get much more value out of that pre-existing machinery --- things like early register re-use, or long-term parking to move memory dependent instructions out of the issue queue.
We're none of us hearing anything about these. Even the more technical fora like Hot Chips keep these details secret. The best you can do is look at patents --- but patents don't mean implemented, and some of these good ideas were suggested so long ago that they are now out of patent --- but are also now technically feasible.
So point is, what does it matter if Intel has five Coffee Lake sessions at IDF or not? They're not going to say anything actually interesting at any of those sessions anyway...
Kevin G - Monday, April 17, 2017 - link
A good chunk of IDF did indeed go into the technical details but IDF was the key in putting all these pieces together for a coherent vision. Basically tying all of Intel's products together in a coherent strategy. Unlike financial calls where you can easily say something positive over a phone call, IDF as more show and tell.IDF also highlighted what other companies were doing with Intel technology. While not that exciting for an end consumer, I was hoping that IDF this year would have had some updates on their 10 nm process and how their foundry business was doing (ie showing off some neat 3rd party designs from their foundries). Things like silicon photonics are reportedly shipping but I haven't heard much about what is actually using it.
The thing that stands out to me have been the recent wave of layoffs combined with numerous acquisitions. The death of IDF to me is signalling that Intel doesn't have that clear and coherent vision going forward. They are scrambling to latch on to some growth market as their strongholds shrink. A third of this is technical (process shrinks are HARD), a third of this is economic (too high of gross margin a spur new sales) and one third just bad senior management. I don't think Intel is in any danger of collapsing but there certainly feels like their will be a reckoning.
SkipPerk - Wednesday, April 19, 2017 - link
Some workloads seriously benefit from that extra cache, I mean a ton. In addition, the cache differences for VM's running on E5 series makes a huge difference for users. If you compare dual core Sandy Bridge VM experience to the dual core Broadwell VM experience, it really is significantly better and that is due to each VM no longer being starved for cache.In addition, you cannot deny that Intel's graphics for Broadwell and later Kaby Lake are head and shoulders better than anything Intel ever had before. There are advantages, but not for the i7 buying enthusiast with a discrete GPU.valinor89 - Monday, April 17, 2017 - link
Might want to ammend the LGA3647 article, as it mentions IDF as a possible source of information.Yojimbo - Monday, April 17, 2017 - link
It'll probably be less convenient for journalists and perhaps it'll be less exciting, but I'm sure Intel will still make sure all the information gets through.prisonerX - Monday, April 17, 2017 - link
Yes, and here is that all that information: "We got nothing to show"Yojimbo - Monday, April 17, 2017 - link
Nonsense. They have a lot to show. The information here is just "we won't have IDF any more."prisonerX - Tuesday, April 18, 2017 - link
You really are delusional. The only reason you'd cancel an event that draws not only paying customers but also heaps of press attention is that the truth of your situation doesn't reflect well on you.People already know that Intel is in big trouble. Maybe they didn't know the panic would set in so quickly. Tell yourself whatever makes you feel better.
SkipPerk - Wednesday, April 19, 2017 - link
Intel is not in big trouble, unless AMD enters the server market in a big way. Intel could lose half the retail market and it would not be that big of a deal. The only action that would put the fear of God into them would be a competitor for the E5 series.Yojimbo - Sunday, April 23, 2017 - link
I don't feel better or worse depending on how Intel does. Since Intel has a strong business and plenty of new technology in its pipeline, as can be seen if you actually look at the company and its business, your assertion about "the only reason you'd cancel an event" is wrong. I don't know if you really are, but you certainly sound like another AMD fanboy. If so, do you guys get paid?bbulkow - Monday, April 17, 2017 - link
Regrettably, I think it makes perfect sense. Developers don't buy intel products, "line of business" IT managers do. Intel doesn't sell developer products ( with some quibbling exceptions ). Intel does need to evangelize their cool new tech, like FPGA in Xeon, like Optane, and it needs to get the message to developers who will develop the next generation of software, but it can do so in a targeted fashion. I hate to say it, because IDF has been a great show, is far more technical than most shows, but Intel's budget can now go into supporting other developer shows ( O'Reilly, Qcon, etc ), and targeting individual developers. The buyers of Intel technology are different cats, and if you had an invite to the "platinum party" at IDF last year ( I had to scam my way in, even though I'm exactly the person IDF targets ), the people intel paid real money on weren't developers. They were buyers from places like Lockheed.The writing for this was on the wall as part of the Optane launch....
Gastec - Monday, April 17, 2017 - link
So the next move would be to retire the Desktop CPUs and the next one to run away with all the money to a paradise island/tax haven/space colony? :)GoMoeJoe - Monday, April 17, 2017 - link
L o Lzdw - Monday, April 17, 2017 - link
Tradeshows across the board are on their way out. As an example, Apple cancelled participation in MacWorld Expo a while ago and it stopped happening the next year.While any event cancellation is a hit to the community, the high costs of putting on or participating in a tradeshow to reach X number of people who can attend in person, when you can reach 10-100X that many online just make the numbers not work out anymore.
Murloc - Tuesday, April 18, 2017 - link
yeah for consumer stuff they don't make as much sense as in the past anymore, as you can reach almost everyone online nowadays, and there's nothing to show as far as silicon products go that you can't show online (a chart is a chart, unlike seeing a cat in real life vs a video of it).Murloc - Tuesday, April 18, 2017 - link
obviously they still make sense for business stuff.SkipPerk - Wednesday, April 19, 2017 - link
I think trade shows are essential. You get to mingle with clients, hang out with former colleagues, build relationships with vendors,...., in general they work well. If it is online no one will go.GoMoeJoe - Monday, April 17, 2017 - link
Save money ... ditch the circus ... good for business.Holliday75 - Monday, April 17, 2017 - link
With trade shows slowly going the way of the dinosaur there will be some in the industry struggling to stay busy."Did you notice that since all those shows were cancelled Bob has nothing to do? What is his job anyway?"
name99 - Monday, April 17, 2017 - link
Let's be clear that when you say "trade shows slowly going the way of the dinosaur" you have to be referring to a very particular set of trade shows...Even in computing, CES, Computex, and MWC seem very much alive and kicking. In other industries, whether its medical, scientific instruments, or cars and boats, they likewise seem to be doing well.
If there's a generic statement to be made encompassing the death of, say, both IDF and Macworld, I'm not sure quite what it is. Sometimes a few isolated happenings are just a few isolated happenings, not a trend of any sort.
SkipPerk - Wednesday, April 19, 2017 - link
I agree. I think they may be speaking of more consumer-oriented shows as opposed to actual "trade" shows.flgt - Monday, April 17, 2017 - link
Seems like people are underestimating their Altera purchase. If they can figure out a programming model I have to believe that FPGA will be the future of compute. Why bang away at really hard 5% general purpose processor gains when you can lay down hardware to accelerate your specific application a few hundred %.cocochanel - Tuesday, April 18, 2017 - link
You're right on this. Programmers have yet to catch up, however. Look at OpenCL, it's been around for a while, yet how many use it ? Amazon has just a few books on it, and the quality is so-so. DX12 and Vulkan are moving very slowly. Parallel computing on FPGA's is probably the future, but it will take a long time. So far, everyone is stuck on 2-3 threads at the most.SkipPerk - Wednesday, April 19, 2017 - link
Do you think FPGA's will be more important than ASIC's? I feel like ASIC-based specific-purpose devices will be the future with custom silicon all over the place.prisonerX - Tuesday, April 18, 2017 - link
You're overestimating the gains from FPGAs. If it were that simple FPGAs would flood the market, but of course it isn't. You miss two things: CPUs do a lot of varied things, you'd have to find optimization opportunities in all of them to get an across the board improvement, and software is seldom structured in a way that you can conveniently move the bottlenecks to FPGAs. Even if 90% of the time in spent in 10% of the code you can't surgically remove that 10% from the code that supports it.SkipPerk - Wednesday, April 19, 2017 - link
I understand what you say about FPGA's, but what about ASIC's? Where they have been developed they transformed everything.Gothmoth - Tuesday, April 18, 2017 - link
what woudl intel tell tjhis year?our next CPU will have 5-10% more IPC performance.
out OPTANE tech is still overhyped stuff.
we will drain you cash cows nevertheless.....
helvete - Wednesday, June 21, 2017 - link
And, oh, 10nm is not going to be next year again..BillR - Tuesday, April 18, 2017 - link
I'll miss IDF and the leadership Intel worked to provide to the industry.I always felt that Intel, AMD, and other CPU vendors didn't get much in the way of credit for the amazingly difficult work they do. Mostly I see sniping in forums like this with vague (and not so vague) declarations that they don't do nearly enough or they don't support someone's favorite technology of the month initiative. It reminds me of the bit from Lewis CK about bitching about spotty wi-fi on a plane and losing the wonder that you are traveling miles above the ground at nearly 500 miles per hour while drinking a coke and nibbling on pretzels assured that you will travel 1000's of miles in just a few hours.
Intel was constantly pushing technologies with often herculean effort. They pushed Fab technology, supported constant streams of new memory technologies, advanced USB, PCI, Ethernet, business graphics, and general I/O improvements. Unlike software companies, Intel also had to risk Billions and Billions of dollars to push hardware technology forward.
They're leaving some big shoes to fill...
prisonerX - Tuesday, April 18, 2017 - link
Jeez, please give us all a break. Who are you, Intel's PR department?BillR - Wednesday, April 19, 2017 - link
Not PR and not working for Intel, just acknowledging how difficult what they and other vendors actually do to make hardware. This is usually lost on SW types that are often mostly clueless about silicon or complex hardware development.I was at a forum where Microsoft's WHQL (Windows Hardware Quality Labs, the guys that invent Windows certification requirements) were telling hardware vendors they had 18-months to modify their silicon products to be compliant with a new standard they wanted to add to the OS. This was a ridiculous ask on the surface and they seemed to have to idea the scope of their hubris. At the time Microsoft was taking 5-6 years to update the OS but they expected the industry to resynthesize, tapeout, qualify, and replace old silicon product families or somehow replace their product portfolios in 18-months because of some extra register sets they wanted in silicon.
Work other people have to do somehow seems simple and easy until you're the one trying to make it happen. Happens all the time.
SkipPerk - Wednesday, April 19, 2017 - link
I think the problem revolves around how much money Intel has made, especially during that two-decade golden Wintel monopoly. What you say may fly for AMD or IBM, but Intel is rolling in cash, and has been for so long that people can and should expect a lot from the organization.cocochanel - Friday, April 21, 2017 - link
Intel management it's not to blame. It's the greedy shareholders. Big mutual funds, pension funds, teachers funds, hedge funds, etc. They park huge amounts of money in tech stocks and then demands big, steady returns, quarter after quarter. If they don't get it, they fire managers and executives. What's Intel to do ?James S - Wednesday, April 19, 2017 - link
We have AMD doing GDC already for game developers maybe now there is no IDF they can take over and make an ADF for developers in general. Seems ripe for the picking and having developers on your side/optimizing would help AMD.loller86 - Sunday, April 23, 2017 - link
The IDF will never lose as you claim here.Dumb Palestinian.