Intel has about as much credibility as Trump and their current actions are no different than the decades of BS. They went too far with the spin, folks got angry and now they are backing off, promising to behave. Look at their shared perf numbers , they conveniently forget to note that older CPUs will see a larger impact and ofc there is no word on power consumption. They call that transparency.
There will be nothing but BS form Intel , talk is cheap while doing the right thing costs money.
"they conveniently forget to note that older CPUs will see a larger impact" ------------------------------------------------------------------------------------------------- Only when using Windows 7 / 8 or 10 where the MS firewall cannot be "completely" replaced
I can block java and anything else I want in an aftermarket firewall for XP and still use the Internet with no worries of malware so I won't be seeing a slowdown here as I won't be getting "the Fix" for meltdown
Secure boot platforms will not be able to avoid the problem however as the back doors cannot be closed by the end users and will require "the Fix"
"If you think you can run XP and still surf the internet safely, then you have far more issues than operating systems." -------------------------------- Yes, the biggest issue is all the Trolls at this site! Using XP-SP2 without any MS security updates and an antivirus that expired 2 years ago and not getting infected by any malware even when searching for the worst types of malware
It's easy for a real Windows security expert to lock down XP to prevent persistent threats of any kind
Just make your OS Read Only and stop using Java & Flash Any remaining threats are easy to stop as long as you have a quality aftermarket firewall and a few additional tweaks
It is not used for banking and no passwords are ever entered so not worried about the latest temporary memory hack of the week
But it is indestructible and reliable for security research
"Yes, the biggest issue is all the Trolls at this site!"
But I seem worst sites - actually I left one that was so bias the site lost total creditability for me.
In the end the trolls on sites don't mean anything. One should purchase on your experience and not make it based on other opinions. I will clearly state - I prefer Intel products - this is because I have 3 decades of experience in technical area - and for me personally I had bad experience with ATI. But I also own a Xbox One S primary for UHD 4K playback and also a couple of Samsung Galaxy tablets and I been using iPhone since iPhone 3. But if you want to used AMD or another product go ahead - but don't tell me that Intel has no creditability and bring it as political debate without any merit.
"but don't tell me that Intel has no creditability and bring it as political debate without any merit." ------------------------------------------------------------------------------------------------------------------------------ I would never do that Stewie
ALL of the computers I use are INTEL
Others have given me Apple devices and AMD computers over the years but I would never use them as a primary (or even secondary) device
INTEL is Great when I can run any compatible O.S. and not be forced to use what Microsoft graciously allows me to use
You have me beat with your 3 decades of experience however as I only have 29 years experience with INTEL ONLY!
I actually have an original IBM PC - but 256K version in my closet somewhere.
Intel is no different than any other CPU running on x86 about Windows - you have other choices just you have deal with their quirks of installation - for awhile I was actually work with OS/2
One thing and it maybe just slang please don't shorten my name. Maybe I should used unreal name also.
Let's leave politics out of this. Not all of us want it 24/7 from every angle. As for Intel, BINGO! Now they have been hit the hardest ,they want everyone to share their toys so they won't be again. I say screw em. They got here by themselves, let them fix it by themselves.
"Now they have been hit the hardest ,they want everyone to share their toys so they won't be again. I say screw em. They got here by themselves, let them fix it by themselves. "
some folks have been warning about the perils of mono-culture in IT for some years. say hello to peril.
Agreed - there is enough fake news in the political arena. It basically the same problem in tech industry with Intel. If the some one is on top - they try to disgrace whoever or what ever, to justified picking the one or product that is not.
This security issues effects the entire computer industry and they bash Intel - who knows they may even try to state Intel cause the entire think.
Bringing Politics into technical discussion shows lack of creditability for the person that brings it up. And especially not researching the information and making response from bias view point is in educated.
Intel didn't do this - the one that come up with these problems - also it is not just Intel that has this problem - just they are people on net that are extremebios against Intel. I think the real problem started with IBM and possibly Microsoft with original IBM PC. IBM desired to have second source of cpu's with the original IBM PC designs - so AMD was designated to clone the Intel designed - so imaging spending millions of dollars on designed and being told that another company can also make it. To make it worst they make unofficial changes to product and have people that want to say Intel is really bad -
Are we better off now - not sure completion does make Intel push it to limits - but what is the real completion of Intel now - maybe Initially it was AMD but now it ARM and smaller units - some may say 64 bit - but that 64 bit is just a natural extension of original Intel 32 bit designed and I believe it would have naturally came
"We also commit to adding incremental funding for academic and independent research into potential security threats." ---------------------------------
I'm not entirely sure "security threats" means what you think it means, but fine..... SHOW ME THE MONEY!
Free work? I got 1.5x what I made after getting my BS computer engineering (amusingly at Intel as well). The MS is computer engineering. I suppose benefits and some other bonuses would make things a bit more equal.
"Intel randomly gave me a cash bonus as a graduate intern awhile back. They can show the money."
ah, your kind will be replaced by Medicaid recipients working off their healthcare "gift". sort of like outsourcing to poor countries without the bother of travel. :)
I am not willing to sacrifice the kind of performance noted for my Windows 7 laptop running a Sandy Bridge i7 CPU. That is stupid, especially given that there really is NO threat. Now that so many systems are going to be updated, there is little reason for any scumbags to try to exploit these vulnerabilities, IMO. From my perspective, the cure is far worse than the disease, especially on older hardware / OS combinations. It just is not worth it. So, I believe Microsoft should make a way to have these patches be OPTIONAL and AVOIDABLE and UNINSTALLABLE. This is crap!
I dont know if I am ignorant or something but I agree with you, I rather have it optional if it will have impact on the the performance of my machines. Personally, I dont care if people are looking at my machine.
1. This can be implemented in malware easily without excluding other information gathering techniques. This just enhances malware's ability to collect information. So, there is little reason for them not to exploit this.
2. You're still running Windows 7, making your computer at least 3 times more vulnerable to getting malware. This makes your information more vulnerable to being collected.
Good luck to you and your credit rating. Very likely, you'll have your identity stolen by the end of this year.
3. this patch has almost no impact on desktop performance. The big performance hit is on database servers, not desktop apps. If you knew anything about this vulnerability instead of lapping up the hype, then you would already know this.
1. No single word of excuse, instead they are "very proud of how our industry has pulled together".
2. They had 6 months to prepare, yet they still have not even all microcode updates ready for CPUs sold in the last 5 years. Worse, my Arrandale laptop still in use will stay unfixed.
I feel sorry for all those clients were I've advised the Intel option, due to the particularities of their workload and production scenario.
Often times it was for some extra performance Intel's chips were offering or some extra energy efficiency, but now most of that will be completely negated by the loss of performance and the cost of the unplanned upgrade.
This is a complete mess and we're in it because the market has allowed such an entity to have so much power , influence and money.
They do say markets regulate themselves, but obviously if you look at how the global x86 and the HDD markets look, they obviously don't do no "regulation" except the one on us, the customers.
After being tried and convicted in 6 different countries on 3 different continents for highly illegal actions, Intel should have been seen with different eyes in the "educated" , "modern" , "objective" western world.
But that obviously has never happened. Money rules everything : press , authorities & governments and that's life.
Does anybody wonder how AMD's Mullins never got one SINGLE design win in the tablet world despite being over 60% faster and way more capable and efficient than Intel's Atom ?!
4 billion USD per year from Intel is the explanation.
The whole world bought useless, unproductive Intel-based tablets. They've realized they're useless and now the tablet market is shrinking.
Too bad we can live without tablets, but we can't live without workstations, laptops and servers.
"Too bad we can live without tablets, but we can't live without workstations, laptops and servers."
but, but, but... the pundits have been telling us for years that the tablet is the new desktop!!!! and the Watch will diagnose whatever ails you. in a while.
I'm surprised you brought up Mullins. That chip never got a design win with anyone, not even cheap Chinese OEMs, and Cherry Trail SoCs are still being used in Windows tablets. I'm crazy enough to use Bay Trail and Cherry Trail tablets as my main Windows machines - I think they're fine for basic usage but their GPUs are woeful. A faster Mullins GPU would have meant snappier UI and accelerated web rendering.
Water under the bridge, I guess. I'd hate for Spectre and Meltdown patches to slow down an already slow Bay Trail machine.
Customer-First Urgency - Those who buy a lot get priority, little guy can go fish. Transparent and Timely Communications - Well make an announcement months later, after our board members sell their shares. Ongoing Security Assurance - NSA has asked us to keep the remaining holes quiet.
No mention how future CPU designs mitigate Meltdown/Spectre. Designs for CPUs coming out this year have been frozen a while back and no major redisign has been planned.
When can we expect Meltdown/Specre free CPUs? 2020/2021?
I am seeing a LOT of very ignorant responses here, and this surprises me.
1. The big performance hit of this patch is on servers, particularly database and web servers. Most desktop apps won't see anything, regardless of how old your system may be. So stop whining about the patch being mandatory, as it will likely not impact your system in any meaningful way, unless you are being so pedantic as to be upset about a 3% decrease in your benchmark scores.
2. All this wining about "being forced to upgrade" to either a new processor or Windows 10 is really annoying. You aren't 'forced' into anything with this.
3. There are FAR more compelling reasons to upgrade to Windows 10. The security behind the scenes is FAR better, especially over Windows 7, and MASSIVELY over Windows XP. I'm not even going to address the ignorant idiots still using WinXP, but those of you still sticking to Windows 7, THERE IS NO REASON FOR YOU TO FEAR WINDOWS 10! Some things have moved around a little, but mostly, it's the same as Windows 7, but has far better security and runs many legacy apps better than Windows 7. It is not smarter to stick with a less secure, less stable, older OS.
4. There are far better reasons to upgrade your old hardware. Believe it or not, even electronics wear down over time. Your 7 year old laptop with Sandy Bridge isn't just less efficient, but also will become less stable over time, and it is NOT Microsoft's fault. Capacitors degrade over use. Batteries degrade over time and use and heat, and wind up producing less stable power, which can damage those previously mentioned capacitors. Even silicon degrades over time, causing instability. These complaints about patches making your laptop less stable or slower is very ignorant. It is your hardware being old that is making it less stable, and therefore making the system compensate for some errors that is making it slower, not the operating system or the patches.
Now you are informed. Stop being intentionally ignorant and stop whining about things you know nothing about.
Yeah, that's been a known likely vulnerable point for a while. It's also not easily patchable. It's a poor design choice from its inception, and most users have voiced their objections to it, but they haven't been listening to users for a long time. The only way to really protect against this is to disable the onboard wireless (or take out the wireless adapter card) and put in a USB wireless adapter.
#4 probably not true. AFAIK older electronics built on bigger, now obsolete nodes has expected lifetime of 100s or 10s of years in contrast to current nodes, which have some single digit numbers. See The international technology roadmap for semiconductors 2013 whitepaper and look at page 18. Figure INTC7 - lifetime versus technology node. If I understand it correctly, the a.u. unit is atomic unit, which means there are probably only dozens of atoms across some structures.
The reliability of new nodes is of particular interest to autonomous cars' manufacturers, because they need the high compute power of the latest generation of nanometer scale ICs (i.e. 7nm etc) but they need to cope with 20years of expected lifetime of the units (in a car which is moving, hot, vibrating). I think nVidia CEO has somehow addressed this with some reliability model which takes into account elevated temperatures and only 99% survival rate after the 20 years.
Which by the way means, that a Uber autonomous taxi without steering wheel will maybe kill the passengers (and maybe some more) in 1% of the cases. I'm exxagerating but you see the point.
Your laptop is on the same node as automotive A.I. enabled SoCs. Hence your statement may not necessarily be true, even if you subtract the years passed by while in on state, the survival rate of older CPUs may be even better than of those new ones. Old tech from 90's like dumbphones live and work today, but even 4 year old smartphones could be having a problem with reliability on - say - 7nm node.
The most detrimental factors are AFAIK temperature and voltage - electromigration is dependent on voltage, some parts of the chip are always on, current density of the interconnect (as per IRTS whitepaper) and temperature (higher temp - way accelerated aging). If I remember correctly, over 60 degC to 80degC accelerates aging manyfold yet single digit times.
It may not be a bad measure to use old tech with proper cooling, beating hands down all the new stuff.
Much of what you said here is pretty reasonable. There are just a few things, however, that my research and experiences disagree with.
3. I'll leave aside the discussion of security between Win10 and older versions of Windows (particularly once you've taken the time to update and set them to a non-default configuration baseline). Privacy issues (one reason to "FEAR" Windows 10) are making their way into older versions of Windows via updates, so you are correct in that sense.
I disagree, however, with the notion that Windows 10 runs many legacy apps better than Windows 7. Perhaps there are a few out there and maybe these are the only ones that concern you, but in my experience with many business clients, I don't have a single scenario of legacy apps running better on Win10. There are a good number that run fine, but there are more than a few (many internally developed) applications that work on Win7 and not on Win10. I also find it hard to believe that Win10 is more stable than Win7. There were more than a few hickups, many driver related (think nVidia) that cause some pretty severe headaches for Win10 users. Windows 7 has had a lot more time to mature and stabilize. You might be able to convince me that Windows 10 is starting to become as stable as Windows 7, but my experiences even in the last month make even that less than convincing.
4. What you said about degradation of components is true and your overall statement about upgrading systems holds if for no other reason than the processor can't work without the motherboard. Life expectancy of components on a motherboard is generally less than that of the processor it hosts (under normal conditions).
However in regards to silicon semiconductors, older silicon based chips designed using larger feature sizes and fabricated with less complex processes are often far more resilient to silicon degradation than the newer chips with very small feature sizes and little room for error in the processes. Electromigration, that can occur when high currents traverse small metal traces in the silicon, can (and has) gone completely unnoticed in older chips, due to the original thickness of the trace. The amount of material that migrates is proportional to the current density on the trace. Smaller chips with smaller feature sizes (smaller traces), but similar power usage have higher current densities. These chips need special considerations for the power delivery in part because if the power delivery remained the same as on a larger chip, the traces would have both higher current densities (more metal ion displacement) and lower tolerance for electromigration (fewer metal atoms can be moved before failure).
Pretty much ALL CPUs are impacted by Spectre and yeah, Meltdown will likely impact any C2Q processor too. As A5 is saying, it's really unlikely that C2Q will get any sort of support due to its age. I've got very low expectations for my two Core i5-540Ms when it comes to patching.
The impact is likely to be minimal, even on a Core2Quad. You'll probably see about a 10% hit on some apps, but most games are seeing no noticeable impact.
You'll see the OS patch if you're running Windows 7, 8.X, or 10, but it is highly unlikely you'll see a firmware/BIOS patch. I doubt even Z97 or X99 boards will see a firmware/BIOS patch. It's a firm maybe for intel's 100 series chipset. The 200 series chipset are even a "probably, but not guaranteed".
Simply put, the patching of this issue is a non-issue for desktop users. The big impact is server side.
It can be inferred from what intel published so far that Skylake and Kaby Lake will get microcode updates, and older generations will not. Then there are motherboard vendors who have to implement these updates in BIOS updates and release them.
@Anton Shilov (article): "Intel intends to release software and firmware patches for 90% of its CPUs launched in the past five years by January 15. By the end of the month, Intel plans to issue software updates for the remainder 10% of processors introduced in the same period."
Haswell was launched in 2013 making it definitely withing the 5 year window. Ivy Bridge, launched in 2012, may or may not be considered within the 5 year window depending on where they start the counting from and how they are rounding/truncating the age. That said, I would expect Haswell to be the oldest chip to get it. I figure similarities with Broadwell made it relatively easy to extend support to Haswell. If they were intent on covering Ivy Bridge, there is a decent probability that Sandy Bridge would also have been supported due to architectural similarity.
I agree on the motherboard vendor issue, though. Who knows how far back the vendors will actually apply the microcode update. I think this will be very telling of to what extend vendors are willing to support their consumers and I will most definitely be basing my future purchasing decisions on how this plays out. Here's hoping there will be a few follow-up articles to assess the state of support by the major manufacturers and get it into the public eye.
I have done quite a few tests before and after the patches in several of my machines (from gen 2 to gen 7) and I have followed quite a few of those that have done the same. First and most important: 1. the impact on CPU is minimal (less than 5%, with any CPU and any Windows version) 2. the impact on SSD speeds is noticeable, but only with older CPUs, and not for sequential read an write...so 512k, 4k and above all QD32 4k and similar 3. That windows 7 is more impacted than 10 is crap from Microsoft... The same PC with dual boot and sandy bridge lost much more SSD speed in Windowds 10 than in 7...
All these tests were done with Windows update that only address Meltdown and one of the 2 variants of Spectre as far as I know. The biggest impact would come from the a microcode update, that requires a bios update by the motherboard manufacturer... I don't expect this to come for anything older than Haswell (Intel mentioned the "last 5 years"...) Still those that applied the bios updated for the only Asus MB yet available saw a big impact only on SSD speeds, not on CPUs...
Thanks for posting your findings. I would love to see the actual numbers associated with the systems and setups, but for now I'll take it at face value. These findings do seem to agree with my personal findings so far (Skylake/Haswell).
That said, it was my understanding that Microsoft's patch needed the microcode update in order for it to fully work. Without the microcode, only part of the patch should be active. Microsoft's comment about Windows 10 vs Windows 7 was clearly (to me) assuming the system was patched and had the microcode update. Sandy Bridge won't likely be getting new microcode. Do you have numbers on a Haswell system that is patched and has a microcode update? My Haswell doesn't have the microcode yet.
Where's the outcry of Intel doing sloppy design such that threats like Meltdown and Spectre can even exist? Sure it's great that they pledge to fix things, but how about not dicking it up in the first place? Cutting corners to increase performance? That's like VW and futzing the emission system when they know they can't actually perform if constrained. "Our branch prediction improves performance because it can predict correctly 99% of time." Uh no, that sounds like just executing Step A-1 and Step A-2 at the same time as A so that you can have both ready as soon as A finishes so you can go to either one with no penalty. That's not prediction.
That is indeed not prediction. The feature you are describing is called speculative execution. Both speculative execution and branch prediction are used in modern processors, and they are different features.
Branch prediction is very much up to the task of getting branches predicted correctly 99% of the time. In fact that's relatively trivial: if you assume a branch will go the same direction it did last time, you'll be right 99% of the time in loops, and getting it right in other contexts is only a little bit more complicated than that.
Do you realize that this threat is not just Intel - it also other products and not just CPU, ARM and AMD and also with GPU's. It just seem people pick on Intel more
The real concern is people that create these things - are they doing a good thing letting out technical information that could lead to virus and such. It should be done in confidence so hackers will not used the information
I am a developer. I worked on a project called "Judy arrays" for the last 18 years. Performance is a goal that has a very high priority. My last release to the public was about a dozen years ago. In 2014, we made some very significant progress due the release of Ubuntu 14.xx and the Haswell processor. My latest test system is a i7-6800k cpu, running at 4Ghz and 128GB of RAM running at 3200Mhz/Cas=14. It has been using Ubuntu 16.04 OS, with bios updated to July 2017 (not current). I decided to update the OS to Ubuntu 18.04. The performance of Judy went down by one HALF!. All mitigations to the mitigations in 18.04 (pti, Spectre_V2) did very little to help. The a.out's built in the 16.04 OS were even faster the when built in the 18.04 system. I have learned the hard way that any software or bios released after 2017 is "infected" with meltdown/spectre mitigations. The "immoral" part is there seems to be no way to turn OFF these mitigations on any software or firmware published after 2017. I am really looking forward to a system that runs anywhere near as fast as the NOT upgraded 16.04 system. I believe that any comparisons of a system that tried to turn off the spectre/meltdown mitigatiions will be very wrong. The Internet is rife with these types of comparisons. This leaves me to not trust ANY update to my system that measures performance. So Intel, are you being transparent and honest with your customers? A recipe for disaster? If you want, I would be happy to work with you to demonstrate/verify my findings; dougbaskins -at- yahoo.com
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
65 Comments
Back to Article
jjj - Thursday, January 11, 2018 - link
Intel has about as much credibility as Trump and their current actions are no different than the decades of BS. They went too far with the spin, folks got angry and now they are backing off, promising to behave.Look at their shared perf numbers , they conveniently forget to note that older CPUs will see a larger impact and ofc there is no word on power consumption. They call that transparency.
There will be nothing but BS form Intel , talk is cheap while doing the right thing costs money.
Bullwinkle-J-Moose - Thursday, January 11, 2018 - link
"they conveniently forget to note that older CPUs will see a larger impact"-------------------------------------------------------------------------------------------------
Only when using Windows 7 / 8 or 10 where the MS firewall cannot be "completely" replaced
I can block java and anything else I want in an aftermarket firewall for XP and still use the Internet with no worries of malware so I won't be seeing a slowdown here as I won't be getting "the Fix" for meltdown
Secure boot platforms will not be able to avoid the problem however as the back doors cannot be closed by the end users and will require "the Fix"
Bullwinkle-J-Moose - Thursday, January 11, 2018 - link
Correction..."Most" secure boot platforms can cannot run Windows XP
Bullwinkle-J-Moose - Friday, January 12, 2018 - link
Not for meltdown, but....
New Google Fix claims ZERO IMPACT to performance >
http://www.zdnet.com/article/google-our-brilliant-...
dgingeri - Friday, January 12, 2018 - link
If you think you can run XP and still surf the internet safely, then you have far more issues than operating systems.Bullwinkle-J-Moose - Friday, January 12, 2018 - link
"If you think you can run XP and still surf the internet safely, then you have far more issues than operating systems."--------------------------------
Yes, the biggest issue is all the Trolls at this site!
Using XP-SP2 without any MS security updates and an antivirus that expired 2 years ago and not getting infected by any malware even when searching for the worst types of malware
It's easy for a real Windows security expert to lock down XP to prevent persistent threats of any kind
Just make your OS Read Only and stop using Java & Flash
Any remaining threats are easy to stop as long as you have a quality aftermarket firewall and a few additional tweaks
It is not used for banking and no passwords are ever entered so not worried about the latest temporary memory hack of the week
But it is indestructible and reliable for security research
HStewart - Friday, January 12, 2018 - link
"Yes, the biggest issue is all the Trolls at this site!"But I seem worst sites - actually I left one that was so bias the site lost total creditability for me.
In the end the trolls on sites don't mean anything. One should purchase on your experience and not make it based on other opinions. I will clearly state - I prefer Intel products - this is because I have 3 decades of experience in technical area - and for me personally I had bad experience with ATI. But I also own a Xbox One S primary for UHD 4K playback and also a couple of Samsung Galaxy tablets and I been using iPhone since iPhone 3. But if you want to used AMD or another product go ahead - but don't tell me that Intel has no creditability and bring it as political debate without any merit.
Bullwinkle-J-Moose - Friday, January 12, 2018 - link
"but don't tell me that Intel has no creditability and bring it as political debate without any merit."------------------------------------------------------------------------------------------------------------------------------
I would never do that Stewie
ALL of the computers I use are INTEL
Others have given me Apple devices and AMD computers over the years but I would never use them as a primary (or even secondary) device
INTEL is Great when I can run any compatible O.S. and not be forced to use what Microsoft graciously allows me to use
You have me beat with your 3 decades of experience however as I only have 29 years experience with INTEL ONLY!
HStewart - Monday, January 15, 2018 - link
I actually have an original IBM PC - but 256K version in my closet somewhere.Intel is no different than any other CPU running on x86 about Windows - you have other choices just you have deal with their quirks of installation - for awhile I was actually work with OS/2
One thing and it maybe just slang please don't shorten my name. Maybe I should used unreal name also.
0ldman79 - Tuesday, January 16, 2018 - link
I really don't understand how folks actually depend on MS to keep the OS secure.Don't depend on Microsoft for your security and you can keep XP, Vista, 7, 8, 10, whatever, secure.
How hard is this to understand?
Questor - Friday, January 12, 2018 - link
Let's leave politics out of this. Not all of us want it 24/7 from every angle. As for Intel, BINGO! Now they have been hit the hardest ,they want everyone to share their toys so they won't be again. I say screw em. They got here by themselves, let them fix it by themselves.FunBunny2 - Friday, January 12, 2018 - link
"Now they have been hit the hardest ,they want everyone to share their toys so they won't be again. I say screw em. They got here by themselves, let them fix it by themselves. "some folks have been warning about the perils of mono-culture in IT for some years. say hello to peril.
HStewart - Friday, January 12, 2018 - link
Agreed - there is enough fake news in the political arena. It basically the same problem in tech industry with Intel. If the some one is on top - they try to disgrace whoever or what ever, to justified picking the one or product that is not.This security issues effects the entire computer industry and they bash Intel - who knows they may even try to state Intel cause the entire think.
Bringing Politics into technical discussion shows lack of creditability for the person that brings it up. And especially not researching the information and making response from bias view point is in educated.
HStewart - Monday, January 15, 2018 - link
Intel didn't do this - the one that come up with these problems - also it is not just Intel that has this problem - just they are people on net that are extremebios against Intel. I think the real problem started with IBM and possibly Microsoft with original IBM PC. IBM desired to have second source of cpu's with the original IBM PC designs - so AMD was designated to clone the Intel designed - so imaging spending millions of dollars on designed and being told that another company can also make it. To make it worst they make unofficial changes to product and have people that want to say Intel is really bad -Are we better off now - not sure completion does make Intel push it to limits - but what is the real completion of Intel now - maybe Initially it was AMD but now it ARM and smaller units - some may say 64 bit - but that 64 bit is just a natural extension of original Intel 32 bit designed and I believe it would have naturally came
Bullwinkle-J-Moose - Thursday, January 11, 2018 - link
"We also commit to adding incremental funding for academic and independent research into potential security threats."---------------------------------
I'm not entirely sure "security threats" means what you think it means, but fine.....
SHOW ME THE MONEY!
Pinn - Friday, January 12, 2018 - link
Intel randomly gave me a cash bonus as a graduate intern awhile back. They can show the money.Fx1 - Friday, January 12, 2018 - link
So you got paid some money while doing free work! HAHAHA. interns.. thats funny. only in the usa..Pinn - Monday, January 15, 2018 - link
Free work? I got 1.5x what I made after getting my BS computer engineering (amusingly at Intel as well). The MS is computer engineering. I suppose benefits and some other bonuses would make things a bit more equal.FunBunny2 - Friday, January 12, 2018 - link
"Intel randomly gave me a cash bonus as a graduate intern awhile back. They can show the money."ah, your kind will be replaced by Medicaid recipients working off their healthcare "gift". sort of like outsourcing to poor countries without the bother of travel. :)
Pinn - Monday, January 15, 2018 - link
I don't think a single word here makes sense. Maybe the :) makes sense.thuckabay - Friday, January 12, 2018 - link
I am not willing to sacrifice the kind of performance noted for my Windows 7 laptop running a Sandy Bridge i7 CPU. That is stupid, especially given that there really is NO threat. Now that so many systems are going to be updated, there is little reason for any scumbags to try to exploit these vulnerabilities, IMO. From my perspective, the cure is far worse than the disease, especially on older hardware / OS combinations. It just is not worth it. So, I believe Microsoft should make a way to have these patches be OPTIONAL and AVOIDABLE and UNINSTALLABLE. This is crap!Pinn - Friday, January 12, 2018 - link
look up herd immunityChyll2 - Friday, January 12, 2018 - link
I dont know if I am ignorant or something but I agree with you, I rather have it optional if it will have impact on the the performance of my machines. Personally, I dont care if people are looking at my machine.nandnandnand - Friday, January 12, 2018 - link
"Personally, I dont care if people are looking at my machine."Just looking at the keys needed to pwn your trash computer lol
dgingeri - Friday, January 12, 2018 - link
1. This can be implemented in malware easily without excluding other information gathering techniques. This just enhances malware's ability to collect information. So, there is little reason for them not to exploit this.2. You're still running Windows 7, making your computer at least 3 times more vulnerable to getting malware. This makes your information more vulnerable to being collected.
Good luck to you and your credit rating. Very likely, you'll have your identity stolen by the end of this year.
dgingeri - Friday, January 12, 2018 - link
Oh, and...3. this patch has almost no impact on desktop performance. The big performance hit is on database servers, not desktop apps. If you knew anything about this vulnerability instead of lapping up the hype, then you would already know this.
LordanSS - Saturday, January 13, 2018 - link
Well, depends on what you do I guess.Windows + Firmware updates on an Intel i5 8400 + Titan X (Pascal) affect performance from 3.4% to 9.4% on gaming, at least from what they tested.
If you're a professional or semi-pro gamer, that's not "negligible". And indeed, it's even worse on Workstation and Server stuff.
LordanSS - Saturday, January 13, 2018 - link
Sorry, forgot to post a link in case you wanted to check yourself.http://www.eurogamer.net/articles/digitalfoundry-2...
Hurr Durr - Friday, January 12, 2018 - link
Now that us a shlomoface if I ever saw one.nandnandnand - Friday, January 12, 2018 - link
"We think of ourselves as an Israeli company as much as a US company"Hurr Durr - Friday, January 12, 2018 - link
Exactly, this very quote.Pork@III - Friday, January 12, 2018 - link
Happy performance reductions, especially for older processors! Well, if it's not planned aging, then what do you do?ceisserer - Friday, January 12, 2018 - link
1. No single word of excuse, instead they are "very proud of how our industry has pulled together".2. They had 6 months to prepare, yet they still have not even all microcode updates ready for CPUs sold in the last 5 years. Worse, my Arrandale laptop still in use will stay unfixed.
Hurr Durr - Friday, January 12, 2018 - link
Making excuses is the last thing you want to do in any situation.IGTrading - Friday, January 12, 2018 - link
I feel sorry for all those clients were I've advised the Intel option, due to the particularities of their workload and production scenario.Often times it was for some extra performance Intel's chips were offering or some extra energy efficiency, but now most of that will be completely negated by the loss of performance and the cost of the unplanned upgrade.
This is a complete mess and we're in it because the market has allowed such an entity to have so much power , influence and money.
They do say markets regulate themselves, but obviously if you look at how the global x86 and the HDD markets look, they obviously don't do no "regulation" except the one on us, the customers.
After being tried and convicted in 6 different countries on 3 different continents for highly illegal actions, Intel should have been seen with different eyes in the "educated" , "modern" , "objective" western world.
But that obviously has never happened. Money rules everything : press , authorities & governments and that's life.
Does anybody wonder how AMD's Mullins never got one SINGLE design win in the tablet world despite being over 60% faster and way more capable and efficient than Intel's Atom ?!
4 billion USD per year from Intel is the explanation.
The whole world bought useless, unproductive Intel-based tablets. They've realized they're useless and now the tablet market is shrinking.
Too bad we can live without tablets, but we can't live without workstations, laptops and servers.
What a mess a single company can make ...
kmi187 - Friday, January 12, 2018 - link
'What a mess a single company can make ..."That is the risk of having one company own 80% or more of the x86 & x64 market.
They don't screw things up often, but when they do ... the repercussions are immense.
Hurr Durr - Friday, January 12, 2018 - link
Makes sense AMD shill will be a commie as well.FunBunny2 - Friday, January 12, 2018 - link
"Too bad we can live without tablets, but we can't live without workstations, laptops and servers."but, but, but... the pundits have been telling us for years that the tablet is the new desktop!!!! and the Watch will diagnose whatever ails you. in a while.
serendip - Friday, January 12, 2018 - link
I'm surprised you brought up Mullins. That chip never got a design win with anyone, not even cheap Chinese OEMs, and Cherry Trail SoCs are still being used in Windows tablets. I'm crazy enough to use Bay Trail and Cherry Trail tablets as my main Windows machines - I think they're fine for basic usage but their GPUs are woeful. A faster Mullins GPU would have meant snappier UI and accelerated web rendering.Water under the bridge, I guess. I'd hate for Spectre and Meltdown patches to slow down an already slow Bay Trail machine.
13xforever - Friday, January 12, 2018 - link
All I want is a roadmap with ALL the CPUs they're planning to patch, preferably with a time table.Wolfclaw - Friday, January 12, 2018 - link
Customer-First Urgency - Those who buy a lot get priority, little guy can go fish.Transparent and Timely Communications - Well make an announcement months later, after our board members sell their shares.
Ongoing Security Assurance - NSA has asked us to keep the remaining holes quiet.
Hurr Durr - Friday, January 12, 2018 - link
NSA? Try Mossad.FunBunny2 - Friday, January 12, 2018 - link
"NSA? Try Mossad."I knew there was some folks who have all those Trump/Putin chats on tape. :):)
bill44 - Friday, January 12, 2018 - link
No mention how future CPU designs mitigate Meltdown/Spectre.Designs for CPUs coming out this year have been frozen a while back and no major redisign has been planned.
When can we expect Meltdown/Specre free CPUs? 2020/2021?
Pork@III - Friday, January 12, 2018 - link
Mmm, yes...If these new generations of processors are not drilled from the next versions of Meltdown / Specter :Ddgingeri - Friday, January 12, 2018 - link
Remember, this is the same guy who sold much of his stock in Intel back in November. That says something about his credibility.libertytrek - Monday, January 15, 2018 - link
This, 1,000 times. I can't believe no one else is raising this huge red flag. Maybe/hopefully one of the lawsuits will address it.dgingeri - Friday, January 12, 2018 - link
I am seeing a LOT of very ignorant responses here, and this surprises me.1. The big performance hit of this patch is on servers, particularly database and web servers. Most desktop apps won't see anything, regardless of how old your system may be. So stop whining about the patch being mandatory, as it will likely not impact your system in any meaningful way, unless you are being so pedantic as to be upset about a 3% decrease in your benchmark scores.
2. All this wining about "being forced to upgrade" to either a new processor or Windows 10 is really annoying. You aren't 'forced' into anything with this.
3. There are FAR more compelling reasons to upgrade to Windows 10. The security behind the scenes is FAR better, especially over Windows 7, and MASSIVELY over Windows XP. I'm not even going to address the ignorant idiots still using WinXP, but those of you still sticking to Windows 7, THERE IS NO REASON FOR YOU TO FEAR WINDOWS 10! Some things have moved around a little, but mostly, it's the same as Windows 7, but has far better security and runs many legacy apps better than Windows 7. It is not smarter to stick with a less secure, less stable, older OS.
4. There are far better reasons to upgrade your old hardware. Believe it or not, even electronics wear down over time. Your 7 year old laptop with Sandy Bridge isn't just less efficient, but also will become less stable over time, and it is NOT Microsoft's fault. Capacitors degrade over use. Batteries degrade over time and use and heat, and wind up producing less stable power, which can damage those previously mentioned capacitors. Even silicon degrades over time, causing instability. These complaints about patches making your laptop less stable or slower is very ignorant. It is your hardware being old that is making it less stable, and therefore making the system compensate for some errors that is making it slower, not the operating system or the patches.
Now you are informed. Stop being intentionally ignorant and stop whining about things you know nothing about.
Pork@III - Friday, January 12, 2018 - link
I see something new: https://phys.org/news/2018-01-finnish-firm-intel-f...Brian Krzanic, I think he'll have to sell his Intel shares, again. :D
dgingeri - Friday, January 12, 2018 - link
Yeah, that's been a known likely vulnerable point for a while. It's also not easily patchable. It's a poor design choice from its inception, and most users have voiced their objections to it, but they haven't been listening to users for a long time. The only way to really protect against this is to disable the onboard wireless (or take out the wireless adapter card) and put in a USB wireless adapter.lada - Saturday, January 13, 2018 - link
#4 probably not true. AFAIK older electronics built on bigger, now obsolete nodes has expected lifetime of 100s or 10s of years in contrast to current nodes, which have some single digit numbers. See The international technology roadmap for semiconductors 2013 whitepaper and look at page 18. Figure INTC7 - lifetime versus technology node. If I understand it correctly, the a.u. unit is atomic unit, which means there are probably only dozens of atoms across some structures.The reliability of new nodes is of particular interest to autonomous cars' manufacturers, because they need the high compute power of the latest generation of nanometer scale ICs (i.e. 7nm etc) but they need to cope with 20years of expected lifetime of the units (in a car which is moving, hot, vibrating). I think nVidia CEO has somehow addressed this with some reliability model which takes into account elevated temperatures and only 99% survival rate after the 20 years.
Which by the way means, that a Uber autonomous taxi without steering wheel will maybe kill the passengers (and maybe some more) in 1% of the cases. I'm exxagerating but you see the point.
Your laptop is on the same node as automotive A.I. enabled SoCs. Hence your statement may not necessarily be true, even if you subtract the years passed by while in on state, the survival rate of older CPUs may be even better than of those new ones. Old tech from 90's like dumbphones live and work today, but even 4 year old smartphones could be having a problem with reliability on - say - 7nm node.
The most detrimental factors are AFAIK temperature and voltage - electromigration is dependent on voltage, some parts of the chip are always on, current density of the interconnect (as per IRTS whitepaper) and temperature (higher temp - way accelerated aging). If I remember correctly, over 60 degC to 80degC accelerates aging manyfold yet single digit times.
It may not be a bad measure to use old tech with proper cooling, beating hands down all the new stuff.
BurntMyBacon - Monday, January 15, 2018 - link
@dgingeriMuch of what you said here is pretty reasonable. There are just a few things, however, that my research and experiences disagree with.
3. I'll leave aside the discussion of security between Win10 and older versions of Windows (particularly once you've taken the time to update and set them to a non-default configuration baseline). Privacy issues (one reason to "FEAR" Windows 10) are making their way into older versions of Windows via updates, so you are correct in that sense.
I disagree, however, with the notion that Windows 10 runs many legacy apps better than Windows 7. Perhaps there are a few out there and maybe these are the only ones that concern you, but in my experience with many business clients, I don't have a single scenario of legacy apps running better on Win10. There are a good number that run fine, but there are more than a few (many internally developed) applications that work on Win7 and not on Win10. I also find it hard to believe that Win10 is more stable than Win7. There were more than a few hickups, many driver related (think nVidia) that cause some pretty severe headaches for Win10 users. Windows 7 has had a lot more time to mature and stabilize. You might be able to convince me that Windows 10 is starting to become as stable as Windows 7, but my experiences even in the last month make even that less than convincing.
4. What you said about degradation of components is true and your overall statement about upgrading systems holds if for no other reason than the processor can't work without the motherboard. Life expectancy of components on a motherboard is generally less than that of the processor it hosts (under normal conditions).
However in regards to silicon semiconductors, older silicon based chips designed using larger feature sizes and fabricated with less complex processes are often far more resilient to silicon degradation than the newer chips with very small feature sizes and little room for error in the processes. Electromigration, that can occur when high currents traverse small metal traces in the silicon, can (and has) gone completely unnoticed in older chips, due to the original thickness of the trace. The amount of material that migrates is proportional to the current density on the trace. Smaller chips with smaller feature sizes (smaller traces), but similar power usage have higher current densities. These chips need special considerations for the power delivery in part because if the power delivery remained the same as on a larger chip, the traces would have both higher current densities (more metal ion displacement) and lower tolerance for electromigration (fewer metal atoms can be moved before failure).
felipetga - Friday, January 12, 2018 - link
Havent figured out if my C2Q 9550 rig will be affected and if so, will it be ever patched?A5 - Friday, January 12, 2018 - link
...I would be extremely surprised if a C2Q gets a firmware patch for this.PeachNCream - Friday, January 12, 2018 - link
Pretty much ALL CPUs are impacted by Spectre and yeah, Meltdown will likely impact any C2Q processor too. As A5 is saying, it's really unlikely that C2Q will get any sort of support due to its age. I've got very low expectations for my two Core i5-540Ms when it comes to patching.dgingeri - Friday, January 12, 2018 - link
The impact is likely to be minimal, even on a Core2Quad. You'll probably see about a 10% hit on some apps, but most games are seeing no noticeable impact.You'll see the OS patch if you're running Windows 7, 8.X, or 10, but it is highly unlikely you'll see a firmware/BIOS patch. I doubt even Z97 or X99 boards will see a firmware/BIOS patch. It's a firm maybe for intel's 100 series chipset. The 200 series chipset are even a "probably, but not guaranteed".
Simply put, the patching of this issue is a non-issue for desktop users. The big impact is server side.
Hurr Durr - Friday, January 12, 2018 - link
It can be inferred from what intel published so far that Skylake and Kaby Lake will get microcode updates, and older generations will not. Then there are motherboard vendors who have to implement these updates in BIOS updates and release them.BurntMyBacon - Monday, January 15, 2018 - link
@Anton Shilov (article): "Intel intends to release software and firmware patches for 90% of its CPUs launched in the past five years by January 15. By the end of the month, Intel plans to issue software updates for the remainder 10% of processors introduced in the same period."Haswell was launched in 2013 making it definitely withing the 5 year window. Ivy Bridge, launched in 2012, may or may not be considered within the 5 year window depending on where they start the counting from and how they are rounding/truncating the age. That said, I would expect Haswell to be the oldest chip to get it. I figure similarities with Broadwell made it relatively easy to extend support to Haswell. If they were intent on covering Ivy Bridge, there is a decent probability that Sandy Bridge would also have been supported due to architectural similarity.
I agree on the motherboard vendor issue, though. Who knows how far back the vendors will actually apply the microcode update. I think this will be very telling of to what extend vendors are willing to support their consumers and I will most definitely be basing my future purchasing decisions on how this plays out. Here's hoping there will be a few follow-up articles to assess the state of support by the major manufacturers and get it into the public eye.
digiguy - Friday, January 12, 2018 - link
I have done quite a few tests before and after the patches in several of my machines (from gen 2 to gen 7) and I have followed quite a few of those that have done the same.First and most important:
1. the impact on CPU is minimal (less than 5%, with any CPU and any Windows version)
2. the impact on SSD speeds is noticeable, but only with older CPUs, and not for sequential read an write...so 512k, 4k and above all QD32 4k and similar
3. That windows 7 is more impacted than 10 is crap from Microsoft... The same PC with dual boot and sandy bridge lost much more SSD speed in Windowds 10 than in 7...
All these tests were done with Windows update that only address Meltdown and one of the 2 variants of Spectre as far as I know. The biggest impact would come from the a microcode update, that requires a bios update by the motherboard manufacturer... I don't expect this to come for anything older than Haswell (Intel mentioned the "last 5 years"...)
Still those that applied the bios updated for the only Asus MB yet available saw a big impact only on SSD speeds, not on CPUs...
BurntMyBacon - Monday, January 15, 2018 - link
@digiguyThanks for posting your findings. I would love to see the actual numbers associated with the systems and setups, but for now I'll take it at face value. These findings do seem to agree with my personal findings so far (Skylake/Haswell).
That said, it was my understanding that Microsoft's patch needed the microcode update in order for it to fully work. Without the microcode, only part of the patch should be active. Microsoft's comment about Windows 10 vs Windows 7 was clearly (to me) assuming the system was patched and had the microcode update. Sandy Bridge won't likely be getting new microcode. Do you have numbers on a Haswell system that is patched and has a microcode update? My Haswell doesn't have the microcode yet.
darckhart - Friday, January 12, 2018 - link
Where's the outcry of Intel doing sloppy design such that threats like Meltdown and Spectre can even exist? Sure it's great that they pledge to fix things, but how about not dicking it up in the first place? Cutting corners to increase performance? That's like VW and futzing the emission system when they know they can't actually perform if constrained. "Our branch prediction improves performance because it can predict correctly 99% of time." Uh no, that sounds like just executing Step A-1 and Step A-2 at the same time as A so that you can have both ready as soon as A finishes so you can go to either one with no penalty. That's not prediction.surt - Friday, January 12, 2018 - link
That is indeed not prediction. The feature you are describing is called speculative execution. Both speculative execution and branch prediction are used in modern processors, and they are different features.Branch prediction is very much up to the task of getting branches predicted correctly 99% of the time. In fact that's relatively trivial: if you assume a branch will go the same direction it did last time, you'll be right 99% of the time in loops, and getting it right in other contexts is only a little bit more complicated than that.
HStewart - Monday, January 15, 2018 - link
Do you realize that this threat is not just Intel - it also other products and not just CPU, ARM and AMD and also with GPU's. It just seem people pick on Intel moreThe real concern is people that create these things - are they doing a good thing letting out technical information that could lead to virus and such. It should be done in confidence so hackers will not used the information
Sliderazer - Wednesday, January 17, 2018 - link
Very gooddigitaldoug - Friday, August 31, 2018 - link
I am a developer. I worked on a project called "Judy arrays" for the last 18 years. Performance isa goal that has a very high priority. My last release to the public was about a dozen years ago.
In 2014, we made some very significant progress due the release of Ubuntu 14.xx and the Haswell processor.
My latest test system is a i7-6800k cpu, running at 4Ghz and 128GB of RAM running at 3200Mhz/Cas=14. It has been using Ubuntu 16.04 OS, with bios updated to July 2017 (not current). I decided to update the OS to Ubuntu 18.04. The performance of Judy
went down by one HALF!. All mitigations to the mitigations in 18.04 (pti, Spectre_V2) did very
little to help. The a.out's built in the 16.04 OS were even faster the when built in the 18.04 system. I have learned the hard way that any software or bios released after 2017 is "infected"
with meltdown/spectre mitigations. The "immoral" part is there seems to be no way to turn OFF
these mitigations on any software or firmware published after 2017. I am really looking forward
to a system that runs anywhere near as fast as the NOT upgraded 16.04 system. I believe that
any comparisons of a system that tried to turn off the spectre/meltdown mitigatiions will be very
wrong. The Internet is rife with these types of comparisons. This leaves me to not trust ANY update to my system that measures performance. So Intel, are you being transparent and honest with your customers? A recipe for disaster? If you want, I would be happy to work with
you to demonstrate/verify my findings; dougbaskins -at- yahoo.com