Intel chip kernel flaw requires OS-level fix that could impact macOS performance, report s...

245

Comments

  • Reply 21 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    clexman said:
    k2kw said:
    It should be time for an A series based MacBook Air or an iOS laptop?
    Then a 30% slowdown on older hardware will be call a, "Feature," and not a bug.
    Well, it would be a feature if the slowdowns prevented a crash that also happens on every other piece of hardware, regardless of age, made by everyone else. 
    watto_cobra
  • Reply 22 of 90
    GG1GG1 Posts: 483member
    Looks like AMD are not affected (https://www.theregister.co.uk/2018/01/02/intel_cpu_design_flaw/). Time to buy AMD stock. Or time for Apple to use AMD chips?
    watto_cobra
  • Reply 23 of 90
    welshdogwelshdog Posts: 1,899member
    dewme said:
    welshdog said:
    This seems like exceptionally bad news - for everyone. Just knowing this flaw exists, even without details, means the bad actors of the world will be working overtime to find an exploit.  And then they'll work to find an exploit to the fix.  This has got to be Intel's biggest screwup ever.  It would be nice to get some info eventually on what chips/machines are affected and by how much.  
    Yeah this is bad news like every one of the many thousands of major security issues that exist in all manner of products. Biggest screwup ever? Biggest this week ... but the week is less than half over. As debilitating as a 30% performance hit would be on tweaking a picture in Photoshop, I'm actually more concerned about this type of threat: https://www.wired.com/story/triton-malware-targets-industrial-safety-systems-in-the-middle-east/ . For those who don't know what a safety system is, it's an independent, isolated, and redundant control system that is put in place (at very great expense) to prevent a failure of the primary control system from leading to a catastrophic failure in an important system, like a power plant or refinery. In other words, the safety net that was put in place to prevent the worst-case scenarios that could occur in a plant from happening - was hacked to break the system it was supposed to be protecting. 

    The "bad actors" have already been working overtime for decades trying (and often succeeding) to exploit everything and anything that has any logic in it, from software, firmware, microcode, markup, macros, scripts, social media, humans, etc. Heck, there are professional-quality development toolkits freely available for anyone to download so you can discover your exploits in your spare time. Unannounced exploits are a worldwide unit of currency. Oh, and who is considered a "bad actor" is entirely relative and depends on who the actor is working for. Are the NSA, FBI, CIA, DOD, and the thousands of public and private companies working on behalf of government agencies, etc., "bad actors?" Depends on whose flag you fly, I guess.

    I don't want to sound like Chicken Little, but cybersecurity is a much greater threat than most lay people can comprehend or deal with at a personal level. It's an ongoing and existential threat that is the primary daily focus of hundreds of thousands of professionals just in the US, and there are easily as many unfilled jobs as ones that are currently filled. The good news is that the previous US administration truly understood the cybersecurity threat from day one and at least got the ball rolling on doing something about it in an apolitical and highly cooperative way between the public and private sectors. I hope the current administration's war on science doesn't lead to regression on this very serious concern. Dealing with the fallout from cybersecurity incidents is simply the new normal today, and it will stay that way at least until managing it gets woven into the fabric of everyday life - like destructive weather, tsunamis, and earthquakes so workarounds, mitigation, and compensation will be required, especially for legacy systems. Going forward everything that has logic in it must be designed with cybersecurity in mind and people must be aware and adapt as well.  
    I called it Intel's worst screwup because it probably is. This is a vulnerability in the chip itself that potentially exposes the kernel and thus pretty much everything. The exploits and hacking efforts you listed are attempts to break/attack/exploit a system that generally is working properly. This flaw puts computer users at a disadvantage right out of the gate.  So while I appreciate the info you provided, I think you kind of missed the point.  If the bad actors (anyone engaged general purpose computer fuckery) get around the fixes it's pretty hard to imagine a worse situation for every single Intel based computer with this flaw. This most likely would include all the systems you mentioned, with the safety systems being the most critical.  I don't think downplaying this flaw is wise.

    To me this is just another reason for some big thinking to be done regarding how computers and their related softwares work and are designed. As we have become utterly dependent on these machines it seems pretty ridiculous that we are not able to create systems that are safe.  The Russians are becoming specialists at entering and crippling power distribution grids.  Do we really want to wait to see what happens when they try something?  Of even worse if someone steals their tech and shuts down a grid just for fun?  Computers were once seen as a panacea for so many things, and to a degree they are.  But now they are a huge risk to world stability and have become a digital albatross around the world's neck.  I love computers and what they do, but we have completely blown it when it comes to deploying them responsibly and with restraint.
    Solifastasleepracerhomie3mobiustdknoxphilboogie
  • Reply 24 of 90
    Most likely scenario is Apple getting sued (class action) by customers and Apple suing Intel for the compensation. In the end the consumer is buying a Macintosh consisting of multiple components from multiple third party vendors. It doesn’t make sense for customers to sue Intel.
  • Reply 25 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    rob53 said:
    If Intel has or can fix this HW issue then I will demand a replacement Mac for every Mac I own. Software won’t fix a HW issue that a hack can’t exploit. I see a class action lawsuit against Intel not Apple. 
      A class action against Intel, Apple, Microsoft, or whoever, will go nowhere. 

    https://readplaintext.com/why-can-t-you-sue-software-makers-for-bugs-1d7585132b3b

    Yeah, I know this is a hardware bug, but since it will be very difficult to prove physical injury or harm then I think the same rule applies. If you could sue companies for defects then every tech company in existence (Apple included) would have gone out of business twenty years ago. 

    Now, it’s fine that the Register has come up with figures of a 30% performance hit, but bear in remind that the Register doesn’t really know the internals of the products they’re testing. I’m pretty sure that Microsoft can come up with optimisations to reduce the hit. And what about Apple? Is this something they can guard against this by tweaking the System Integrity Protection? (That’s a genuine question by the way)
    edited January 2018 muthuk_vanalingamwatto_cobra
  • Reply 26 of 90
    LatkoLatko Posts: 398member
    Rayz2016 said:
    clexman said:
    k2kw said:
    It should be time for an A series based MacBook Air or an iOS laptop?
    Then a 30% slowdown on older hardware will be call a, "Feature," and not a bug.
    Well, it would be a feature if the slowdowns prevented a crash that also happens on every other piece of hardware, regardless of age, made by everyone else. 
    Outside of our micro-cosmos, realize Macs are big in numbers but relatively futile in importance. 
    Think of industrial / infra / nuclear consequences...
    edited January 2018 Soli
  • Reply 27 of 90
    Now this is kernel panic!
    Solichiabaconstangmdwychoff
  • Reply 28 of 90
    It’s a good thing I spent all my money on an expensive IPad, and a cheep AMD notebook : )

    Yes.  This is Intel’s biggest screwup ever.

    Apple needs to expedite A series development for laptops, etc.

    Processors have become so complicated they’ve become buggy just like software.  The problem is unlike software, hardware flaws are difficult to patch.

    I wonder if Intel sees a future where x86 is dead?

    Is this why they bought Altera?

    Altera specializes in field-programmable gate arrays (FPGAs) which can be re-programmed to run new tasks. That agility is critical to the future of computing because it lets companies operating data centers (or wrangling millions of devices out in the field) re-purpose the brains of those devices on the fly instead of having to replace or upgrade...

  • Reply 29 of 90
    SoliSoli Posts: 10,038member
    2018 is off to an interesting start. There's also a zero-day bug of a reported 15+ year flaw in macOS kernel. You need physical access to the device and the guy never reported it to Apple so there's some question of ethics and risk at play.

    emoeller
  • Reply 30 of 90
    fastasleepfastasleep Posts: 6,425member
    welshdog said:
    Just knowing this flaw exists, even without details, means the bad actors of the world will be working overtime to find an exploit.  And then they'll work to find an exploit to the fix. 
    I'm sure the NSA already knows about it.
    welshdog
  • Reply 31 of 90
    tyler82tyler82 Posts: 1,105member
    Just bought a new $3,200 MacBook Pro kaby lake a few months ago, is this processor vulnerable? And if so, chance of me getting a refund or the chip replaced by Apple?
    edited January 2018
  • Reply 32 of 90
    netmagenetmage Posts: 314member
    This is bad but hardly as bad as the WPA2 issues recently or any network OS vulnerabilities. If bad actors are running code on your box, you’ve already lost the game - this is just another way they can exploit that. 
    fastasleepdewmewatto_cobra
  • Reply 33 of 90
    Rayz2016Rayz2016 Posts: 6,957member
    Latko said:
    Rayz2016 said:
    clexman said:
    k2kw said:
    It should be time for an A series based MacBook Air or an iOS laptop?
    Then a 30% slowdown on older hardware will be call a, "Feature," and not a bug.
    Well, it would be a feature if the slowdowns prevented a crash that also happens on every other piece of hardware, regardless of age, made by everyone else. 
    Outside of our micro-cosmos, realize Macs are big in numbers but relatively futile in importance. 
    Think of industrial / infra / nuclear consequences...
    You missed the point. He was talking about the battery thing again.
  • Reply 34 of 90
    asdasdasdasd Posts: 5,686member
    Soli said:
    2018 is off to an interesting start. There's also a zero-day bug of a reported 15+ year flaw in macOS kernel. You need physical access to the device and the guy never reported it to Apple so there's some question of ethics and risk at play.

    He literally wrote the hack code and published it on GitHub.  
  • Reply 35 of 90
    wizard69wizard69 Posts: 13,377member
    netrox said:
    does anyone remember the intel division bug?
    Sadly not many.  I never understood the freedom Intel is given with these failures.   AMD screws up with a tiny problem and everybody and their brother wants blood from the company. 

    Lucky for AMD they are in a position to exploit this failure due to having very competitive hardware finally.  
    watto_cobra
  • Reply 36 of 90
    wizard69wizard69 Posts: 13,377member

    anome said:
    My first reaction is “Intel have a bug on a mass produced chip? This is news?”
    foggyhill said:
    In a just world this truly horrendous issue should crash Intel's stock, but probably won't, only Apple seemingly gets any scrutiny for anything even when it is trivial.
    Intel are too important, they’ll get propped up by their customers. Really makes you wish there was credible competition for processors like there used to be.

    I still don’t think Apple are ready to go with ARM for the desktop, but I bet the custom silicon lobby inside the company are using this to press their case.
    You need to look at AMDs new Ryzen Mobile chips.  Very impressive performance for a low power solution.  I just bought an HP envy with one and i can say safely that it rocks.  

    The only real problem is that the laptop and processor combo are a bit bleeding edge.   I notice significant boosts in performance for some behaviors with the last MS update.  Still waiting for a bootable Linux. 
  • Reply 37 of 90
    wizard69wizard69 Posts: 13,377member

    netrox said:
    Actually, we should sue Apple as well, think of it - by sueing Apple (and every other PC company), we can force their devices to be more modular and more accessible for upgrades/exchanges. There is really no reason to soldier everything on a logic board considering that it can be expensive if they are forced to replace all logic boards with soldiered cpu's/ram. The components keep getting smaller and thinner yet they can easily be socketed. I cannot think of a reason why it should be soldered. 
    Soldered in components are massively more reliable than socketed components.    Any decent engineer will strive to minimize interconnections by their nature sockets had many unneeded connections to a system.   

    Beyond that you are living in the past the trend is to move more and more functionality either on die, in to a chip stack or in package.  We are already seeing this in high performance GPUs.    And of course cell phone chips are wafer stacks these days.  
    watto_cobra
  • Reply 38 of 90
    so.. what about OS X El Capitan & Mid-2009 MacBookPro?
  • Reply 39 of 90
    GG1GG1 Posts: 483member
    wizard69 said:

    netrox said:
    Actually, we should sue Apple as well, think of it - by sueing Apple (and every other PC company), we can force their devices to be more modular and more accessible for upgrades/exchanges. There is really no reason to soldier everything on a logic board considering that it can be expensive if they are forced to replace all logic boards with soldiered cpu's/ram. The components keep getting smaller and thinner yet they can easily be socketed. I cannot think of a reason why it should be soldered. 
    Soldered in components are massively more reliable than socketed components.    Any decent engineer will strive to minimize interconnections by their nature sockets had many unneeded connections to a system.   

    Beyond that you are living in the past the trend is to move more and more functionality either on die, in to a chip stack or in package.  We are already seeing this in high performance GPUs.    And of course cell phone chips are wafer stacks these days.  
    +1 on soldered component reliability. Every physical interconnection is a defect possibility during manufacture (and reliability over time, as connectors can work loose over time).

    Note Apple's first use of "2.5D" PC board construction in the iPhone X (the raised border around the perimeter, which is actually part of the PC board). This method eliminates multiple interconnect components and ribbon cables for much better reliability. From https://www.ifixit.com/Teardown/iPhone+X+Teardown/98975.


    danhfastasleepwatto_cobra
  • Reply 40 of 90
    Class action lawsuit to follow to replace microprocessor or computer.
Sign In or Register to comment.