Intel aims beyond 5Ghz for future MacBook Pro H-series processors

Posted:
in General Discussion edited June 2020
The next generation of MacBook Pro could be equipped with processors capable of going faster than 5Ghz, after Intel revealed at CES 2020 it was planning to release a 10th-generation mobile processor capable of outpacing its existing 5GHz flagship component.




Revealed during a CES "sneak peek," Intel claimed the next batch of processors in its 10th-generation lineup will include a Core i7 H-series model with the high clock speed. While exact speeds were not mentioned, it was insisted the model will have a potential speed higher than 5GHz, making it effectively the fastest among its mobile processors.

At present, the highest clock speed in a mobile processor is the Core i9 9980HK, offering a boosted clock speed of 5GHz. As the chip is also in the Core i7 range, Engadget reports, Intel is also likely to surpass the performance offered by that processor in a future Core i9 announcement.

At the same time, Intel mentioned the H-series processors will provide better scaling across 8 cores and 16 threads. This in turn should also help improve performance on top of the clock speed increases.

Given the performance claims, the unnamed H-series chip may make an appearance in a the 2020 MacBook Pro. Current models already use H-series chips, with the highest clock speed available being the 2.4Ghz ninth-generation Core i9 in the 16-inch MacBook Pro, which can run at up to 5GHz under Turbo Boost.

Intel is anticipated to provide more details about its future processsor product lines during its news conference at 4pm eastern time on Monday.

Intel already has a number of other processors in its 10th-generation lineup for possible use in Apple notebooks beyond the MacBook Pro. Launched in August, four Y-series chips and four U-series processors offered potential use in the MacBook Air, due to offering extremely low thermal design points best suited for ultra-portable notebooks.

Comments

  • Reply 1 of 19
    cornchipcornchip Posts: 1,945member
    oh no! not this whole thing again!
    viclauyyc
  • Reply 2 of 19
    rob53rob53 Posts: 3,241member
    FUD from Intel to try and slow down Apple's migration to i's own chips. Intel won't come out with these any time soon. Single CPU clock speed seems to still be important because too many applications are not programmed to make use of multi cores. If developers spent more time getting away from single thread programs, they could see their applications run faster by using more cores. Unless something has changed recently, Adobe products continue to make use of single cores to run the majority of their consumer software. Non-Adobe products are using multi cores making them run a lot faster, especially on "slower" CPUs.
    viclauyyccornchipwatto_cobra
  • Reply 3 of 19
    These had better be 7nm or they will be furnaces!
    caladaniancornchipwatto_cobra
  • Reply 4 of 19
    Rayz2016Rayz2016 Posts: 6,957member
    There is a galaxy-wide gap  between what Intel says and what Intel delivers
    viclauyyccornchipGabywatto_cobra
  • Reply 5 of 19
    canukstormcanukstorm Posts: 2,694member
    jorjitop said:
    These had better be 7nm or they will be furnaces!
    Intel's 7nm processors won't be coming until 2022
    cornchipwatto_cobra
  • Reply 6 of 19
    "Intel also boasted the chips would consume a mere 1.21 gigawatts of power per second while running as cool as a million suns."
    cornchipfastasleepGabywatto_cobra
  • Reply 7 of 19
    cornchipcornchip Posts: 1,945member
    "Intel also boasted the chips would consume a mere 1.21 gigawatts of power per second while running as cool as a million suns."

    oh man I actually LoLd
    king editor the gratewatto_cobra
  • Reply 8 of 19
    sflocalsflocal Posts: 6,092member
    What would be nice if Intel got off its laurels and started getting serious again like AMD is right now.
    watto_cobra
  • Reply 9 of 19
    rob53 said:
    FUD from Intel to try and slow down Apple's migration to i's own chips. Intel won't come out with these any time soon. Single CPU clock speed seems to still be important because too many applications are not programmed to make use of multi cores. If developers spent more time getting away from single thread programs, they could see their applications run faster by using more cores. Unless something has changed recently, Adobe products continue to make use of single cores to run the majority of their consumer software. Non-Adobe products are using multi cores making them run a lot faster, especially on "slower" CPUs.
    I’m inferring you’re not an experienced developer if you’re a developer at all: very few applications can be made much more parallel because computations have a purely sequential flow.  This is why single-core speed still matters, and why throwing more cores at a problem is a fool’s gold, not to mention that adding more threads can often make it take longer to actually accomplish something due to communication overhead between threads.  
    fastasleepRayz2016mattinozwatto_cobra
  • Reply 10 of 19
    cpsrocpsro Posts: 3,192member
    I expect MBP CPUs will continue to throttle themselves until Intel gets off 14nm. MEANWHILE, Apple is at 7nm and was at 10nm 2-1/2 years ago.
    watto_cobra
  • Reply 11 of 19
    foljsfoljs Posts: 390member
    rob53 said:
    FUD from Intel to try and slow down Apple's migration to i's own chips.


    Conspiracy theory BS. Apple will move or not move to its own chips on its own agenda. Whatever Intel says, they will evaluate on their own, and the most important criterium would be how ready Apple would be for the transition (e.g. emulation for running existing Intel compiled programs, future roadmap, different thermal and performance points covered for its own series etc).

    In any case, the high speed, top-of-the-line MPBs, as these CPUs are targeted to, would be the very last to go ARM, even if Apple moves forward. We're talking 5+ years in the future.

    So hardly "FUD" and hardly aimed specifically at Apple. It's the same kind of presentation Intel has done for decades for its newer architectures.

    edited January 2020 tmay
  • Reply 12 of 19
    mdriftmeyermdriftmeyer Posts: 7,503member






    Apple is out of excuses not to switch to AMD now. And no Apple isn't switching to ARM and yes they have been testing Zen in-house for over a year.
    edited January 2020 watto_cobra
  • Reply 13 of 19
    rob53 said:
    FUD from Intel to try and slow down Apple's migration to i's own chips. Intel won't come out with these any time soon.
    They will come out before Apple comes out with their own chips which will be ... never. Seriously, I wish people will give this up. It isn't their area of expertise. CPU and SOC design isn't easy. If it was, everyone would do it. Instead there are only two CPU companies on the planet and have been for decades - Intel and AMD both of whom use the same base x86 design - and the number of SOC companies isn't that much bigger (and again they all start from the same base ARM Holdings design). Sun Microsystems, Motorola and IBM, who were all making CPUs or SOCs as recently as the 1990s? RIP. Also there is this whole "application compatibility" thing. Macs run on x86 just like Windows and Linux computers. Result: while you have to tweak for the different OSes and such, all "PC" applications are developed for the x86 instruction set. If Apple wants to use their ARM SOCs for MacBooks or develop a wholly new SOC/CPU, all those developers would have to port, rewrite or create from scratch their x86 applications or Apple would have to emulate x86 (more on this later).

     Consider what Google does with Chromebooks. Nearly all Chromebooks run on x86, nearly all Android apps are ARM. So Chromebooks run the Android apps on am ARM emulator. Were Apple to switch to either their own Ax chips or design completely new ones, they would likely have to offer x86 emulation too. Except this causes a performance hit. Not a big deal for Android apps on Chromebooks as most Android apps are designed to be able to run on dual core CPUs with 512 MB of RAM anyway. But for the desktop and productivity applications that people use on MacBook Pros? Yeah, that's a problem.

    And x86 is copyrighted. Where Google can emulate ARM on ChromeOS because they picked up ARM patents when they bought Motorola (plus some ARM stuff is open source anyway) Apple cannot emulate x86 on anything without paying either Intel or AMD a ton of cash. Microsoft is dealing with this right now ... they need to emulate x86 on Qualcomm's chips to get back into the mobile space ... and Intel has responded "fine ... write us a check for every device you sell." They would give Cupertino the same terms that they give Redmond.

    So while an A12 can certainly match the performance of the Intel CPUs in most MacBooks in theory - iMacs and Mac Pros not so much! - the x86 emulation would take a decent bite out of that performance. Add to that Apple needing to pay significant licensing fees to Intel - let's see you call THEM a patent troll! - PLUS the little issue that Apple would need to ensure their x86 on ARM emulation isn't too similar to Microsoft's - Redmond will sue too if it does - and it doesn't make sense from a technology or business sense.
     
    Which is why Apple is never going to do it. And is why the people who keep claiming that they will have never taken so much as a high school computer architecture class.
    edited January 2020 muthuk_vanalingamwatto_cobra
  • Reply 14 of 19
    Rayz2016Rayz2016 Posts: 6,957member
    rob53 said:
    FUD from Intel to try and slow down Apple's migration to i's own chips. Intel won't come out with these any time soon.
    They will come out before Apple comes out with their own chips which will be ... never. Seriously, I wish people will give this up. It isn't their area of expertise. CPU and SOC design isn't easy. If it was, everyone would do it. Instead there are only two CPU companies on the planet and have been for decades - Intel and AMD both of whom use the same base x86 design - and the number of SOC companies isn't that much bigger (and again they all start from the same base ARM Holdings design). Sun Microsystems, Motorola and IBM, who were all making CPUs or SOCs as recently as the 1990s? RIP. Also there is this whole "application compatibility" thing. Macs run on x86 just like Windows and Linux computers. Result: while you have to tweak for the different OSes and such, all "PC" applications are developed for the x86 instruction set. If Apple wants to use their ARM SOCs for MacBooks or develop a wholly new SOC/CPU, all those developers would have to port, rewrite or create from scratch their x86 applications or Apple would have to emulate x86 (more on this later).

     Consider what Google does with Chromebooks. Nearly all Chromebooks run on x86, nearly all Android apps are ARM. So Chromebooks run the Android apps on am ARM emulator. Were Apple to switch to either their own Ax chips or design completely new ones, they would likely have to offer x86 emulation too. Except this causes a performance hit. Not a big deal for Android apps on Chromebooks as most Android apps are designed to be able to run on dual core CPUs with 512 MB of RAM anyway. But for the desktop and productivity applications that people use on MacBook Pros? Yeah, that's a problem.

    And x86 is copyrighted. Where Google can emulate ARM on ChromeOS because they picked up ARM patents when they bought Motorola (plus some ARM stuff is open source anyway) Apple cannot emulate x86 on anything without paying either Intel or AMD a ton of cash. Microsoft is dealing with this right now ... they need to emulate x86 on Qualcomm's chips to get back into the mobile space ... and Intel has responded "fine ... write us a check for every device you sell." They would give Cupertino the same terms that they give Redmond.

    So while an A12 can certainly match the performance of the Intel CPUs in most MacBooks in theory - iMacs and Mac Pros not so much! - the x86 emulation would take a decent bite out of that performance. Add to that Apple needing to pay significant licensing fees to Intel - let's see you call THEM a patent troll! - PLUS the little issue that Apple would need to ensure their x86 on ARM emulation isn't too similar to Microsoft's - Redmond will sue too if it does - and it doesn't make sense from a technology or business sense. 
     
    Which is why Apple is never going to do it. And is why the people who keep claiming that they will have never taken so much as a high school computer architecture class.
    This post contains so many inaccuracies that I thought it was a joke and that maybe you just forgot the /s at the end.

    But since someone has tagged it as informative, I thought it might be worth working through it, just in case you're serious.

    Let's start with where things started to fall apart: the beginning.

    Seriously, I wish people will give this up. It isn't their area of expertise. CPU and SOC design isn't easy. If it was, everyone would do it. Instead there are only two CPU companies on the planet and have been for decades - Intel and AMD both of whom use the same base x86 design - and the number of SOC companies isn't that much bigger (and again they all start from the same base ARM Holdings design).

    Apple has been designing its own chips for years, and no, it isn't easy. They have succeeded in pulling more performance while using less power than just about every other mobile chip on the market. Apple was within striking distance of high-end laptops two years ago, and they haven't been standing still.

    https://appleinsider.com/articles/18/10/05/apples-a12-bionic-comes-close-to-desktop-cpu-performance-in-benchmarks
    https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-review-unveiling-the-silicon-secrets

    I suspect that your error comes from the belief that because the chips use the ARM instruction set then it must be using the same designs. This is incorrect. The chips that Apple uses are their own designs that happen to use the ARM instruction set. There is no real advantage of coming up with another instruction set, especially since Apple had more than a passing hand in the creation of ARM in the first place.

    Also there is this whole "application compatibility" thing. Macs run on x86 just like Windows and Linux computers. Result: while you have to tweak for the different OSes and such, all "PC" applications are developed for the x86 instruction set. If Apple wants to use their ARM SOCs for MacBooks or develop a wholly new SOC/CPU, all those developers would have to port, rewrite or create from scratch their x86 applications or Apple would have to emulate x86 (more on this later). 

    This is just so wrong on so many levels I'm not sure where to start. You seem to think that people are writing apps directly to the X86 instruction set. That is balderdash.

    Okay, here we go: software development from the beginning.

    Developers write apps using a high-level programming language.
    These days, this is usually translated into an intermediate lower level representation, before being compiled into the runnable app.

    This is a gross over-simplification, but I don't have time to take you through the thirty years of software development innovation that you've missed. 

    If you want to run an app on a different instruction set, then you compile it for a different platform. These days, you don't have to write it from scratch. If you did, then the likes of Adobe and Microsoft wouldn't bother, which is what Jobs discovered when he tried to tell them that if they wanted their apps to run on the newly-minted MacOSX, then they'd have to rewrite them all in ObjectiveC

    Microsoft and Adobe said, "No, Steve, I don't think that's going to work for us."

    And so Apple came up with the Carbon layer, which would allow existing applications to run under emulation.  This must've been quite a humbling experience for Cupertino, because they took their existing expertise with switching platforms and went hyper.

    Ever heard of the universal binary? When Apple was transitioning from PowerPC to X86, you could build an app from the same code base that contained the code in both PowerPC and x86. The same app could effectively run on the two chipsets. Now how do you think that would've worked if you had to "rewrite your whole application to run on x86"?

    For the past fifteen years, Apple has been coaxing (and none too gently) developers to write their apps to a language and frameworks that can be ported across platforms.  Why? To make sure that they can move to another chip if they need to. If/when Apple decides to move to their own processor, then developers who have toed the company line will be able to their applications to the architecture with a flip of a compiler switch.

    Did I say, flip of a compiler switch? My mistake.  

    The application will be compiled to an intermediate bitcode.

    When a user buys the app from the App store, the bitcode will be compiled to whatever platform it's going to before being sent to the end user.

    Okay, that's a lot of writing, but the upshot is this: No one who has followed Apple guidelines for the past fifteen years will have to rewrite their applications  for a switch to ARM.

    So while an A12 can certainly match the performance of the Intel CPUs in most MacBooks in theory - iMacs and Mac Pros not so much! - the x86 emulation would take a decent bite out of that performance. Add to that Apple needing to pay significant licensing fees to Intel - let's see you call THEM a patent troll! - PLUS the little issue that Apple would need to ensure their x86 on ARM emulation isn't too similar to Microsoft's - Redmond will sue too if it does - and it doesn't make sense from a technology or business sense. 

    Jesus Henry Christ on a bicycle … 

    There would be no performance loss because there would be no need to emulate the x86. The applications would need recompiling, not rewriting from scratch.

    In the case of applications on the app store, I'm not even sure that they would need recompiling, since they're stored in an intermediate format that is compiled for the customer's machine when the app is downloaded.

    Something else to bear in mind: because Apple does the chip, the supporting hardware (along with other chips that – yes – they design in house), the operating system and the  development frameworks, they're in a position to optimise the experience that will serve to lessen the performance difference between x86 desktop and ARM even further.

    So, will Apple shift to their own processors across the board? Well, it's been talked about a lot, and last year, even Intel engineers were telling people they expected the first sign of the transition to appear in 2020/2021

    https://www.axios.com/apple-macbook-arm-chips-ea93c38a-d40a-4873-8de9-7727999c588c.html

    But if they don't, it won't be for any of the reasons in your post. Apple was stung badly when IBM started losing interest in designing a power efficient PowerPC chip. Intel raced ahead, and Apple made sure that they would never again be in a position where they're trapped on an end-of-life architecture. 

    Is Apple planning to move? It doesn't matter. What matters is that if they have to, they can.

    edited January 2020 anonconformistmacpluspluscornchipsphericwatto_cobratmay
  • Reply 15 of 19
    wizard69wizard69 Posts: 13,377member






    Apple is out of excuses not to switch to AMD now. And no Apple isn't switching to ARM and yes they have been testing Zen in-house for over a year.
    Intel seems to be looking to the past by focusing on high click rates.   That means long execution units for a given process which didn’t do them any favors back then and won’t in the future.  

    As for AMD I’m not sure what Apples problem is with adopting their hardware.  Ryzen is excellent even in older forms before today’s releases.  Excellent in a way that Apple should love which is very good GPUs in their APU chips.    That was with the initial Ryzens, now AMD leads I all fronts.   

    Honestly Apples behavior with high prices and crappy performance has me really negative with respect to the Mac lineup.  Things like M.2 SSD’s are old hat these days in far cheaper machine.  I’m not an All inOne fan but even if I was the iMacs are truly pathetic for the price.  In some cases you are buying 5 year old tech at new hardware prices.    Macs, the iMac especially, have become huge rip offs.  
    cornchip
  • Reply 16 of 19
    wizard69wizard69 Posts: 13,377member
    rob53 said:
    FUD from Intel to try and slow down Apple's migration to i's own chips. Intel won't come out with these any time soon.
    They will come out before Apple comes out with their own chips which will be ... never. Seriously, I wish people will give this up. It isn't their area of expertise. CPU and SOC design isn't easy. If it was, everyone would do it. Instead there are only two CPU companies on the planet and have been for decades - Intel and AMD both of whom use the same base x86 design - and the number of SOC companies isn't that much bigger (and again they all start from the same base ARM Holdings design). Sun Microsystems, Motorola and IBM, who were all making CPUs or SOCs as recently as the 1990s? RIP. Also there is this whole "application compatibility" thing. Macs run on x86 just like Windows and Linux computers. Result: while you have to tweak for the different OSes and such, all "PC" applications are developed for the x86 instruction set. If Apple wants to use their ARM SOCs for MacBooks or develop a wholly new SOC/CPU, all those developers would have to port, rewrite or create from scratch their x86 applications or Apple would have to emulate x86 (more on this later).

     Consider what Google does with Chromebooks. Nearly all Chromebooks run on x86, nearly all Android apps are ARM. So Chromebooks run the Android apps on am ARM emulator. Were Apple to switch to either their own Ax chips or design completely new ones, they would likely have to offer x86 emulation too. Except this causes a performance hit. Not a big deal for Android apps on Chromebooks as most Android apps are designed to be able to run on dual core CPUs with 512 MB of RAM anyway. But for the desktop and productivity applications that people use on MacBook Pros? Yeah, that's a problem.

    And x86 is copyrighted. Where Google can emulate ARM on ChromeOS because they picked up ARM patents when they bought Motorola (plus some ARM stuff is open source anyway) Apple cannot emulate x86 on anything without paying either Intel or AMD a ton of cash. Microsoft is dealing with this right now ... they need to emulate x86 on Qualcomm's chips to get back into the mobile space ... and Intel has responded "fine ... write us a check for every device you sell." They would give Cupertino the same terms that they give Redmond.

    So while an A12 can certainly match the performance of the Intel CPUs in most MacBooks in theory - iMacs and Mac Pros not so much! - the x86 emulation would take a decent bite out of that performance. Add to that Apple needing to pay significant licensing fees to Intel - let's see you call THEM a patent troll! - PLUS the little issue that Apple would need to ensure their x86 on ARM emulation isn't too similar to Microsoft's - Redmond will sue too if it does - and it doesn't make sense from a technology or business sense.
     
    Which is why Apple is never going to do it. And is why the people who keep claiming that they will have never taken so much as a high school computer architecture class.
    This post is wrong in so many ways as to be a joke.   

    Apps these days are not developed for x86.  They are developed for operating systems and the APIs they support.  The development process tries to separate the programmer from the hardware as much as possible via high level languages and abstractions.   Very few wise developers these days target the processor directly.  

    If you want to see how operating systems can operate on different hardware without a major strain just look at Linux.   Many distros run on ARM, x86 andPower without any drama.  This is due to the item above, that is the use of high level languages and a focus on APIs.  

    As far as the development of ARM based hardware for Macs by Apple it is actually a real good idea.  Why?   Simple it spreads the cost of chip development across multiple products.  Today’s A series is already good enough to run Mac Book like devices.  Given a little more I/O, cache and thermal budget and you cover even more hardware.  The design of the A series is already a good base for building higher performance Mac oriented processors.  

    As for Sparc and the many other dying processor offerings, they died along with the companies offering them.   The reasons are complex but the big factor was the advent of the buy Intel mentality.   Frankly the last half decade was a wake up call for people ascribing to that mentality.  Buy Intel literally crushed innovation.   This is perhaps the biggest advantage for Apple, their own hardware allows for innovation. Again the A series is a very public demonstration of this advantage.  
    cornchipsphericwatto_cobra
  • Reply 17 of 19
    mattinozmattinoz Posts: 2,299member
    wizard69 said:
    rob53 said:
    FUD from Intel to try and slow down Apple's migration to i's own chips. Intel won't come out with these any time soon.
    They will come out before Apple comes out with their own chips which will be ... never. Seriously, I wish people will give this up. It isn't their area of expertise. CPU and SOC design isn't easy. If it was, everyone would do it. Instead there are only two CPU companies on the planet and have been for decades - Intel and AMD both of whom use the same base x86 design - and the number of SOC companies isn't that much bigger (and again they all start from the same base ARM Holdings design). Sun Microsystems, Motorola and IBM, who were all making CPUs or SOCs as recently as the 1990s? RIP. Also there is this whole "application compatibility" thing. Macs run on x86 just like Windows and Linux computers. Result: while you have to tweak for the different OSes and such, all "PC" applications are developed for the x86 instruction set. If Apple wants to use their ARM SOCs for MacBooks or develop a wholly new SOC/CPU, all those developers would have to port, rewrite or create from scratch their x86 applications or Apple would have to emulate x86 (more on this later).

     Consider what Google does with Chromebooks. Nearly all Chromebooks run on x86, nearly all Android apps are ARM. So Chromebooks run the Android apps on am ARM emulator. Were Apple to switch to either their own Ax chips or design completely new ones, they would likely have to offer x86 emulation too. Except this causes a performance hit. Not a big deal for Android apps on Chromebooks as most Android apps are designed to be able to run on dual core CPUs with 512 MB of RAM anyway. But for the desktop and productivity applications that people use on MacBook Pros? Yeah, that's a problem.

    And x86 is copyrighted. Where Google can emulate ARM on ChromeOS because they picked up ARM patents when they bought Motorola (plus some ARM stuff is open source anyway) Apple cannot emulate x86 on anything without paying either Intel or AMD a ton of cash. Microsoft is dealing with this right now ... they need to emulate x86 on Qualcomm's chips to get back into the mobile space ... and Intel has responded "fine ... write us a check for every device you sell." They would give Cupertino the same terms that they give Redmond.

    So while an A12 can certainly match the performance of the Intel CPUs in most MacBooks in theory - iMacs and Mac Pros not so much! - the x86 emulation would take a decent bite out of that performance. Add to that Apple needing to pay significant licensing fees to Intel - let's see you call THEM a patent troll! - PLUS the little issue that Apple would need to ensure their x86 on ARM emulation isn't too similar to Microsoft's - Redmond will sue too if it does - and it doesn't make sense from a technology or business sense.
     
    Which is why Apple is never going to do it. And is why the people who keep claiming that they will have never taken so much as a high school computer architecture class.
    This post is wrong in so many ways as to be a joke.   

    Apps these days are not developed for x86.  They are developed for operating systems and the APIs they support.  The development process tries to separate the programmer from the hardware as much as possible via high level languages and abstractions.   Very few wise developers these days target the processor directly.  

    If you want to see how operating systems can operate on different hardware without a major strain just look at Linux.   Many distros run on ARM, x86 andPower without any drama.  This is due to the item above, that is the use of high level languages and a focus on APIs.  

    As far as the development of ARM based hardware for Macs by Apple it is actually a real good idea.  Why?   Simple it spreads the cost of chip development across multiple products.  Today’s A series is already good enough to run Mac Book like devices.  Given a little more I/O, cache and thermal budget and you cover even more hardware.  The design of the A series is already a good base for building higher performance Mac oriented processors.  

    As for Sparc and the many other dying processor offerings, they died along with the companies offering them.   The reasons are complex but the big factor was the advent of the buy Intel mentality.   Frankly the last half decade was a wake up call for people ascribing to that mentality.  Buy Intel literally crushed innovation.   This is perhaps the biggest advantage for Apple, their own hardware allows for innovation. Again the A series is a very public demonstration of this advantage.  
    An Apple doesn’t have to pick aside to get advantage from the Aseries chip work they are doing for iDevices. We already see that work appearing in Macs with the Tseries chips  I suspect we will see more and more platform work on Mac being handled by Apple own hardware in future Macs. 
    cornchipwatto_cobra
  • Reply 18 of 19
    knowitallknowitall Posts: 1,648member
    In a way this will be extraordinary, possibly violation thermodynamics laws by producing even more heat in watts than its electrical input. 

    P.S. can't wait for the new A(rm)Mac; that will be truly insane.  
    edited January 2020 watto_cobra
  • Reply 19 of 19
    sphericspheric Posts: 2,544member
    rob53 said:
    FUD from Intel to try and slow down Apple's migration to i's own chips. Intel won't come out with these any time soon.
    They will come out before Apple comes out with their own chips which will be ... never. Seriously, I wish people will give this up. It isn't their area of expertise. CPU and SOC design isn't easy. If it was, everyone would do it. Instead there are only two CPU companies on the planet and have been for decades - Intel and AMD both of whom use the same base x86 design - and the number of SOC companies isn't that much bigger (and again they all start from the same base ARM Holdings design). Sun Microsystems, Motorola and IBM, who were all making CPUs or SOCs as recently as the 1990s? RIP. 
    The rest of your post has been torn apart competently and succinctly already. 

    Just posting to point out that Apple is one of the world's two or three biggest designers of SoC's — they've been designing their own SoC's for iPhone, iPad, and Apple Watch, as well as the T1 and T2 subsystems on newer Macs, for years now. 
    watto_cobratmayknowitall
Sign In or Register to comment.