Early 2021 Apple Silicon iMac said to have 'A14T' processor

124

Comments

  • Reply 61 of 85
    Rayz2016Rayz2016 Posts: 6,957member
    cornchip said:
    I’m also curious at what point they decided to start building chips for the Mac. Was it way back at the beginning? My guess is that this was part of the plan all along or at least very early on, and why they went 64bit in a mobile. It caught everyone off guard, but I suspect they were already ramping up for Mac at that point.


    I’d say they when they built their first 64-bit chip. Since then they’ve been refine the software stack and dragging their developer base (with some kicking and screaming) towards the future. 

    I imagine that some will be left behind; it happens with every transition, but it’s a small price to pay to secure the future of the Mac platform. 
    tmayAlex1Nwatto_cobra
  • Reply 62 of 85
    mjtomlin said:
    This rumor again. 

    There is no way Apple is going to use an A-series SoC in a Mac - not even a low-end MacBook. Those SoCs were designed specifically for iOS and the need to be highly efficient. There's a reason Apple said they were designing a new family of SoCs for the Mac - different needs. Using an A14X for both the iPad Pro and a MacBook, means one of those is not as optimized as Apple usually prefers.
    They will use an A series chip regardless of name.  That’s whats In the developers kit.  The macs will get some kind of improved versions of the A12z which is just an A series chip. 
  • Reply 63 of 85
    mjtomlin said:
    This rumor again. 

    There is no way Apple is going to use an A-series SoC in a Mac - not even a low-end MacBook. Those SoCs were designed specifically for iOS and the need to be highly efficient. There's a reason Apple said they were designing a new family of SoCs for the Mac - different needs. Using an A14X for both the iPad Pro and a MacBook, means one of those is not as optimized as Apple usually prefers.
    They will use an A series chip regardless of name.  That’s whats In the developers kit.  The macs will get some kind of improved versions of the A12z which is just an A series chip. 
  • Reply 64 of 85
    mjtomlin said:
    This rumor again. 

    There is no way Apple is going to use an A-series SoC in a Mac - not even a low-end MacBook. Those SoCs were designed specifically for iOS and the need to be highly efficient. There's a reason Apple said they were designing a new family of SoCs for the Mac - different needs. Using an A14X for both the iPad Pro and a MacBook, means one of those is not as optimized as Apple usually prefers.
    They will use an A series chip regardless of name.  That’s whats In the developers kit.  The macs will get some kind of improved versions of the A12z which is just an A series chip. 

    And if rumors are true that Apple wants to go slimmer and lighter with its laptops, it’s makes total sense that the laptop would use the same as an iPad.  They would prob have about the same thickness.   And thus the same same heat constraints.  
    edited October 2020
  • Reply 65 of 85
    h4y3s said:
    A silicon atom is only about 0.21 nm, so a 5nm process might be the limit for a while!
    https://en.wikipedia.org/wiki/3_nm_process
  • Reply 66 of 85
    elijahg said:
    I'm hoping Apple will still offer a discreet GPU, the Metal score for the Apple GPUs is crap compared to AMD and Nvidia's GPUs. It might be fine for mobile games with graphics on par with the Xbox 360, but desktops have far exceeded that.
    Agreed - it might work for a MacBook Air user but pros need much more. I wonder if they’ll stick with ATI for this.
    elijahg
  • Reply 67 of 85
    jdwjdw Posts: 1,324member
    1.5nm is technically possible, so we're good for the next decade or so.  
    watto_cobra
  • Reply 68 of 85
    mjtomlinmjtomlin Posts: 2,673member
    mikeinca said:
    mjtomlin said:
    This rumor again. 

    There is no way Apple is going to use an A-series SoC in a Mac - not even a low-end MacBook. Those SoCs were designed specifically for iOS and the need to be highly efficient. There's a reason Apple said they were designing a new family of SoCs for the Mac - different needs. Using an A14X for both the iPad Pro and a MacBook, means one of those is not as optimized as Apple usually prefers.
    They will use an A series chip regardless of name.  That’s whats In the developers kit.  The macs will get some kind of improved versions of the A12z which is just an A series chip. 

    Umm, the original Mac Pro developer kit shipped with a Pentium... the actual first Intel Macs shipped with newer Intel Core and Core Duo CPUs. So that's not at all indicative of anything.
    razorpitAlex1Nwatto_cobra
  • Reply 69 of 85
    mjtomlinmjtomlin Posts: 2,673member
    melgross said:
    There’s a lot of confusion by what’s meant by a “new” series of chips just for Macs, which is what Apple said they were doing.

    the way some of you are writing here, you don’t seem to understand that Apple can have all of its chips based on the same cores, but be different chips. Neither AMD or Intel label their chips with x86 in the name of the particular chip itself. But they’re all x86 chips. Apple will be the same. These are not A14 chips and more than they were A6 chips. That’s just the generational indicator, and that’s how Apple labels it’s chips. So there can easily be four or even five A14 chips out there, all having different features. Don’t get caught up by the fact that they may all be called A14 something or other.

    x86 (and x84-x64) are ISAs, they are not core designs. ARMv8 Is an ISA as well and yes, all of Apple's CPU cores are base off this ISA, but they are all also different designs. (Hell, they could even move to ARMv9 for their Mac SoCs.) If they weren't, Apple wouldn't need different names for each - A Thunderbolt CPU core is a Thunderbolt core regardless of what generation of SoC it's on. The same goes for every other logical unit on the SoC.

    Also, I am not getting "caught up" in the naming... some people are saying that Apple is going to use the exact same chip in both the Mac and iPad and that's where they're wrong, that's where Apple's statement of making and designing a new family of SoCs specifically for the Mac comes in. Because if they're just going to use the exact same A14 and A14X in the Mac, why bother making that statement? Furthermore they also said they're making bigger and more performant GPU cores for the Mac. And those SoCs will support virtualization. They will also support Thunderbolt (PCI). Not to mention they will need to support disparate RAM as RAM requirements for macOS is much different than iOS and putting all that RAM onto the SoC is probably not reasonable. I have a feeling all Mac SoCs will have 8GB RAM "on chip", and the rest "off chip".

    And as far as naming goes... current Macs don't use an "A10T", they use a "T2", even though that SoC is just a variant of the A10. So it stands to reason that even though Mac SoCs will share a lot of the same logical units, there will be enough differences to use a different name - just to differentiate their use.


    jdb8167Alex1Nwatto_cobra
  • Reply 70 of 85
    razorpitrazorpit Posts: 1,796member
    I, for one, am not happy about this.  Moving away from x86 to something (anything) else will break sooooo much.  All of a sudden the ability to run Windows (at CPU speed) goes away, all the programs which rely on Wine stop working (or at least working well).  I liked the 6502 (and variants).  I liked the 68K series.  I liked the PPC series.  I (eventually) liked the x86 series.  But this change...  I just don't feel good about it.  That said, perhaps the Apple CPUs will be fast enough to make emulation tolerable (unlike the x86 emulators for the PPC!)...  Perhaps.
    While this is absolutely a concern of mine, do you know this for fact? Personally I’m going to wait until the product is available before I lose sleep over it.
    I, for one, am not happy about this.  Moving away from x86 to something (anything) else will break sooooo much.  All of a sudden the ability to run Windows (at CPU speed) goes away, all the programs which rely on Wine stop working (or at least working well).  I liked the 6502 (and variants).  I liked the 68K series.  I liked the PPC series.  I (eventually) liked the x86 series.  But this change...  I just don't feel good about it.  That said, perhaps the Apple CPUs will be fast enough to make emulation tolerable (unlike the x86 emulators for the PPC!)...  Perhaps.
    Apple actually demonstrated its Rosetta 2 emulation for a Windows x86 3D game (Shadow of the Tomb Raider) in its July WWDC conference. Did you see it? Wasn't it fast enough for you? https://www.reddit.com/r/macgaming/comments/hdzdo8/shadow_of_the_tomb_raider_running_on_rosetta_2/ <--

    I'm also wondering if you are assuming that Microsoft has no intention to introduce a native Apple Silicon version of Windows. Would that mitigate your concern, since Windows apps would be able to recompile their apps to Apple Silicon version of Windows, thereby providing "native" support. Microsoft's official line on future OSs for Apple is "we have nothing more to say at this time." To me, that's exciting, as they are still considering it.
    The key is getting app developers to recompile. I don’t see it happening for any of the Windows apps I’m using now. The whole reason for having Parallels on my machine is to run applications not on the Mac. I doubt they’ll use their resources to build a customized version of their application for a customized version of Windows.
    ronnelijahgAlex1Nwatto_cobra
  • Reply 71 of 85
    I, for one, am not happy about this.  Moving away from x86 to something (anything) else will break sooooo much.  All of a sudden the ability to run Windows (at CPU speed) goes away, all the programs which rely on Wine stop working (or at least working well).  I liked the 6502 (and variants).  I liked the 68K series.  I liked the PPC series.  I (eventually) liked the x86 series.  But this change...  I just don't feel good about it.  That said, perhaps the Apple CPUs will be fast enough to make emulation tolerable (unlike the x86 emulators for the PPC!)...  Perhaps.
    Well, skepticism—applied systematically—will presumably eliminate one’s risk of experiencing disappointment!
    Alex1Nwatto_cobra
  • Reply 72 of 85
    elijahgelijahg Posts: 2,753member
    darkvader said:
    melgross said:It used to be, a long time ago, that Apple upgraded its Mac every quarter as slightly faster chips came out. Then it was 6 months, then once a year. It stayed that way, along with pretty much every other computer manufacturer, until Intel had problems with new chips that Apple was designing around. So Apple slowed its iterations to match the chip. They’re not happy about that.

    A lie, of course.  When Apple switched to Intel chips, there was a great deal of promise that Apple would, since Macs were essentially Intel reference designs, be moving to a more frequent release schedule.

    It never happened.  Intel released chips, Apple ignored them.  Intel released another generation of chips, Apple ignored them.  Intel kept releasing, Apple kept ignoring.  Apple would occasionally drop some almost out of date hardware, but Apple never lived up to the promise of frequent releases based on new Intel chips.


    This is unfortunately true. Initially Apple was getting special Intel chips ahead of time, but gradually fell into parity then lagged behind. Apple seemed to be very distracted by the iPhone, unfortunately making the Mac a bit of an afterthought. It's reasonably well known that Cook doesn't really like the Mac, as it's not a massive money spinner like the iPhone.
    Alex1N
  • Reply 73 of 85
    elijahgelijahg Posts: 2,753member
    rob53 said:
    mattinoz said:
    rob53 said:
    I know Apple has been moving towards an almost sealed computer enclosure for some time but have to wonder if they will continue this process or, hopefully, allow the use of SoC sockets at least in the beginning so newer, faster, more powerful SoCs can be added to what might be a basic logic board that only contains IO, WiFi, and ??? circuits and logic chips. If almost everything of value is in/on the SoC then making this component removal, interchangeable and able to be secured in a safe when not in use would be an interesting design change, especially for corporate and government users but also for the rest of us.

    Could this even be done? Looking at the iPhone 12, the SoC takes up a large portion of one side of the logic board while all the other logic chips are associated with support features. I could see many of these staying reasonably consistent on future Macs. Apple could include important chips on the same SoC socket (or two) thereby allowing for updates without replacing everything.


    Been wondering the same also why stop at 2?
    They could have "compute" daughter boards that could be mass produced with there own RAM, then different levels of macs get 1, 2, ...., 8+ sockets and on board platform processor to handle all the common functions and the PCIe network that links it all together. Leaves room for other processors hooked in to the network.

    Apple gets to make lots of the same processors in the same bundles that can be tested and build 2 - 3 bins of modules to sell at higher price points.

    Still leave the question if they do go this way can your Phone, Mac, Pad connect as a group of ePU's to the device I'm driving right now.

    Sounds good to me. The size of the iPhone logic boards continue to get smaller. Strip all the cellular stuff from the board and you'll have a very small computing blade. To keep everything in a thin package, the boards could be arranged side by side. A MBP could get away with 2-3 boards which would end up being ~3"x4". There's plenty of room in a MBP and iMac for several boards, just give us a trap door to get to them.
    Considering Apple does its utmost to remove access panels and the like, I think whilst this would be an awesome idea, it's not the way it's going to go with Cook at the helm. It would mean people spending $200 on a blade rather than forcing them to spend $2000 on a new machine. Why would he ever allow that? He only allows RAM upgrades in the 27" iMacs and MP.
    Alex1N
  • Reply 74 of 85
    elijahgelijahg Posts: 2,753member
    cornchip said:
    Quick question from a dummy.

    as I understand it, the intel chips i3, i5, i7, i9 are essentially the same chip; with the “lower grade” chips “simply” being the ones which have tested with flaws or underperforming/cores not functioning. 

    Is this true, and if so, it would stand to reason that the Apple chips face similar production yields. With Apple’s insane volumes, how does this play out in terms of what is shipped in iPhones and now Macs? 
    You are correct. The better chips tend to come from the centre of the wafer, the cheaper ones out toward the edges. 

    In the same family they're all the same chip, but they literally laser off bits of the CPU that aren't working - or sometimes disable it in microcode too, which is the firmware for the chip essentially. And then yes, they they sell it as a lower end ix chip. Sometimes the chip is fully functional at lower clocks, so they sell it as a lower end ix. or whatever. As the production process improves, they end up crippling perfectly functional high-end CPUs to sell as the higher volume i3 and i5.

    My 2019 iMac's MMU (on the CPU die) failed, which was probably because it was a very early i9, produced before they perfected the lithography (printing) for the i9. 

    I can't remember the specifics but the AppleTV was using an iPhone A series CPU with one or more of the cores disabled, which were probably just rejected iPhone CPUs. It's unlikely Apple will use iPhone CPUs in the mid to high end Macs, though I wouldn't be surprised if the A14x ends up in the iPad and MacBook Air or equivalent.
    Alex1N
  • Reply 75 of 85
    elijahgelijahg Posts: 2,753member
    cornchip said:
    I’m also curious at what point they decided to start building chips for the Mac. Was it way back at the beginning? My guess is that this was part of the plan all along or at least very early on, and why they went 64bit in a mobile. It caught everyone off guard, but I suspect they were already ramping up for Mac at that point.

    edited for cohesion.
    Apple used to produce ASICs (Application Specific Integrated Circuit) for their Macs way back in the early 90's, for SCSI controllers and the like. They had input to the CPUs Motorola and IBM made too back when the AIM (Apple IBM Motorola) alliance was still alive, but I imagine ever since Apple acquired PA Semi they've had 100% Apple designed chips for Macs in mind. They of course had to reach parity with Intel for this to happen. I suspect the T1/2 chips were somewhat of an experiment to see how well they could interface standard PC parts (i.e. what a Mac is now) with the ARM architecture, which also proves it wouldn't be impossible to have a hybrid Intel/ASi Mac. 
    Alex1N
  • Reply 76 of 85
    elijahgelijahg Posts: 2,753member
    elijahg said:
    I'm hoping Apple will still offer a discreet GPU, the Metal score for the Apple GPUs is crap compared to AMD and Nvidia's GPUs. It might be fine for mobile games with graphics on par with the Xbox 360, but desktops have far exceeded that.
    Agreed - it might work for a MacBook Air user but pros need much more. I wonder if they’ll stick with ATI for this.
    I hope they stop being so childish and use Nvidia. Whilst AMD/ATI has improved with their new RDNA architecture Nvidia's GPUs are still ahead. AMD's previous GCN arch was geared toward GPU compute, which never really happened and made it sluggish and power hungry for graphics due to all the unused compute silicon. Also the drivers for AMD GPUs are written by Apple, and they're rarely updated and not great. I get a significantly better GPU score in Windows on the same machine than I do on macOS. Nvidia drivers were always much better, but Apple has blocked Nvidia's requests to sign their drivers (surely another antitrust suit waiting to happen there). My 2012 iMac used to get Nvidia driver updates right up to 2018, until Mojave when Apple required signed drivers. I used to get about the same GPU score in Windows as macOS on that iMac.

    Considering Apple was the first to have a commercially available GUI, Macs being graphics oriented, the first to have a GPU accelerated UI, and their at one point reasonably tight integration with Pixar, you'd think they'd want to push graphics. But they just don't care about GPUs, it's really weird. I get the impression they're embarrassed that the App Store makes so much from games. 
    Alex1N
  • Reply 77 of 85
    georgie01 said:
    I wonder what the upgrade cycle will be for Macs with Apple processors, and how those processors will compare to mobile processors. If a Mac uses an A14 variant and then an iPhone/iPad is upgraded to an A15, will the Macs then be slower than iPhones and some Mac users feel the need to upgrade every year to have the fastest available? What Mac user is going to feel good about their Mac being slower than the next iPhone that comes out?

    Or will the Mac variants of the processors be faster enough that an iPhone/iPad using next generation will still be slower?

    They will still be faster probably because of the fans inside the Macs can keep them cooler than fan-less hardware. I would suspect.
    Alex1Nwatto_cobra
  • Reply 78 of 85
    melgrossmelgross Posts: 33,510member
    flydog said:
    h4y3s said:
    A silicon atom is only about 0.21 nm, so a 5nm process might be the limit for a while!
    Based on what?  The fact that the two numbers seem close together?

    3nm chips have already been produced, and TSMC, Intel, and Samsung have plants at various stages of construction.  There are also plans for 2 nm production within a few years. 
    The problem is that we’re reaching the limit. That’s a fact the industry knows. Can they produce 1nm chips, .7nm chips, even .5nm chips? It’s possible, but will there be any advantage? Not likely. Right now, with 5nm, the advantages over 7nm are less than the advantages of 7nm over 10nm. And 10nm was less from 14nm than 14nm was from 22nm. This goes all the way back to 90nm where all the heat problems began, which ended the MHz race.

    but manufacturers need to have “advances”. Look at the publicity over Intel’s attempts at 20nm, and their problems with 14 before that. The problem with moving down nodes these days is that manufacturers are using more and more “relaxed”, as they call it, definitions as to what that node actually means. Intel got into trouble because they refused, for several years, to relax first the 14nm, and then the 10nm specs. They went full on 10nm, and finally had to give up some of the advanced specs, though they kept a lot more than TSMC and Samsung.

    this is just getting worse as we go down more nodes. What will 3nm actually be? Much less efficiency per transistor. We know that, because that’s been happening since 14nm. More heat in a smaller spot. Less frequency increase, etc. It’s already amazing they could get to a hybrid 5nm. The 3nm will be even more hybrid. 
    Alex1Nwatto_cobra
  • Reply 79 of 85
    melgrossmelgross Posts: 33,510member

    I, for one, am not happy about this.  Moving away from x86 to something (anything) else will break sooooo much.  All of a sudden the ability to run Windows (at CPU speed) goes away, all the programs which rely on Wine stop working (or at least working well).  I liked the 6502 (and variants).  I liked the 68K series.  I liked the PPC series.  I (eventually) liked the x86 series.  But this change...  I just don't feel good about it.  That said, perhaps the Apple CPUs will be fast enough to make emulation tolerable (unlike the x86 emulators for the PPC!)...  Perhaps.
    Apple actually demonstrated its Rosetta 2 emulation for a Windows x86 3D game (Shadow of the Tomb Raider) in its July WWDC conference. Did you see it? Wasn't it fast enough for you? https://www.reddit.com/r/macgaming/comments/hdzdo8/shadow_of_the_tomb_raider_running_on_rosetta_2/ <--

    I'm also wondering if you are assuming that Microsoft has no intention to introduce a native Apple Silicon version of Windows. Would that mitigate your concern, since Windows apps would be able to recompile their apps to Apple Silicon version of Windows, thereby providing "native" support. Microsoft's official line on future OSs for Apple is "we have nothing more to say at this time." To me, that's exciting, as they are still considering it.
    I doubt Microsoft would release a version of Windows just for Apple silicon. They’re having enough trouble with what they’re doing now for ARM. The number of Mac users interested would be far too small.
    Alex1N
  • Reply 80 of 85
    melgrossmelgross Posts: 33,510member
    h4y3s said:
    A silicon atom is only about 0.21 nm, so a 5nm process might be the limit for a while!
    Yes, but there's also the issue of how those atomic spheres are packed. It's like packing baseballs in a box, there are different ways to pack them.
    Silicon has a crystalline structure. That’s about it. You can “strain” the silicon, or dope it, but you can’t change the basic structure much. And it’s not just silicon. The biggest problem is the copper and other materials needed. There’s only so much that can be done. At these small sizes, there is the volume problem, and the distance oroblem. That is a structure has more boundary to volume as the size goes down. The difficulty here is quantum mechanics. The Uncertainty Principle. Electrons traveling through the circuitry has a certain chance of leaving the circuit altogether. The narrower the fabrication feature, the closer to the edge any given electron is, and the more likely it will tunnel through the insulation, and end up somewhere else, where it isn’t wanted. There are diodes (tunneling diodes), and other electronic devices that use this effect deliberately. But we don’t want it in a chip.
    elijahgAlex1N
Sign In or Register to comment.