Apple's next-gen 'A13' iPhone and iPad chipset will remain 7nm

2»

Comments

  • Reply 21 of 38
    claire1claire1 Posts: 510unconfirmed, member
    Whatever they do I hope they differentiate it as a relevant big screen tool.

    iPad seems to get thrown on the back burner for being such a huge tool.
  • Reply 22 of 38
    wizard69wizard69 Posts: 13,377member

    melgross said:
    melgross said:
    I’m concerned about the new iPad Pro. We expected the new model around March, which is when it’s been arriving. So I expected an A11X, which would have been in line with the new phone chip, as usual. But here we are, and we’re beginning to pass June - and still no word.

    my concern is whether Apple has an A11X for this new model already. If the delay is due to factors such as case size and form, new screen, Face ID, and some other features, what is Apple planning here? If they expected this to come out much later, when will that be? I don’t like the idea that Apple is stretching out new designs. I get a new iPad every year, and to think that it’s moving to a year and maybe 9 months is too much.

    so if this is an unexpectedly long delay, does Apple have A11X chips for it waiting? If so, I can’t imagine them saving these chips for something else, and going with a new one. If they are going for an A12X, and haven’t made A11X chips, because they expected this long delay, what does that mean in terms of when a new model will arrive? Will we have to wait until September, when the phones come out, as that announcement has priority? That’s a heck of a long time. I don’t see them releasing the A12X before the A12.

    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.
    You Update your ipad every year?! Really? Isn’t that a bit excessive? To each their own i guess. But I use my ipad for professional artistic purposes, and even I don’t feel the need to update mine every year. I think Apple is moving to longer update cycles for the ipads for that very reason. Unlike iphones, most people don’t feel the need to update their ipads as often. So you may have to get used to that. 
    It’s only excessive to those who don’t do it. My daughter does too. For us, it’s worth while. For others, maybe not. But we all update our phones every other year, and get new 
    Macs, when available, between every three or four years. I didn’t intend to replace my gen 2 Apple Watch with the 3 LTE this year, but since my daughter wanted one, I went ahead with it.

    we’re fortunate that we can afford to do this, and I understand that not everyone can. But it’s a very big mistake to think that extending release dates is good. It’s not. It’s very bad. People want a new model, and when it doesn’t arrive, Apple loses sales, sometimes forever. Never think that because you don’t want a new one now, that others, who don’t have it at all yet, won’t want it now, but won’t buy it if it hasn’t been updated for over a year. This is the widely known problem with the Mac. Apple used to update them four times a year, every time a slightly newer version of a CPU came out. Then it went to two times, then one time, and now, we don’t know when they’ll be updated. That’s very bad.

    we need to have regular upgrades that can be counted on, as it was in the last. What they are doing will just kill sales.
    I have to agree 100% here.  While i dont upgrade my iPad every year right now im in the market for a new ipad.   However i wont buy one that is so outdated like the current models.  Im also a bit pissed of over the unwarranted price increases. In any event Apple isnt meeting expectations here, at the rate we are going iPad will suffer like the Mini.  By the way a Mini was on my list of wants sometime ago but Apple has totally screwed that up.  

    In any event im kinda hoping that Apple has something up its sleeves here that leads to a bigger than expected performance win.  One thought is an A11&1/2X, that is an A11X built on 7nm tech.  A simple half step like this (a process shrink) could be of huge benefit for Apple giving them a very significant jump in performance without waiting an entire year.   

    Another possibility that has been rumored on and off is a bigger split between the cell phone architecture and the processors for the more capable devices.  This would be a processor that better fits the needs of iPad and scales to fit the needs if notebooks and such.   Such a processor need an architecture that better supports the needed high speed ports and GPU requirements.  Call these the "B" series for now.  

    In any event i completely understand your concerns.   Apple has so screwed up their product lines and release schedules that after a decade of being on the Apple bandwagon i went out and purchased a non Apple computer in Janurary.    Frankly i got a machine with a modern processor and a host of other fearures including a working keyboard.   
    Alex1N
  • Reply 23 of 38
    wizard69wizard69 Posts: 13,377member
    melgross said:

    mjtomlin said:
    melgross said:
    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.

    You don't know the performance of the upcoming A12, it may in fact not be as powerful as an "A11X". You also don't know if Apple has decided to develop the A12 and A12X in tandem and plan on releasing new iPad Pros along side new iPhones in the Fall.

    However, the latest iPad Pros were released in June of last year at WWDC, so it would not be that unreasonable to have an event in July with an A11X update to the iPad Pro.
    You’re right, we don’t know. But that doesn’t mean that there isn’t logic to this that we can apply, and that’s what I try to do. But if Apple adds another 30% performance per core, then an A12 will be as powerful as an A11X, with the extra core. Generally, the latest GPU for the iPhone is about as powerful as the previous GPU for the iPad, which has twice as many cores.

    that’s what we/ve been seeing for several years. Will Apple change it up? Maybe. But why? The best situation is what we see now. The iPad version has more cores and slightly higher clock speed. It’s about 35% better in performance than the iPhone version for the cpu, and about 50% better for the GPU. You can’t just increase the speed by 30-40%. Power draw and heat will go up by about 50% if you do, but performance will just rise by about 25%. That’s a truly lousy trade off.

    the reason why I feel uncomfortable about a July release with an older chip design is that the new phones are just around the corner. Then iPad performance will be compared to the iPhone performance, and will be seen as wanting. Whether some want to believe it or not, while the phone are compared to other phones, the iPad Pro is compared to notebooks. Performance needs to be enhanced as much as possible.
    I agree with your concerns about performance but there is another performance driver beyond iPad, that is the rumored ARM baseed Macs or possibly entirely new devices.  To do this sort of jump really well i believe Apple will need a new series of processors.  Not so much to address CPU core performance but rather to be able to move data around much faster.  This means larger caches, data paths and faster I/O channels.   All the APU stule chips out there suffer from bandwidth issues and the A series are no exception.  So i can see a split happening where cell phone chips become one line and iPad and laptop chips another.  For example imagine an iPad chip with 4 to 8GB of HBM type memory in the package.  

    Of course i can imagin all sorts of things.   The fact that we are doing so now just highlights how screwed up product delivery is at Apple these days.  It reminds me of the years after the Mac Plus when the couldnt update the Mac on a decent schedule.  
    Alex1N
  • Reply 24 of 38
    melgross said:
    I’m concerned about the new iPad Pro. We expected the new model around March, which is when it’s been arriving. So I expected an A11X, which would have been in line with the new phone chip, as usual. But here we are, and we’re beginning to pass June - and still no word.

    my concern is whether Apple has an A11X for this new model already. If the delay is due to factors such as case size and form, new screen, Face ID, and some other features, what is Apple planning here? If they expected this to come out much later, when will that be? I don’t like the idea that Apple is stretching out new designs. I get a new iPad every year, and to think that it’s moving to a year and maybe 9 months is too much.

    so if this is an unexpectedly long delay, does Apple have A11X chips for it waiting? If so, I can’t imagine them saving these chips for something else, and going with a new one. If they are going for an A12X, and haven’t made A11X chips, because they expected this long delay, what does that mean in terms of when a new model will arrive? Will we have to wait until September, when the phones come out, as that announcement has priority? That’s a heck of a long time. I don’t see them releasing the A12X before the A12.

    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.
    I’m sure Apple appreciates your “concern,” but absolutely no one expected new iPad Pro models in March. They were last updated in June 2017, and Apple has no intention of compressing product life cycles on the iPad Pro to 9 months. 

    The 12.9” went 19 months before being updated last June, the 9.7/10.5 was 15 months. 
  • Reply 25 of 38
    melgrossmelgross Posts: 33,510member
    hodar said:
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    Having worked in the semiconductor industry over 15 years ago; A LOT has changed.  But, one of the major problems that always got worse with each die shrink, was the Lithography steps.  I can't imagine how you can mask your layers, with only 5nm geometries.  After awhile, physics simply stops you.
    For years, there’s been doubt that going smaller than 7nm was possible. From extreme Ultraviolet for lithography, to fabricating, to the physics of the operation of the chip. None of it was considered to be doable. We’ve been seeing double masking, then quadruple masking. Confocal lenses. The light sources are as big as a fair sized room now.

    when we look at a magnified chip, we can see how uneven the line thicknesses are, with bumps all along the length. So that doesn’t matter when lines are thicker, but when the bumps are an appreciable portion of the line width, it causes problems. With the atoms used in chips being about 0.5nm in diameter, that means that a line is just about 10 atoms wide, though practically, because of bonding forces it’s more like 20. Still what’s happening is the the edge of the line is now very close to the line center. That means that a lot of electrons are near the edge where, because of the Uncertainty Principal, they wander out of the line, through the insulator, and in another line where they don’t belong.

    this causes parasitic capacitance, screws around with the inductance and resistance, and results in large losses and interference. The features of the transistors are squeezed into smaller areas, and some of the clever methods used to get around it, don’t work well, or at all at 5nm, such as FinFet, though TSMC is claiming to be using it for 5nm.

    and worse, we’re being told about 3nm! I find that to be science fiction. I just can’t wait until some company claimed to be using it.
    edited June 2018 Alex1N
  • Reply 26 of 38
    mcdavemcdave Posts: 1,927member
    zimmie said:
    I would love for new iPads to have A12X chips, but suspect A11Xs are more likely. I don't recall Apple ever releasing iPads with brand new generation chips. They usually have whatever the iPhone had last. Presumably because yields aren't high enough for lower-margin products.
    The A4 was in the original iPad first, then it was used in iPhone 4 when it was launched a few months later. The A5 hit iPad 2 first, then iPhone 4S a few months later. iPads started getting X variants with the A5X (third-generation iPad) and lagging behind the A# base variant with the A6 (A6 in iPhone 5 in 2012-09, A6X in iPad 4 in 2012-10).
    Odd, my recollection was the iPad2 was A5X already. At the time it didn’t make sense as it had no Retina display to drive making me think Retina came a year later than expected.
  • Reply 27 of 38
    melgrossmelgross Posts: 33,510member
    wizard69 said:

    melgross said:
    melgross said:
    I’m concerned about the new iPad Pro. We expected the new model around March, which is when it’s been arriving. So I expected an A11X, which would have been in line with the new phone chip, as usual. But here we are, and we’re beginning to pass June - and still no word.

    my concern is whether Apple has an A11X for this new model already. If the delay is due to factors such as case size and form, new screen, Face ID, and some other features, what is Apple planning here? If they expected this to come out much later, when will that be? I don’t like the idea that Apple is stretching out new designs. I get a new iPad every year, and to think that it’s moving to a year and maybe 9 months is too much.

    so if this is an unexpectedly long delay, does Apple have A11X chips for it waiting? If so, I can’t imagine them saving these chips for something else, and going with a new one. If they are going for an A12X, and haven’t made A11X chips, because they expected this long delay, what does that mean in terms of when a new model will arrive? Will we have to wait until September, when the phones come out, as that announcement has priority? That’s a heck of a long time. I don’t see them releasing the A12X before the A12.

    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.
    You Update your ipad every year?! Really? Isn’t that a bit excessive? To each their own i guess. But I use my ipad for professional artistic purposes, and even I don’t feel the need to update mine every year. I think Apple is moving to longer update cycles for the ipads for that very reason. Unlike iphones, most people don’t feel the need to update their ipads as often. So you may have to get used to that. 
    It’s only excessive to those who don’t do it. My daughter does too. For us, it’s worth while. For others, maybe not. But we all update our phones every other year, and get new 
    Macs, when available, between every three or four years. I didn’t intend to replace my gen 2 Apple Watch with the 3 LTE this year, but since my daughter wanted one, I went ahead with it.

    we’re fortunate that we can afford to do this, and I understand that not everyone can. But it’s a very big mistake to think that extending release dates is good. It’s not. It’s very bad. People want a new model, and when it doesn’t arrive, Apple loses sales, sometimes forever. Never think that because you don’t want a new one now, that others, who don’t have it at all yet, won’t want it now, but won’t buy it if it hasn’t been updated for over a year. This is the widely known problem with the Mac. Apple used to update them four times a year, every time a slightly newer version of a CPU came out. Then it went to two times, then one time, and now, we don’t know when they’ll be updated. That’s very bad.

    we need to have regular upgrades that can be counted on, as it was in the last. What they are doing will just kill sales.
    I have to agree 100% here.  While i dont upgrade my iPad every year right now im in the market for a new ipad.   However i wont buy one that is so outdated like the current models.  Im also a bit pissed of over the unwarranted price increases. In any event Apple isnt meeting expectations here, at the rate we are going iPad will suffer like the Mini.  By the way a Mini was on my list of wants sometime ago but Apple has totally screwed that up.  

    In any event im kinda hoping that Apple has something up its sleeves here that leads to a bigger than expected performance win.  One thought is an A11&1/2X, that is an A11X built on 7nm tech.  A simple half step like this (a process shrink) could be of huge benefit for Apple giving them a very significant jump in performance without waiting an entire year.   

    Another possibility that has been rumored on and off is a bigger split between the cell phone architecture and the processors for the more capable devices.  This would be a processor that better fits the needs of iPad and scales to fit the needs if notebooks and such.   Such a processor need an architecture that better supports the needed high speed ports and GPU requirements.  Call these the "B" series for now.  

    In any event i completely understand your concerns.   Apple has so screwed up their product lines and release schedules that after a decade of being on the Apple bandwagon i went out and purchased a non Apple computer in Janurary.    Frankly i got a machine with a modern processor and a host of other fearures including a working keyboard.   
    We’re having an agreement fest here.

    i’ve also been saying that Apple needs to differentiate their iPad chip line from their iPhone line even more. And I’ve even used the letter B for the series. To all intents and purposes, the iPad Pro 12.9” is the equivalent of a notebook. Therefore, people expect certain performance levels. With a high Rez screen which matches that of the 13” notebooks, Apple is thrusting this out into that competitive area, whether they fully admit it or not.

    and, if it’s true that Apple may replace Intel with one of their SoCs, in the Macbook, they’re going to need something far more powerful, maybe even doing what I’ve been saying about incorporating a few instructions that are used in x86 (which aren’t individually patented).
    Alex1N
  • Reply 28 of 38
    widmarkwidmark Posts: 37member
    nunzy said:
    Apple skates to where the puck is going to be, but won't release a product until it is fully baked. Always.
    Homepod... puck was already there and still updating feature set to compete... Apple Watch... not fully baked first version... you could even say still not even on version 3 (no Podcast app until watchOS 5).  AppleTV... rough first model, albeit Steve released it as a hobby rather than product.  Mac Pro... Star Wars version Mac not fully baked/buggy.  I love Apple but they stub their toe all the time.  If you never stub your toe though, you’re probably playing it too safe.
    Alex1N
  • Reply 29 of 38
    iqatedoiqatedo Posts: 1,822member
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    So I suppose that even if you can drop operating voltages as geometries shrink, at some point there is a basic problem with electron mobility in semiconductors due to reduced driving electric fields, which results in slower electron drift speed and therefore, reduced efficiency. New materials with improved physical properties needed?

    hodar said:
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    Having worked in the semiconductor industry over 15 years ago; A LOT has changed.  But, one of the major problems that always got worse with each die shrink, was the Lithography steps.  I can't imagine how you can mask your layers, with only 5nm geometries.  After awhile, physics simply stops you.
    Assuming illumination is performed by X-rays and grazing incidence optics, 5 nm is at the upper end of X-ray wavelengths (soft X-rays) and so even smaller geometries would be possible but I agree, the lithography must be insane! The wonders of modern tech!
    Alex1N
  • Reply 30 of 38
    melgrossmelgross Posts: 33,510member
    iqatedo said:
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    So I suppose that even if you can drop operating voltages as geometries shrink, at some point there is a basic problem with electron mobility in semiconductors due to reduced driving electric fields, which results in slower electron drift speed and therefore, reduced efficiency. New materials with improved physical properties needed?

    hodar said:
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    Having worked in the semiconductor industry over 15 years ago; A LOT has changed.  But, one of the major problems that always got worse with each die shrink, was the Lithography steps.  I can't imagine how you can mask your layers, with only 5nm geometries.  After awhile, physics simply stops you.
    Assuming illumination is performed by X-rays and grazing incidence optics, 5 nm is at the upper end of X-ray wavelengths (soft X-rays) and so even smaller geometries would be possible but I agree, the lithography must be insane! The wonders of modern tech!
    You might remember that this problem really came into its own way back with 90nm. It’s what killed IBM and Intel’s claims that in a few years we would see CPU speeds of 10, 15  GHz, and even higher speeds. 

    Netburst, Intel’s method of continually increasing speeds came to an end with Prescott, which ran at a normally cooled speed of 3.8 GHz. It’s taken 15 years to get back to that speed. It’s why we have multiple cores. But Intel, and others were able to use different materials and clever designs to get around the problem. But now, we seem to be reaching fundamental limits. When that happens in a few years, that’s it, unless so far unproven technologies such as carbon nanotubes and molecular work out - if they ever do, as we don’t know that yet. If so, then improvements will creep along, with software developers having to finally write efficient code withou a lot of crap features we don’t need, slowing things down.

    so, in a way, that would be a good thing, until new technology works (if it ever will) and they go back to writing crappy code again.
    iqatedoAlex1N
  • Reply 31 of 38
    melgrossmelgross Posts: 33,510member
    iqatedo said:
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    So I suppose that even if you can drop operating voltages as geometries shrink, at some point there is a basic problem with electron mobility in semiconductors due to reduced driving electric fields, which results in slower electron drift speed and therefore, reduced efficiency. New materials with improved physical properties needed?

    hodar said:
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    Having worked in the semiconductor industry over 15 years ago; A LOT has changed.  But, one of the major problems that always got worse with each die shrink, was the Lithography steps.  I can't imagine how you can mask your layers, with only 5nm geometries.  After awhile, physics simply stops you.
    Assuming illumination is performed by X-rays and grazing incidence optics, 5 nm is at the upper end of X-ray wavelengths (soft X-rays) and so even smaller geometries would be possible but I agree, the lithography must be insane! The wonders of modern tech!
    Look at how long it’s taken to just get UV working! They can’t use regular lenses, and there’s been a host of problems. They were talking far UV coming in as long as 10 years ago, but it just got here. X-rays! Wow. I don’t think they really know where they’re going with that which is why they are pushing far UV, a step they didn’t even want to do.

    now, with x-ray, they need magnetic lenses, and they’re nowhere with that. And that’s not the only problem. If a chip won’t work no matter how finely it’s designed and made because of fundamental physics problems, then there’s no point in even trying.
    iqatedoAlex1N
  • Reply 32 of 38
    19831983 Posts: 1,225member
    wizard69 said:
    melgross said:

    mjtomlin said:
    melgross said:
    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.

    You don't know the performance of the upcoming A12, it may in fact not be as powerful as an "A11X". You also don't know if Apple has decided to develop the A12 and A12X in tandem and plan on releasing new iPad Pros along side new iPhones in the Fall.

    However, the latest iPad Pros were released in June of last year at WWDC, so it would not be that unreasonable to have an event in July with an A11X update to the iPad Pro.
    You’re right, we don’t know. But that doesn’t mean that there isn’t logic to this that we can apply, and that’s what I try to do. But if Apple adds another 30% performance per core, then an A12 will be as powerful as an A11X, with the extra core. Generally, the latest GPU for the iPhone is about as powerful as the previous GPU for the iPad, which has twice as many cores.

    that’s what we/ve been seeing for several years. Will Apple change it up? Maybe. But why? The best situation is what we see now. The iPad version has more cores and slightly higher clock speed. It’s about 35% better in performance than the iPhone version for the cpu, and about 50% better for the GPU. You can’t just increase the speed by 30-40%. Power draw and heat will go up by about 50% if you do, but performance will just rise by about 25%. That’s a truly lousy trade off.

    the reason why I feel uncomfortable about a July release with an older chip design is that the new phones are just around the corner. Then iPad performance will be compared to the iPhone performance, and will be seen as wanting. Whether some want to believe it or not, while the phone are compared to other phones, the iPad Pro is compared to notebooks. Performance needs to be enhanced as much as possible.
    I agree with your concerns about performance but there is another performance driver beyond iPad, that is the rumored ARM baseed Macs or possibly entirely new devices.  To do this sort of jump really well i believe Apple will need a new series of processors.  Not so much to address CPU core performance but rather to be able to move data around much faster.  This means larger caches, data paths and faster I/O channels.   All the APU stule chips out there suffer from bandwidth issues and the A series are no exception.  So i can see a split happening where cell phone chips become one line and iPad and laptop chips another.  For example imagine an iPad chip with 4 to 8GB of HBM type memory in the package.  

    Of course i can imagin all sorts of things.   The fact that we are doing so now just highlights how screwed up product delivery is at Apple these days.  It reminds me of the years after the Mac Plus when the couldnt update the Mac on a decent schedule.  
    Good points pertaining to a new processor line separate from that of the iPhone. This is highly unlikely, but maybe Apple’s rumored new product line we all heard about a while back, is a device that replaces both the iPad Pro line and lower end Macs. Thus this perceived delay with the launch of updated iPad Pros?
    edited June 2018 Alex1N
  • Reply 33 of 38
    mcdavemcdave Posts: 1,927member
    wizard69 said:
    melgross said:

    mjtomlin said:
    melgross said:
    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.

    You don't know the performance of the upcoming A12, it may in fact not be as powerful as an "A11X". You also don't know if Apple has decided to develop the A12 and A12X in tandem and plan on releasing new iPad Pros along side new iPhones in the Fall.

    However, the latest iPad Pros were released in June of last year at WWDC, so it would not be that unreasonable to have an event in July with an A11X update to the iPad Pro.
    You’re right, we don’t know. But that doesn’t mean that there isn’t logic to this that we can apply, and that’s what I try to do. But if Apple adds another 30% performance per core, then an A12 will be as powerful as an A11X, with the extra core. Generally, the latest GPU for the iPhone is about as powerful as the previous GPU for the iPad, which has twice as many cores.

    that’s what we/ve been seeing for several years. Will Apple change it up? Maybe. But why? The best situation is what we see now. The iPad version has more cores and slightly higher clock speed. It’s about 35% better in performance than the iPhone version for the cpu, and about 50% better for the GPU. You can’t just increase the speed by 30-40%. Power draw and heat will go up by about 50% if you do, but performance will just rise by about 25%. That’s a truly lousy trade off.

    the reason why I feel uncomfortable about a July release with an older chip design is that the new phones are just around the corner. Then iPad performance will be compared to the iPhone performance, and will be seen as wanting. Whether some want to believe it or not, while the phone are compared to other phones, the iPad Pro is compared to notebooks. Performance needs to be enhanced as much as possible.
    I agree with your concerns about performance but there is another performance driver beyond iPad, that is the rumored ARM baseed Macs or possibly entirely new devices.  To do this sort of jump really well i believe Apple will need a new series of processors.  Not so much to address CPU core performance but rather to be able to move data around much faster.  This means larger caches, data paths and faster I/O channels.   All the APU stule chips out there suffer from bandwidth issues and the A series are no exception.  So i can see a split happening where cell phone chips become one line and iPad and laptop chips another.  For example imagine an iPad chip with 4 to 8GB of HBM type memory in the package.  

    Of course i can imagin all sorts of things.   The fact that we are doing so now just highlights how screwed up product delivery is at Apple these days.  It reminds me of the years after the Mac Plus when the couldnt update the Mac on a decent schedule.  
    Now you’re talking. I think the road to AR is the driver and I’m seeing renewed interest in ray tracing with extensions in Metal to push ray intersection to the GPU.  If their 5th gen GPU has ray-intersect logic that’s a huge boost.  Imagination were getting 300Mrays/sec on paper 3 years ago but no software layer to deliver it and Nvidia are still way behind.

    1) Apple sees AR as a new fundamental tech.
    2) ‘Real’ AR requires ray-tracing.
    3) Apple have access to low-power RT HW.
    4) OpenRL never took off but Apple has Metal baked in to stacks of 3D applications.
    5) Transitioning Macs to ARM via the GPU always made sense.

    Conclusion: new iPads & MBPs with stunning GPUs around the corner.
    Alex1N
  • Reply 34 of 38
    iqatedoiqatedo Posts: 1,822member
    melgross said:
    iqatedo said:
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    So I suppose that even if you can drop operating voltages as geometries shrink, at some point there is a basic problem with electron mobility in semiconductors due to reduced driving electric fields, which results in slower electron drift speed and therefore, reduced efficiency. New materials with improved physical properties needed?

    hodar said:
    melgross said:

    wood1208 said:
    More interested in Apple design built GPU. 7nm to 5nm depends on TSMC or anyone's ability to build 5nm chip fabrication facility and produce chips at sufficient yield.
    I’m not confident about 5nm, much less anything smaller. We’re actually seeing transistor efficiency go down since 14nm. It’s been small so far, but the smaller we go, the worse it will get. And certain on chip structures are becoming impossible to make because there’s not enough room at these small sizes. While I think TSMC has claimed that thy will be using a modified FinFet for 5nm, a lot of chip designers have said that FinFet doesn’t work properly at smaller than 7nm, and not as well there as in 10nm.

    so I’m skeptical right now. I’ll believe it when I see it, as they say.
    Having worked in the semiconductor industry over 15 years ago; A LOT has changed.  But, one of the major problems that always got worse with each die shrink, was the Lithography steps.  I can't imagine how you can mask your layers, with only 5nm geometries.  After awhile, physics simply stops you.
    Assuming illumination is performed by X-rays and grazing incidence optics, 5 nm is at the upper end of X-ray wavelengths (soft X-rays) and so even smaller geometries would be possible but I agree, the lithography must be insane! The wonders of modern tech!
    Look at how long it’s taken to just get UV working! They can’t use regular lenses, and there’s been a host of problems. They were talking far UV coming in as long as 10 years ago, but it just got here. X-rays! Wow. I don’t think they really know where they’re going with that which is why they are pushing far UV, a step they didn’t even want to do.

    now, with x-ray, they need magnetic lenses, and they’re nowhere with that. And that’s not the only problem. If a chip won’t work no matter how finely it’s designed and made because of fundamental physics problems, then there’s no point in even trying.
    I was doing some optics design in the 80's and was a member of SPIE. There were papers published then on using grazing incident optics, which were surface reflectors, not refractive components, to focus X-rays. I assumed that the technology had progressed and didn't realise that they still relied on UV (EUV). Happy to be educated.
    Alex1N
  • Reply 35 of 38
    Apple is likely to stick with 7-nanometer chip designs for 2019 iPhone and iPad processors, a report revealed on Friday.

    Shrinking die size allows more processing power to fit into the same space, while also often improving power efficiency. Apple is frequently concerned about both matters, but may be especially interested as it works on its rumored AR/VR headset, which could ship as soon as 2020. That device may use power-hungry technologies such as 8K and WiGig, and wearables must be as slim and lightweight as possible.
    Ehh, not as bad as the hyperbolic wofftec article, but never the less....me thinks the author here has conflated die size with the Apple code name for their ARM licensed "Axx" series of SoC. Logically 'shrinking' (honey I shrunk the kids?) the die size would mean whatever processing power you have with that shrunken die, in a smaller space, not the same space. 

    Risk production as they have stated is not very meaningful, but you could, assuming they have *less* trouble going to 5nm than prior 7nm(not likely) assume based on timeline from risk production last year, until mass production this year, one year later. If Apple were to be desiring any given Axx series Soc, for September release, serioiusly doubt they could ramp up enough production if they only started in the "summer" as summer runs from June 21 but ends with beginning of Fall which occurs right about the release/announcement day for September. Apple could always change annoucments to October, for shipping a month later...if there are production delays.

    https://wccftech.com/tsmc-7nm-proces-sram-details/

    ^March 27, 2017 "TSMC will commence risk production for first generation 7nm process next month." So, a full year later, they announce the 7nm node is in mass production.


    but, from that article we also see "“We believe we will be the first one to use EUV in volume production.” according to an executive. Risk production will start by June 2018." Almost end of June now, no annoucment of risk production on that EUV 7nmFF+ . Anyone doing online betting in Vegas that 5nm will show up on the timeline schedule TSMC is showing this month, next month, next quarter???

    A year later we have annoucment of mass production with WCC engaging in confusing hyperbole

    https://wccftech.com/apple-7nm-a12-2018-iphone-mass-production-tsmc/

    "Apple will make another massive jump in mobile processors this year, leaving a lot of folks at Qualcomm short-winded, to say the least" Massive jump, how so? Early rumored GB4 on the A12, shows modest improvement in multi scores up from 11k to 13k, improvement for sure, but "massive"?

    And then: "
    If the rumor mill is accurate once again, then this year’s iPhones will represent only an incremental upgrade over their predecessors" so which is it, 'massive' or 'incremental'?

    "Compared to 10nm, TSMC’s 7nm FinFET features a 20% speed improvement, 40% power reduction, 37% area reduction and 1.6 times the logic density." 

    ^Well at least that bit was helpful, compared to the AI post, which is not particularly informative, imho.

    So, as we can see from the better article on WCC (read the comments, if TeamAI is so upset about moderating this forum they should pray they never have to deal with the likes of Apple haters over on 'the other side'... the evil 'dark side' :) ) Apple will be doomed for the A13, when the 7nmFF+ EUV SoC's become available.

    "TSMC’s 7FF+, which is the first manufacturing process from the company to utilize EUV will enter tape out in the second half of this year, according to Wei. While there’s a lot of hype around 7FF+, particularly due to EUV’s dramatic decrease in wavelength (13.5nm from 193nm), the gains aren’t that stark. 7FF+, based on TSMC’s N7+ node will provide 10% power efficiency and 20% density increase over the 7FF. This, as you can see, puts Apple in a rut"

    I was thinking..due to AI's aversion to anything political (at least when it comes to the current president)...AI readers need to look elsewhere for important news about the tech industry and Apple's future, as it is all tied up with mostly China...huge threat coming from gov subsidized industry push to be global leader in emerging silicon tech by 2025, Silcon Valley is geopolitically being relocated to China...sooner than any of us think, unless a trade war 'happens' :)

    http://www.cdrinfo.com/Sections/News/Details.aspx?NewsId=67789
    Chinese Chipmaker SMIC Orders $120m EUV 
    ^SMIC gov subsidzied is going for 7nm soon, they might be 2 nodes behind now, but give it a few years

    oh, this semiwiki is a good read:
    https://www.semiwiki.com/forum/content/7513-china-semiconductor-equipment-china-sales-risk.html
    "In the semiconductor world how can we stop selling them chips, as in the ZTE case, yet still sell them equipment to make those same chips so they become independent of us ? This begs the question as to when we will see prohibitions on semiconductor equipment sales to China?"

    "The White House doesn't need to put together a list for June 15th and June 30th, as it already has one.......

    ASML is "perfect" political choice...
    The prohibition of sales of chips to ZTE negatively impacted US companies and jobs, specifically Qualcomm, Lumentum, Google (software) even Intel and others. Blocking ASML sales to China would not directly impact a US based company (and its associated Republican congressman as is the case with QCOM). It would not impact ASML (former Cymer) employees in San Diego as they are needed for all other ASML customers. The Netherlands is probably low on the administrations radar (even if found on a map...).

    In short ASML EUV sales are a politically perfect target and pawn in the trade wars.

    No immunity....Collateral Damage?
    In case you thought ASML is immune or not part of the trade war between the US and China, just ask Aixtron , in Germany, whose sale to a Chinese company was blocked by the US department of defense and CFIUS.

    The US has long arms that reach outside of its borders. In addition, the US could obviously be concerned with "industrially significant technology" inside US borders, such as the laser light source for EUV designed in San Diego. This is the heart and soul of an EUV system.

    That technology had its genesis in Ronald Regans "Star Wars" laser based defense technology. When the program was cancelled both US and Russian laser scientists found their way to little know (at the time) Cymer in San Diego".

    Huh, I did not know that. Fascinating.
    Well, there you go again; we can 'blame' EUV and 5nm Apple SoC A14 series on 1980's Republicans, Ronald Regan < what say you Pelosi, crying Chuck, Tim Cook?

    Something else TSMC announced last month: 

    https://www.neowin.net/news/tsmc-debuts-wafer-on-wafer-tech-7nm-node-at-volume-production-7nm-and-5nm-nodes-on-track

    " Wafer-on-Wafer (“WoW”) technology, which could be a boon for GPU performance, similar to how vertical stacking has improved DRAM manufacturing and performance. The technology, as the name suggests, stacks layers of logic on top of one another using a through-silicon via (“TSV”) interconnect. With limited lateral space on wafers, WoW technology should allow for faster chips with higher areal density to be crammed in the same amount of space using high-speed, low-latency interconnects."

    "manufacturers could literally stack two fully-functional GPUs on top of one another, as opposed to dual-GPU setups using two independent dies, leading to cost savings, and smaller, more power-efficient video cards."

    Not only video cards, but A14 Soc in the iPhone

    Why we won't see 3nm node anytime soon or next decade?, and perhaps only Apple/Samsung (until the Chinese take over) will want to invest enough to make it worth while as these few remaining chip foundtry companies have to spend ever increasing $$$ to get the next node out, see the chart in the article below showing how development costs going from 7nm to 5nm have doubled, many will try to save cost by using older process nodes:

    https://www.extremetech.com/computing/272096-3nm-process-node
    "A recent article at Semiengineering explored this phenomena and the ugly cost curves driving it. Right now, Samsung and GlobalFoundries are hoping to deliver nanosheet FETs as a successor to FinFET, while TSMC is exploring both nanosheets and nanowires (Intel hasn’t said anything about its plans). All of these approaches are highly theoretical (3nm isn’t a near-term node in any case), but there’s a different problem waiting in the wings, even if these new transistor types can be refined and brought to market: design cost. The price of a 3nm chip is expected to range from between $500M to $1.5B, with the latter figure reserved for a high-end GPU from Nvidia."

    Rather than this mostly not very interesting nor informative article, nowhere could I search (tried Google on this site) to find this iPhone 7 issue here on AI, unless hidden in a poorly worded title:

    https://www.slashgear.com/if-your-iphone-does-this-get-it-fixed-asap-13534148/
  • Reply 36 of 38
    blastdoorblastdoor Posts: 3,258member
    melgross said:
    blastdoor said:
    melgross said:
    I’m concerned about the new iPad Pro. We expected the new model around March, which is when it’s been arriving. So I expected an A11X, which would have been in line with the new phone chip, as usual. But here we are, and we’re beginning to pass June - and still no word.

    my concern is whether Apple has an A11X for this new model already. If the delay is due to factors such as case size and form, new screen, Face ID, and some other features, what is Apple planning here? If they expected this to come out much later, when will that be? I don’t like the idea that Apple is stretching out new designs. I get a new iPad every year, and to think that it’s moving to a year and maybe 9 months is too much.

    so if this is an unexpectedly long delay, does Apple have A11X chips for it waiting? If so, I can’t imagine them saving these chips for something else, and going with a new one. If they are going for an A12X, and haven’t made A11X chips, because they expected this long delay, what does that mean in terms of when a new model will arrive? Will we have to wait until September, when the phones come out, as that announcement has priority? That’s a heck of a long time. I don’t see them releasing the A12X before the A12.

    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.
    Could be the iPad Pro will get a higher clocked A12. Apple doesn't *always* do an X. 

    And there's absolutely nothing wrong with a higher clock instead of more cores. Intel does that all the time --- the fastest quad core i7 in a desktop is much faster than the fastest quad core i7 in a laptop. 
    They’ve done an X for years. The higher screen resolution alone requires it. You can’t increase clock speeds by a large enough amount to make up for performance lack. There’s a good reason why Apple went to more cores, and why they use double the cores for the GPU than what’s on the phone.

    flr the purpose, an iPad Pro needs a significantly more powerful SoC.
    I don't think it's at all obvious that the higher screen resolution alone requires it. Pushing pixels at high resolution has been a solved problem for years. Even the Intel integrated GPU in my 15" MacBook Pro can do a fine job most of the time. These days, the real advantage of a faster GPU is more computational. In the case of games, a higher resolution can mean a higher computational load because you're doing a lot of computations on each pixel. And games need to play at a decent frame rate. So, I you really want to play at full resolution on an iPad Pro, then yes, resolution matters. But playing at full resolution isn't really necessary for most games. So I don't think that's a driver here. 

    But for "pro" apps, which presumably is the point of the iPad Pro, it's not clear to me that there's a linear interaction between screen resolution and GPU computation. 

    Also, keep in mind that the A10X was fabbed on a 10nm process whereas the A12 will be on a 7nm process. Apple could, in theory, make the A12 equivalent to a die shrunk A10X, run it at the same clock in an iPhone that it currently runs in the iPad Pro, and then up the clock for the iPad Pro. That would be a guaranteed performance improvement over the iPad Pro without the need for an X. 

    Finally, one last related way to look at it. GPU performance scales nearly-linearly with both clock speed and parallelism. So you can either increase performance by adding transistors or by adding Hz -- they are almost interchangeable (the same is not true for a CPU, where Hz are always more helpful). But power usage does not scale the same way -- power usage scales linearly when adding transistors, but it scales more than linearly when adding Hz. So when a die shrink comes along, Apple might rightly choose to add more GPU transistors to the iPhone than were there before, but cut the clock speed. That would be a win in battery life on the iPhone. But it also means they have more headroom to amp up the clock speed in the iPad. 

    Bottom line.... don't be so sure that I'm wrong :-) 
    Alex1N
  • Reply 37 of 38
    melgrossmelgross Posts: 33,510member
    blastdoor said:
    melgross said:
    blastdoor said:
    melgross said:
    I’m concerned about the new iPad Pro. We expected the new model around March, which is when it’s been arriving. So I expected an A11X, which would have been in line with the new phone chip, as usual. But here we are, and we’re beginning to pass June - and still no word.

    my concern is whether Apple has an A11X for this new model already. If the delay is due to factors such as case size and form, new screen, Face ID, and some other features, what is Apple planning here? If they expected this to come out much later, when will that be? I don’t like the idea that Apple is stretching out new designs. I get a new iPad every year, and to think that it’s moving to a year and maybe 9 months is too much.

    so if this is an unexpectedly long delay, does Apple have A11X chips for it waiting? If so, I can’t imagine them saving these chips for something else, and going with a new one. If they are going for an A12X, and haven’t made A11X chips, because they expected this long delay, what does that mean in terms of when a new model will arrive? Will we have to wait until September, when the phones come out, as that announcement has priority? That’s a heck of a long time. I don’t see them releasing the A12X before the A12.

    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.
    Could be the iPad Pro will get a higher clocked A12. Apple doesn't *always* do an X. 

    And there's absolutely nothing wrong with a higher clock instead of more cores. Intel does that all the time --- the fastest quad core i7 in a desktop is much faster than the fastest quad core i7 in a laptop. 
    They’ve done an X for years. The higher screen resolution alone requires it. You can’t increase clock speeds by a large enough amount to make up for performance lack. There’s a good reason why Apple went to more cores, and why they use double the cores for the GPU than what’s on the phone.

    flr the purpose, an iPad Pro needs a significantly more powerful SoC.
    I don't think it's at all obvious that the higher screen resolution alone requires it. Pushing pixels at high resolution has been a solved problem for years. Even the Intel integrated GPU in my 15" MacBook Pro can do a fine job most of the time. These days, the real advantage of a faster GPU is more computational. In the case of games, a higher resolution can mean a higher computational load because you're doing a lot of computations on each pixel. And games need to play at a decent frame rate. So, I you really want to play at full resolution on an iPad Pro, then yes, resolution matters. But playing at full resolution isn't really necessary for most games. So I don't think that's a driver here. 

    But for "pro" apps, which presumably is the point of the iPad Pro, it's not clear to me that there's a linear interaction between screen resolution and GPU computation. 

    Also, keep in mind that the A10X was fabbed on a 10nm process whereas the A12 will be on a 7nm process. Apple could, in theory, make the A12 equivalent to a die shrunk A10X, run it at the same clock in an iPhone that it currently runs in the iPad Pro, and then up the clock for the iPad Pro. That would be a guaranteed performance improvement over the iPad Pro without the need for an X. 

    Finally, one last related way to look at it. GPU performance scales nearly-linearly with both clock speed and parallelism. So you can either increase performance by adding transistors or by adding Hz -- they are almost interchangeable (the same is not true for a CPU, where Hz are always more helpful). But power usage does not scale the same way -- power usage scales linearly when adding transistors, but it scales more than linearly when adding Hz. So when a die shrink comes along, Apple might rightly choose to add more GPU transistors to the iPhone than were there before, but cut the clock speed. That would be a win in battery life on the iPhone. But it also means they have more headroom to amp up the clock speed in the iPad. 

    Bottom line.... don't be so sure that I'm wrong :-) 
    I’m not so interested in games, though first person shooters do require a very good GPU. I use AutoCad, and every upgrade has made a very noticeable difference. Apple has pretty much always doubled down cores for the iPad.
    Alex1N
  • Reply 38 of 38
    nhtnht Posts: 4,522member
    wizard69 said:

    melgross said:
    melgross said:
    I’m concerned about the new iPad Pro. We expected the new model around March, which is when it’s been arriving. So I expected an A11X, which would have been in line with the new phone chip, as usual. But here we are, and we’re beginning to pass June - and still no word.

    my concern is whether Apple has an A11X for this new model already. If the delay is due to factors such as case size and form, new screen, Face ID, and some other features, what is Apple planning here? If they expected this to come out much later, when will that be? I don’t like the idea that Apple is stretching out new designs. I get a new iPad every year, and to think that it’s moving to a year and maybe 9 months is too much.

    so if this is an unexpectedly long delay, does Apple have A11X chips for it waiting? If so, I can’t imagine them saving these chips for something else, and going with a new one. If they are going for an A12X, and haven’t made A11X chips, because they expected this long delay, what does that mean in terms of when a new model will arrive? Will we have to wait until September, when the phones come out, as that announcement has priority? That’s a heck of a long time. I don’t see them releasing the A12X before the A12.

    i would not be happy with an A11X now, because normally the newest chip for the phone has approximately the same performance as the previous generation iPad X version. So that would have the new phones on a par with the new iPad Pro 12.9”. That’s really unacceptable. When considering the far higher resolution, it’s even worse.
    You Update your ipad every year?! Really? Isn’t that a bit excessive? To each their own i guess. But I use my ipad for professional artistic purposes, and even I don’t feel the need to update mine every year. I think Apple is moving to longer update cycles for the ipads for that very reason. Unlike iphones, most people don’t feel the need to update their ipads as often. So you may have to get used to that. 
    It’s only excessive to those who don’t do it. My daughter does too. For us, it’s worth while. For others, maybe not. But we all update our phones every other year, and get new 
    Macs, when available, between every three or four years. I didn’t intend to replace my gen 2 Apple Watch with the 3 LTE this year, but since my daughter wanted one, I went ahead with it.

    we’re fortunate that we can afford to do this, and I understand that not everyone can. But it’s a very big mistake to think that extending release dates is good. It’s not. It’s very bad. People want a new model, and when it doesn’t arrive, Apple loses sales, sometimes forever. Never think that because you don’t want a new one now, that others, who don’t have it at all yet, won’t want it now, but won’t buy it if it hasn’t been updated for over a year. This is the widely known problem with the Mac. Apple used to update them four times a year, every time a slightly newer version of a CPU came out. Then it went to two times, then one time, and now, we don’t know when they’ll be updated. That’s very bad.

    we need to have regular upgrades that can be counted on, as it was in the last. What they are doing will just kill sales.
    I have to agree 100% here.  While i dont upgrade my iPad every year right now im in the market for a new ipad.   However i wont buy one that is so outdated like the current models.  Im also a bit pissed of over the unwarranted price increases. In any event Apple isnt meeting expectations here, at the rate we are going iPad will suffer like the Mini.  By the way a Mini was on my list of wants sometime ago but Apple has totally screwed that up.  

    OMG, more old "experts" proclaiming the sky is falling.  

    "Unwarranted price increase"?  The iPad is down to $329.  If you can't afford to update the iPad Pro every year then buy a new iPad every year for less than half the price.

    "Unexpectedly long delay"?  The last iPad Pro was released in June 2017 with a A10X.

    Outdated?  How is an unreleased A11X outdated?  If the A11X iPad Pro appears in 2018 then it's right on time based on past releases and going to be more than competitive in its market segment...which isn't against iPhones.  In any case the time between iPad Pro and iPhone releases where someone can claim the AnX iPad Pro is better than the An+1 iPhone has been about 4 months from June to September.  If that duration is 2 months then a big "so what" as opposed to "very bad" or "unacceptable".

    It's also complete bullshit when it comes to GPU performance as a compute element in Geekbench 4 (aka Pro App Performance vs Gaming).

    The iPad Pro 12.9 with the A9X scores higher on the metal benchmark than the iPhone 8 Plus (15745 vs 15309).  The iPad Pro 12.9 with the A10X scores almost double over the iPhone 8 Plus. (29247 vs 15309). 

    For CPU the best A11 iPhone does outperform the best A10X iPad Pro on single core (4219 vs 3908) and multicore (10184 vs 9307).

    For gaming, on screen performance matters more and the iPad will have lower frame rates.
Sign In or Register to comment.