And the herd mentality of all analysts. The blind trying to describe the elephant by screaming what they have found… eventually, you'll end up with consensus of either leather tree, or a snake, or the largest bagpipe in the world.
Will an A-series chip make it into a MacBook Air in the next 5 years? That's the question.
And will Apple get into the fab business?
I bet that, if there are still MacBooks in 5 years, they will be running ARM chips. In 5 years, x86 won't be competitive. People forget that Intel's advantage is based on massive capital expenditure, which came from being the dominant player in a growth market. It's now in a terminally declining market, a position it has never been in before. It's not just that ARM is getting better, but that at some point sustaining x86 will become financially impossible, and Intel will lose its advantage. x86 will start to lag. This won't take long because so much money is involved in going to each new process. When this happens, I think Apple will probably dump the Mac and move everything on to some future version of iOS, rather than move the Mac to ARM. But who knows.
I doubt they'll ever get into the fab business though.
For a site whose writers and readership generally excoriate speculation by third-party analysts (see, for example, comment #42 by TheOtherGeoff), that’s Dilger’s stock in trade — and yet when he fails, he never seems to hear the same slagging off that the other analysts receive. His fanboy-driven, armchair wish-is-father-to-the-thought analysis is pretty tired ...
For a site whose writers and readership generally excoriate speculation by third-party analysts (see, for example, comment #42 by TheOtherGeoff), that’s Dilger’s stock in trade — and yet when he fails, he never seems to hear the same slagging off that the other analysts receive. His fanboy-driven, armchair wish-is-father-to-the-thought analysis is pretty tired ...
I think the difference is he's not trying to financially manipulate the stock price.
I tend to listen to individual analysts to understand their logic.
DED's speculation is one at least rooted in a long term understanding of the Apple Corp Psyche'. He may have gotten the Samsung A7 thing wrong (at least partially), but at least he has a story that weaves into a [ir]rational analysis of how Apple got to where it is, and where it's likely going.
DED looks like a damn Nobel winning economist when compared to analysts, or worse, the Enderle's, Thurotts, and Dvorack's of the world.
And he's entertaining like a Saturday night revival tent preacher… (and most Nobel Winning anythings are wrong more than they are right).
I think the difference is he's not trying to financially manipulate the stock price.
I tend to listen to individual analysts to understand their logic.
DED's speculation is one at least rooted in a long term understanding of the Apple Corp Psyche'. He may have gotten the Samsung A7 thing wrong (at least partially), but at least he has a story that weaves into a [ir]rational analysis of how Apple got to where it is, and where it's likely going.
DED looks like a damn Nobel winning economist when compared to analysts, or worse, the Enderle's, Thurotts, and Dvorack's of the world.
And he's entertaining like a Saturday night revival tent preacher… (and most Nobel Winning anythings are wrong more than they are right).
My iPad 3 is starting to feel a bit weak, especially when using certain apps. And while I do like iOS 7, I do notice that it is slightly more processor intensive than iOS 6.
A bit slower maybe! However this release of iOS 7 seems to have a lot of bugs on iPad 3. At least on my iPad 3 that appears to be the case, so we won't know how dad it will be long term until after a point release or two.
I look forward to getting my new iPad 5 next month, which will have even more power than the iPhone 5s A7 chip, since it will be clocked higher.
I'm of the same mind to the point that I will milk my iPhone 4 even longer. However I'm expecting a bit more than a higher clock, in other words another "X" variant. Plus iPad needs external tweaks such as more RAM and a doubling, maybe tripling of Flash storage.
in the next 1-3 years, agreed. But always having an MBA prototype that is AxX powered as you walk into a meeting with Intel for long term 'partnership' discussions is a strong bargaining chip. Apple is the big consumer of high end Intel x86 chips… and controls it's own ARM chip development. It makes for good negotiations on both pricing and capabilities (for stuff like GPUs).
Actually I believe Intel is limited in pricing negotiations due to anti trust law suits of some time ago. Thy pretty much have to price based on volume. That still leaves a bunch of other things they can negotiate on.
Your point on 'windows universe already used it….' what happens when 40% of the 'personal computers' sold are ARM chip powered? then 50%…. And then Windows 9 comes out fully ARMed… I will argue that the 'Intel Inside' Sticker may be a 'stamp tax' for a lot of consumers (I just need to watch netflix, run a Web Browser, and do some spreadsheets).
With A7 there is no doubt at all that Apple could offer up some very low cost solutions. However will they offer an Open Mac type OS, which I see armored important than the processor chip. Apple has the browser and netflix crowd covered with AppleTV and the iPads. An ARM based laptop that offers all the power of Mac OS on the current machines would go over like hotcakes. You just need to see all the activity in the Linux world trying to ft such a platform off the ground to realize that Apple doing it right would be huge.
.
If you're apple, we're not long until the converse of that statement… '1Billion iOS devices already use it [A-Series]' is true and iOS is built using the same and similar OSX components running on Intel [read: OSX on ARM should be trivial, especially without the comets tail of macOS compatibility]. At what point does the economics cross over?
The funny thing here is that people express shock about the Apple / Samsung relationship here but Apples demands for chips is such that they may very well NEED TWO SOURCES of processor chips.
It may be somewhat sacreligous, but in 5 years, most of the compute power you 'need' will be a network hop away. That may always be big iron Intel, but on a laptop, when all apps are effectively 'cloud enabled?' At what point is your primary computing bottleneck the size of your AWS instance and the applications' QoS setting via your ISP?
I don't think that day will ever come completely. I see distributed computing but the demand for local intelligence will only increase. One of the big issues with Siri is that goes off device for things that it relay shouldn't have to bother with the net. A7 is a step forward in having enough power to run an AI locally, it might not be there yet but in the long run it just makes sense to distribute the work load to the local device when possible.
Pretty sure Samsung is already making the A6X. A7X if produced would most likely come from Samsung just because of the similarities.
I don't see how they would split A7 between 2 different fabs, it's not just simply a 1:1 copy, a lot of work would need to be done to integrate with TSMC's fab/process. Also any split would undoubtedly effect unit price, why would apple compromise their margin and increase their complexity by splitting across 2 different fabs? Doesn't make sense to me.
A8 and onwards, who knows.
There are two good reasons:
1. Apples wafer start is so strong that one fab will have trouble keeping up. At one time it was suggested that Apple was using 80% of Samsung's foundry capacity. That was one of the reasons behind Samsung's expansion in Austin.
2. You never want to have all your egg in one basket. It has been common practice in the electronics industry to have multiple sources for your parts. Think about Apples exposure here if something happened to Samsung's Austin plant, a lightening strike, tornado or whatever could cost Apple billions in just one quarter. The CYA approach is to keep suppliers globally separated. Think about how Apple was able to weather the hard disk crisis of a year or so ago when its balance of parts resulted in many machines not being impacted at all.
Now you have good points about the difference in processes but in Apples situation they are likely developing and sampling with multiple vendors already. It would be pretty hard to get valid quotes from a vendor with out the vendor having some idea as to what they will be building. It is rational to assume that right now Apple could get working A7 silicon from both Samsung and TSMC if they wanted.
1. Apples wafer start is so strong that one fab will have trouble keeping up. At one time it was suggested that Apple was using 80% of Samsung's foundry capacity. That was one of the reasons behind Samsung's expansion in Austin.
2. You never want to have all your egg in one basket. It has been common practice in the electronics industry to have multiple sources for your parts. Think about Apples exposure here if something happened to Samsung's Austin plant, a lightening strike, tornado or whatever could cost Apple billions in just one quarter. The CYA approach is to keep suppliers globally separated. Think about how Apple was able to weather the hard disk crisis of a year or so ago when its balance of parts resulted in many machines not being impacted at all.
Now you have good points about the difference in processes but in Apples situation they are likely developing and sampling with multiple vendors already. It would be pretty hard to get valid quotes from a vendor with out the vendor having some idea as to what they will be building. It is rational to assume that right now Apple could get working A7 silicon from both Samsung and TSMC if they wanted.
I would agree for commodity items like HDD/DRAM that can be sourced easily from multiple suppliers and not require a product design change between suppliers. Custom processors are different IMO. The development effort and time and complexity may not make it worthwhile to duplicate across multiple fabs. There's a fine line between hedging your bets against supply shortage and increasing complexity and cost. There's no indication that I can find that Samsung's texas fab was not able to keep up with the demand for A7's.
Besides it was the same situation for A6, one manufacturer.
P.S There seems to already be some potential redundancy built into A7. This article was written in 2011 about A6, which never made it to 28nm (A6 was 32nm) so seems pretty applicable to current A7.
"Samsung Electronics may not be as large as TSMC, but the company has synchronized its 28nm process technologies with various other industry players, including Globalfoundries and IBM, which theoretically allows Apple to make chips at different factories in order to meet its volume goals. Samsung can produce 28nm HKMG low-power (28LP) chips at S1 fab in Giheung, Korea and the company’s recently expanded fab, S2 in Austin, Texas, Globalfoundries has two 300mm fabs that qualify the 28nm low-power HKMG technology: Fab 1 in Dresden, Germany and Fab 8 in Saratoga County, New York; IBM has a 300mm production facility in Armonk, New York. The fabs represent a global footprint estimated to be the largest in the foundry industry for leading-edge capacity, offering customer choice enabled by close collaboration and an unparalleled de-risking of supply chain uncertainties.
Since Samsung and TSMC have different 28nm process technologies, they cannot produce identical SoCs even in case Apple develops two separate versions of A6 for different fabrication processes. As a result, the relationship with TSMC is unlikely about creating a second-source for A6 SoC, or at least this is not the only business project between Apple and the Taiwanese giant."
With A7 there is no doubt at all that Apple could offer up some very low cost solutions. However will they offer an Open Mac type OS, which I see armored important than the processor chip. Apple has the browser and netflix crowd covered with AppleTV and the iPads.
An ARM based laptop that offers all the power of Mac OS on the current machines would go over like hotcakes. You just need to see all the activity in the Linux world trying to ft such a platform off the ground to realize that Apple doing it right would be huge.
The funny thing here is that people express shock about the Apple / Samsung relationship here but Apples demands for chips is such that they may very well NEED TWO SOURCES of processor chips.
I don't think that day will ever come completely. I see distributed computing but the demand for local intelligence will only increase. One of the big issues with Siri is that goes off device for things that it relay shouldn't have to bother with the net. A7 is a step forward in having enough power to run an AI locally, it might not be there yet but in the long run it just makes sense to distribute the work load to the local device when possible.
Not Open MacOSX… never. Apple will control their experience on all platforms.
The question comes in whether or not Intel is 'worth it'? AxX chips are fast catching up (they are 5 years behind Intel at the moment… effectively the power of a Macbook when the iPhone 3GS was released.) When that performance curve tightens, and remember, Apple has no compromises with Windows or Linux or vmWare any of the other consumers of x86, so it will tighten, it becomes a serious question… people don't care what chip is underneath, just that it's faster than the last release on the platform, and better power characteristics (cooler, longer lasting on battery), and it runs my stuff.
I agree on two sources, but Other than maybe high end Mac (book) Pros, 99.9% of people would rather have the economy of scaling a huge pile of A10X chips into laptops, than having competing sources (I know I'm partially arguing against myself, but this is different architectures, not plug compatible chips). Having 2 vendors supply the same chip is different than 2 vendors supplying 2 different chips with 2 different OSes and 2 different Archs.
What Sun said 25 years ago holds true today… The Network is the system. Skynet be damned, I think we have to consider being 'off-net' will be like a 'disk drive failure' Something rare, something you backup for, but not something you plan on operating on for the long term. I think the key will be the coverage of networks, such as open mesh, and LTE, along with fiber to the home (I'm LOVING IT!!!!) will make for multi-path TCP solutions (already in use by Siri) for more seamless connectivity.
The UI will always be local, and where possible, most of the local compute will be used for that… but the big crunch, the big data stuff, where big iron is needed, will be remote.
I bet that, if there are still MacBooks in 5 years, they will be running ARM chips. In 5 years, x86 won't be competitive. People forget that Intel's advantage is based on massive capital expenditure, which came from being the dominant player in a growth market. It's now in a terminally declining market, a position it has never been in before. It's not just that ARM is getting better, but that at some point sustaining x86 will become financially impossible, and Intel will lose its advantage. x86 will start to lag. This won't take long because so much money is involved in going to each new process. When this happens, I think Apple will probably dump the Mac and move everything on to some future version of iOS, rather than move the Mac to ARM. But who knows.
I doubt they'll ever get into the fab business though.
Intel's first real attempt is the 64-bit out of order X86 Silvermont on a 22nm process. It costs $30~$40 per SoC (Bay Trail-T), and the performance is above any ARM CPU on the market.
Airmont (16 14nm), Silvermont's successor, is coming in 2014 and will be using Broadwell graphics cores.
In 2014 NVIDIA licencing Mobile Kepler cores, so Intel might take advantage. Project Logan (using Mobile Kepler) can put out 400 GFLOPS (while consuming similar power to the A6X in the iPad 4) , offer support for OpenGL ES 3.0, OpenGL 4.4, Direct X11, tessellation, and CUDA 5.0.
Intel's Haswell Y-Series has a max TDP of 11.5W with an SDP of 4.5W~6W and the Broadwell Y-Series should have a max TDP around 8W with an SDP of 3.5W~4.5W
X86 is becoming less expensive and more efficient at a faster rate than ARM is becoming more powerful.
As the chip goes through pilot and into production, Apple will have been guiding the device through the fab. Ultimately, assuming Samsung didn't get a lot of chance to look very closely at the reticles used in photo (and Apple will have guarded access to them closely), it's very difficult to tell the structure of a device as it's manufactured, beyond being able to see generally, that part is memory, that part is logic etc.
As long as you name your recipes sensibly, it gives little away. As an example, if you called as step, I don't know, SDEIMP for the Source-Drain Extension Implant, you wouldn't know if that was implanting a gate for a 16bit, 32bit, 64 bit processor (or even a gate for a memory block). If, however, you called it SDEIMPFORA64BITPROCESSOR, you'd probably give a little more away.......
I get your point, but I was kind joking, most time decapping a part and looking at you can not always tell who fabricated the part unless there is artwork on the die pointing to the FAB. Or it FAB with technologies no one else has like copper on silicon or something similar. How it every easy to count how many I/O a device has that is extremely had to hide. Lastly the parts have to be probed and tested so Apple will have to tell them how many I/O it has so it can correctly test and verify that all the address line work properly.
I think they are still guessing that Samsung is involved, this would have been the perfect part to move away from Samsung and let someone else build it.
Intel's first real attempt is the 64-bit out of order X86 Silvermont on a 22nm process. It costs $30~$40 per SoC (Bay Trail-T), and the performance is above any ARM CPU on the market.
Airmont (16 14nm), Silvermont's successor, is coming in 2014 and will be using Broadwell graphics cores.
In 2014 NVIDIA licencing Mobile Kepler cores, so Intel might take advantage. Project Logan (using Mobile Kepler) can put out 400 GFLOPS (while consuming similar power to the A6X in the iPad 4) , offer support for OpenGL ES 3.0, OpenGL 4.4, Direct X11, tessellation, and CUDA 5.0.
Intel's Haswell Y-Series has a max TDP of 11.5W with an SDP of 4.5W~6W and the Broadwell Y-Series should have a max TDP around 8W with an SDP of 3.5W~4.5W
X86 is becoming less expensive and more efficient at a faster rate than ARM is becoming more powerful.
Intel still have a long road ahead to match ARM max TDP around 1 watt, current mobile x86 generation TDP are at least 2 to 4 times higher than ARMs SoC .
If you excuse my analogy, the ARM CPU started is life as a 2 strokes motorcycle engine that have been beef up overtime, the current Intel CPUs is more like a V12 engine ripped apart and throttled to make it run with the same fuel efficiency of a 2 stroke motorcycle engine. Of course because of the nature of the beast, the x86 will be forever more powerfull than ARM in terms of raw power, but its also true the x86 will never reach ARM level of power preservation. Intel can't win this power preservation battle against ARM, any trick Intel would pull-off to lower their CPU consumption like lowering clock frequency or lower fab process can also be apply to ARM cores.
I get your point, but I was kind joking, most time decapping a part and looking at you can not always tell who fabricated the part unless there is artwork on the die pointing to the FAB. Or it FAB with technologies no one else has like copper on silicon or something similar. How it every easy to count how many I/O a device has that is extremely had to hide. Lastly the parts have to be probed and tested so Apple will have to tell them how many I/O it has so it can correctly test and verify that all the address line work properly.
I think they are still guessing that Samsung is involved, this would have been the perfect part to move away from Samsung and let someone else build it.
They won't be guessing that Samsung are involved. There will be all manner of identifiers in the device that if you know what you're looking for, will identify it as a Samsung manufactured part.
You're right that the part will have to be probe tested, but Samsung won't have been doing that for Apple. The production fab is rarely the place that the device is probe tested and packaged (even Intel ship a lot of their finished wafers to China for probe test, dicing and packaging). I don't know for sure, but I would assume Apple don't use any Samsung connected business to do the probe test for them. Based on all that, Samsung still won't have know exactly the device they've been making.
Comments
Aaah, the famous Misek.
And the herd mentality of all analysts. The blind trying to describe the elephant by screaming what they have found… eventually, you'll end up with consensus of either leather tree, or a snake, or the largest bagpipe in the world.
Will an A-series chip make it into a MacBook Air in the next 5 years? That's the question.
And will Apple get into the fab business?
Oh man, poor DED...
Hopefully next year Apple can cut 'em loose.
For a site whose writers and readership generally excoriate speculation by third-party analysts (see, for example, comment #42 by TheOtherGeoff), that’s Dilger’s stock in trade — and yet when he fails, he never seems to hear the same slagging off that the other analysts receive. His fanboy-driven, armchair wish-is-father-to-the-thought analysis is pretty tired ...
For a site whose writers and readership generally excoriate speculation by third-party analysts (see, for example, comment #42 by TheOtherGeoff), that’s Dilger’s stock in trade — and yet when he fails, he never seems to hear the same slagging off that the other analysts receive. His fanboy-driven, armchair wish-is-father-to-the-thought analysis is pretty tired ...
I think the difference is he's not trying to financially manipulate the stock price.
Well said.
I'm of the same mind to the point that I will milk my iPhone 4 even longer. However I'm expecting a bit more than a higher clock, in other words another "X" variant. Plus iPad needs external tweaks such as more RAM and a doubling, maybe tripling of Flash storage.
I don't think that day will ever come completely. I see distributed computing but the demand for local intelligence will only increase. One of the big issues with Siri is that goes off device for things that it relay shouldn't have to bother with the net. A7 is a step forward in having enough power to run an AI locally, it might not be there yet but in the long run it just makes sense to distribute the work load to the local device when possible.
There are two good reasons:
1. Apples wafer start is so strong that one fab will have trouble keeping up. At one time it was suggested that Apple was using 80% of Samsung's foundry capacity. That was one of the reasons behind Samsung's expansion in Austin.
2. You never want to have all your egg in one basket. It has been common practice in the electronics industry to have multiple sources for your parts. Think about Apples exposure here if something happened to Samsung's Austin plant, a lightening strike, tornado or whatever could cost Apple billions in just one quarter. The CYA approach is to keep suppliers globally separated. Think about how Apple was able to weather the hard disk crisis of a year or so ago when its balance of parts resulted in many machines not being impacted at all.
Now you have good points about the difference in processes but in Apples situation they are likely developing and sampling with multiple vendors already. It would be pretty hard to get valid quotes from a vendor with out the vendor having some idea as to what they will be building. It is rational to assume that right now Apple could get working A7 silicon from both Samsung and TSMC if they wanted.
There are two good reasons:
1. Apples wafer start is so strong that one fab will have trouble keeping up. At one time it was suggested that Apple was using 80% of Samsung's foundry capacity. That was one of the reasons behind Samsung's expansion in Austin.
2. You never want to have all your egg in one basket. It has been common practice in the electronics industry to have multiple sources for your parts. Think about Apples exposure here if something happened to Samsung's Austin plant, a lightening strike, tornado or whatever could cost Apple billions in just one quarter. The CYA approach is to keep suppliers globally separated. Think about how Apple was able to weather the hard disk crisis of a year or so ago when its balance of parts resulted in many machines not being impacted at all.
Now you have good points about the difference in processes but in Apples situation they are likely developing and sampling with multiple vendors already. It would be pretty hard to get valid quotes from a vendor with out the vendor having some idea as to what they will be building. It is rational to assume that right now Apple could get working A7 silicon from both Samsung and TSMC if they wanted.
I would agree for commodity items like HDD/DRAM that can be sourced easily from multiple suppliers and not require a product design change between suppliers. Custom processors are different IMO. The development effort and time and complexity may not make it worthwhile to duplicate across multiple fabs. There's a fine line between hedging your bets against supply shortage and increasing complexity and cost. There's no indication that I can find that Samsung's texas fab was not able to keep up with the demand for A7's.
With A7 there is no doubt at all that Apple could offer up some very low cost solutions. However will they offer an Open Mac type OS, which I see armored important than the processor chip. Apple has the browser and netflix crowd covered with AppleTV and the iPads.
Not Open MacOSX… never. Apple will control their experience on all platforms.
The question comes in whether or not Intel is 'worth it'? AxX chips are fast catching up (they are 5 years behind Intel at the moment… effectively the power of a Macbook when the iPhone 3GS was released.) When that performance curve tightens, and remember, Apple has no compromises with Windows or Linux or vmWare any of the other consumers of x86, so it will tighten, it becomes a serious question… people don't care what chip is underneath, just that it's faster than the last release on the platform, and better power characteristics (cooler, longer lasting on battery), and it runs my stuff.
I agree on two sources, but Other than maybe high end Mac (book) Pros, 99.9% of people would rather have the economy of scaling a huge pile of A10X chips into laptops, than having competing sources (I know I'm partially arguing against myself, but this is different architectures, not plug compatible chips). Having 2 vendors supply the same chip is different than 2 vendors supplying 2 different chips with 2 different OSes and 2 different Archs.
What Sun said 25 years ago holds true today… The Network is the system. Skynet be damned, I think we have to consider being 'off-net' will be like a 'disk drive failure' Something rare, something you backup for, but not something you plan on operating on for the long term. I think the key will be the coverage of networks, such as open mesh, and LTE, along with fiber to the home (I'm LOVING IT!!!!) will make for multi-path TCP solutions (already in use by Siri) for more seamless connectivity.
The UI will always be local, and where possible, most of the local compute will be used for that… but the big crunch, the big data stuff, where big iron is needed, will be remote.
Intel's first real attempt is the 64-bit out of order X86 Silvermont on a 22nm process. It costs $30~$40 per SoC (Bay Trail-T), and the performance is above any ARM CPU on the market.
Airmont (16 14nm), Silvermont's successor, is coming in 2014 and will be using Broadwell graphics cores.
In 2014 NVIDIA licencing Mobile Kepler cores, so Intel might take advantage. Project Logan (using Mobile Kepler) can put out 400 GFLOPS (while consuming similar power to the A6X in the iPad 4) , offer support for OpenGL ES 3.0, OpenGL 4.4, Direct X11, tessellation, and CUDA 5.0.
Intel's Haswell Y-Series has a max TDP of 11.5W with an SDP of 4.5W~6W and the Broadwell Y-Series should have a max TDP around 8W with an SDP of 3.5W~4.5W
X86 is becoming less expensive and more efficient at a faster rate than ARM is becoming more powerful.
Probably not.
I get your point, but I was kind joking, most time decapping a part and looking at you can not always tell who fabricated the part unless there is artwork on the die pointing to the FAB. Or it FAB with technologies no one else has like copper on silicon or something similar. How it every easy to count how many I/O a device has that is extremely had to hide. Lastly the parts have to be probed and tested so Apple will have to tell them how many I/O it has so it can correctly test and verify that all the address line work properly.
I think they are still guessing that Samsung is involved, this would have been the perfect part to move away from Samsung and let someone else build it.
Intel's first real attempt is the 64-bit out of order X86 Silvermont on a 22nm process. It costs $30~$40 per SoC (Bay Trail-T), and the performance is above any ARM CPU on the market.
Airmont (16 14nm), Silvermont's successor, is coming in 2014 and will be using Broadwell graphics cores.
In 2014 NVIDIA licencing Mobile Kepler cores, so Intel might take advantage. Project Logan (using Mobile Kepler) can put out 400 GFLOPS (while consuming similar power to the A6X in the iPad 4) , offer support for OpenGL ES 3.0, OpenGL 4.4, Direct X11, tessellation, and CUDA 5.0.
Intel's Haswell Y-Series has a max TDP of 11.5W with an SDP of 4.5W~6W and the Broadwell Y-Series should have a max TDP around 8W with an SDP of 3.5W~4.5W
X86 is becoming less expensive and more efficient at a faster rate than ARM is becoming more powerful.
Intel still have a long road ahead to match ARM max TDP around 1 watt, current mobile x86 generation TDP are at least 2 to 4 times higher than ARMs SoC .
I get your point, but I was kind joking, most time decapping a part and looking at you can not always tell who fabricated the part unless there is artwork on the die pointing to the FAB. Or it FAB with technologies no one else has like copper on silicon or something similar. How it every easy to count how many I/O a device has that is extremely had to hide. Lastly the parts have to be probed and tested so Apple will have to tell them how many I/O it has so it can correctly test and verify that all the address line work properly.
I think they are still guessing that Samsung is involved, this would have been the perfect part to move away from Samsung and let someone else build it.
They won't be guessing that Samsung are involved. There will be all manner of identifiers in the device that if you know what you're looking for, will identify it as a Samsung manufactured part.