I'd like to at least see Apple try a Mac with their own CPU inside it. They could easily do it on something like a Mac mini which doesn't sell a lot, but just enough to get a feel of how it would work, what customers reaction would be (if any). I'm sure they have a version of macOS that works on their own CPU's. I find it very hard to believe there's not a secret project going on somewhere to test macOS with their own CPU's...kinda like how there was before the switch to Intel. I bet Apple could literally make the Mac mini the size of AppleTV, possibly smaller. Apple LOVES going smaller!
I'd like to at least see Apple try a Mac with their own CPU inside it. They could easily do it on something like a Mac mini which doesn't sell a lot, but just enough to get a feel of how it would work, what customers reaction would be (if any). I'm sure they have a version of macOS that works on their own CPU's. I find it very hard to believe there's not a secret project going on somewhere to test macOS with their own CPU's...kinda like how there was before the switch to Intel. I bet Apple could literally make the Mac mini the size of AppleTV, possibly smaller. Apple LOVES going smaller!
It's been my contention that Apple will deliver an ARM version of the Mac Book, sans support for x86/x64, but with Mac OS support. Developers would, in theory, just have a bit of modification to make current applications compatible.
I have also stated that I thought that it would be mutually beneficial for MS and Apple, to support Windows on ARM only on such a device.
I think I also know how Apple thinks and I think Apple actually thinks about how many coal-fired power plants exist to power computers with specs beyond what most users need. It’s a thin line between compute power and compute efficiency, and I think Apple deliberately walks the compute efficiency side of that line, know it could easily spec up (design) a crazy performing machine and knowing they will constantly take flack for not doing so. TBut that’s the path Apple walks, better for the environment, worse for performance-hungry consumers who don’t think of that bigger picture.
OT, but I had a palm/forehead moment the other day... watching an ad about power storage for solar arrays and windmills — using lead-acid battery arrays to store the power generated for use when the sun/wind is not available.
What is the environmental cost of:
building the batteries
maintaining the batteries
disposing of the batteries after their useful life
The benefit to the power company, is that it smooths out the power output from renewables, especially solar, such that there isn't as much load balancing required, which can be costly.
The plant in Tonopah, NV uses molten salts for the same purpose, certainly at a larger scale than the lead acid batteries, but many other companies use pumped storage.
Lead Acid batteries are reliable and easy to recycle, the problems in Richmond, CA, being an example of what happens when they are dumped indiscriminately.
It's hilarious reading some of the comments on various sites that have linked to the Anandtech article.
People are going absolutely apeshit over this. It's amazing how pissed off people are getting that Apple has designed such an amazing processor and how far ahead they are compared to everyone else.
It's hilarious reading some of the comments on various sites that have linked to the Anandtech article.
People are going absolutely apeshit over this. It's amazing how pissed off people are getting that Apple has designed such an amazing processor and how far ahead they are compared to everyone else.
I know right? People are losing their minds!
Just look at some of the top comments from pissed off fanboys on an Android forum:
Wow. The A12 was really undersold by Apple’s own marketing department. It really is quite the beast.
I’m going to guess the SD855 is going to at least catch up or slightly exceed on the GPU side of things, but for most other areas, Qualcomm seems to be a generation or two behind. The javascripting benchmarks were particularly embarrassing. We’re seeing even the iPhone 6s outperforming flagship android devices released this year.
What is going on in Qualcomm land?
god I would kill for an a12 in an android phone
In order to command the prices & brand recognition Apple has going for it, it has to comprehensively differentiate itself. At all levels. It's working.
Credit to Apple they've improved their sustained performance by 40-80%!
Was disappointed with their claim of only 50% GPU improvement since Apple's claims have usually been of peak performance. But this year they've accounted for sustained performance in their advertising claim
Bringing laptop and desktop concerns into the A-processor series development may not necessarily be a good thing if done myopically. The beauty of the A-series chips are based on Apple wanting to target as much capability and performance into the smallest possible package. That sets a very high bar but Apple is able to consistently ratchet it up on about a 12 month cadence. That whole process is working wonderfully. Laptop and desktop platforms aren't nearly as cutting edge so I'd be concerned about compromises entering into the development mix.
However, that doesn't mean laptops and desktops aren't viable targets for A-power. What I'd actually prefer on the Mac side would be to use A-series processor arrays connected as highly elastic compute clusters to scale A-series system-level performance to levels that are simply not possible in an iOS device with its limited energy resources and heat management concerns. A MacBook form factor device could potentially have a 4-way or 8-way A-power cluster while an iMac Pro could be 16-way or 32-way. Sure, marshaling all of the A-chips and the underlying cores/threads would require some serious OS chops, but not all apps would require the use of the full array and course granularity allocation of apps to cores would still yield enormous speed-ups with the appropriate memory architecture. Perhaps the next leap from system on a chip (SoC) is cluster on a board (CoB).
To move from Intel to the A architecture will require another Rosetta miracle and if anybody can do it, it will be Apple.
dunno, I think it will be much easier.
Before the Intel switch, OS X apps were written in Cocoa, Carbon, and sometimes even Java. Apps were built in both Xcode and CodeWarrior. These products required major rewrites to run on Intel natively, necessitating an emulator.
Modern Mac apps are 64bit Cocoa written in Xcode. Adding ARM as a compilation target is trivial, they did all the necessary groundwork for that years ago. Most Mac apps should compile just fine.
The notable exception being legacy games, who are already in trouble thanks to OpenGL deprecation… but yeah, I think all it will take is macOS 10.15 treating non-universal binaries the same way it treats 32 bit apps, and 10.16 refusing to run them at all. The developers will fall in line fast.
I think we'll see ARM Macs for developers within 12 months, users in two years or less. The real question now is "why has it taken so long," which I can only assume is due to Apple having limited resources to spend on this project.
macxpress said: I'm willing to bet the farm Apple is working on it. You're already seeing how they're doing other things to support it as far as hardware. I'll still say the next Mac mini will have an Apple CPU/GPU inside it rather than an Intel CPU. I'm thinking this is what is delaying its release.
I doubt it, especially in light of the more recent concept of it being more a 'pro' machine. If the Mini is going in that direction, I doubt it would be the new experimental machine based on the A-Series. More likely it would be a machine like the MacBook were most of the software likely to run on it (at least initially) wouldn't have as many ties to X86.
radarthekat said: I think I also know how Apple thinks and I think Apple actually thinks about how many coal-fired power plants exist to power computers with specs beyond what most users need. It’s a thin line between compute power and compute efficiency, and I think Apple deliberately walks the compute efficiency side of that line, know it could easily spec up (design) a crazy performing machine and knowing they will constantly take flack for not doing so. But that’s the path Apple walks, better for the environment, worse for performance-hungry consumers who don’t think of that bigger picture.
I think it has a lot more to do with their ability to sell a bunch of phones to eager people and make sleek looking laptops, etc., than walking some environmental line.
tmay said: The plant in Tonopah, NV uses molten salts for the same purpose, certainly at a larger scale than the lead acid batteries, but many other companies use pumped storage.
That idea for alternate energy production is one of the few that makes sense to me so far... but we'd need a lot more of them and much better distribution systems.
Eric_WVGG said: Modern Mac apps are 64bit Cocoa written in Xcode. Adding ARM as a compilation target is trivial, they did all the necessary groundwork for that years ago. Most Mac apps should compile just fine.
It isn't so much a problem of rebuilding Mac apps, as it is all the people who use the X86 compatibility for things like VMs and such.
I think I also know how Apple thinks and I think Apple actually thinks about how many coal-fired power plants exist to power computers with specs beyond what most users need. It’s a thin line between compute power and compute efficiency, and I think Apple deliberately walks the compute efficiency side of that line, know it could easily spec up (design) a crazy performing machine and knowing they will constantly take flack for not doing so. TBut that’s the path Apple walks, better for the environment, worse for performance-hungry consumers who don’t think of that bigger picture.
OT, but I had a palm/forehead moment the other day... watching an ad about power storage for solar arrays and windmills — using lead-acid battery arrays to store the power generated for use when the sun/wind is not available.
What is the environmental cost of:
building the batteries
maintaining the batteries
disposing of the batteries after their useful life
Always a good question. Hopefully that’s factored in by those involved.
It's hilarious reading some of the comments on various sites that have linked to the Anandtech article.
People are going absolutely apeshit over this. It's amazing how pissed off people are getting that Apple has designed such an amazing processor and how far ahead they are compared to everyone else.
I know right? People are losing their minds!
Just look at some of the top comments from pissed off fanboys on an Android forum:
Wow. The A12 was really undersold by Apple’s own marketing department. It really is quite the beast.
I’m going to guess the SD855 is going to at least catch up or slightly exceed on the GPU side of things, but for most other areas, Qualcomm seems to be a generation or two behind. The javascripting benchmarks were particularly embarrassing. We’re seeing even the iPhone 6s outperforming flagship android devices released this year.
What is going on in Qualcomm land?
god I would kill for an a12 in an android phone
In order to command the prices & brand recognition Apple has going for it, it has to comprehensively differentiate itself. At all levels. It's working.
Credit to Apple they've improved their sustained performance by 40-80%!
Was disappointed with their claim of only 50% GPU improvement since Apple's claims have usually been of peak performance. But this year they've accounted for sustained performance in their advertising claim
A few posts on Reddit (not an Android site, mind you) don’t negate anything.
I'd like to at least see Apple try a Mac with their own CPU inside it. They could easily do it on something like a Mac mini which doesn't sell a lot, but just enough to get a feel of how it would work, what customers reaction would be (if any). I'm sure they have a version of macOS that works on their own CPU's. I find it very hard to believe there's not a secret project going on somewhere to test macOS with their own CPU's...kinda like how there was before the switch to Intel. I bet Apple could literally make the Mac mini the size of AppleTV, possibly smaller. Apple LOVES going smaller!
It's been my contention that Apple will deliver an ARM version of the Mac Book, sans support for x86/x64, but with Mac OS support. Developers would, in theory, just have a bit of modification to make current applications compatible.
I have also stated that I thought that it would be mutually beneficial for MS and Apple, to support Windows on ARM only on such a device.
Slightly off topic - In the same Anandtech review of iphone Xs and Xs Max, anyone going through the "Camera - Low light evaluation" would be surprised with the results there.
Not surprised, as it is the author’s first iPhone evaluation. Probably his first iPhone handling as well.
Boost the exposure and you get the same low light shot as P20’s, so what? The point is to what extent the processing will intervene to the user’s natural perception of that low light scene. Leaving greater control to the user doesn’t mean that the iPhone “just can’t keep up.”
Well, all that power won’t be harnessed for longer pro tasks until there is a great cooling system. Lately I’ve been using an iPhone 8 with Lightroom as my main camera while traveling. LR camera does some quite heavy computational stuff, for example making HDR RAW DNGs (so actually blending RAW files on the fly, leaving the fully editable result) or long exposure shots, where it blends dozens of shots into a single image, again leaving an editable RAW file. While the results are lovely, when taking few shots in succession the app queues them because it takes quite a while to process each one and after a few minutes the phone overheats and starts to trottle heavily, crawling to a halt, while also force dimming the display untill all the shots are processed (which starts taking forever due to the throttling). On a recent trip to a hot-ish place I had to periodically get in the car and hold the phone, without a case, against the airco vent and that helped immensely. I know this is a somewhat rare and extreme usage scenario, but not out of this world extreme, especially for a product touting its technological superiority in a big way.
And what is the Android crowd’s reaction? Can’t you guess? They’re babbling on about Intel modems being inferior to Qualcomm modems. You can’t will argue with that crowd. They got it covered from every angle. When they lose in one area they just jump to something else.
We still haven't seen the A12X yet either.
Anantech appeared to be hugely impressed with how efficient the A12 was, but noted that Apple needs to apply some changes to iPhone XS when it was cold. That was entirely due To Apple not designing the A12 to excel at benchmarks, even as it does.
Yes - “In terms of peak performance, I encountered some great issues in 3DMark: I was completely unable to complete a single run on either the iPhone XS or XS Max while the devices were cool. If the device is cool enough, the GPU will boost to such high performance states that it will actually crash”
This is not a question of not focusing on benchmarks, it’s a throttling and power management issue. I shouldn’t be able to crash my new phone based on its temperature when I run something.
And what is the Android crowd’s reaction? Can’t you guess? They’re babbling on about Intel modems being inferior to Qualcomm modems. You can’t will argue with that crowd. They got it covered from every angle. When they lose in one area they just jump to something else.
We still haven't seen the A12X yet either.
Anantech appeared to be hugely impressed with how efficient the A12 was, but noted that Apple needs to apply some changes to iPhone XS when it was cold. That was entirely due To Apple not designing the A12 to excel at benchmarks, even as it does.
Yes - “In terms of peak performance, I encountered some great issues in 3DMark: I was completely unable to complete a single run on either the iPhone XS or XS Max while the devices were cool. If the device is cool enough, the GPU will boost to such high performance states that it will actually crash”
This is not a question of not focusing on benchmarks, it’s a throttling and power management issue. I shouldn’t be able to crash my new phone based on its temperature when I run something.
Oddly, but excepting for that test performed by Anantech, I haven't read of anyone else having a problem running Fortnite in the real world.
"if the device is cool enough" seems to be the giveaway.
muthuk_vanalingam said: Didn't go through that section??? Spoiler alert - Avon B7 would brag about the REAL INNOVATION that Huawei has brought to the smartphone cameras. Many people in this forum who believe that Apple is the ONLY innovator would be surprised with the results there. You are NOT one of those who believe ONLY Apple can innovate, so it may not be a surprise to you. But for the knockoff commentors in this forum, it would be a surprise that a chinese company can actually innovate and move the camera technology forward.
Why should they read it? Most Apple users don't give a fuck about Android or Windows.
I've always found it odd that Android/Google fans like to hang around Apple forums constantly braying about how good their platform is. But I've never felt the need to hang around an Android forum because, as you say, I don't actually care about Android.
It's almost as if these folk are trying to convince themselves.
¯\_(ツ)_/¯
I agree and we all know who our main such poster is on AI! He owns no Apple products (according to his own words) but manages to post daily pointing out any negative thing he can about Apple (always with links).
It's hilarious reading some of the comments on various sites that have linked to the Anandtech article.
People are going absolutely apeshit over this. It's amazing how pissed off people are getting that Apple has designed such an amazing processor and how far ahead they are compared to everyone else.
Comments
I have also stated that I thought that it would be mutually beneficial for MS and Apple, to support Windows on ARM only on such a device.
The plant in Tonopah, NV uses molten salts for the same purpose, certainly at a larger scale than the lead acid batteries, but many other companies use pumped storage.
Lead Acid batteries are reliable and easy to recycle, the problems in Richmond, CA, being an example of what happens when they are dumped indiscriminately.
People are going absolutely apeshit over this. It's amazing how pissed off people are getting that Apple has designed such an amazing processor and how far ahead they are compared to everyone else.
Just look at some of the top comments from pissed off fanboys on an Android forum:
However, that doesn't mean laptops and desktops aren't viable targets for A-power. What I'd actually prefer on the Mac side would be to use A-series processor arrays connected as highly elastic compute clusters to scale A-series system-level performance to levels that are simply not possible in an iOS device with its limited energy resources and heat management concerns. A MacBook form factor device could potentially have a 4-way or 8-way A-power cluster while an iMac Pro could be 16-way or 32-way. Sure, marshaling all of the A-chips and the underlying cores/threads would require some serious OS chops, but not all apps would require the use of the full array and course granularity allocation of apps to cores would still yield enormous speed-ups with the appropriate memory architecture. Perhaps the next leap from system on a chip (SoC) is cluster on a board (CoB).
Before the Intel switch, OS X apps were written in Cocoa, Carbon, and sometimes even Java. Apps were built in both Xcode and CodeWarrior. These products required major rewrites to run on Intel natively, necessitating an emulator.
Modern Mac apps are 64bit Cocoa written in Xcode. Adding ARM as a compilation target is trivial, they did all the necessary groundwork for that years ago. Most Mac apps should compile just fine.
The notable exception being legacy games, who are already in trouble thanks to OpenGL deprecation… but yeah, I think all it will take is macOS 10.15 treating non-universal binaries the same way it treats 32 bit apps, and 10.16 refusing to run them at all. The developers will fall in line fast.
I think we'll see ARM Macs for developers within 12 months, users in two years or less. The real question now is "why has it taken so long," which I can only assume is due to Apple having limited resources to spend on this project.
I think it has a lot more to do with their ability to sell a bunch of phones to eager people and make sleek looking laptops, etc., than walking some environmental line.
That idea for alternate energy production is one of the few that makes sense to me so far... but we'd need a lot more of them and much better distribution systems.
It isn't so much a problem of rebuilding Mac apps, as it is all the people who use the X86 compatibility for things like VMs and such.
Boost the exposure and you get the same low light shot as P20’s, so what? The point is to what extent the processing will intervene to the user’s natural perception of that low light scene. Leaving greater control to the user doesn’t mean that the iPhone “just can’t keep up.”
This is not a question of not focusing on benchmarks, it’s a throttling and power management issue. I shouldn’t be able to crash my new phone based on its temperature when I run something.
"if the device is cool enough" seems to be the giveaway.