The A10 fusion chip is cheap compared to the Intel chips. The iMac pro margin can easily handle it.
Apple could be working towards direct running of iOS apps on OS X for development or plain ol' operation. Your Mac could become your iPad or your phone... just another iOS device, while still being a Mac.
At the same time, Apple could be testing how well the ARM runs pro-style apps or OSX. If the A10 chip is handling the boot process, then the next step is booting to the OS of choice, be it OSX, Windows, or (dun dun dunnnn...) iOS.
Remember that Apple was running OSX on intel for FIVE YEARS before they ANNOUNCED the switch to Intel. We could have ARM-based Macs in that time or less from now.
It sounds nuts. N-V-T-S, nuts. But it is plausible.
Ok, but why would they treat OSX and iOS as separate OS's? iOS was derived from OSX. Most of the differences are related to the hardware form (and related i/o) that they run on. Running two OS's on one machine just isn't what Apple does. Two messy. They'll roll them back together -- or more likely just modify OSX as necessary...
The UI is the major difference. It’s like saying watchOS is the same as iOS because they both run on ARM. Architecture can change, as Apple has shown many times. There will never be a single install that will work for iPhones and Macs!
UI is the roadblock? Are you aware of where iOS came from?
UI is always the roadblock. It's why the iPhone uses CocoaTouch the Mac uses Aqua.
The main difference comes down to one class UIResponder on CocoaTouch instead of NSResponder on Mac (Nextstep). At the start UIResponder was all multi-touch events but slowly evolved to handle keyboard events and pencil as well. Apple have closed the gap between them every year so wouldn't be surprised if either could work on the other with no work at all. In many was UIResponder has overtaken and would be capable of replacing It's NS brother with a few restriction put in place.
It's not about objects, APIs, or whatever the correct terminology is for that, I'm talking about how you interact with the device. These buttons are way too small and close together for your fingers. Maybe they'll change the UI so that it's completely touch-friendly, but we've had a decade of iOS for iPhone and 7 years of iOS for iPad (both independent OSes with unique interfaces that idealize them for their respective displays), and they have yet to make macOS have the same look and feel as iOS.
All-in-one (AIO) computers like iMac are a huge aggression to planet Earth. Computers may last for seven years or less, whereas displays may last for more than 20 years.
Oh yeah? Show me your 15-inch Studio Flat Panel LCD from 1997. Let's check its color accuracy.
Or maybe your blue and white CRT Studio Display, that consumes 21 times the power that a 28-inch 4K LCD does. What was that about "aggression to planet Earth?"
I was using one of the 15inch Studio as a secondary email screen till about a year ago when it fired. Now have an old 21, colour on either wasn't great but good for email.
3) A10 seems like overkill for the stated functionality, and with FaceTime mentioned I hope that this means the iMac Pro will also include Face ID.
I'm pretty sure that Face ID requires the A11/neural engine... so if the rumored co-processor is an a10, then no Face ID.
I'm not so sure about that?
A9 & A10 can do face feature detection using the Vision Framework are a reasonable speed. Given the computer is generally in a more physically secure location they could lower the test standards and make it more about registered user switching (yes please) than strict security. Still say require password or second factor after say 30min inactivity or if moved assuming Macbooks are also on the cards. Enough time to duck to loo or make coffee, but come back from meeting or lunch and you get prompted.
Good point and it wouldn't need to prompt you, it would just work. It would just wake up when it sees you.
2) I'm curious why the 'B' is being capitalized when all other Apple OSes have the first letter lowercase.
3) A10 seems like overkill for the stated functionality, and with FaceTime mentioned I hope that this means the iMac Pro will also include Face ID.
4) Is this still launching next month?
Since the A10 will also be used in the HomePod I’m guessing it’s easier to use that fabrication already in place rather than have two different lines. Too bad it’s not the A11 with its new Apple GPU and the new Neural Engine.
The HomePod uses the A8, not the A10
Apple really needs to make a SiriOS. If Apple released a dock that turned any iDevice into a home Siri assistant millions of people would use their old iPhones/iPads as a Siri home assistant day one. The dock could have the HomePod mics and a loud speaker.
A lot of people have old iDevices and they don't know what to do with them. This would breathe new life into the otherwise "useless" devices. The market is gigantic and would only grow with time.
I would rather use an old iPad than an echo home or some other crap.
I always find it easier and quicker to just do a search using a few keys than talking to a digital assistant which works once out of seven attempts. We don't need Siri, stop trying to make us all lazy and lame.
That's because Siri is still just a child, only a few years old. She wants to be Scarlett Johansson when she grows up (and so it shall be).
I always find it easier and quicker to just do a search using a few keys than talking to a digital assistant which works once out of seven attempts. We don't need Siri, stop trying to make us all lazy and lame.
That's because Siri is still just a child, only a few years old. She wants to be Scarlett Johansson when she grows up (and so it shall be).
All-in-one (AIO) computers like iMac are a huge aggression to planet Earth. Computers may last for seven years or less, whereas displays may last for more than 20 years.
Please shut the fuck up with that nonsense. The number of consumer displays in use from 20 years ago is practically nil and we're not going to give up our notebooks, tablets smartphones, smartwatches, and everything else that comes with a built-in display because you have a problem with an already low-yield device, the iMac.
He made a reasonable point. Why so touchy about it?
No he didn’t.
Given the the improvements in power efficiency over the past 20 years, he’s not doing the environment any favours by hanging on to his old monitor.
If he can’t afford a new one, or is too tight to replace it, then fine – but The iMac is pretty easy to recycle, so let’s not pretend he’s doing this to help the environment.
All-in-one (AIO) computers like iMac are a huge aggression to planet Earth. Computers may last for seven years or less, whereas displays may last for more than 20 years.
Please shut the fuck up with that nonsense. The number of consumer displays in use from 20 years ago is practically nil and we're not going to give up our notebooks, tablets smartphones, smartwatches, and everything else that comes with a built-in display because you have a problem with an already low-yield device, the iMac.
He made a reasonable point. Why so touchy about it?
No he didn’t.
Given the the improvements in power efficiency over the past 20 years, he’s not doing the environment any favours by hanging on to his old monitor.
If he can’t afford a new one, or is too tight to replace it, then fine – but The iMac is pretty easy to recycle, so let’s not pretend he’s doing this to help the environment.
Power efficiency is an excellent point. I wonder how much power a 20yo CRT uses over a modern LCD, even one that has a much larger display.
Good for those who can justify the price of an iMac Pro. How about more support for MacOS and get MacOS into the hands of more people. A reasonably priced and spec'ed headless mac is needed.
People said that in 2007 and nope, no xMac and Apple is even more wildly successful.
All-in-one (AIO) computers like iMac are a huge aggression to planet Earth. Computers may last for seven years or less, whereas displays may last for more than 20 years.
Please shut the fuck up with that nonsense. The number of consumer displays in use from 20 years ago is practically nil and we're not going to give up our notebooks, tablets smartphones, smartwatches, and everything else that comes with a built-in display because you have a problem with an already low-yield device, the iMac.
He made a reasonable point. Why so touchy about it?
No he didn’t.
Given the the improvements in power efficiency over the past 20 years, he’s not doing the environment any favours by hanging on to his old monitor.
If he can’t afford a new one, or is too tight to replace it, then fine – but The iMac is pretty easy to recycle, so let’s not pretend he’s doing this to help the environment.
Power efficiency is an excellent point. I wonder how much power a 20yo CRT uses over a modern LCD, even one that has a much larger display.
i was irritated yesterday when I made my response to it, but from a typical 20-inch CRT to a 28-inch LCD, it's about 3-4x more power consumed by the CRT.
The A10 fusion chip is cheap compared to the Intel chips. The iMac pro margin can easily handle it.
Apple could be working towards direct running of iOS apps on OS X for development or plain ol' operation. Your Mac could become your iPad or your phone... just another iOS device, while still being a Mac.
At the same time, Apple could be testing how well the ARM runs pro-style apps or OSX. If the A10 chip is handling the boot process, then the next step is booting to the OS of choice, be it OSX, Windows, or (dun dun dunnnn...) iOS.
Remember that Apple was running OSX on intel for FIVE YEARS before they ANNOUNCED the switch to Intel. We could have ARM-based Macs in that time or less from now.
It sounds nuts. N-V-T-S, nuts. But it is plausible.
Ok, but why would they treat OSX and iOS as separate OS's? iOS was derived from OSX. Most of the differences are related to the hardware form (and related i/o) that they run on. Running two OS's on one machine just isn't what Apple does. Two messy. They'll roll them back together -- or more likely just modify OSX as necessary...
The UI is the major difference. It’s like saying watchOS is the same as iOS because they both run on ARM. Architecture can change, as Apple has shown many times. There will never be a single install that will work for iPhones and Macs!
UI is the roadblock? Are you aware of where iOS came from?
UI is always the roadblock. It's why the iPhone uses CocoaTouch the Mac uses Aqua. There's a reason why the iPad was never simply a plop-in of macOS with all those tiny little features that are best done with a mouse, not a fat finger. No he didn't.
....
It's 2017 and you're saying that most of us want to use displays from 1997? Bullshit! And back in 1997 we were using displays from 1977? Double bullshit.
"... tiny little features that are best done with a mouse, not a fat finger." Nah... Those are just two different sources of input. Not an innate difference. Jobs correctly proposed that five fat fingers made a better input device for a pocket device than the stylus's that were common at the time. But today, the stylus that can be used with the iPad Pro is going back to that. ... Computer vendors have been adding and deleting input devices since the dawn of computers. My first used a cassette tape. Don't get too hung up on one over the other. Going to voice input is far more radical than going to a stylus or cursor.
"you're saying that most of us want to use displays from 1997?" LOL... Where did I say THAT? No, I was responding to the claim that NOBODY uses a "20 year old monitor". And, at the same time, pointing out that it has certain advantages over the flat screen. That doesn't make it "better" nor does it mean that "most of us" should be using one. ... The analogy might be comparing a Model T to a Corvette StingRay: Of course the Corvette is "better". But, the Model T can do things that the Corvette would totally fail at -- such as navigating the muddy paths that were common to its day.
The A10 fusion chip is cheap compared to the Intel chips. The iMac pro margin can easily handle it.
Apple could be working towards direct running of iOS apps on OS X for development or plain ol' operation. Your Mac could become your iPad or your phone... just another iOS device, while still being a Mac.
At the same time, Apple could be testing how well the ARM runs pro-style apps or OSX. If the A10 chip is handling the boot process, then the next step is booting to the OS of choice, be it OSX, Windows, or (dun dun dunnnn...) iOS.
Remember that Apple was running OSX on intel for FIVE YEARS before they ANNOUNCED the switch to Intel. We could have ARM-based Macs in that time or less from now.
It sounds nuts. N-V-T-S, nuts. But it is plausible.
Ok, but why would they treat OSX and iOS as separate OS's? iOS was derived from OSX. Most of the differences are related to the hardware form (and related i/o) that they run on. Running two OS's on one machine just isn't what Apple does. Two messy. They'll roll them back together -- or more likely just modify OSX as necessary...
Do you really have to ask this question? Seriously? Have you not been paying attention for the past 10 yrs?
Yes, I have. Where were you when they spun iOS off from MacOS?
I don't really need to say anymore...you've already made yourself look like a fool.
So you ARE uninfomed! ... But adept at using personal attacks to cover up for it.
This discussion reminds me, after long-not-thinking-about it, of the iMac G4 with the semi-spherical base, and the unbelievable 20" flatscreen "floating" on that stainless arm. In its own way that was one of the best computer designs of all time.
Whenever I think of it I kick myself for not getting that over the last-ever PC we bought; a problematic Dell-tower. I recall punching it so hard (imagine a seated uppercut-motion towards a tower-PC sitting on a low-box on the floor), that it went flying back towards the wall. Junk.
After that (if I have the timeline right), was our 1st Mac in 2005, the white iBook G4. Was $1800 if I recall. Sold in 2009 for $500 when we got an Aluminum MacBook, which is still going fine for Int/E-mail.
Then the first ever 27-inch Mac in late 2009, which is still going.
All-in-one (AIO) computers like iMac are a huge aggression to planet Earth. Computers may last for seven years or less, whereas displays may last for more than 20 years.
Please shut the fuck up with that nonsense. The number of consumer displays in use from 20 years ago is practically nil and we're not going to give up our notebooks, tablets smartphones, smartwatches, and everything else that comes with a built-in display because you have a problem with an already low-yield device, the iMac.
He made a reasonable point. Why so touchy about it?
No he didn’t.
Given the the improvements in power efficiency over the past 20 years, he’s not doing the environment any favours by hanging on to his old monitor.
If he can’t afford a new one, or is too tight to replace it, then fine – but The iMac is pretty easy to recycle, so let’s not pretend he’s doing this to help the environment.
Power efficiency is an excellent point. I wonder how much power a 20yo CRT uses over a modern LCD, even one that has a much larger display.
i was irritated yesterday when I made my response to it, but from a typical 20-inch CRT to a 28-inch LCD, it's about 3-4x more power consumed by the CRT.
Similarly, I bought my 15” Studio Display (maybe it was a 17”) in 2001 at the opening of the Apple Store at the North Shore Mall in Peabody, MA. I recall it being touted as using 75% less power than an equivalent sized CRT monitor. I think I bought that along with a PowerMac G4 with dual 800 MHz processors.
All-in-one (AIO) computers like iMac are a huge aggression to planet Earth. Computers may last for seven years or less, whereas displays may last for more than 20 years.
go troll this bullshit somewhere else. Macs are extremely long-lived and highly recyclable, making them a more ecological choice than any crappy plastic Dell desktop.
nobody uses a display for TWENTY YEARS. there is something wrong with your brain.
He seems to have forgotten a 20 year old display today would be have a CRT!
All-in-one (AIO) computers like iMac are a huge aggression to planet Earth. Computers may last for seven years or less, whereas displays may last for more than 20 years.
go troll this bullshit somewhere else. Macs are extremely long-lived and highly recyclable, making them a more ecological choice than any crappy plastic Dell desktop.
nobody uses a display for TWENTY YEARS. there is something wrong with your brain.
Damn! I better get rid of mine! Such a shame. It's works great and I love the picture quality on it. None of that posterized looking stuff like you get on the flat screens.
macxpress said: I usually disable Siri on the Mac...its just too useless. I don't see where it really does much of anything.
I think it would be cool to see the A10 or A11 in the next MacBook and/or MacBook Pro along side the Intel CPU. It could run the OS doing basic things and then when power is really needed it kicks in the Intel CPU. That would quite a feat of engineering though. There's a lot of software engineering that needs to take place to make something like that work reliably.
Why would you want this? By that logic wouldn't it be better to just use one chip to do it all? Why would you want it to switch between chips depending on power? I think a better application would be what I suggested above. Assign certain functions to the A-chip to deal with exclusively (Siri, TouchID/FaceID, ApplePay and more we haven't thought of) and let Intel deal with the rest. Then slowly cross functions over to the A-processor as research for future ARM Macs.
To save power for one thing, but your last sentence really explains it all. Not only can Apple put both in there, they can get developers to recompile some of their apps for use with ARM based Macs.
All-in-one (AIO) computers like iMac are a huge aggression to planet Earth. Computers may last for seven years or less, whereas displays may last for more than 20 years.
Multiple people have demolished most of this near-sighted statement, but nobody has addressed your point that “computers may last seven years or less” — let’s talk about that.
Setting aside the obvious fact that computers can remain in use beyond seven years (within families, for example), let’s turn to the problem of the macOS leaving perfectly-functional hardware behind, which I assume is the basis of your claim. macOS High Sierra (2017) runs on the Late 2009 iMac and later, so seven or eight years sounds about right.
I do maybe think there is an opportunity here for Apple to make an eco-friendly statement. I’d love to see them periodically choose to maintain security/Safari for an older OS to keep older Macs safe to run — Yosemite would have been a very good choice. El Capitan, too, though its RAM requirements complicate that.
Facing forward, some configurations would probably have to downgrade to maintain security/Safari after being left behind, but Apple could mitigate most of those issues by planning for it. High Sierra might be a good choice to start with, assuming the next macOS leaves some machines behind. Apple could start planning now to make High Sierra a “legacy” macOS, supported for, say, an additional three years or more, so those Late 2009 iMacs would have support beyond 2017.
With regard to sales, Apple would then be able to say all Macs receive software support for a minimum of ten years, while still allowing macOS to push the envelope and leave older configurations behind when necessary...
Comments
Now have an old 21, colour on either wasn't great but good for email.
Apple really needs to make a SiriOS. If Apple released a dock that turned any iDevice into a home Siri assistant millions of people would use their old iPhones/iPads as a Siri home assistant day one. The dock could have the HomePod mics and a loud speaker.
A lot of people have old iDevices and they don't know what to do with them. This would breathe new life into the otherwise "useless" devices. The market is gigantic and would only grow with time.
I would rather use an old iPad than an echo home or some other crap.
Given the the improvements in power efficiency over the past 20 years, he’s not doing the environment any favours by hanging on to his old monitor.
If he can’t afford a new one, or is too tight to replace it, then fine – but The iMac is pretty easy to recycle, so let’s not pretend he’s doing this to help the environment.
Nah... Those are just two different sources of input. Not an innate difference. Jobs correctly proposed that five fat fingers made a better input device for a pocket device than the stylus's that were common at the time. But today, the stylus that can be used with the iPad Pro is going back to that.
... Computer vendors have been adding and deleting input devices since the dawn of computers. My first used a cassette tape. Don't get too hung up on one over the other. Going to voice input is far more radical than going to a stylus or cursor.
"you're saying that most of us want to use displays from 1997?"
LOL... Where did I say THAT?
No, I was responding to the claim that NOBODY uses a "20 year old monitor". And, at the same time, pointing out that it has certain advantages over the flat screen. That doesn't make it "better" nor does it mean that "most of us" should be using one.
... The analogy might be comparing a Model T to a Corvette StingRay: Of course the Corvette is "better". But, the Model T can do things that the Corvette would totally fail at -- such as navigating the muddy paths that were common to its day.
... But adept at using personal attacks to cover up for it.
Whenever I think of it I kick myself for not getting that over the last-ever PC we bought; a problematic Dell-tower. I recall punching it so hard (imagine a seated uppercut-motion towards a tower-PC sitting on a low-box on the floor), that it went flying back towards the wall. Junk.
After that (if I have the timeline right), was our 1st Mac in 2005, the white iBook G4. Was $1800 if I recall. Sold in 2009 for $500 when we got an Aluminum MacBook, which is still going fine for Int/E-mail.
Then the first ever 27-inch Mac in late 2009, which is still going.
E.
I’m hoping that HomePod will connect to the Apple TV and allow you to use commands like, “Hey, Siri, turn on TV,” “Hey, Siri, play [movie/TV show].”
Setting aside the obvious fact that computers can remain in use beyond seven years (within families, for example), let’s turn to the problem of the macOS leaving perfectly-functional hardware behind, which I assume is the basis of your claim. macOS High Sierra (2017) runs on the Late 2009 iMac and later, so seven or eight years sounds about right.
I do maybe think there is an opportunity here for Apple to make an eco-friendly statement. I’d love to see them periodically choose to maintain security/Safari for an older OS to keep older Macs safe to run — Yosemite would have been a very good choice. El Capitan, too, though its RAM requirements complicate that.
Facing forward, some configurations would probably have to downgrade to maintain security/Safari after being left behind, but Apple could mitigate most of those issues by planning for it. High Sierra might be a good choice to start with, assuming the next macOS leaves some machines behind. Apple could start planning now to make High Sierra a “legacy” macOS, supported for, say, an additional three years or more, so those Late 2009 iMacs would have support beyond 2017.
With regard to sales, Apple would then be able to say all Macs receive software support for a minimum of ten years, while still allowing macOS to push the envelope and leave older configurations behind when necessary...