Apple researching return to distributed computing in iPhone and Mac

Posted:
in General Discussion edited September 2020
Like Apple once did with Xcode and Xgrid, it wants to leverage all the processors you have across your Mac, iPhone, and iPad and make them work together for you when you need more processing power.

Your iPhone or iPad could hand off processing to a nearby Mac
Your iPhone or iPad could hand off processing to a nearby Mac


The screen you're reading this on is part of a very powerful computing device. Yet the odds are that shortly you're going to put this down and pick up a different device that if it isn't as powerful, is still pretty close.

We all pop our iPhones in our pockets or purses, we all put our iPads down on a desk or in a bag, as we open up our Macs or MacBook Pro machines. Maybe those devices are still doing some work for us, but Apple wants them to do more.

"Peer-to-peer distributed computing system for heterogeneous device types," a new US patent, describes what is basically distributed computing. Apple itself used to offer distributed processing to speed up producing apps within Xcode, but now it wants to bring it to us all.

"With the advent of an increased number a portable computing devices," says Apple, "the need for integration between various devices has become increasingly important. For example, a user may have access to multiple types of devices at a given time (e.g. smartphone, laptop, etc.), and accordingly, these devices may communicate amongst each other to provide an enhanced user experience."

"This enhanced user experience may include sharing resources amongst devices in order to increase capabilities such as processing capacity, battery usage, network bandwidth, storage, and the like," it continues.

Apple says that this effectively creates a "distributed computing environment," just by the existence of these devices. However, there are problems, chiefly cause by us.

"The communication link between devices is often severed, for example, when the device is physically removed from an area (e.g. when a user leaves a location with an accompanying portable device)," explains Apple. We're also rubbish at keeping everything charged up.

So we have our cluster of devices but they may not be powered and they will go in and out of range. "Accordingly, traditional distributed computing frameworks are often ill suited for the environments of current user devices," says Apple.

Apple's proposed solution is what it calls "a flexible framework (or technique or method)" that can "efficiently utilize resources within the environment (e.g. device ecosystem)." If another device is nearby, and it has "greater capabilities such as processing capacity, storage, power resources, etc.," then it may be given some processing to do.

Detail from the patent showing how a network of devices can be made and unmade as available
Detail from the patent showing how a network of devices can be made and unmade as available


"For instance, it may not be uncommon for a user to have a personal device such as a smartphone in proximity to a laptop or desktop computer, either of which may be connected to a continuous power source (e.g. plugged into an electrical outlet)," says Apple. "Accordingly, the smartphone may offload processing tasks such as editing multimedia (e.g. photos) to the desktop or laptop computer to utilize a potentially more powerful processor and to conserve battery power."

The 11,000 words or so of the patent detail methods of how devices can tell each other that they are available, and what capabilities they have. It also covers how this adhoc network can cope when a device is unexpectedly removed.

"[It may] recognize that a device has rejoined the environment, and accordingly, may resume computations that were suspended," says Apple.

The patent is credited to Oliver M. Williams. He's previously been listed on multiple patents to do with image recognition, including one on a system for identifying books on a bookshelf from a photograph.

Apple is not new to distributed computing. As well as its previous inclusion in Xcode for developers, Apple used to support it for researchers and general users as Xgrid, part of Mac OS X Tiger.

It was also in Mac OS Server, and the origins of this particular implementation of the idea can be traced back to NeXT. However, Apple dropped Xgrid with Mac OS X Mountain Lion.

Comments

  • Reply 1 of 20
    Rayz2016Rayz2016 Posts: 6,957member
    And this becomes far easier now that every device from the Apple Watch to the Mac Pro is powered by ASi.

    This is Palpatine-level planning.



    radarthekatpscooter63seanjcornchipFileMakerFellerwatto_cobra
  • Reply 2 of 20
    This sounds like a genericization of what they're planning with the AR glasses -- offloading processing to the more powerful nearby device. So maybe instead of being tethered specifically to your phone, as was done with the WATCH, they'll recognize and have their capabilities enhanced by any nearby device you're signed in to, in a transparent way. And then as Apple often does, they'll bring this technology they developed for AR/VR back to their other devices too.
    radarthekatseanjFileMakerFellerwatto_cobra
  • Reply 3 of 20
    DuhSesameDuhSesame Posts: 1,278member
    Interesting.  Just did some research on distributed computing weekend to see how I can group some old Macs into a cluster.

    I’d like to see this patent become available, though that might require your phone to have better cooling for sustain workload.
    watto_cobra
  • Reply 4 of 20
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Hasn't Apple essentially been doing that for the past 6 years with the iPhone and Apple Watch (and, to a lesser extent, the Apple Watch and MacBooks)?
  • Reply 5 of 20
    welshdogwelshdog Posts: 1,896member
    The cluster software Apple made in the past was not very good. Finicky and unreliable, it was a pain to keep running. I hope any new product is "all new" and doesn't use any of the old tech.
    watto_cobra
  • Reply 6 of 20
    Rayz2016Rayz2016 Posts: 6,957member
    Hasn't Apple essentially been doing that for the past 6 years with the iPhone and Apple Watch (and, to a lesser extent, the Apple Watch and MacBooks)?

    Not really, no.

    I don't think I can think of a case where Apple has used one device to increase the processing power of another by offloading some of the processing to another device.  In the case of the Apple Watch, it sometimes acts as a dumb terminal to the iPhone (when it allows you to control the music on your phone for example), but that's not the same thing.

    edited September 2020 fastasleepcornchipwatto_cobramuthuk_vanalingam
  • Reply 7 of 20
    BeatsBeats Posts: 3,073member
    This is just the beginning for Apple Silicon. Can't wait!
    cornchipwatto_cobra
  • Reply 8 of 20
    hexclockhexclock Posts: 1,240member
    Apple also had Logic Node, for distributed processing of Logic workflows. They may have had something like that for Final Cut as well, though I am not sure. It would be nice if all the Pro apps could utilize a shared framework like that. 
    edited September 2020 watto_cobra
  • Reply 9 of 20
    Mesh networks are awesome if executed properly.
    watto_cobra
  • Reply 10 of 20
    NeXT had distributed computing Frameworks since it’s NeXTStation debuted with NeXTStep 2.0. This isn’t new just updated, and no this doesn’t get easier with an ARM only ecosystem. We were fully Distributed from an agnostic set of frameworks with NS3.1. Nothing has stopped Apple being a ubiquitous distributed ecosystem since the merger of NeXT Inc with Apple Inc. We changed direction to consumer to save Apple. 

    At Apple I watched resources intended to build upon our enterprise NeXT lineage be diverted. XServe never had the resources or focus to develop an enterprise server/client version of OS X, though that’s always been one of the goals. 

    Nothing Ai will produce will be on an EPYC scale focus, but I sure would hope their built out back end data centers interface seamlessly with them to enhance their cloud focused services, including AR. 
    edited September 2020 cornchipFileMakerFellerwatto_cobramuthuk_vanalingam
  • Reply 11 of 20
    Beats said:
    This is just the beginning for Apple Silicon. Can't wait!
    There's nothing here that suggests this would be limited to Apple Silicon. Why would it?
    watto_cobra
  • Reply 12 of 20
    I've been wondering about this type of thing after the news that Octane X would have an iPadOS release that can act as a network render node for macOS. 
    edited September 2020 cornchipwatto_cobra
  • Reply 13 of 20
    NeXT had distributed computing Frameworks since it’s NeXTStation debuted with NeXTStep 2.0. This isn’t new just updated, and no this doesn’t get easier with an ARM only ecosystem. We were fully Distributed from an agnostic set of frameworks with NS3.1. Nothing has stopped Apple being a ubiquitous distributed ecosystem since the merger of NeXT Inc with Apple Inc. We changed direction to consumer to save Apple. 

    At Apple I watched resources intended to build upon our enterprise NeXT lineage be diverted. XServe never had the resources or focus to develop an enterprise server/client version of OS X, though that’s always been one of the goals. 

    Nothing Ai will produce will be on an EPYC scale focus, but I sure would hope their built out back end data centers interface seamlessly with them to enhance their cloud focused services, including AR. 
    You make it sound like Xgrid — which we know came from NeXT as is stated in the article — didn't exist until 2012.  But of course you got your AMD plug in there, like that has anything to do with anything. 
    Rayz2016watto_cobra
  • Reply 14 of 20
    hexclock said:
    Apple also had Logic Node, for distributed processing of Logic workflows. They may have had something like that for Final Cut as well, though I am not sure. It would be nice if all the Pro apps could utilize a shared framework like that. 
    It is true, and Compressor has an easy to use Node (xGrid) system in Preferences to create a cluster. I am currently repurposing five Mac Pro 2013 machines into a Cluster Farm for encoding that my Digital Media students can use to offload from laptops. It’s not perfect, but those unused 24 Xeon cores can come in handy in a pinch.
    cornchipfastasleepwatto_cobra
  • Reply 15 of 20
    rob53rob53 Posts: 3,239member
    You have to look a long ways back to the early 2000's for the UCLA parallel processing array made up of 25 Power Mac G3s and G4s. Dean Dauger wrote Pooch cluster software and was a member of the UCLA team, http://daugerresearch.com/pooch/top.shtml. This was before Xgrid. 

    https://www.macworld.com/article/1021865/appleseed.html don't know if AI had an article about this. The Appleseed webpage is no longer there.


    watto_cobra
  • Reply 16 of 20
    The big concern is whether not (a) Apple ships this and (b) commits to it for a few decades. Xgrid seemed a little magical when it first appeared, but five-ish years later it was gone. Catch-22 of not enough people using it to bother supporting it, but not enough support for it to get more people to use it. That situation needs to be avoided.
    watto_cobra
  • Reply 17 of 20
    Rayz2016Rayz2016 Posts: 6,957member
    NeXT had distributed computing Frameworks since it’s NeXTStation debuted with NeXTStep 2.0. This isn’t new just updated, and no this doesn’t get easier with an ARM only ecosystem. We were fully Distributed from an agnostic set of frameworks with NS3.1. Nothing has stopped Apple being a ubiquitous distributed ecosystem since the merger of NeXT Inc with Apple Inc. We changed direction to consumer to save Apple. 

    At Apple I watched resources intended to build upon our enterprise NeXT lineage be diverted. XServe never had the resources or focus to develop an enterprise server/client version of OS X, though that’s always been one of the goals. 

    Nothing Ai will produce will be on an EPYC scale focus, but I sure would hope their built out back end data centers interface seamlessly with them to enhance their cloud focused services, including AR. 
    So you’re saying that not having to deal with different architectures won’t make it easier for Apple to take this technology forward?

    Oh wait … you said EPYC, so is this is another one of your “butthurt because Apple didn’t stick with AMD and tie itself to another x86 company instead of forging its own path” posts isn’t it. 
    watto_cobrafastasleep
  • Reply 18 of 20
    Rayz2016Rayz2016 Posts: 6,957member

    hexclock said:
    Apple also had Logic Node, for distributed processing of Logic workflows. They may have had something like that for Final Cut as well, though I am not sure. It would be nice if all the Pro apps could utilize a shared framework like that. 
    I stand corrected. 
    watto_cobra
  • Reply 19 of 20
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Rayz2016 said:
    Hasn't Apple essentially been doing that for the past 6 years with the iPhone and Apple Watch (and, to a lesser extent, the Apple Watch and MacBooks)?

    Not really, no.

    I don't think I can think of a case where Apple has used one device to increase the processing power of another by offloading some of the processing to another device.  In the case of the Apple Watch, it sometimes acts as a dumb terminal to the iPhone (when it allows you to control the music on your phone for example), but that's not the same thing.


    Yes, it does that,
    it also takes phones calls for the phone as well as messages.  And no, the watch is NOT acting as a "Dumb terminal" there but rather, receiving the data and processing it correctly.   (A "dumb terminal" can't do that).
    Meanwhile the phone acts as its central collection point for activities like heart rate, steps and activity metrics like pace and distance, collecting and reporting them.
    Likewise the phone serves as the control center of the watch maintaining its settings and even managing its updates and what it connects to (such as AirPods)

    Likewise the Watch Acts as the logon center for the MacBook.

    And, I suspect, they are doing a lot more behind the scenes that we don't see.  Each does the processing that it does best -- and even shifts it back and forth based on circumstances.

    I think the distributive processing going on between the Apple Watch and iPhone or MacBook may be far more extensive than you thought.
    edited September 2020
  • Reply 20 of 20
    hexclockhexclock Posts: 1,240member
    Rayz2016 said:

    hexclock said:
    Apple also had Logic Node, for distributed processing of Logic workflows. They may have had something like that for Final Cut as well, though I am not sure. It would be nice if all the Pro apps could utilize a shared framework like that. 
    I stand corrected. 
    I think Logic node was somewhat limited, as it would only allow processing of Logic’s own plugins, not third party ones. If Apple’s new proposed distributed compute model could break that limitation, it would be very exiting indeed. 
Sign In or Register to comment.