Apple's AI plans involves 'black box' for cloud data
Apple's efforts in AI could pay off in its WWDC announcements, but it is also very keen to protect user data at the same time. Here's how it will get done.
Siri on an iPhone
Apple is expected to make a number of big plays in AI for WWDC. The changes are anticipated to include big things in iOS 18 and its other operating systems, with app feature changes such as audio transcription apparently on the way.
But, with privacy a core tenet of Apple's work, it's doing what it can to protect its users.
According to sources of The Information, Apple intends to process data from AI applications inside a virtual black box. The concept, known as "Apple Chips in Data Centers" (ACDC) internally, would involve only Apple's hardware being used to perform AI processing in the cloud.
The idea is that it will control both the hardware and software on its servers, enabling it to design more secure systems.
While on-device AI processing is highly private, the initiative could make cloud processing for Apple customers to be similarly secure.
On-device processing is inherently private, due to not ferrying data away to the cloud. The problem is that it can be a lot slower compared to cloud processing.
However, cloud processing can be a lot more powerful, albeit with the privacy tradeoff. This latter element is what Apple's trying to avoid.
Avoiding use and abuse
Part of the problem is the potential for the uploaded data to be misused, or exposed by hackers. With a reliance on cloud servers, AI services do pose a risk to user data getting out.
By taking control over how data is processed in the cloud, it would make it easier for Apple to implement processes to make a breach much harder to actually happen.
Furthermore, the black box approach would also prevent Apple itself from being able to see the data. As a byproduct, this means it would also be difficult for Apple to hand over any personal data from government or law enforcement data requests.
Its ACDC initiative can be even more beneficial to Apple in terms of future device designs. By offloading AI features to the cloud, Apple could reduce the hardware requirements of its future products, making lighter wearables and other devices.
Secure Enclaves
Core to the ACDC initiative, which was detailed earlier in May, is the Secure Enclave. Used on the iPhone to store biometric data, the Secure Enclave is a protected element that holds data like passwords and encryption keys, preventing access to the sensitive data by hackers if they compromise iOS or the hardware.
Under the plan, the Secure Enclave would be used to isolate data processed on the servers, former Apple employees told the report. Doing so means the data can't be seen by other elements of the system, nor Apple itself.
Read on AppleInsider
Comments
NVIDIA is getting a lot of attention today for its chips, but in reality it isn't so much its chips that make AI work it is their software platform and its ability to integrate with its chips. Sounds like Apple doesn't it - and in fact it is.
Apple has an AI opportunity to leapfrog everyone else by utilizing its chips and an ultra-efficient software stack to process Large Language Models (LLM) on its servers. This will also allow Apple to take the high road in terms of ensuring that only high quality and LEGAL data is used for its AI LLMs - something no other AI company currently is doing.
WWDC this year was supposed to be all about Spacial Computing (VisionPro) and I hope that is still the case. But the media seems to think that Apple will be all about AI. My hope is that Apple spends some time on the plans noted in this article and highlights why longer term with will be a fantastic opportunity for Apple and its customers - but focus on the current hardware and software projects.
Apple isn't behind as many would like to believe and their path will be different than the rest in tech and the resulting geek howl will be hilarious as usual.
https://www.reddit.com/r/hardware/comments/1cwdlak/notebookcheck_apple_m4_soc_analysis_amd_intel_and/
https://beebom.com/early-snapdragon-x-elite-benchmarks-cant-beat-apple-m3/
Second, if Apple is not behind in AI, why we are waiting for WWDC to see their AI/LLM roadmap? That's different from. MS and Google, that already have AI/LLM implemented in their products and services.
I thought the problem was a glitch that occasionally deleted the photographic database entry, but not the actual photograph? So when the update happened, iOS found the actual file and re-added it to the database.
Since when has Apple given out details about anything major that wasn't ready to ship, I would prefer Apple keep designing/engineering building the M4, M5 and M6 SOC'S, but target some of those chip towards servers for themselves and the public and until WWDC no one really knows where Apple is in AI, but we do know what/where Apple Silicon is when compared to the competition over the last four years, ahead of the pack. With Neural Engines, LiDAR, UMA memory, bandwidth and plenty of Apple OS support.
Remember what Matthew McConaughey’s ad said: "Data is the new Gold". You create gold by sharing information. Just be aware of who you share it with.
And I think is nonsense to compare Apple Silicon to AI processors from MS, Google, Amazon and Nvidia. One is a multipurpose SoC for notebooks and PCs, while Maia, Axio, and Graviton4 / Trainium2 are designed for AI tasks in datacenters. Until Apple have their own AI / LLM datacenter running with Apple Silicon we won't know how compares to the competition.