Apple offers Private Cloud Compute up for a security probe

Posted:
in iOS

Apple advised at launch that Private Cloud Compute's security will be inspectable by third parties. On Thursday, it fulfilled its promise.

Laptop screen displaying terminal code with a colorful abstract background and mobile application icons.
The virtual environment for testing Private Cloud Compute - Image credit: Apple



In July, Apple introduced Apple Intelligence and its cloud-based processing facility, Private Cloud Compute. It was pitched as being a secure and private way to handle in-cloud processing of Siri queries under Apple Intelligence.

As well as insisting that it used cryptography and didn't store user data, it also insisted that the features could be inspected by independent experts. On October 24, it offered an update on that plan.

In a Security Research blog post titled "Security research in Private Cloud Compute," Apple explains that it provided third-party auditors and some security researchers with early access. This included access to resources created for the project, including the PCC Virtual Research Environment (VRE).

The post also says that the same resources are being made publicly available from Thursday. Apple says this allows all security and privacy researchers, "or anyone with interest and a technical curiosity" to learn about Private Cloud Compute's workings and to make their own independent verification.

Resources



The release includes a new Private Cloud Compute Security Guide, which explains how the architecture is designed to meet Apple's core requirements for the project. It includes technical details of PCC components and their workings, how authentications and routing of requests occurs, and how the security holds up to various forms of attack.

The VRE is Apple's first ever for any of its platforms. It consists of tools to run the PCC node software on a virtual machine.

This isn't specifically the same code as used on servers, as there are "minor modifications" for it to work locally. Apple insists the software runs identically to the PCC node, with changes only to the boot process and the kernel.

Flowchart depicting interactions between virtual research environment, private cloud compute versions, and PCC client with attestation, prompt, response, ML stack, and Apple silicon server.
A diagram showing how elements of Private Cloud Compute interact with the new virtual research environment - Image credit: Apple



The VRE also includes a virtual Secure Enclave Processor, and takes advantage of the built-in macOS support for paravirtualized graphics.

Apple is also making the source code for some key components available for inspection. Offered under a limited-use license intended for analysis, the source code includes the CloudAttestation project for constructing and validating PCC node attestations.

There's also the Thimble project, which includes a daemon for a user's device that works with CloudAttestation for verifying transparency.

PCC bug bounty



Furthermore, Apple is expanding its Apple Security Bounty. It promises "significant rewards" for reports of issues with security and privacy in Private Cloud Compute.

The new categories in the bounty directly align with critical threats from the Security Guide. This includes accidental data disclosure, external compromise from user requests, and physical or internal access vulnerabilities.

The prize scale starts from $50,000 for the accidental or unexpected disclosure of data due to a deployment or configuration issue. At the top end of the scale, managing to demonstrate arbitrary code execution with arbitrary entitlements, which can earn participants up to $1 million.

Apple adds that it will consider any security issue that has a "significant impact" to PCC for a potential award, even if it's not lined up with one of the defined categories.

"We hope that you'll dive deeper into PCC's design with our Security Guide, explore the code yourself with the Virtual Research Environment, and report any issues you find through Apple Security Bounty," the post states.

In closing, Apple says it designed PCC "to take an extraordinary step forward for privacy in AI," including verifiable transparency.

The post concludes "We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time."



Read on AppleInsider

Comments

  • Reply 1 of 1
    mpantonempantone Posts: 2,186member
    There has been a lot of hand wringing online by people who wail that Apple is late to the AI party.

    I'd rather see this type of cautious, measured approach rather than toss caution to the wind and push out consumer-facing AI features that are barely early beta quality.

    And yes, I'm talking about every single LLM-powered AI chatbot service for consumers. They are all alpha or early beta quality. There is nothing that remotely resembles release quality software.

    Security should be first and foremost. We have seen a massive braindrain at OpenAI because the current senior management did not prioritize this. We also saw Microsoft delay their CoPilot desktop rollout after a disastrous response and the original unveiling.

    There are too many people scrambling to be "First" without taking any care to do things securely.

    One thing for sure, I will wait until June 2025 to install iOS 18 and Sequoia on my main devices. And then I will ensure to turn off all AI functionality initially. I plan to trial desktop AI on a separate throwaway non-admin account on my Mac.

    I actually bought an iPhone 16 and have yet to make it my primary phone; it mostly sits unused on a counter. I am strongly inclined to do a factory wipe and go through setup using an alternate identity to trial the Apple Intelligence features without putting any of my personal information on the line.

    Trust is earned.


    Apple needs to prove to me (and security experts) that they have been thinking carefully about security in Apple Intelligence.
    edited 3:38PM auxio
Sign In or Register to comment.