AI

How private will Apple’s Private Cloud Compute Models be in processing your AI requests? | Technology News


Apple has always touted itself as being a vanguard of privacy that is committed to protecting its users’ data. And the tech giant’s promises at its 2024 Worldwide Developers Conference (WWDC) held last week in California’s Cupertino, US, were no different.

On the back of unveiling its widely anticipated personal intelligence system called Apple Intelligence as well as ChatGPT integrations, Apple shed light on its strategy for AI processing in a secure and private manner.

After Microsoft, Apple is also looking to run its generative AI-powered features locally on the device – but with an important caveat. For processing more sophisticated AI requests, the iPhone-maker said that it will rely on cloud services and further claimed that it has come up with a way to bring its on-device security model to the cloud.

“We want to extend the privacy and security of your iPhone into the cloud to unlock even more intelligence for you. So we have created Private Cloud Compute,” Craig Federighi, senior vice president of engineering at Apple, announced at the company’s annual developer conference.

Festive offer
Apple Intelligence Apple said that it will be shipping its generative AI features with iOS 18 later this year. (Image Source: Apple/YouTube)

But how will Private Cloud Compute (PCC) models offer the same security and privacy as your iPhone? For starters, Apple has said that personal data sent to the PCC models will never be accessible to anyone but the user. Furthermore, the data will be purportedly wiped from the private cloud servers once the AI request has been completed.

Prohibiting privileged runtime access, safeguarding against targeted cyber attacks, and allowing researchers to verify the end-to-end security are some of the other privacy guarantees that come attached with Apple Intelligence.


short article insert
However, are these first-of-a-kind AI privacy guarantees by Apple practically enforceable in places like India? Can they lead to legal or political pushback? And what does Apple need to clarify going forward?

Why hasn’t Apple stuck to on-device AI processing?

In a blog post published by its security team, Apple elaborated on its reasons for adopting a part on-device, part private cloud-based strategy for processing AI requests.

“When on-device computation with Apple devices such as iPhone and Mac is possible, the security and privacy advantages are clear: users control their own devices, researchers can inspect both hardware and software, runtime transparency is cryptographically assured through Secure Boot, and Apple retains no privileged access,” the post said.

“However, to process more sophisticated requests, Apple Intelligence needs to be able to enlist help from larger, more complex models in the cloud,” it added.

But how to make it secure?

Apple ruled out complete end-to-end encryption of personal data sent for AI processing as that would, in turn, prevent AI models from “performing computations on user data.” Additionally, it said that traditional cloud AI applications come with limitations such as the possibility of inadvertently logging sensitive user data or running modified server software.

In contrast, Apple envisions that its private cloud systems will allow Apple Intelligence to “draw on even larger, server-based models for more complex requests, while protecting your privacy.”

How will Apple Intelligence and PCC Models work?

According to Apple, understanding a user’s personal context is essential for its AI models to do what they’re meant to do. At WWDC, Federighi illustrated this with an example:

“Suppose one of my meetings is being re-scheduled for late in the afternoon, and I’m wondering if it’s going to prevent me from getting to my daughter’s play performance on time. Apple Intelligence can process the relevant personal data to assist me. It can understand who my daughter is, the play details she sent several days ago, the time and location for my meeting, and predicted traffic between my office and the theatre.”

Apple Intelligence Apple’s Private Cloud Compute Models will run on Apple Silicon chips. (Image Credit: Anuj Bhatia/The Indian Express)

When you make a similar request, Apple Intelligence will use its semantic index to gather the personal context relevant to your request and feed it to its generative AI models. If the nature of the request is complex and requires more computational energy, personal data is collected from your device and sent to larger generative AI models hosted by your private cloud that runs on Apple silicon chips.

Personal data will be encrypted during its transmission to the PCC model and back, as per Apple.

How do Apple’s AI privacy guarantees square with the law?

Walking the tightrope of accessibility and privacy, Apple guaranteed that personal user data sent to its private AI cloud servers will “never be available to anyone other than the user, not even to Apple staff, not even during active processing.”

It stressed that such data will only be used for processing AI requests and will not be retained even for logging or debugging purposes, meaning that the data will be deleted from Apple’s private cloud servers once the request is completed.

Welcoming the move as a positive step from a privacy point of view, Rohit Kumar, founding partner at public policy firm The Quantum Hub (TQH), told The Indian Express that the AI data processing standards set by Apple don’t clash with India’s Digital Personal Data Protection Act, 2023, as the legislation advocates for a consent-based approach and calls for limits on data retention.

On the other hand, the Information Technology Act, 2000, requires companies to cooperate with law enforcement agencies by enabling access to information or offering decryption support for authorised requests.

“But if no user data is retained by Apple’s AI after fulfilling a user’s request, there is not much that can be done,” he said.

However, with regards to compliance, Kumar highlighted a potential run-in with the 2022 cybersecurity directive issued by CERT-In (India’s nodal cybersecurity agency) which requires companies to maintain ICT system logs for 180 days, among other requirements.

Since Apple’s PCC models don’t even include a general-purpose logging mechanism, Kumar said it was unclear how these directions will be complied with.

Apple Intelligence Apple has struck a deal with OpenAI to integrate ChatGPT with iPhones, iPads, and Macs. (Image Credit: Anuj Bhatia/The Indian Express)

How can Apple improve its AI privacy strategy?

Besides shipping its personal intelligence system, Apple announced that Siri will be able to turn to OpenAI’s ChatGPT to answer questions as long as users allow it. This means users will know when they’re using ChatGPT and when they’re using Apple Intelligence.

Though, it appears that users will not know when their requests to Apple Intelligence will be processed on-device and when it will be sent to the private cloud for processing.

Independent cybersecurity researcher Karan Saini concurred that the lack of transparency was a problem, adding that “Apple would have to ensure that most of its efforts would be focused on making the contractual side of this air-tight, and that they would want to inform users that there is a possibility that your data will get processed on the cloud without you being necessarily informed of it.”

Saini further said users should be allowed to opt-out of Apple devices’ generative AI capabilities completely as “some people might just want to use their iPhone as a phone.” “If I update to iOS 18 tomorrow, I would hope that AI processing isn’t enabled from the get-go because I might have agreed to Apple’s privacy policies when I signed up but I might not be comfortable with their integration with OpenAI’s ChatGPT,” he said. “Even if I might want Apple’s generative AI features in isolation, I may not want them to be enabled always,” Saini added.

However, it might be too late for that.

“We have integrated it [Apple Intelligence] deep into your iPhone, iPad, and Mac and throughout your apps, so it’s aware of your personal data, without collecting your personal data,” the company’s senior executive Craig Federighi said on-stage at the 2024 WWDC.



Source

Related Articles

Back to top button