Does Apple Intelligence Protect Your Privacy? (2024)

When you ask Siri a question, you really want to hear the computer from Star Trek or at least something that sounds smart and responsive. What you get is often a disappointing quote or a "Here's what I found" with a tepid list of links. In a keynote at its developer-focused WWDC, Apple laid out its plans to give you a true Apple Intelligence experience.

Apple Intelligence will come as part of Apple’s operating systems, so in effect you’re getting it for free. And you know what they say—if the service is free, you are the product. Is Apple Intelligence “creepy spyware,” as Elon Musk declares? Or can Apple make good on its promise to serve up AI-based information without damaging your privacy?

This Tweet is currently unavailable. It might be loading or has been removed.

Your Data Stays on Your Device…Mostly

You’ve heard a lot of bad things about Large Language Model generative AI systems. They steal content they don’t own and profit by spewing it back in slightly altered form. They hallucinate, providing information that’s totally contrary to fact. Everything is input, even the prompts that you type. You might think these AI systems are antithetical to truth and privacy.

One big privacy issue is that your queries and input get sent off to the cloud to be processed by an impenetrable machine-learning algorithm. Apple plans to remedy that concern by having its AI keep the processing local to your own device. That processing takes some serious CPU mojo. If you want to use Apple Intelligence, you’ll need the latest devices: iPhone 15 Pro and Pro Max or an iPad or Mac running an M1 chip and above. Your device must run iOS 18, iPadOS 18, or macOS Sequoia. Yes, those platforms aren’t even available until fall, so you have some time to consider unless you opt into next month's beta.

Oh, and that promise to keep the action local to your device? That’s only mostly true. If you want to “run more complex requests” (Apple’s words), your local AI brain will hand off processing to cloud servers running Apple’s new Private Cloud Compute. At first glance, this hardly seems better than querying ChatGPT, Google Gemini, or another AI chatbot whose processing resides in the cloud. What’s different?

What Is Private Cloud Compute?

Why is this server different from other servers? In a detailed blog post, Apple explains exactly what security problems exist when private data is processed in the cloud and exactly how its design process solves those problems.

Apple does have a reputation for building security and privacy into its devices. Macs aren’t immune to malware, but they don’t need antivirus protection as desperately as PCs do. With iOS, Apple built on learnings from macOS design to produce the most secure mobile device around. Almost all the hackers and security experts at the Black Hat conference use iPhones—they know. As for server hardware, we learned years ago that Apple locks down new servers by physically destroying the keys that would allow any modification. The first line of defense for Private Cloud Compute is the hardware—specialized servers using Apple silicon and a hardened operating system.

Recommended by Our Editors

The Best AI Chatbots for 2024

How Do You Solve a Problem Like Generative AI? Poison the Well

Everything You Missed from WWDC 2024: Apple Intelligence, iOS 18, More

The blog post details how cloud computing fails at privacy and recounts how Private Cloud Compute avoids those pitfalls. For example, “stateless computation” means the data from your query is isolated from any other processing and isn’t retained after the server provides its answer. Your device’s connection to the server uses end-to-end encryption. Specialized access for debugging or other privileged purposes simply doesn’t exist—by design. And the whole system is architected so outside security experts can analyze and verify all security claims.

That doesn’t mean security experts will have to arduously scrutinize the system to evaluate it. Rather, with every release of the server-side software, Apple will publish “software images of every production build of PCC publicly available for security research.” Any researcher who finds a flaw in the system could earn a reward by reporting it through the Apple Security Bounty program.

Is It Secure? Is It Private?

So, is Private Cloud Compute as pristine and perfect as Apple claims? It seems we'll get the chance to find out from third-party experts whether it makes the grade. Certainly, there’s no reason yet to go ballistic and ban Apple devices from workplaces. Even when you’ve upgraded to the necessary hardware and Apple has released Apple Intelligence into the wild, you don’t have to use it. If you don’t want to deal with Apple Intelligence, just don’t opt-in.

Like What You're Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

Does Apple Intelligence Protect Your Privacy? (2024)
Top Articles
Latest Posts
Article information

Author: Moshe Kshlerin

Last Updated:

Views: 6630

Rating: 4.7 / 5 (77 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Moshe Kshlerin

Birthday: 1994-01-25

Address: Suite 609 315 Lupita Unions, Ronnieburgh, MI 62697

Phone: +2424755286529

Job: District Education Designer

Hobby: Yoga, Gunsmithing, Singing, 3D printing, Nordic skating, Soapmaking, Juggling

Introduction: My name is Moshe Kshlerin, I am a gleaming, attractive, outstanding, pleasant, delightful, outstanding, famous person who loves writing and wants to share my knowledge and understanding with you.