My Blog
Technology

Apple Says Its AI Sets a ‘New Standard’ for Privacy and Invites Security Experts to Test It

Apple Says Its AI Sets a ‘New Standard’ for Privacy and Invites Security Experts to Test It
Apple Says Its AI Sets a ‘New Standard’ for Privacy and Invites Security Experts to Test It


When it comes to using generative AI, can you trust Apple? 

That was a question the company went to great lengths to answer with its rollout of “Apple Intelligence,” the catchphrase for all of the gen AI functionality it’s promised to bring to iPhone, iPad and Mac users in the next versions of its operating system software this fall.  

AI Atlas art badge tag

Apple CEO Tim Cook and his team, speaking during the keynote address at the company’s annual developer conference on Monday, described Apple Intelligence as a “personal intelligence system” that understands the context of all your personal data so that it can deliver “intelligence that’s incredibly useful and relevant” and thus make your “devices even more useful and delightful.”

Watch this: Apple Intelligence: What to Know About Apple’s Gen AI

In order to make the meaningful connections required “to understand and create language and images, take action across apps, and draw from personal context to simplify and accelerate everyday tasks,” Apple needs you to let it mine and process all the data stored in the software and services you use across its devices. That includes texts, messages, documents, emails, photos, audio files, videos, images, contacts, calendars, search history and Siri conversations.   

Then using its gen AI large language models and its custom computer chips to crunch that information, Apple says it will be able to help you write emails and text; transcribe and summarize messages; edit your grammar; easily check messages, emails and calendars for upcoming events; clean up photos; create a memory movie; and get better search results using Siri and the Safari browser. 

apple intelligence genmoji

Genmoji examples that Apple showed off during its keynote.

Apple/Screenshot by CNET

You’ll also be able to create and share original Genmojis, gen AI-enabled emojis generated from a natural language description you provide (example: smiley face relaxing, wearing cucumbers) or based on photos of your friends and family.

All that requires that you trust Apple to keep your data private and secure. Which is why the company said in its keynote, in its general press release, in a privacy press release and in a post on its security site that it’s created a “new standard for privacy in AI.” 

Analysts are willing, so far, to give Apple the benefit of the doubt, with one security researcher also countering Elon Musk’s claims on his social media site X Monday that the OpenAI deal may undermine Apple users’ security. 

“Apple has made it clear they intend to keep data private both on device and in the cloud,” said Carolina Milanesi, a longtime Apple analyst who is founder of the consultancy The Heart of Tech. “It is clear that they are being very transparent about their technology and they are controlling the end-to-end experience. Most consumers trust Apple and because of the return they will see with Apple Intelligence they will not think about it twice.”

AI privacy is all about trust

A collage of Apple software names and Tim Cook

Apple Intelligence was at the heart of everything Apple showed off at WWDC.

Apple/Amy Kim/CNET

To be sure, Apple’s not the only AI company asking you to trust it with all your data. Google, Microsoft, Meta and others aim to offer you new ways of doing things that they say are only possible with gen AI, which likewise will need their LLMs and gen AI chatbots to ingest and digest your data so they can AI-ify it. And they also say they will protect your privacy and not share personally identifiable information with anyone.

But what gives IDC analyst Francisco Jeronimo a bit more confidence in Apple’s approach is that the company’s brand and business model are based on delivering user privacy. Unlike Google and Meta, which make most of their money by delivering lucrative personalized ads to users based on knowing something about their personal preferences (again, they say user data is anonymized and never shared), Apple makes its money from hardware, like the iPhone, and from services including the App Store, iTunes and Apple TV. 

“We all know that Apple doesn’t make money from selling our data, unlike other players. It’s one of their ways to differentiate themselves from their competitors,” Jeronimo said in an interview. “If we can’t trust Apple with all the data, then who can we trust?”

Only using the data needed

apple intelligence feature on iphone

Apple Intelligence needs your data. Apple has a unique plan on how it can protect user data.

Apple/Screenshot by CNET

Apple’s new standard for AI security is about making sure your data is protected and secure, whether the ingesting, digesting and manipulation of all that data is done on your personal device (also known as on-device or local processing) or if a complex AI task needs to be handed off to more powerful computer servers in the cloud running custom Apple chips.

Apple’s promise, as part of its new Private Cloud Compute standard, is that, just like how it handles on-device processing, the company “uses your data only to fulfill your request, and never stores it, making sure it’s never accessible to anyone, including Apple.”

In a press briefing with reporters after the WWDC keynote, software chief Craig Federighi discussed Apple’s private-cloud approach and the amount of personal information needed to provide context-based intelligence. 

Watch this: Apple Intelligence: What to Know About Apple’s Gen AI

“Cloud computing typically comes with some real compromises when it comes to privacy assurances because if you’re going to be making requests to the cloud, well, the cloud traditionally could receive that request, and any data included in it, and go right into the log file, save it to a database, maybe put it in a profile about you,” he said, noting that you’re “putting a lot of faith” in companies to protect your information.

“As we move forward with AI, and you rely more and more on more personal kinds of requests,” Federighi added, “it’s essential that you can know that … not anyone else would have access to any of the information used to process requests.”

IDC’s Jeronimo also applauds the company for inviting independent security researchers and cryptographers to inspect the code that runs on Private Cloud Compute servers to assess if it works the way Apple claims. 

“Security researchers need to be able to verify, with a high degree of confidence, that our privacy and security guarantees for Private Cloud Compute match our public promises,” the company said in a security blog post on Monday.

“Hypothetically, then, if security researchers had sufficient access to the system, they would be able to verify the guarantees. But this last requirement, verifiable transparency, goes one step further and does away with the hypothetical: security researchers must be able to verify the security and privacy guarantees of Private Cloud Compute, and they must be able to verify that the software that’s running in the PCC production environment is the same as the software they inspected when verifying the guarantees.”

‘The hardest problem in computer security’

Matthew Green, an associate professor of computer science who teaches cryptography at Johns Hopkins University, said in a thread on X that he appreciates Apple’s approach but still has questions. They include whether users can opt out of having their requests processed in the “private cloud.” Apple, he says, hasn’t yet detailed its plans.

“Building trustworthy computers is literally the hardest problem in computer security. Honestly, it’s almost the only problem in computer security,” Green wrote after reading through the Private Cloud Compute blog post. “But while it remains a challenging problem, we’ve made a lot of advances. Apple is using almost all of them.”  

We’ll have to wait and see how this unfolds. Apple Intelligence will be available in beta as part of iOS 18, iPadOS 18 and MacOS Sequoia this fall in the US, the company said.

But even if your eyes glaze over when reading about AI, privacy and security, on-device processing and cloud computing, it’s worth knowing something about it all. AI-enabled devices will be the fastest-growing segment for smartphones and PCs, according to IDC. The market researcher believes that AI smartphones will reach 170 million units in 2024 and that AI PCs will account for nearly 60% of all PCs sold by 2027.

AI will be an inescapable part of our next-generation devices — and our daily lives.

Editors’ note: CNET used an AI engine to help create several dozen stories, which are labeled accordingly. The note you’re reading is attached to articles that deal substantively with the topic of AI but are created entirely by our expert editors and writers. For more, see our AI policy.



Related posts

Today’s NYT Wordle Hints, Answer and Help for August 4, #1142

newsconquest

Best Internet Providers in Sugar Land, Texas

newsconquest

‘Thor: Love and Thunder’ — When Will the Marvel Flick Arrive On Disney Plus?

newsconquest