Apple touts the $3,499 Vision Pro, arriving on Friday, as the next big thing after the smartphone. When you wear one, you see the world around you with computer-generated images and information superimposed on top. You might be intrigued or think the idea of a face computer is dumb. Regardless, you might want to know this device collects more data than any other personal device I’ve ever seen.
If this is our potential future, then I’ve got lots of questions. At launch, Apple has taken steps to restrict some of the data collected by the Vision Pro, including what people’s eyes are looking at. That’s a very good thing. But there are also new kinds of risks Apple doesn’t appear to have addressed, or might not be able to given how the tech works.
I see a privacy mess waiting to happen. Among the new dilemmas flagged to me by privacy researchers: Who gets to access the maps these devices build of our homes and data about how we move our bodies? A Vision Pro could reveal much more than you realize.
The last time a gadget raised these sorts of societal questions was in 2013 with Google Glass. It contained a small screen and just one camera that people worried might be used to covertly record them. Glass was so reviled, the nickname for people wearing them was Glassholes. Now we have to brace for, perhaps, the Vision Bros.
Most of my Vision Pro concerns are, at this point, speculative. But it matters to all of us if the technology Apple and others are inventing to replace smartphones could end up supercharging online problems like location tracking, the loss of anonymity and data brokers gathering the intimate details of our lives.
“Should we as a society really be going headfirst into virtual reality and augmented reality in our lives before we have strong privacy legislation?” says Cooper Quintin, senior public interest technologist at the Electronic Frontier Foundation. “Data brokers already have way too much intimate knowledge about everything I do. I don’t want them to have this level of knowledge.”
Adding to my concern is that Apple, which has staked its reputation on privacy, wouldn’t answer most of my questions about how the Vision Pro will tackle these problems. Nor has it, to date, allowed The Washington Post to independently test the hardware.
But from Apple’s limited statements, as well as conversations with developers making apps for the Vision Pro, I’ve been able to piece together a picture of its initial privacy strategy — and what it’s not talking about.
Homes and bodies, up for grabs?
I’m pretty sure Apple does not want to be known for creating the ultimate surveillance machine. But to make magical things happen inside its goggles, apps need loads of information about what’s happening to the user and around them. Apple has done more than rivals like Meta to limit access to some of this data, but developers are going to keep pressing for more.
“There’s a tension between having these types of experiences and your privacy,” says Jarrett Webb, technology director at design firm Argo, who has been exploring developing for the Vision Pro. “It has to get this data to get an understanding of the world to invoke these experiences.”
And once developers have data, it’s hard to ensure they don’t also use it for purposes that might feel like a violation.
On some issues, Apple has drawn a line in the sand — at least initially. To combat people being surreptitiously filmed with the Vision Pro, there’s an indicator on the device’s front screen when it’s shooting a photo or video. Apple also isn’t allowing third-party Vision Pro apps to access the camera to capture photos and videos. That would, in theory, also prevent third-party apps from doing creepy things like running facial recognition algorithms on people while you’re looking at them.
But privacy researchers tell me photographs alone aren’t the biggest concern here. We have, since the days of Google Glass, come to terms with the idea a smartphone could be filming us at any time.
The new problem is what else the device is gathering: a map of the spaces around you. The device needs to know the contours of the world around you so it can know where to insert digital things into your line of sight.
Understanding what’s in the room around you can be even more invasive than having a photograph of it, says Joseph Jerome, a visiting professor at the University of Tampa and the former policy lead on sensor data at Meta’s Reality Labs.
Vision Pro apps have the ability to access this data, if a user grants permission — like how an iPhone app asks for your location. These worldview maps might just look like a wireframe mesh to a human, but to a computer it reveals a lot.
On a basic level, the Vision Pro might know it’s in a room with four walls and a 12-foot ceiling and window — so far, so good, Jerome says. But then add in that you’ve got a 75-inch television, suggesting you might have more money to spend than someone with a 42-inch set. Since the device can understand objects, it could also detect if you’ve got a crib or a wheelchair or even drug paraphernalia, he says.
Advertisers and data brokers who build profiles of consumers would salivate at the chance to get this data. Governments, too.
Think of it as an extension of the kinds of issues we know can come from someone tracking your location. A phone alone, Jerome says, might be able to report that you’re generally near a hospital or a strip club. “These devices know where you are down to the centimeter, and then they’re combining it with a bunch of other sensors to know exactly what you’re looking at at the same time,” he says.
Apple didn’t answer my questions about what visibility it has into what apps do with this data, or how it plans to vet them. On a website for Vision Pro developers, Apple warns, “It’s your responsibility to protect any data your app collects, and to use it in responsible and privacy-preserving ways.” So users just have to trust them?
Apple Vision Pro is powered by visionOS, which is built on the foundation of decades of engineering innovation in macOS, iOS, and iPadOS.
visionOS features a brand-new three-dimensional user interface controlled entirely by a user’s eyes, hands, and voice. pic.twitter.com/iBMKSCa70g
— Apple Vision Pro News (@AppleVisionPro) January 8, 2024
Other privacy researchers say the risks are even higher that devices like the Vision Pro expose a stream of data about the one thing we can’t change: our bodies.
Information about how you’re moving and what you’re looking at “can give significant insights not only to the person’s unique identification, but also their emotions, their characteristics, their behaviors and their desires in a way that we have not been able to before,” says Jameson Spivak, a senior policy analyst at the Future of Privacy Forum.
Apple has addressed the privacy around one extra-sensitive organ: your eyeballs. The Vision Pro tracks your eyes so you can select things with your gaze like you might move a mouse on a computer. But Apple says it doesn’t share where users look with apps, websites or even itself. Instead, the device only reports what you’ve selected with your gaze after you tap your fingers together, the Vision Pro equivalent of a mouse click.
This is a solid place to start. But what about the rest of the body? Developers tell me apps can get access to a stream of data about users’ movement, right down to the wiggle of a finger.
Researchers at the University of California at Berkeley blew my mind when they explained just how revealing data about how your body moves while dancing could be.
Last year, they discovered they could uniquely and consistently identify about 55,000 different VR users based solely on data about the movement of their head and hands. It’s as useful a fingerprint, maybe more.
And in another study, they used head and hand motion from a game to guess some 40 different personal attributes of people, ranging from age and gender to substance use and disability status.
What’s to stop Vision Pro apps from doing the same? “In cases where that motion data is being streamed to the cloud … even Apple has very little visibility into what is happening to it after it leaves the device,” said one of the researchers, Vivek Nair. “Because this data can’t be entirely eliminated from most applications, our suggestion would be to develop a privacy-preserving tool for VR motion data.”
I asked Apple what it was doing to protect this kind of data. Its response: Crickets.
Mixed-reality devices are “very exciting with huge potential,” says Berkeley computer science Professor James O’Brien. “But I also think that privacy considerations need to be primary design criteria, not afterthought.”