My Blog
Technology

Your face and images helped build ChatGPT and Lensa. Is that fair?



Comment

This article is a preview of The Tech Friend newsletter. Sign up here to get it in your inbox every Tuesday and Friday.

Sure, that drunk selfie you posted on Instagram might be personally embarrassing. Now imagine that selfie is also training fuel for an artificial intelligence system that helps put an innocent person in jail.

Welcome to the age of artificial intelligence. What you do with your face, your home security videos, your words and the photos from your friend’s art show are not just about you. Almost entirely without your true consent, information that you post online or that is posted about you is being used to coach AI software. These technologies could let a stranger identify you on sight or generate custom art at your command.

Good or bad, these AI systems are being built with pieces of you. What are the rules of the road now that you’re breathing life into AI and can’t imagine the outcomes?

I’m bringing this up because a bunch of people have been trying cool AI technologies that are built on all the information we’ve put out into the world.

My colleague Tatum Hunter spent time evaluating Lensa, an app that transforms a handful of selfies you provide into artistic portraits. And people have been using the new chatbot ChatGPT to generate silly poems or professional emails that seem like they were written by a human. These AI technologies could be profoundly helpful but they also come with a bunch of thorny ethical issues.

Tatum reported that Lensa’s portrait wizardly comes from the styles of artists whose work was included in a giant database for coaching image-generating computers. The artists didn’t give their permission to do this, and they aren’t being paid. In other words, your fun portraits are built on work ripped off from artists. ChatGPT learned to mimic humans by analyzing your recipes, social media posts, product reviews and other text from everyone on the internet.

Beyond those two technologies, your birthday party photos on Facebook helped train Clearview AI facial recognition software that police departments are using in criminal investigations.

Being part of the collective building of all these AI systems might feel unfair to you, or amazing. But it is happening.

I asked a few AI experts to help sketch out guidelines for the new reality that anything you post might be AI data fuel. Technology has outraced our ethics and laws. And it’s not fair to put you in the position of imagining whether your Pinterest board might someday be used to teach murderous AI robots or put your sister out of a job.

“While it’s absolutely a good individual practice to limit digital sharing in any case where you don’t or can’t know the afterlife of your data, doing that is not going to have a major impact on corporate and government misuse of data,” said Emily Tucker, executive director at the Center on Privacy and Technology at Georgetown Law. Tucker said that people need to organize to demand privacy regulations and other restrictions that would stop our data from being hoarded and used in ways we can’t imagine.

“We have almost no statutory privacy protections in this country, and powerful institutions have been exploiting that for so long that we have begun to act as if it’s normal,” Tucker said. “It’s not normal, and it’s not right.”

Mat Dryhurst and Holly Herndon, artists in Berlin, helped set up a project to identify artists’ work or your photos from popular databases used to train AI systems. Dryhurst told me that some AI organizations including LAION, the massive image collection used to generate Lensa portraits, are eager for people to flag their personal images if they want to yank them from computer training data sets. (The website is Have I Been Trained.)

Dryhurst said that he is excited about the potential of AI for artists like him. But he also has been pushing for a different model of permission for what you put online. Imagine, he said, if you upload your selfie to Instagram and have the option to say yes or no to the photo being used for future AI training.

Maybe that sounds like a utopian fantasy. You have gotten used to the feeling that once you put digital bits of yourself or your loved ones online, you lose control of what happens next. Dryhurst told me that with publicly available AI, such as Dall-E and ChatGPT, getting a lot of attention but still imperfect, this is an ideal time to reestablish what real personal consent should be for the AI age. And he said that some influential AI organizations are open to this, too.

Hany Farid, a computer science professor at the University of California at Berkeley, told me that individuals, government officials, many technology executives, journalists and educators like him are far more attuned than they were a few years ago to the potential positive and negative consequences of emerging technologies like AI. The hard part, he said, is knowing what to do to effectively limit the harms and maximize the benefits.

“We’ve exposed the problems,” Farid said. “We don’t know how to fix them.”

For more, watch Tatum discuss the ethical implications of Lensa’s AI portrait images:

Your iPhone automatically saves to Apple’s cloud copies of many things on your phone, including your photos and your gossipy iMessage group chats. Apple said this week that it will start to give iPhone owners the option of fully encrypting those iCloud backups so that no one else — including Apple — can access your information.

Encryption technology is controversial because it hides information of both good guys and bad guys. End-to-end encryption stops crooks from snooping on your video call or stealing your medical records saved in a cloud. But the technology can also shield the activity of terrorists, child abusers and other criminals.

Starting later this year, Apple will let you decide for yourself whether you want to encrypt the backups saved from your iPhone. If you’re privacy conscious, you can turn on this feature now.

First you need to sign up for the Apple Beta Software Program, which gives you access to test versions of the company’s next operating systems while Apple is still tinkering with them. After you sign up, you must download and install the test software on all your Apple devices. You will then have the option to turn on fully encrypted iCloud backups.

One downside: You might encounter hiccups with using operating software that isn’t ready for release to every iPhone or Mac.

Also, read advice from Heather Kelly about how to keep your texts as private as possible.

Brag about YOUR one tiny win! Tell us about an app, gadget, or tech trick that made your day a little better. We might feature your advice in a future edition of The Tech Friend.



Related posts

Lionel Messi World Cup Instagram post beats egg to become most liked ever

newsconquest

‘The Last of Us’ Release Schedule: When Does Episode 3 Hit HBO Max?

newsconquest

Okta confirms loads of businesses may have been affected in January hack

newsconquest

Leave a Comment