My Blog
Technology

How AI is already shaping young people’s future, careers and goals


The Washington Post spoke to young people about how artificial intelligence is shaping their future

Photo illustrations by Sam Cannon for The Washington Post.

A year into the artificial intelligence boom that began with ChatGPT, there’s no shortage of bold predictions about how the technology will change our lives. The way we interact with computers will be upended. Entire movies will be made by single people. Millions of jobs could be eliminated. Robots will soon rule the world.

Regardless of how AI develops, young people will feel the most impact. This GPT generation is being bombarded with declarations about how AI will shape their lives from education to entertainment, but their voices are often missing from the conversation. The Washington Post spoke to some people in this generation to ask how they see AI shaping their future.

She gave up Wall Street wealth for the AI dream.

Sarah Chieng, 22, had it all figured out. A senior at MIT, she’d move to New York after graduating, live near friends and enjoy the trappings of a nearly $400,000-a-year quantitative finance job.

But one thing kept nagging her: she wasn’t going to change the world with that finance job. Instead, Chieng said she knew what kind of job would: artificial intelligence.

After graduation, she wrote a long note to her parents telling them her new plan. She’d be turning down the lucrative job offer in New York in hopes of finding an AI job.

She was nomadic for a while, bouncing from Austin to New York before landing in San Francisco, where she got a short-term contract to work at a company building an AI powered search engine.

Though she’s making far less in her current full-time role at the company than she would’ve in her finance job, Chieng says she feels she’s doing the right thing. Since search engines are the way people get information, she finds power in her work. “I don’t regret it,” she said. “I was giving myself a chance to do something great — like, not great — but something actually impactful.”

She thinks AI art has no heart.

This year, conversations about AI began taking over Hannah Minnix’s classes. The 21-year-old art student at Virginia Commonwealth University in Richmond, Minnix hadn’t thought much about the tech before, but the advent of generative AI tools like OpenAI’s Dall-E that can create images based on simple prompts, had made the tech suddenly relevant to her and her classmates.

“A lot of people are using it as a tool and I have a lot of classmates who hate the idea of it,” Minnix said.

To Minnix, who says the main goal of her art is to tell a story, using AI robs artists of their own potential. AI art has inspired her to use new colors, but overall she finds most of the images messy and less interesting.

“I don’t hate AI, I don’t think it’s destroying the art world,” she said. “A lot of people use it as a tool to create more art but I think it diminishes the emotional depth that they could be having.”

AI is changing the law. He’s studying it anyway.

Craig Dawdy, 24, was in his senior year of college this past winter when generative AI swept his campus in Hamilton, Ontario. A member of student government, Dawdy worried school administrators might unfairly penalize students amid a panic over student cheating. He’d read articles about people being accused — even reprimanded — for using AI without solid evidence.

“Going through anything like that is a traumatic experience,” he said. “This grand institution is accusing you of all these things. It can be horrible”

In the end, the school’s policies “ended somewhere in the middle,” between a complete ban and an embrace.

Now, Dawdy is a first-year law student at Dalhousie University in Halifax, Nova Scotia, and AI is still a big topic on campus. During orientation week, a school presenter brought up an infamous case of a lawyer using ChatGPT to write a legal brief that was littered with made-up cases.

He’s largely calm about the idea that AI systems might soon replace many positions in the legal field. Dawdy has used AI to write emails and says he feels the tech doesn’t capture nuances very well, something that’s key for legal arguments.

“The real truth is lawyers and law students … think the best work will always be produced by themselves,” he said. “No one wants to rely on a machine.”

He wants climate-friendly AI.

Sathvik Redrouthu is spending his high school days trying to make artificial intelligence climate friendly.

Redrouthu, a senior at the Thomas Jefferson High School for Science and Technology in Virginia, said the AI boom is fueled by high powered computing chips, but those raw materials consume large amounts of electricity.

That inspired the 18-year-old and a few of his classmates to create computer chips powered by light, which is more energy efficient. They have discussed their prototype with several prominent AI leaders, such as OpenAI’s Sam Altman, Redrouthu said.

Redrouthu and his friends have received $50,000 in funding from a Peter Thiel-backed venture-capital fund.

Redrouthu said he’s applied to college, but is very focused on his business. He knows that anytime not spent working on his company could spell doom for his start-up.

“A lot of the problems with AI, they’re so important that they have a lot of people that are coming to try and solve them,” he said. “So if we spent too long doing something else, then we’ll have a competitor that goes and solves problems before us.”

AI called her a cheater. She wasn’t.

Jessica Zimny, 20, isn’t too skilled in artificial intelligence. But her political science professor didn’t agree.

A student at Midwestern State University in Wichita Falls, Tex., Zimny turned in a short, 302-word essay for a summer class. When software flagged it as 67 percent AI-written, her professor promptly gave her a zero.

Zimny, 20, implored her professor, the head of the school’s political science department and a university dean, to reconsider her grade — to no avail.

To ensure she has proof she didn’t cheat, Zimny says she now takes videos of her screen as she does her homework. She isn’t excited to be a student in the era of ChatGPT.

“I’m an artist,” she said. “I don’t like the idea that people thinking that my work is copied, or that I don’t do my own things originally.”

AI has become a family pastime.

Hassan El Mghari had no experience building artificial intelligence models. But the 25-year-old software engineer knew how to get computer programs to talk to each other.

With a few days of work, El Mghari was able to build RoomGPT, a tool that allowed anyone to upload a photo of a room and experiment with different styles of interior design. Within a week, he had 500,000 users who were amazed at what the AI could do.

He built nine more AI side projects, all open source, that got him 8.5 million unique visitors. In October, El Mghari started working on generative AI full time.

His father, whose background is in accounting and finance, has also caught the generative AI bug. Since learning to code a couple years ago, he has been partnering with El Mghari to build tools to help his mom, a researcher who studies special education. Her process is typically very manual, said El Mghari. She interviews teachers, goes through the transcripts and pulls out significant excerpts and identifies themes — a process that can be automated most of the time with AI.

“My mom found it pretty useful, as a starting point at least.” He and his dad plan to fine-tune a couple models and potentially sell it to other researchers.

He thinks AI is for everyone, not just math geeks.

Rafael Perez, 18, felt like an outsider in his computer science class at Lowell High School, a predominantly White and Asian magnet school in San Francisco, because he didn’t start coding until he was a sophomore. So Perez, whose parents are from Nicaragua, signed up for classes in Python, JavaScript and game design at Mission Bit, a local nonprofit to get young people of color into STEM, where he later joined the student advisory board.

For Perez, discovering ChatGPT as a high school senior was a gateway to AI. At the University of California at Irvine, he was intrigued by informatics, a major that looks at the intersection between AI and the world, like deploying the technology in health care or education.

Without ChatGPT, “I probably would have stuck to computer science engineering,” said Perez.

The heavy math parts of engineering classes could be alienating and leave him feeling ill-suited for the field, despite his natural ability for developing games and apps. ChatGPT, on the other hand, made Perez feel certain that AI would eventually touch everyone’s lives in a way that is more welcoming than code. “It’s branded for all of us,” he said.

He wants to use generative AI to help cure cancer.

Arnav Shah, wanted to be a professional soccer player until he was diagnosed with a rare form of blood cancer at 10 and told to avoid physical activity. He decided to spend his time focusing on his career and learning computer programming. Shah taught himself math and machine learning by studying textbooks for classes at Stanford, MIT and the University of Toronto. To balance out the theory, he used his software knowledge to try to write machine learning algorithms from research papers.

By the time he was 15, he felt he’d reached the limit of independent study. So he got in touch with Altman, the chief executive of OpenAI.

If Altman gave him a chance, Shah told the tech exec, he would do everything in his power to become the best researcher Altman had ever seen. Altman replied, introducing him to OpenAI research scientist Joshua Achiam, who mentored Shah for a year, as the two worked on a battery model to help with climate change. Shah moved on to work on AI at another start-up, HelixNano, which is developing mRNA vaccines to treat cancer.

Shah met Achiam in person for the first time at a party after OpenAI’s DevDay. “I don’t know if you can infer from my personality and interests and stuff, I’ve never been to a party before,” Shah said dryly.

Earlier, at DevDay, Shah said he hoped all the top engineering talent in the audience would consider more meaningful uses for generative AI, like applying it to other fields. “People looked at the amazing achievement that it is GPT-4 and they said, ‘How can I apply this technology to solve some consumer facing problems so I can make money out of it?’” Shah said.

He thinks AI is inspiring, but dangerous.

As a high school student, Okezue Bell, 17, was fascinated with machine learning, but worried about the potential negative impact of the technology he wanted to build. But when a student detection camera malfunctioned and prevented him from taking a test, Bell began to devote more time to working on responsible AI.

Fidutam, the microfinancing nonprofit he started while in high school, is now focused on getting diverse feedback on AI public policy from more than 1,500 members in about 50 countries, he said.

When ChatGPT came out, Bell decided to incorporate it into regular online workshops he hosted in places like Nigeria and Zambia, where students were studying for their national exams, he said. Using tests from the online education start-up Udemy, the students appeared to improve in math and science, said Bell. In workshops in the United States, he used ChatGPT to help students learn about menstrual health and climate change, topics that weren’t necessarily available to them, he said.

With the frenzy around generative AI, Bell hasn’t hosted a workshop in awhile, but hopes to again soon. “One of the biggest beauties about it was just seeing how magical it seemed to the students — that they could ask a question over and over again repeatedly in different ways and get explanations that suited their process of thinking,” he said.

She’s leading a new generation of tech watchdogs.

Dinner table conversation at Sneha Revanur’s house was lively growing up. Her family lived in San Jose, the biggest city in Silicon Valley. Both her parents worked in tech and her sister, who studied computer science and its use for social good, now works as head of AI at a start-up developing AI-powered tools to search and summarize documents. When Revanur grew disillusioned with the idea of working in tech, instead deciding to fight algorithmic bias, it took everyone by surprise.

“We’ve got so many jokes about how she’s building the AI and I’m bringing out the red tape,” said Revanur, who founded the youth-led nonprofit Encode Justice in her sophomore year in high school.

At first, Revanur’s parents worried that her interest in activism around the risks of technology would leave her “branded as a Luddite,” Revanur said.

Instead, skepticism toward Encode Justice helped Revanur prepare for a whirlwind of AI policy debates on Capitol Hill, where she worked as a summer intern for the Center for AI and Digital Policy and represented Encode Justice in high-profile briefings with the likes of Vice President Harris.

Revanur’s first campaign helped oppose California’s Prop 25, which sought to replace cash bail with a risk assessment algorithm that has been shown to discriminate against Black people. Some were dubious of the campaign, viewing the software as an efficient tool that could improve on biased human judges.

When Revanur speaks on panels about AI policy, experts still argue that technology’s increased efficiency is better than the status quo, regardless of the risks.

“I have talked to so many people who think what I’m doing is all great. But at the end of the day, it’s slowing down innovation that is going to dramatically improve the human condition,” she said. “It’s a lot of the same stuff that I heard when I was first starting out.”

Revanur has now seen first hand how having more young people in the room has helped lawmakers view issues like digital surveillance in schools as a concrete problem.

“Even if the people who are building it today don’t necessarily experience its impact, it’s our generation that’s going to bear the brunt,” she said.

Editing by Monique Woo, Alexis Fitts, Yun-Hee Kim and Karly Domb Sadof.

Related posts

World Snooker Championship 2024: How to Watch Live From Anywhere

newsconquest

‘Prey’ Review: The Predator Movie You’ve Been Praying For

newsconquest

On Social Media, Hunting for Voter Fraud Becomes a Game

newsconquest

Leave a Comment