My Blog
Technology

California Builds the Future, for Good and Bad. What’s Next?

California Builds the Future, for Good and Bad. What’s Next?
California Builds the Future, for Good and Bad. What’s Next?


While the task force hasn’t set an exact figure on how descendants of enslaved people might be compensated for overpolicing, mass incarceration and housing discrimination, the economists who advise it estimate that the losses suffered by the state’s Black residents could amount to hundreds of billions of dollars. Whether compensation will actually be approved is yet to be determined.

The reparations conversation shows that California has a unique ability to reckon with its troubled history. But that thinking doesn’t always extend to the future. Artificial-intelligence systems are being used to moderate content on social media, evaluate college applications, comb through employment résumés, generate fake photos and artworks, interpret movement data collected from the border zone and identify suspects in criminal investigations. Language models like ChatGPT, made by the San Francisco-based company OpenAI, have also attracted a lot of attention for their potential to disrupt fields like design, law and education.

But if the success of A.I. can be measured in billion-dollar valuations and lucrative I.P.O.s, its failures are borne by ordinary people. A.I. systems aren’t neutral; they are trained on large data sets that include, for example, sexually exploitative material or discriminatory policing data. As a result, they reproduce and magnify our society’s worst biases. For example, racial-recognition software used in police investigations routinely misidentifies Black and brown people. A.I.-based mortgage lenders are more likely to deny home loans to people of color, helping to perpetuate housing inequities.

This would seem to be a moment where we can apply historical thinking to the question of technology, so that we can prevent the injustices that have resulted from previous paradigm-altering changes from happening again. In April, two legislators introduced a bill in the State Assembly that tries to prohibit algorithmic bias. The Writers Guild of America, which is currently on strike, has included limits on the use of A.I. in its demands. Resistance to excess also comes from inside the tech industry. Three years ago, Timnit Gebru, a head of the Ethical A.I. Team at Google, was fired after she sounded the alarm about the dangers of language models like GPT-3. But now even tech executives have grown wary: In his testimony before the Senate, Sam Altman, the chief executive of OpenAI, conceded that A.I. systems need to be regulated.

The question we face with both reparations and A.I. is in the end not that different from the one that arose when a Franciscan friar set off on the Camino Real in 1769. It’s not so much “What will the future look like?” — although that’s an exciting question — but “Who will have a right to the future? Who might be served by social repair or new technology, and who might be harmed?” The answer might well be decided in California.


Laila Lalami is the author of four novels, including “The Other Americans.” Her most recent book is a work of nonfiction, “Conditional Citizens.” She lives in Los Angeles. Benjamin Marra is an illustrator, a cartoonist and an art director. His illustrations for Numero Group’s “Wayfaring Strangers: Acid Nightmares” were Grammy-nominated.

Related posts

Airbnb’s industry is booming — and charges are emerging

newsconquest

Don’t Miss Out on the Best Prime Day TV Deals Happening Now

newsconquest

‘The Mandalorian’ Season 3: New Trailer, March Release Date and Baby Yoda’s Future

newsconquest