But in conversations with Amazon executives after the launch event, two concerns lingered: Alexa 2.0 appears to be very much a work in progress. I watched it repeatedly get questions wrong. And can we trust it in the places we use smart speakers at home, like children’s bedrooms?
Amazon founder Jeff Bezos owns The Washington Post and interim CEO Patty Stonesifer sits on Amazon’s board. But I review all tech with the same critical eye.
Here’s what Amazon showed during its launch event: Say, “Alexa, let’s chat,” and an Echo smart speaker enters a special conversational mode. In this mode, Alexa acts like a speaking version of a chatbot, bantering back and forth about a wide range of topics. Amazon showed people asking it for advice about travel and up-to-date information, as well as writing, asking it to draft stories and emails. It showed that people can interrupt and redirect the AI mid-sentence.
“You can have a near-human-like conversations with Alexa,” said Dave Limp, Amazon’s senior vice president of devices and services.
A version of that “let’s chat” function will be available to owners of Echo smart speakers some time this year, the company said.
Unfortunately, Amazon didn’t make the new Alexa available for journalists to probe and chat with. Perhaps this is why: The live portion of Limp’s demonstrations of it didn’t quite live up to the “near-human” description.
When Limp was presenting onstage, the AI sometimes paused awkwardly before responding. Amazon says that in normal conditions, the new Alexa should be about five to 10 times faster than ChatGPT.
During a later interview I had with Limp, the AI repeatedly struggled to get the right answer to questions Limp posed live. One time, when Limp asked Alexa for advice on a museum to visit in Washington, D.C., the AI recommended a museum in another part of the country.
In response to a question I proposed about why there is smoke today in San Francisco, Alexa said there was no “smoke” — and then followed it with a non-sequitur about how the air quality index in the city is high today. (Smoke from fires in Oregon has blanketed the city for the last day.)
And when I suggested Limp ask if it I should get the new coronavirus vaccine, the new Alexa was unaware that one existed.
Limp acknowledged that his team had work to do but said Amazon had a large opportunity to bring generative AI technology to consumers’ homes. As of 2022, 71.6 million Americans used Alexa monthly, according to Insider Intelligence. While a majority of American adults have heard of ChatGPT, only 14 percent have actually tried using it, according to a survey conducted in March by the Pew Research Center.
“What we’ll launch this year behind ‘let’s chat’ will be fun to play with, but it’s not the endgame,” Limp said, because it only has access to a fraction of all the capabilities built into today’s Alexa. A more complete remake of Alexa would happen “through the end of next year,” Limp said.
“We’re going to have this remarkable assistant that’s going to be much more proactive,” he said. “It’s going to be unbelievably personalized because we’re going to have context on you.”
For example, he said people who use Alexa to control a smart home will see immediate benefits with the AI’s ability to understand their commands. You could just say “turn on the new light” and it would know what you mean.
But putting the capabilities of a chatbot into a smart speaker also raises another problem: trust. Millions of children use Echo smart speakers — and chatbots have a reputation for sometimes going off the rails, or “hallucinating”. Testing generative AI products this year, I’ve found them drawing on questionable sources, and giving out bad advice on sex, drugs and even eating disorders.
“The reason I would trust it is we are not a start-up with a couple hundred people,” Limp said. Over the last nine years, he said, Amazon has learned a lot about how people interact with Alexa and how to keep the content appropriate. (For example, Limp said, try to swear to Alexa or ask an inappropriate question and it will just respond with a sound — it won’t even attempt to answer the question.)
That’s one reason Amazon moved slowly into this sort of product, Limp said. “We didn’t want to hallucinate in your home that much.”
Limp said Amazon has been working on guardrails for the new Alexa. For one, he said, Amazon would work to keep the new Alexa from disclosing personal information that it might have gleaned from training data, most of which comes from the web and doesn’t include private information from Amazon customers.
Amazon also announced a separate new conversational Alexa specifically designed to answer questions for children about topics like animals in a “safe” manner. That product, which is slated to debut before the holiday season, has been trained on a more limited set of sources and works to keep conversations on track. “We wanted to make sure we had a safe sandbox,” Limp said.
Still, it seems like any child could just say, “Alexa, let’s chat” and launch the new general-purpose Alexa. Would Limp put it in the bedroom of a 10-year-old?
“I think so, from the experience that I’ve had,” he said. “We’re still making some mistakes where sometimes we’re not retrieving exactly the right answer — that’s going to get better over the next 30 and 60 days. But I don’t see a lot of examples where we’re getting to a hateful situation. But if we did, then we’ll certainly be cautious about it.”