My Blog
Technology

One news publication had an AI tool write articles. It didn’t go well

One news publication had an AI tool write articles. It didn’t go well
One news publication had an AI tool write articles. It didn’t go well




New York
CNN
 — 

News outlet CNET said Wednesday it has issued corrections on a number of articles, including some that it described as “substantial,” after using an artificial intelligence-powered tool to help write dozens of stories.

The outlet has since hit pause on using the AI tool to generate stories, CNET’s editor-in-chief Connie Guglielmo said in an editorial on Wednesday.

The disclosure comes after CNET was previously called out publicly for quietly using AI to write articles and later for errors. While using AI to automate news stories is not new – the Associated Press began doing so nearly a decade ago – the issue has gained new attention amid the rise of ChatGPT, a viral new AI chatbot tool that can quickly generate essays, stories and song lyrics in response to user prompts.

Guglielmo said CNET used an “internally designed AI engine,” not ChatGPT, to help write 77 published stories since November. She said this amounted to about 1% of the total content published on CNET during the same period, and was done as part of a “test” project for the CNET Money team “to help editors create a set of basic explainers around financial services topics.”

Some headlines from stories written using the AI tool include, “Does a Home Equity Loan Affect Private Mortgage Insurance?” and “How to Close A Bank Account.”

“Editors generated the outlines for the stories first, then expanded, added to and edited the AI drafts before publishing,” Guglielmo wrote. “After one of the AI-assisted stories was cited, rightly, for factual errors, the CNET Money editorial team did a full audit.”

The result of the audit, she said, was that CNET identified additional stories that required correction, “with a small number requiring substantial correction.” CNET also identified several other stories with “minor issues such as incomplete company names, transposed numbers, or language that our senior editors viewed as vague.”

One correction, which was added to the end of an article titled “What Is Compound Interest?” states that the story initially gave some wildly inaccurate personal finance advice. “An earlier version of this article suggested a saver would earn $10,300 after a year by depositing $10,000 into a savings account that earns 3% interest compounding annually. The article has been corrected to clarify that the saver would earn $300 on top of their $10,000 principal amount,” the correction states.

Another correction suggests the AI tool plagiarized. “We’ve replaced phrases that were not entirely original,” according to the correction added to an article on how to close a bank account.

Guglielmo did not state how many of the 77 published stories required corrections, nor did she break down how many required “substantial” fixes versus more “minor issues.” Guglielmo said the stories that have been corrected include an editors’ note explaining what was changed.

CNET did not immediately respond to CNN’s request for comment.

Despite the issues, Guglielmo left the door open to resuming use of the AI tool. “We’ve paused and will restart using the AI tool when we feel confident the tool and our editorial processes will prevent both human and AI errors,” she said.

Guglielmo also said that CNET has more clearly disclosed to readers which stories were compiled using the AI engine. The outlet took some heat from critics on social media for not making overtly clear to its audience that “By CNET Money Staff” meant it was written using AI tools. The new byline is just: “By CNET Money.”

Related posts

FCC commissioner calls on Apple and Google to remove TikTok from their app stores

newsconquest

What Is IVF? Here’s Why People Choose It and How It Works

newsconquest

Sony’s PlayStation Access controller offers a new social lifeline for gamers with disabilities

newsconquest