Bloomberg's AI-Generated News Summaries Had At Least 36 Errors Since January (nytimes.com)
- Reference: 0176866593
- News link: https://news.slashdot.org/story/25/03/30/1946224/bloombergs-ai-generated-news-summaries-had-at-least-36-errors-since-january
- Source link: https://www.nytimes.com/2025/03/29/business/media/bloomberg-ai-summaries.html
While Bloomberg announced on January 15 that it would add three AI-generated bullet points at the top of articles as a summary, "The news outlet has had to correct at least three dozen A.I.-generated summaries of articles published this year." (This Wednesday they published a "hallucinated" date for the start of U.S. auto tariffs, and earlier in March claimed president Trump had imposed tariffs on Canada in 2024, while other errors have included incorrect figures and incorrect attribution.)
> Bloomberg is not alone in trying A.I. — many news outlets are figuring out how best to embrace the new technology and use it in their reporting and editing. The newspaper chain Gannett [2]uses similar A.I.-generated summaries on its articles, and The Washington Post has a [3]tool called "Ask the Post" that generates answers to questions from published Post articles. And problems have popped up elsewhere. Earlier this month, The Los Angeles Times removed its A.I. tool from an opinion article after the technology described the Ku Klux Klan as something other than a racist organization.
>
> Bloomberg News said in a statement that it publishes thousands of articles each day, and "currently 99 percent of A.I. summaries meet our editorial standards...." The A.I. summaries are "meant to complement our journalism, not replace it," the statement added....
>
> John Micklethwait, Bloomberg's editor in chief, laid out the thinking about the A.I. summaries in [4]a January 10 essay , which was an excerpt from a lecture he had given at City St. George's, University of London. "Customers like it — they can quickly see what any story is about. Journalists are more suspicious," he wrote. "Reporters worry that people will just read the summary rather than their story." But, he acknowledged, "an A.I. summary is only as good as the story it is based on. And getting the stories is where the humans still matter."
A Bloomberg spokeswoman told the Times that the feedback they'd received to the summaries had generally been positive — "and we continue to refine the experience."
[1] https://www.nytimes.com/2025/03/29/business/media/bloomberg-ai-summaries.html
[2] https://www.theverge.com/2024/5/16/24158531/gannett-ai-generated-overviews-usa-today-memo
[3] https://www.washingtonpost.com/ask-the-post-ai/
[4] https://www.bloomberg.com/news/articles/2025-01-10/8-ways-ai-will-transform-journalism
Only 36? (Score:3)
The real question: did the summary AI only make 36 errors or did only 36 errors get published? The difference is that the summary AI could be making a lot more errors but a human editor is accepting or rejecting summaries generated by the summary AI and incorrectly accepted 36 that contained errors.
How does this compare to human error rate? (Score:2)
Just curious...
Re: (Score:2)
Well humans at least have editors or fact checkers (remember those people?) The whole point in AI writing stories is not having to pay any staff. Errors are minor details when you're saving the company money.
Re: (Score:2)
It's not like the general public actually fact checks the news that they consume anyways. My guess is thousands of people consumed this garbage without batting an eye till someone really dug into it.
Re: (Score:2)
Very little fact checking gets done these days. Journalism used to be a solidly middle class job but now it's something people do to pay the rent between jobs. The number of people actually doing investigative reporting as a career is vanishingly small. Trusting your news article to be correct and fact checked is not a reality anymore outside of a handful of places like wsj and nyt
Google (Score:2)
Just a couple of days ago my tablet gave me a notification about a famous singer retiring and cancelling the rest of his tour dates - but of course left out the name in the notification. (Standard clickbait.) I clicked through and saw it was an 89-year-old guy whose name I d8d recognize but wasn't really that famous (and have since forgotten who it was).
But I'm sure you're familiar with Google searches - beneath the main results they have "People also asked ..." one of which was when did the singer die. As
Where does the ai get their info from? (Score:2)
If the AI is getting its up-to-date facts from the major news outlets and the major news outlets are using AI - I forsee a problem.
Re: (Score:1)
Now he has said that he doesn't care if manufacturers raise prices. And somehow this all culminates in America becoming the motherland of all automobiles.