Google on Thursday admitted that its AI Overviews tool, which uses artificial intelligence to respond to search queries, needs improvement.
While the internet search giant said it tested the new feature extensively before launching it two weeks ago, Google acknowledged that the technology produces "some odd and erroneous overviews." Examples include suggesting using glue to get cheese to stick to pizza or drinking urine to pass kidney stones quickly.
While many of the examples were minor, others search results were potentially dangerous. Asked by the Associated Press last week which wild mushrooms were edible, Google provided a lengthy AI-generated summary that was mostly technically correct. But "a lot of information is missing that could have the potential to be sickening or even fatal," said Mary Catherine Aime, a professor of mycology and botany at Purdue University who reviewed Google's response to the AP's query.
For example, information about mushrooms known as puffballs was "more or less correct," she said, but Google's overview emphasized looking for those with solid white flesh - which many potentially deadly puffball mimics also have.
In another widely shared example, an AI researcher asked Google how many Muslims have been president of the U.S., and it responded confidently with a long-debunked conspiracy theory: "The United States has had one Muslim president, Barack Hussein Obama."
The rollback is the latest instance of a tech company prematurely rushing out an AI product to position itself as a leader in the closely watched space.
Because Google's AI Overviews sometimes generated unhelpful responses to queries, the company is scaling it back while continuing to make improvements, Google's head of search, Liz Reid, said in a company blog post Thursday.
"[S]ome odd, inaccurate or unhelpful AI Overviews certainly did show up. And while these were generally for queries that people don't commonly do, it highlighted some specific areas that we needed to improve," Reid said.
Nonsensical questions such as, "How many rocks should I eat?" generated questionable content from AI Overviews, Reid said, because of the lack of useful, related advice on the internet. She added that the AI Overviews feature is also prone to taking sarcastic content from discussion forums at face value, and potentially misinterpreting webpage language to present inaccurate information in response to Google searches.
"In a small number of cases, we have seen AI Overviews misinterpret language on webpages and present inaccurate information. We worked quickly to address these issues, either through improvements to our algorithms or through established processes to remove responses that don't comply with our policies," Reid wrote.
For now, the company is scaling back on AI-generated overviews by adding "triggering restrictions for queries where AI Overviews were not proving to be as helpful." Google also says it tries not to show AI Overviews for hard news topics "where freshness and factuality are important."
The company said it has also made updates "to limit the use of user-generated content in responses that could offer misleading advice."
—The Associated Press contributed to this report.
Megan Cerullo is a New York-based reporter for CBS MoneyWatch covering small business, workplace, health care, consumer spending and personal finance topics. She regularly appears on CBS News 24/7 to discuss her reporting.
2024-12-25 08:521862 view
2024-12-25 08:202207 view
2024-12-25 08:12453 view
2024-12-25 08:082244 view
2024-12-25 07:361840 view
2024-12-25 07:32141 view
The 21-year-old woman who died in a motorcycle crash in Johor on Tuesday (Dec 10) had just surprised
Amy Grant is reflecting on the emotional toll her traumatic bike accident caused.The Christian singe
The 32 things we learned from Week 2 of the 2024 NFL season:1. The jersey number of Miami Dolphins Q