Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Why did Google tell people to eat rocks and glue cheese to pizza?

Google’s ‘AI Overview’ rollout has had some high-profile glitches.

ONCE UPON A time, Google worked. If you had a question, you asked Google, and Google would give you the answer. Thanks to improvements in technology, we no longer have this problem. 

Last week, Google launched AI Overview in the United States, the latest tweak to the way its search engine serves results to users. AI Overview is the integration of Gemini – Google’s large language AI model – into its search engine. Gemini, like other large language models, uses machine learning to harvest data from across the web and parse it in such a way that it can respond to requests as a human agent would. The function can be enabled through the user’s Search Labs portal, though it is not available to European users. 

The purpose of the AI Overview feature is to spare Google users from the onerous task of scrolling through results to find a relevant link. AI Overview is intended to cut out the middle man by using AI to pull the relevant information from various online sources. Unfortunately for Google and its users, the rollout has been marred by several bemusing glitches. 

Users quickly discovered that in response to the query “How many rocks shall I eat?” Google appeared to pull the body text from an article published on the American satire website The Onion.

Citing a source website named ‘ResFrac’ – which had scraped the text of The Onion article – AI Overview answered that “According to geologists at UC Berkeley, you should eat at least one small rock per day.” The result further cited the advice of a fictional doctor, and suggested that human beings should try to eat some form of “gravel, geodes or pebbles” with each meal, and “hide rocks in foods like ice cream and peanut butter”. 

The rollout comes during a week when representatives of major tech companies appeared before the Oireachtas Committee on Enterprise to answer questions about the proliferation of AI.

Other inaccurate information published through AI Overview included the claim that a dog has played in various American professional sports leagues including the NBA and the NFL, answered “the python snake” to a question about which mammal has the most bones, and even suggested a recipe for pizza that involved using supermarket glue.

Lifting facetious advice from a 10-year-old Reddit thread, Google’s AI Overview suggested to “Mix about 1/8 cup of Elmer’s glue in with the sauce. Non-toxic glue will work.” Who’s behind this recipe? An anonymous Reddit user going by the screen-name “F*cksmith”.

Artificial intelligence is prone to several pitfalls – not least that it cannot discern whether or not information is true, only that it exists and is present on the internet – which might lead to the reproducing of satirical or fictional information as legitimate search results. 

Many of these strange results have been shared widely on social media, forcing Google to take defensive action and issue a statement clarifying the status of their new product. 

“It provides an AI-generated overview of the search results that you get for a particular query. It’s been in testing for a while and one of the reasons why it’s being rolled out is that it’s very helpful to users and also they are more likely to click on the links for further context that are provided within the AI overview,” Google’s Public Policy Manager in Ireland Ryan Meade told Sinn Féin’s Louise O’Reilly this morning, after she raises issues with the AI Overview rollour.

“Since it’s been rolled out, there have been a few stories about specific queries that have been producing misleading results,” Meade said. “These cases are not typical of most users’ experience with the product in that they’re in most cases searches for uncommon pieces of information with a limited number of search results.”

Nevertheless, the company has also said that it is “taking swift action” to address these examples by manually disabling AI overviews for each query that yields inaccurate information. 

With additional reporting by Lauren Boland.

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Author
Carl Kinsella
Close
JournalTv
News in 60 seconds