Google responds to AI search generating major factual errors: suggesting netizens eat stones, poisonous mushrooms, etc.

GoogleToday, NeoWin Media released a report on the major factual errors in the recent AI Overviews search.A relevant statement was issued, indicating that it will continue to improve the relevant algorithms and make corrections.

Google responds to AI search generating major factual errors: suggesting netizens eat stones, poisonous mushrooms, etc.

The official translation of Google's statement is as follows:

The vast majority of AI overviews provide high-quality information with links to in-depth web searches. Based on our observations, many of the examples shared by netizens are not common queries, and some examples are tampered with or cannot be reproduced.

We conducted extensive testing before launching AI Overviews, following our existing testing methodology for new search features, and we appreciate your feedback.

We will quickly take appropriate action under our content policies and use these examples to make broader improvements to our systems, some of which have already begun rolling out.

Here’s what’s going on:

Some netizens searched on Google to find a solution to the problem of "cheese and pizza not sticking together". AI directly gave a summary guide at the top, which seemed to be very practical. One of the steps was "You can also add 1/8 cup of non-toxic glue to the sauce to make it stickier.

statement:The content is collected from various media platforms such as public websites. If the included content infringes on your rights, please contact us by email and we will deal with it as soon as possible.
Information

Six departments: Encourage the use of new technologies and equipment such as AI in film visual effects and post-production

2024-5-25 9:18:11

Information

Scammer fined $6 million for using AI to clone Biden's voice for phone fraud

2024-5-25 9:21:57

Search