The rise of ‘slop’ in AI-generated content is raising concerns across social media and search engines as poor-quality information spreads. Google’s integration of the Gemini AI model adds to the debate about the quality of AI-generated content online.
In the realm of artificial intelligence (AI), “slop” refers to poor-quality or unwanted AI-generated content that increasingly appears on social media, in digital art, books, and even search engine results. Examples of slop include Google suggesting the use of nontoxic glue to make cheese adhere to pizza, or low-quality digital books that mimic desired titles but fall short of expectations. The term gained more prominence last month when Google integrated its Gemini AI model into U.S.-based search results. Instead of directing users to relevant links, Gemini attempts to answer queries directly through an “A.I. Overview,” presenting a summary at the top of the search page. This move has sparked discussions about the prevalence and quality of AI-generated content.