You may not know exactly what “slop” means in relation to artificial intelligence. But on some level you probably do.
Slop, at least in the fast-moving world of online message boards, is a broad term that has developed some traction in reference to shoddy or unwanted A.I. content in social media, art, books and, increasingly, in search results.
Google suggesting that you could add nontoxic glue to make cheese stick to a pizza? That’s slop. So is a low-price digital book that seems like the one you were looking for, but not quite. And those posts in your Facebook feed that seemingly came from nowhere? They’re slop as well.
The term became more prevalent last month when Google incorporated its Gemini A.I. model into its U.S.-based search results. Rather than pointing users toward links, the service attempts to solve a query directly with an “A.I. Overview” — a chunk of text at the top of a results page that uses Gemini to form its best guess at what the user is looking for.
The change was a reaction to Microsoft having incorporated A.I. into its search results on Bing, and it had some immediate missteps, leading Google to declare it would roll back some of its A.I. features until problems can be ironed out.
But with the dominant search engines having made A.I. a priority, it appears that vast quantities of information generated by machines, rather than largely curated by humans, will be served up as a daily part of life on the internet for the foreseeable future.
Hence the term slop, which conjures images of heaps of unappetizing food being shoveled into troughs for livestock. Like that type of slop, A.I.-assisted search comes together quickly, but not necessarily in a way that critical thinkers can stomach.
Kristian Hammond, the director of Northwestern University’s Center for Advancing Safety of Machine Intelligence, noted a problem in the current model: the information from A.I. Overview is being presented as a definitive answer, rather than as a place to start an internet user’s research into a given subject.
“You search for something and you get back what you need in order to think — and it actually encourages you to think,” Mr. Hammond said. “What it’s becoming, in this integration with language models, is something that does not encourage you to think. It encourages you to accept. And that, I think, is dangerous.”
For a problem to be targeted, giving it a name can prove helpful. And while slop is one option, it is still an open question of whether it will catch on with a mainstream audience, or end up in the slang dustbin with cheugy, bae and skibidi.
Adam Aleksic, a linguist and content creator who uses the handle etymologynerd on social media, believes that slop — which he said has yet to cross over to a broader audience — shows promise.
“I think this is a great example of an unobtrusive word right now, because it is a word we’re all familiar with,” Mr. Aleksic said. “It’s a word that feels like it’s naturally applicable to this situation. Therefore, it’s less in your face.”
The use of slop as a descriptor for low-grade A.I. material seemingly came about in reaction to the release of A.I. art generators in 2022. Some have identified Simon Willison, a developer, as an early adopter of the term — but Mr. Willison, who has pushed for the phrase’s adoption, said it was in use long before he found it.
“I think I might actually have been quite late to the party!” he said in an email.
The term has sprung up in 4chan, Hacker News and YouTube comments, where anonymous posters sometimes project their proficiency in complex subject matter by using in-group language.
“What we always see with any slang is that it starts in a niche community and then spreads from there,” Mr. Aleksic said. “Usually, coolness is a factor that helps it spread, but not necessarily. Like, we’ve had a lot of words spread from a bunch of coding nerds, right? Look at the word ‘spam.’ Usually, the word is created because there is a particular group with shared interests, with a shared need to invent words.”
In the short term, the effect of A.I. on search engines and the internet in general may be less extreme that some would fear.
News organizations have worried about shrinking online audiences as people rely more on A.I.-generated answers and data from Chartbeat, a company that researches internet traffic, indicates that there was an immediate drop in referrals from Google Discover to websites in the first days of A.I. overviews. But that dip has since recovered, and in the first three weeks of the overviews, overall search traffic to more than 2,000 major websites in the U.S. actually went up, according to Chartbeat.
Mr. Willison, who identified himself as an optimist for A.I. when it is used correctly, thought that slop could become the go-to term for junky machine-generated content.
“Society needs concise ways to talk about modern A.I. — both the positives and the negatives,” he said. “‘Ignore that email, it’s spam,’ and ‘Ignore that article, it’s slop,’ are both useful lessons.”