Australia’s Macquarie Dictionary has announced that “AI slop” has been crowned by its committee and the public as the Word of the Year for 2025.

Macquarie defines AI slop as “low-quality content created by generative AI, often containing errors, and not requested by the user.” Most tech-savvy readers will know AI slop when they see it, but view ‘Shrimp Jesus’ for a case in point. Wired has even declared Donald Trump as the “first AI slop President”.  The term beat other shortlisted contenders like ‘clanker’, ‘medical misogyny’, and, to the delight of children globally, ‘six-seven’.

Macquarie’s committee and the public agreed on the same winning word for the second year running and only the fourth time in history. The choice is telling. AI slop is a problem. After all, it is central to the broader ‘enshittification’ of the internet (which was coincidentally Macquarie’s Word of the Year for 2024).

The question is, how did we get here?

Generative AI has dramatically lowered the barrier to creating content, ignited by OpenAI’s release of ChatGPT around three years ago. This in itself is a great thing. Creative tools like image and video editing are no longer the preserve of specialists. Ordinary folk can turn their creative ideas, which otherwise may never have seen the light of day, into reality.

Generative AI not only makes it easier to create content to begin with but also makes it possible to do this rapidly and at scale. This too has clear benefits. For example, game developers can use AI-generated assets to create immensely rich worlds. However, it has inadvertently paved the way for the onslaught of AI slop on the internet. This is most visible on social media and in search engine results.

Social media platforms reward attention. This has long driven content farming, where creators churn out low-effort posts to game algorithms and make money, such as from ad revenue or affiliate links. AI-generated content takes this to another level. Even LinkedIn, the bastion of cringe-free critical thinking, is not immune. Originality.AI claims that over half of all long-form LinkedIn posts are likely AI-generated.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

Meanwhile, AI slop is clogging up search results. Graphite found that in November 2024, the quantity of AI-generated articles on the web surpassed the quantity of human-written articles for the first time.

The cost of AI slop

Now, let us be clear; not all AI-generated content is AI slop. Generative AI can assist us in producing genuinely compelling work. However, there is no doubt that a sizeable chunk of AI-generated content will indeed be AI slop.

AI slop is ultimately bad because it degrades and fills our online spaces with low-quality junk. It can also be dangerous when used by nefarious actors to spread misinformation, political propaganda, and conspiracy theories.

Graphika has detailed extensively how sophisticated AI-powered influence operations (IOs) aim to manipulate target audiences by amplifying polarising content. The Russia-linked CopyCop IO has seen hundreds of websites posing as Western media sources that churn out AI-generated pro-Russian and anti-Ukrainian slop.

While we will never be able to rid the internet of AI slop, there needs to be heightened efforts to crack down on it. Platforms must ultimately discourage and demote such content. Efforts are already being made, for example: YouTube has implemented policies to restrict monetisation possibilities for unoriginal content, but we have a long way to go. Let us hope Macquarie’s Word of the Year for 2026 is something more positive. I will not hold my breath.