“AI slop” sounds like one of those daft online phrases you are supposed to ignore. It is not. It has become a mainstream term for low-quality AI-generated material produced at scale, to the point that Merriam-Webster chose “slop” as its word of the year in 2025 and defined it in the AI sense as poor digital content churned out in bulk.

That matters because once a silly-sounding phrase escapes internet in-jokes and enters mainstream language, the underlying problem is usually already out in the real world.

For UK SMEs, AI slop is not mainly a culture-war issue. It is a business issue. More specifically, it is a trust issue. It means more time checking whether reviews are fake, whether marketing copy is thin machine mush, whether a supplier proposal has any real thought behind it, whether a developer’s codebase is maintainable, and whether an agency is doing skilled work or just feeding prompts into a model and invoicing you for the privilege.

Slop is no longer somebody else’s problem. It is arriving through the same tools, suppliers and workflows UK SMEs are already buying into.

The real cost of slop is a trust tax

This is the clearest way to think about it.

AI slop creates a trust tax. It forces businesses to spend more time verifying things that used to be easier to trust at first glance. Is that review genuine? Is that article worth reading? Is that image real? Is that code sensible? Is that proposal tailored to my business or just generic machine filler with my company name dropped in?

Every extra minute spent checking rubbish is a hidden cost. For a small business, hidden costs matter because time is already scarce.

The wider web environment is moving in the wrong direction. The internet is becoming more machine-saturated, more synthetic and more annoying to trust.

“The danger is not only that AI can generate rubbish. It is that AI can generate rubbish cheaply, endlessly and convincingly enough that the burden of sorting signal from noise shifts onto the buyer.”

That is the first key point for SMEs: the danger is not only that AI can generate rubbish. It is that AI can generate rubbish cheaply, endlessly and convincingly enough that the burden of sorting signal from noise shifts onto the buyer.

Code slop is where this gets expensive

One of the strongest SME angles is software.

There is now credible software-engineering research showing that AI-generated code can create maintainability and technical-debt problems, especially when it is produced quickly and not properly understood by the humans shipping it. Recent papers explicitly discuss GenAI-induced technical debt, uncertain code quality and AI-generated code maintainability as real issues rather than theoretical worries.

The plain-English version is simple: code can “work” today while still becoming a costly mess tomorrow because nobody really understands why it was written that way.

That matters enormously for SMEs buying software from agencies or freelance developers. If your supplier is leaning heavily on AI to generate code, you may get a faster first version of the system. You may also inherit a codebase that is harder to diagnose, harder to modify and more dependent on the original builder’s workflow.

That is not just a purist complaint from grumpy programmers. It is a commercial risk. If the codebase is odd, opaque or stitched together in ways that do not align well with ordinary engineering practice, changing supplier later becomes harder and more expensive.

The lock-in risk is worse than it looks

If an SME buys software built by a developer or agency using AI heavily, there may be a double lock-in.

First, you are dependent on the supplier who understands the system. Second, you may also be indirectly dependent on the AI workflow, model family or coding patterns used to create it. If future changes are easiest to make by continuing with the same tool-assisted approach, your freedom to switch becomes narrower than you realised when you approved the original build.

That is where the word “slop” becomes useful. It is not just about bad taste. It is about the difference between something produced with enough care to be maintainable and something that is merely functional enough to ship. SMEs can survive clunky software. What they struggle with is clunky software that also traps them.

“It is not just about bad taste. It is about the difference between something produced with enough care to be maintainable and something that is merely functional enough to ship.”

The human cost is real, but keep it in proportion

There is also a softer human angle here, and it is worth including carefully.

You shared a transcript from a developer describing himself as “oneshotted” by coding AI: dependent on the easy button, unable to review AI-generated code properly, and detached from the result. On its own, that is anecdotal, not proof of an industry-wide collapse in morale. But it lines up with wider evidence.

There is growing evidence that heavy reliance on coding AI can produce deskilling, weaker ownership of the work, and a more detached relationship with the software being built. For an SME, that matters because detached builders often produce detached systems. If nobody feels much ownership of what was made, future support is usually uglier.

Fake reviews are slop with legal consequences

This is one of the cleanest and most important SME sections.

AI makes it easier to generate fake praise and fake complaints at scale. Regulators know it. In the UK, the CMA has issued specific guidance on fake reviews, and the newer consumer-law framework bans fake or misleading consumer reviews. In the US, the FTC’s final fake-reviews rule and its case against Rytr show regulators are already treating AI-generated review abuse as a real enforcement issue.

From an SME point of view, there are two risks here. The first is obvious: you can be harmed by fake negative reviews or by a review environment that customers trust less than they used to. The second is more awkward: some businesses will be tempted to fight filth with filth and use AI-generated review padding or reputation manipulation themselves.

That is a rotten idea. Even if you dodge enforcement, you are still feeding the same sludge economy that makes customers suspicious of everyone.

SEO slop is making online visibility worse, not better

There is a similar warning for content marketing.

If your agency is churning out dozens of AI-generated blog posts, location pages, product summaries or thought-leadership pieces with barely any real expertise in them, you may not be buying modern marketing. You may be buying slop.

This creates what can fairly be called a quality inversion. It becomes easier for low-effort operators to flood the web than for careful operators to stand out. That does not mean good content is dead. It means the effort required to distinguish good content from synthetic filler gets higher.

SMEs paying for content should think about that very carefully, because a pile of AI sludge with some keywords stuffed into it may not just fail to help — it may make your brand look cheaper, lazier and less trustworthy.

Even Microsoft got dragged into the slop conversation

A brief aside, because this one is too on the nose to ignore.

The term “Microslop” has been used online as a mocking response to Microsoft’s aggressive Copilot push and wider irritation at AI being rammed into everything. That is not proof that Microsoft is automatically producing slop, and it would be unfair to claim that. But it is useful as a cultural signal: users are annoyed enough by forced AI insertion and AI fatigue to invent slang for it.

That matters to SMEs because the same irritation can apply to agencies, developers and SaaS vendors. If a supplier is using AI because it genuinely improves the work, fine. If they are slapping AI on everything because it is fashionable, or because it lets them pump out more output with less care, then “slop” stops being a meme and starts being a perfectly sensible business label.

The dead internet theory is nonsense — except for the bit that isn’t

The strong version of the dead internet theory — the idea that the web is basically fake and humans have quietly left the building — is not supported by serious evidence.

But the weaker version, the one ordinary users actually feel, is much harder to dismiss. More bots. More synthetic engagement. More junk pages. More fake reviews. More machine-written filler. More content produced for ad systems or platform incentives rather than people.

It does not matter whether the most dramatic version of the theory is true. What matters is that the web is becoming more synthetic, more polluted and more expensive to trust.

Political slop and video slop are real, but they are side symptoms here

There is enough evidence to say these are real problems. Academic work has shown that AI-based persuasion can influence political attitudes under some conditions, and major platforms are expanding deepfake detection tools because synthetic media risk is now taken seriously.

They help show that slop is not confined to business content. The same pattern is spreading across the wider digital world.

The conclusion for SMEs is simple

Used properly, AI can absolutely improve work. That is not the problem.

The problem is what happens when output replaces judgement.

That is when you get slop: code that works but nobody really understands, reviews that look real but are fake, articles that sound fine but say nothing, marketing that scales but does not connect, and supplier relationships that become harder to trust because you are no longer sure where the human work ends and the machine mush begins.

For UK SMEs, the practical lesson is not “avoid AI”. It is be pickier about where you tolerate slop. Ask agencies how much of the work is AI-generated. Ask developers how maintainable the code is and who can support it if they walk away. Treat suspicious review patterns seriously. Be wary of SEO sludge sold as strategy. Assume the internet is getting noisier and that somebody, usually you, will pay the trust tax.

AI slop is not just ugly. It is expensive.

Richard Lowe — Founder of Small World Solutions, helping UK SMEs navigate IT infrastructure, security and AI adoption.

If you want to discuss how to spot AI slop in software, supplier proposals, content marketing or your wider digital estate, use the contact page.

← Back to Ramblings Get in Touch

References

  1. Anthropic (2025) How AI is Transforming Work at Anthropic.
  2. Axios (2026) reporting on the “AutoBait” network of AI-generated slop sites.
  3. Boston Consulting Group (2025) Managing the Evolving Dynamics of Digital Platform Lock-In.
  4. Competition and Markets Authority (2025) fake reviews guidance and Google undertakings on fake reviews.
  5. Federal Trade Commission (2024–2025) fake reviews rule and Rytr enforcement materials.
  6. Google Search Central guidance on scaled content abuse and people-first content.
  7. Imperva (2025) Bad Bot Report.
  8. Merriam-Webster (2025) Word of the Year: Slop.
  9. Office for National Statistics (2026) Business insights and impact on the UK economy.
  10. PC Gamer (2026) reporting on the “Microslop” Discord moderation incident.
  11. Reuters Institute for the Study of Journalism (2024) AI-generated slop is quietly conquering the internet.
  12. Science (2025) political persuasion by artificial intelligence.
  13. Stack Overflow (2025) Developer Survey 2025.
  14. Technical papers on AI-generated code maintainability and GenAI-induced technical debt, including TODO: Fix the Mess Gemini Created and related 2026 work.
  15. YouTube and platform reporting on deepfake detection expansion.