Type: Article -> Category: Smoke & Mirrors

Person surrounded by AI-generated images and videos struggling to identify real vs fake content

When Humans No Longer Believe Anything

AI, Synthetic Reality, and the Cost of Removing Friction

Don't have time to read the article? View as a short video storyboard or listen to it whilst jogging.

Publish Date: Last Updated: 27th March 2026

Author: nick smith- With the help of CHATGPT


Introduction: The Simulation We Built Ourselves

For years, the idea that we might be living in a simulation has captured the imagination of technologists and philosophers alike. Popular culture, particularly films like The Matrix, framed this as a future in which machines construct an artificial world around us.

But reality has taken a different path.

We are not trapped inside a machine-generated illusion.

We are living inside a human-generated distortion of reality, amplified by artificial intelligence.

And unlike science fiction, there is no central system controlling it.

There is only us.

(Mis)Aligned is a human-first exploration of a reality few people are talking about openly, yet millions are living every day: people are forming meaningful emotional bonds with AI companions.


The Acceleration of Synthetic Reality

In a remarkably short period of time, AI has crossed a threshold that changes the nature of information itself.

Systems such as GPT-4 and Sora have made it possible to generate:

  • Photorealistic images
  • Convincing human voices
  • Video footage of events that never occurred
  • Entire digital identities that can interact in real time

This is no longer confined to research labs or production studios. It is accessible, scalable, and increasingly indistinguishable from reality.

The question is no longer “Can AI fake reality?”

The question is:

What happens when faking reality becomes trivial?


From Information to Noise

The early internet promised access to knowledge.

AI has delivered something else entirely: infinite content.

At first glance, this seems like progress. But scale changes everything.

We are now seeing the rise of what many refer to as AI slop:

  • Low-effort generated media
  • Endless novelty content
  • Mass-produced visuals with no informational value

This is not simply harmless entertainment.

It creates three structural problems:

1. Signal Collapse

When everything can be created instantly, meaningful content becomes harder to identify, and easier to dismiss.

2. Economic Dilution

Creators producing genuine work are forced to compete with content that costs almost nothing to produce.

3. Infrastructure Waste

Behind every generated image or video sits:

  • GPU-intensive computation
  • Expanding data centre demand
  • Increasing energy consumption

We are, quite literally, burning real-world resources to generate disposable digital noise.


The Trust Breakdown

The more serious issue is not content quality.

It is trust.

When any image, video, or voice can be fabricated convincingly, the foundation of verification begins to erode.

  • A video can no longer be assumed to be evidence
  • A voice recording can no longer be assumed to be authentic
  • A profile can no longer be assumed to represent a real person

This has profound implications:

Justice

If visual evidence can be dismissed as synthetic, the threshold for proof becomes unstable.

Conflict

AI-generated media can be used to manipulate public perception in real time.

Commerce

Products can be misrepresented at scale with near-zero cost.

Human Relationships

Identity itself becomes questionable in digital spaces.

Over time, this leads to something far more dangerous than misinformation:

A society in which people no longer believe anything at all.


AI Is Not the Cause, It Is the Amplifier

It is important to be precise.

AI did not create distrust.

Distrust was already growing, across institutions, media, and politics.

What AI has done is remove the cost of deception.

And when deception becomes cheap, it becomes abundant.


The Missing Ingredient: Friction

One of the defining features of modern AI systems is ease of use.

  • No infrastructure required
  • No identity verification
  • No cost barrier at entry level

This accessibility has driven innovation, but it has also removed an important constraint:

Friction.

Historically, creating convincing media required:

  • Skill
  • Time
  • Resources
  • Accountability

Today, it requires none of these at scale.

And that changes behaviour.


A Controversial but Necessary Idea: Reintroducing Friction

If we cannot stop the misuse of AI, and we cannot, then the question becomes:

Can we make misuse harder, slower, and more accountable?

Two potential directions emerge:

1. Localised Compute for High-Volume Generation

Requiring large-scale or high-frequency AI generation to occur on local hardware introduces natural constraints:

  • Hardware cost becomes a barrier to mass production
  • Energy usage becomes visible to the user
  • Casual, low-value generation decreases

When someone has to invest in a capable machine to generate content, behaviour changes.

Not entirely, but meaningfully.


2. Validated Access for Powerful Systems

We do not allow unrestricted, anonymous access to other powerful systems.

  • Financial systems require identity
  • Infrastructure systems require oversight
  • Communication networks enforce accountability

AI, arguably one of the most powerful tools ever created, remains unusually open at the point of use.

Introducing validated accounts for advanced capabilities could:

  • Reduce anonymous large-scale abuse
  • Increase traceability of harmful content
  • Encourage more responsible usage patterns

This is not about control.

It is about accountability matching capability.


The Reality: There Is No Perfect Solution

These approaches are not foolproof.

  • Open-source models will continue to exist
  • Workarounds will be developed
  • Determined actors will always find a way

But perfection is not the goal.

Friction does not eliminate abuse.
It reduces its scale.

And in systems as large as ours, scale is everything.


A Mirror, Not a Machine

It is tempting to frame this as a technological problem.

It is not.

AI is not introducing new flaws into humanity.

It is exposing and amplifying the ones already there:

  • The desire for attention
  • The pursuit of profit over value
  • The willingness to manipulate perception

AI is simply the most efficient tool we have ever built for expressing them.


Conclusion: Choosing What We Optimise For

We are entering a world where:

  • Reality can be simulated effortlessly
  • Trust must be earned, not assumed
  • Information must be verified, not consumed

The risk is not that AI will deceive us.

The risk is that we become so accustomed to deception that we stop caring about truth altogether.

Reintroducing friction, through cost, identity, or effort, is not a step backwards.

It is a recognition that:

Power without constraint does not create freedom.
It creates noise.

And if we allow that noise to dominate, we may find ourselves in a world where nothing is believed, not because reality disappeared, but because we chose to drown it out.

AI Smart Glasses with ChatGPT & 8MP Camera | HD Video Recording, Sunglasses with Real-Time Translation, Music & Voice Control

Latest Smoke & Mirrors Articles

Why the UK May Soon Be Praying for Wind
Why the UK May Soon Be Praying for Wind

Electric cars and renewables promise energy independence—but the UK grid still relies on gas and unpredictable weather. As demand...

Knowing About AI Is Not the Same as Using It
Knowing About AI Is Not the Same as Using It

In the AI age, familiarity is often mistaken for expertise. This article explores the growing gap between knowing about AI and...

Smoke & Mirrors: The Myth of the “AI-Only” Chatroom
Smoke & Mirrors: The Myth of the “AI-Only” Chatroom

A growing number of AI “chatrooms” are described as human-free spaces. In reality, humans shape the agents that speak within them....

Smoke & Mirrors: The West’s Headlong Rush into Electrification
Smoke & Mirrors: The West’s Headlong Rush into Electrification

Smoke & Mirrors: The West’s Headlong Rush into Electrification Progress marketed as simplicity, built upon systems that are...

The Quiet Apocalypse: Why AI Lovers, Not AI Weapons, Could End Humanity
The Quiet Apocalypse: Why AI Lovers, Not AI Weapons, Could End Humanity

When the film Her was released, it was praised as a clever, emotional love story. But the real brilliance of that film lies...

When the Numbers Don’t Add Up: How Removing Local Policing Turned UK High Streets Into Illusions
When the Numbers Don’t Add Up: How Removing Local Policing Turned UK High Streets Into Illusions

When the Numbers Don’t Add Up: How Removing Local Policing Turned UK High Streets Into Illusions....

Smoke & Mirrors: The Carbon Capture Con
Smoke & Mirrors: The Carbon Capture Con

Billions are being poured, quite literally, into the ground. Under the banner of “Carbon Capture and Storage” (CCS), governments...

Using a Dead Fish to Take on a Shark - The Illusion of UK Drug Policy
Using a Dead Fish to Take on a Shark - The Illusion of UK Drug Policy

In the theatre of modern politics, illusion is everything. This week, buried deep in the back pages of the mainstream press, a...

 

Click to enable our AI Genie

AI Questions and Answers section for When Humans No Longer Believe Anything

Welcome to a new feature where you can interact with our AI called Jeannie. You can ask her anything relating to this article. If this feature is available, you should see a small genie lamp above this text. Click on the lamp to start a chat or view the following questions that Jeannie has answered relating to When Humans No Longer Believe Anything.

Be the first to ask our Jeannie AI a question about this article

Look for the gold latern at the bottom right of your screen and click on it to enable Jeannie AI Chat.

Type: Article -> Category: Smoke & Mirrors