There is no ethical case for staying on X
This image is a metaphor – the harm it represents is not / Adobe Photos

There is no ethical case for staying on X

The sense of outrage many people feel about X right now is real – and justified. I feel it too. Not as a fleeting reaction to the latest controversy, but as a growing recognition that this platform has crossed a line it cannot uncross.

This is no longer a debate about moderation thresholds, free expression trade-offs, or whether regulation has gone too far. It is about harm – predictable, repeatable harm – enabled by design, excused by rhetoric, and left unchecked by those with the power to stop it.

This is a fast-developing story. Events, statements, and political positions may well change in the coming days. Threats have already been made. Pressure is being applied.

But whatever happens next, the core issue does not change: X has become ethically indefensible.

What has changed – and why this matters

For years, many of us have tolerated X despite its problems. We justified staying because it was useful, because audiences were there, because “being present” felt professionally necessary.

That was how I felt right up until I called it a day in 2023 and stopped using X.

Today, the justification to stay no longer holds.

The enabling of image creation and manipulation through the Grok AI chatbot – including the sexualisation of women and girls without their consent – is not an unfortunate side effect. It is a foreseeable outcome of choices made and safeguards not put in place.

When a platform allows people to create or manipulate images of women and girls and present them in sexualised settings or worse, without permission, consent becomes irrelevant. Dignity becomes collateral damage. And responsibility is waived away under the banner of “free speech.”

At that point, the question is no longer "How bad is this platform?" It becomes "Why are we still here?"

The false comfort of “We need to stay and monitor”

I keep hearing the same argument: we have to stay on X to monitor it. That leaving would be counterproductive. That abandoning the platform hands it over to bad actors.

Tim Davie, Director-General of the BBC, who quit the role in November 2025, firmly believes the BBC should remain on the platform. I couldn't disagree more strongly.

Indeed, I don’t buy it at all.

Monitoring does not require participation. Observation does not require endorsement. And staying active on a platform like this confers legitimacy whether we intend it or not.

Being present says: this is still an acceptable place to do business.
Posting says: this is still a channel we are comfortable using.
Paying says: this is worth funding.

There is nothing neutral about any of that.

💡
Reach without responsibility is not a virtue. And visibility achieved by tolerating harm is not something communicators should be proud of.

Why withdrawal is not censorship – it is accountability

Closing an account is not silencing speech. Cancelling a subscription is not banning ideas. Walking away is not an attack on free expression.

It is a refusal to be complicit.

If you are a paying subscriber, your money directly supports the platform’s direction. If you are an organisation posting regularly, your presence signals acceptance. If you are a government using X as an official channel, your choice carries weight far beyond convenience.

Ethics are revealed less by what we criticise and more by what we refuse to support.

Governments and institutions must meet a higher standard

This is where the argument becomes unavoidable.

Governments have a duty not only to communicate, but also to model ethical behaviour. That is why the open letter a few days ago from the Public Relations and Communications Association calling on the UK government to stop using X matters so much. It reflects a growing recognition within the profession that continued use is no longer defensible.

  • Using X while condemning its harms is incoherent. Staying for “reach” while acknowledging abuse is a contradiction.
  • There are alternative channels. None is perfect. But some are not actively enabling harm at scale.
  • Public service credibility erodes when convenience consistently overrides principle.

Pressure, threats, and the politics of intimidation

Unsurprisingly, pressure is already being applied.

Donald Trump has reportedly threatened sanctions if the UK moves to block X. Elon Musk has framed accountability as censorship and regulation as an attack on free speech.

This is a familiar playbook: escalate loudly, recast harm prevention as oppression, and dare governments to blink.

But free speech arguments ring hollow when the issue is non-consensual sexualised imagery. This is not about dissent or unpopular opinions. It is about consent, safety, and dignity.

💡
Democracies should not shape their ethical boundaries around the comfort of billionaires or the threats of politicians. Giving in to that pressure does not defend free speech – it rewards bullying.

There is strength in collective resolve. Acting alongside European partners would reframe this as standards, not spite. But even acting alone is preferable to doing nothing.

Banning X is only half the answer

Even if the UK were to block X tomorrow, that would not resolve the deeper issue.

Regulation matters. Enforcement matters. But cultural change does not come from fines alone. It comes from people leaving. From organisations withdrawing support. From institutions deciding that some lines, once crossed, matter.

Outrage without action changes nothing.

What ethical withdrawal looks like

This does not require performative gestures or moral grandstanding.

It looks like:

  • Individuals closing accounts and cancelling subscriptions.
  • Organisations stopping posting, archiving accounts, and explaining why.
  • Governments ceasing use entirely and directing citizens elsewhere.

None of this is easy. But ethics rarely are.

A line, not a slippery slope

This is not about purity. It is not about being flawless. And it is not about chasing the next platform controversy.

It is about recognising when a system has become so compromised that continued participation does more harm than good.

This is that moment.

Whether the UK bans X or not, the moral test remains the same. Bullying works only when it succeeds. And legitimacy is not removed by outrage alone – it is removed when people walk away.

Principles, after all, are only principles when they cost something.

Neville Hobson

Somerset, England
Communicator, writer, blogger from the beginning, and podcaster shortly after that.