The critique comes up often enough to be worth addressing directly: building privacy infrastructure is an amoral act. You’re helping people hide things. Hiding things is what bad actors do. A moral society wants transparency.
The argument sounds principled. It isn’t.
The Category Error
The critique confuses visibility with virtue. They are not the same thing.
Morality — real morality, not compliance — requires a genuine inner life. You cannot have conscience without interiority. When someone does the right thing only because they are being watched, that is not moral behavior. It is performance. Kant made exactly this point: the moral worth of an action comes from the will behind it, freely chosen, not from external constraint.
A society of total surveillance does not produce moral people. It produces people who are very good at appearing moral. That distinction is everything.
What Surveillance Actually Does
The evidence here is not theoretical.
After the Snowden revelations in 2013, researchers documented a measurable, immediate drop in Wikipedia searches for terrorism-related topics. Not by terrorists — by ordinary curious people who suddenly felt watched. PEN America surveyed writers and found one in six had stopped writing about certain subjects entirely due to surveillance concerns.
This is not people hiding wrongdoing. This is the contraction of the space of thought itself.
The mechanism is well documented in psychology. When people know they might be observed, they stop asking “what is right?” and start asking “what will be approved?” They internalize the watcher’s gaze. The question stops forming before it becomes conscious. The chilling effect operates below the level of deliberate self-censorship.
A society that stops thinking certain thoughts is not a moral society. It is a conformist one.
East Germany Is the Case Study You Cannot Argue With
At its peak, the Stasi had roughly one informant for every 63 citizens — the densest surveillance apparatus in history. What did it produce?
Not a moral population.
It produced a deeply traumatized, atomized society where trust collapsed at every level — between neighbors, between spouses, between parents and children. After reunification, people discovered their closest family members had been filing reports on them for years. The psychological damage outlasted the regime by decades.
The point is not merely that the Stasi was evil. The point is that comprehensive surveillance destroyed the social fabric that morality depends on. You cannot have genuine moral community without trust. Surveillance systematically destroys trust.
When Payment Systems Become Gatekeepers
The practical record of increasingly intermediated payment systems is instructive and recent.
WikiLeaks was subjected to a payment blockade — major card networks and processors cut off donations — before any court had found them guilty of anything. Operation Choke Point saw regulators pressure banks to deny accounts to legal businesses deemed politically inconvenient. In authoritarian contexts, the mechanism is even more direct: journalists, activists, and protesters have had their ability to transact quietly removed — not through criminal prosecution but through administrative exclusion.
When the ability to transact is contingent on approval from whoever controls the ledger, economic freedom is conditional. And conditional economic freedom has a way of becoming no economic freedom at all — gradually, then suddenly.
The Asymmetry Nobody Talks About
Surveillance — whatever one thinks of its stated rationale — is never neutral in practice. It flows downward.
The powerful have lawyers, shell companies, offshore structures, and legislative protection. The architectures of visibility that exist in most countries hit communities that are already vulnerable hardest — the poor, minorities, dissidents, the politically inconvenient. The powerful are rarely the ones being watched.
The practical effect is not a more moral society. It is existing power structures made more entrenched and harder to challenge. That is the opposite of moral progress.
What China’s Social Credit System Reveals About the Mechanism
When you observe behavior and attach consequences to it, people do not become more virtuous. They become better at optimizing their score.
They learn what the system rewards and they perform that, regardless of what they actually believe or value. The metric colonizes the behavior. You end up with a population that is extremely good at appearing moral by whatever the current criteria happen to be.
That is almost the precise opposite of a moral society. A moral society needs people capable of recognizing when the current criteria are wrong. Pervasive observation eliminates that capacity.
The Civilizational Stakes
The costs here are larger than they first appear.
The first-order cost is obvious: data gets misused, individuals get hurt. That is bad enough. But the second-order cost is civilizational.
Every major moral advance in history came from someone willing to ask a question that the consensus of their time said was dangerous — abolitionists, suffragettes, dissidents behind the iron curtain. They were not operating in the open. They needed private spaces, literal and figurative, to think, organize, and build conscience.
A society that cannot protect private thought cannot protect moral progress. The capacity to correct its own errors — to recognize injustice, organize around a better idea, and act on it — quietly atrophies. What replaces it is conformity: doing what the group does, what the camera expects, what the algorithm rewards.
Conformist drones do not make moral progress. They reproduce the current consensus, whatever it happens to be, however wrong it is.
The Principle That Threads the Needle
Privacy by default. Accountability when there’s cause.
This is not a radical position. It is the foundational logic of every free society that has ever functioned. Probable cause. Warrants. Due process. Presumption of innocence. These are all the same principle applied to different domains. The legal tradition worked this out centuries ago. The problem is that technology made mass observation so cheap that societies drifted away from it without ever consciously deciding to.
The principle also answers the hardest objection cleanly: what about criminals?
The answer is not that criminals deserve privacy. The answer is that when there is cause, accountability exists and should be pursued. The system was never supposed to watch everyone. It was supposed to watch people when there is a specific, articulable reason to. Mass surveillance inverts this — it watches everyone, all the time, and sorts out the bad actors from the data afterward. Which requires treating every person as a suspect by default. That is not a safety architecture. It is a presumption of guilt with better branding.
There is something else the principle does that is easy to miss. Accountability when there’s cause forces someone to make a judgment, justify it, and be themselves accountable for that judgment. The warrant system does not only protect the suspect. It forces the state to articulate why it is watching someone and have a third party agree. That constraint on power is the feature, not the bug. Remove it and you do not get more accountability. You get less — because the watchers answer to no one.
Privacy by default protects the innocent. Accountability when there’s cause pursues the guilty. The two are not in tension. One makes the other legitimate.
The Conclusion
Privacy is not a preference. It is not a feature. It is the soil that morality grows in.
Free and moral people require privacy as a baseline — the same way oxygen is not a feature of human flourishing but the condition that makes it possible at all.
The road to the panopticon has been paved with transparency advocates. The people who built the architectures of visibility — the ad networks, the compliance regimes, the payment graphs — many of them genuinely believed they were making the world safer. Good intentions, confidently executed. What they built was infrastructure for control available to whoever ends up holding the keys.
The privacy builder has a clear moral theory: people deserve sovereignty over their own lives, concentrating information is concentrating power, and concentrating power ends badly — consistently, across history, without exception.
The surveillance builder has an assumption: that whoever holds the keys will be good ones.
That is not a moral theory. That is a prayer.
The Morality Series
SatsRail is non-custodial Bitcoin payment infrastructure. We built a payment rail with a minimal data footprint — processing payment data only, with no content visibility and no buyer identity collected by default. Learn how it works.