← back/privacy-is-a-human-right
Privacy Alert

Privacy Is Not About Hiding. It's About Freedom.

The 'I have nothing to hide' argument isn't just wrong. It misunderstands what privacy is for. Privacy is not a shield for wrongdoers. It is the precondition for everything a free person is able to do. Here's why that matters more right now than at any point in recent history.

PrivacySurveillanceCivil LibertiesPalantir
March 18, 2026//9 min read//Noctis Privacy

When someone says 'I have nothing to hide,' they're not making an argument. They're revealing an assumption: that privacy is a tool for concealment, something only criminals or the paranoid need. That assumption is wrong. Here's why it matters right now.

I started Noctis because I got tired of watching people I knew hand over their data without realizing what they were agreeing to. Not because they were careless. Because the systems collecting it were designed to be invisible. The permissions are buried. The data flows are opaque. The consequences show up years later, in ways you can't trace back to the checkbox you ticked in 2019. This isn't paranoia. It's what these companies actually do, and it's documented.

//01. Privacy Is Not About Secrecy

People who say they have nothing to hide routinely close the bathroom door. They don't read their email aloud in public. They use passwords. They don't let strangers read their medical records. None of that is because they're doing anything wrong. It's because they understand, instinctively, that some information is theirs. The content of your thoughts, your medical history, your relationship problems, your financial situation: you share these selectively, with people you trust, in contexts where sharing makes sense. That selective control is not secrecy. It's the basic mechanics of being a person.

Privacy is about power. Who gets to know what about you, when, under what circumstances, and for what purpose. The moment that control shifts from you to someone else, your relationship to every institution changes. You become legible to systems that were not built with your interests in mind.

Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.

Edward Snowden — yes, you've seen this quote before. It still lands.

//02. The Chilling Effect Is Real

Surveillance changes behavior. This is not speculation. It is one of the most replicated findings in behavioral research. When people know they are being watched, they self-censor. They become more conformist. They avoid topics that might be misinterpreted. They stop asking certain questions. They don't read certain things. They don't say what they actually think.

In 2013, PEN America surveyed writers after the Snowden disclosures. One in six said they had avoided writing or speaking about certain topics because of surveillance concerns. These are professional communicators whose job is to engage with difficult ideas. If surveillance affects them, it affects everyone who does anything publicly or digitally.

This isn't hypothetical. It's already happening. The active suppression of thought, speech, and association, produced not by law but by the awareness of being watched. A society under comprehensive surveillance is not a free society, even if no law has changed.

//03. What Palantir Actually Does

In early 2025, the Department of Government Efficiency began negotiating access to federal data systems across agencies including the Social Security Administration, the Department of Health and Human Services, and the IRS. Palantir, the data analytics company founded by Peter Thiel, was brought in to build the integration infrastructure.

Palantir's core product is data fusion. It takes information from disparate sources, medical records, financial records, immigration records, law enforcement databases, social media, and builds unified profiles. It is used by intelligence agencies, law enforcement, and immigration enforcement. Its software is what ICE uses to locate and track undocumented immigrants. It is what the UK's NHS used to build patient profiles during COVID. It is what multiple police departments use for predictive policing.

The access requested under DOGE represented something different in scale: the potential for a single integrated view of the financial, medical, and personal records of nearly every American citizen, managed by a private company with government contracts and no public accountability framework. Federal employees who raised legal objections were dismissed. Court orders slowed some of the access. The fight is not over.

Why this matters to you specifically

You don't have to be undocumented, politically active, or under investigation to be affected by this kind of infrastructure. You just have to exist in a database. Everyone with a Social Security number, a tax return, a Medicare claim, or a federal student loan does.

//04. Data Collected Today Can Be Used Against You Tomorrow

This is the argument that cuts through most of the 'nothing to hide' thinking. Even if you accept that you have done nothing wrong by current standards, you cannot know what will be considered wrong in the future, or by whom, or under what legal framework.

Location data sold by data brokers to advertising networks in 2019 was purchased by law enforcement agencies in 2022 to identify people who had attended abortion clinics in states where abortion became illegal. The people whose location data was sold had done nothing wrong in 2019. The legal landscape shifted. The data was permanent. It was already in the broker's database.

Clearview AI scraped billions of photos from social media and built a facial recognition database used by over 3,000 law enforcement agencies. The people whose faces are in that database never consented to it being built. Most of them don't know their face is searchable. The photos were public. The aggregation and deployment for law enforcement was not something they anticipated when they posted a photo in 2014.

Think about who this affects in practice. A gay person in a country that hasn't criminalized homosexuality yet. A journalist whose sources become politically inconvenient. Someone who attended a protest, donated to a cause, or asked their doctor a question that gets flagged by an algorithm they'll never see. Someone whose medical history gets used by an insurer to make a decision they're told is just policy. The data was always there. The rules shifted.

The problem with 'nothing to hide' is that it assumes the rules are fixed and the interpreter is neutral. Neither has ever been true in human history.

//05. Privacy Is Legally Recognized as a Human Right

This is not an opinion. Article 12 of the Universal Declaration of Human Rights, adopted in 1948, prohibits arbitrary interference with a person's privacy, family, home, or correspondence. Article 8 of the European Convention on Human Rights guarantees the right to respect for private life. The UN General Assembly has passed multiple resolutions affirming that the same rights people have offline must be protected online.

The EU's General Data Protection Regulation, widely misunderstood as a technical compliance framework, is fundamentally a human rights instrument. It proceeds from the premise that personal data belongs to the person it describes, not to whoever collected it. That's a meaningful legal commitment, even if enforcement is uneven.

The US has no real federal privacy law. That's not an accident. It's by design. The legal protections that do exist are sector-specific, fragmented, and enforced inconsistently. The companies whose business models depend on collecting your data have spent decades lobbying to keep it that way. It has worked.

//06. The Corporate Surveillance Layer

Government surveillance is the most dramatic version of the problem, but it isn't the only one. The infrastructure of commercial surveillance is older, larger, and in many ways more intimate.

Google processes roughly 8.5 billion searches per day. Those searches are a direct log of what people are thinking about, worried about, curious about, and planning. They are tied to accounts that also hold email, calendar, location history, voice recordings, and document contents. Meta has built behavioral models on over 3 billion people. Those models are used to predict and influence behavior at scale, originally for advertising, but the capability exists for any purpose the company or a government chooses to apply it to.

The data broker industry, which most people have never interacted with directly, trades profiles on the overwhelming majority of American adults. These profiles include inferred political beliefs, health conditions, financial stress indicators, religious affiliation, and relationship status, assembled without your knowledge or consent from the residue of your digital activity. These databases are purchased by advertisers, employers, landlords, insurers, and law enforcement.

The aggregation problem

No single piece of information about you is particularly sensitive. Your name is public. Your employer is on LinkedIn. Your neighborhood is guessable from your area code. Your gym is tagged in an Instagram post. Individually, none of it matters. Combined, these fragments create a profile more detailed than anything you've ever deliberately shared with anyone. Surveillance works through aggregation, not through individual disclosures.

//07. Privacy Protects Everyone, Especially the Vulnerable

Privacy protections are not equally valuable to everyone. They are most valuable to the people with the most to lose from exposure: abuse survivors who don't want to be found, LGBTQ people in unsupportive families or hostile jurisdictions, immigrants navigating uncertain legal status, activists and journalists in countries where dissent is prosecuted, people with stigmatized medical conditions, anyone belonging to a group that is currently, or could in the future be, targeted.

When you argue against privacy protections because you personally don't need them, you are arguing that the people who need them most should go without. The logic only holds if you assume your circumstances represent the universal case. They don't.

//08. What Actually Helps

Privacy is not binary. You won't go from fully surveilled to invisible overnight, and you don't need to. What you can do is reduce the surface area. Make it harder to build a detailed profile. Move your most sensitive communications off platforms that mine them. Run an operating system that isn't built on advertising revenue.

Start with Signal for conversations that matter. It collects almost no metadata and encrypts everything end-to-end by default. That's not a setting you enable. It's just how it works. Then look at your phone's OS. Your search history, your location, your app usage patterns: most of that is collected at the operating system level, not the app level, which means swapping apps doesn't actually fix it. GrapheneOS removes Google from that layer entirely. It's the only mobile OS that does this while still running on hardware with a meaningful security foundation.

Get a real VPN, not one you found through a YouTube ad. Mullvad doesn't ask for your email address when you sign up. In 2023, Swedish police showed up with a warrant and left with nothing, because there was nothing to take. That's the kind of proof that matters. Use unique passwords everywhere and a password manager like Bitwarden to keep track of them. Credential reuse is how profiles get linked across breaches. And if you want to go further, look into opting out of data broker databases. It takes a few hours and it's worth it.

None of this requires you to believe you're being targeted. It requires you to understand that you are a node in a data collection system that was not built for your benefit, and that the cost shows up later, in ways you won't be able to trace back to the moment you handed it over.

* * *

Privacy is not about hiding. It is about maintaining the conditions under which a person can think freely, associate freely, and live without being permanently legible to institutions with interests that may not align with their own. That is not a niche concern for the paranoid. It is the basic infrastructure of a free life. And it is eroding, quickly, in ways that are visible if you choose to look.

ready to act

Take back your phone.

Every device ships pre-configured with GrapheneOS, bootloader locked, and verified boot enabled. Just turn it on.