Skip Navigation

Data privacy: how to counter the "I have nothing to hide" argument?

I know data privacy is important and I know that big corporations like Meta became powerful enough to even manipulate elections using our data.

But, when I talk to people in general, most seem to not worry because they "have nothing to hide", and most are only worried about their passwords, banking apps and not much else.

So, why should people worry about data privacy even if they have "nothing to hide"?

307 comments
  • One thing I often see is people not understanding the difference between secrecy and privacy. They ask why it matters if you're not doing anything wrong. A UK government minister actually said "if you have nothing to hide, you have nothing to fear", and then backpedaled when someone pointed out they were quoting Joseph Goebbels. The analogy I've seen is simple: I'm sure you don't do anything illegal in the shower, but I'm also pretty sure most people would be uncomfortable with a law that required you to have a police officer standing in you bathroom with a video camera to record you showering, just in case.

    The other thing is the assumption that any information about you that the government actually has about you will only be used against you if you commit a crime, in which case you'll deserve it - if you're not a bad person then it's fine. This is a double fallacy.

    First, we've seen that information can be used to do all sorts of things regardless of wrongdoing - if someone knows enough about you, they can use it to manipulate you. I don't mean blackmail or whatever, although that's an option. I mean that with a clear enough picture of your preferences and biases and habits, someone can tailor their actions and information to your psychology and make you think whatever they want you to agree with.

    Second, it assumes that you won't ever commit a crime because crimes are bad things and you're not a bad person. This overlooks the possibility of you being mistakenly accused while innocent, but more importantly it overlooks the possibility that the government will change into something that holds different moral values to yours. Even in the modern world we've seen places outlaw abortions, or criminalise homosexuality, or pass laws on what religions you're allowed to follow. If that happens in your country and you find yourself on the wrong side of whatever arbitrary line they've now drawn, you may regret giving them so much information about you - information that lets them identify you, prove that you broke their new rules, and ruin your life in so many ways.

    The default principal of any exchange with governments, businesses, or any entity taking your information should be to give as much information as is required for them to perform the operation you're requesting of them, and no more - and wherever possible to only engage with those entities that you trust to have that information; a trust that they earn by a verified and unbroken track record of ethical and trustworthy behaviour.

  • You may have nothing to hide now but what if your (political) opponents reach a point where they have access to your data and the (political) power to use it? What happens if they don't like your opinions which (you think) you don't have to hide now?

    My opinions may mostly align with the current general consensus in my country and since I'm not politically active I am rather unlikely to be harmed because of my opinions in the foreseeable future (unless I call someone 1 Pimmel). But there are certain developments that are troubling and there are people who don't like what I've said on the internet (duh). Now, I'm not exactly anyone important and realistically there are far more important targets than me personally. But still, it's not unthinkable that the things I've said (things I've looked at on the internet, things I've bought, things I've like/upvoted) might be used to my detriment if certain people came into a position where they have access to any stored data on me.

    This applies regardless of your political leanings. If data exists, no matter how harmless it may seem, there's always the possibility of people who REALLY don't like it getting access.

  • Maybe you don't think you have anything to hide today, but what about the future? Millions of women gave their period-tracking apps that kind of personal/private data when Roe was in effect because at the time, states couldn't use it to prosecute women who miscarry or get abortions. Now that Roe is gone, that data is out there and can't be recalled.

    By the same token, everyone who went out and got a 23-and-me genetic test gave their genomes to private companies who can legally sell that information to insurance companies that can use that information to hike their premiums or terminate their policies if they think your genes predispose you to some expensive-to-treat condition. Also those family trees don't lie about whose kids are the product of adultery, hahahahaha

    You do have things to hide in the sense that they're nobody else's business.

    Also, some countries have established digital privacy as a right (in particular, EU countries) and that's not just about protecting your dirty stinky secrets, it's also about preventing social media being weaponized as political or information warfare vectors based on private information obtained without your consent. (the same profiling used to target relevant commercial ads to you is also usable to target information warfare and propaganda to your susceptible relatives, and they vote in addition to giving racist rants at holiday dinner)

    In other words, your privacy is intrinsically valuable- if it wasn't, exploiting your private information wouldn't be a multi-billion-dollar industry

  • They are appealing to the fallacy that hiding things means bad behavior.

    Not true. There are plenty of good reasons to hide things. Social security numbers, income, bank account info, even personal preferences.

    Privacy != bad

  • I mean, if you have nothing to hide, then surely you don't need window blinds or a bedroom door? It should also mean that it's okay for guests to rifle through your closet and dresser drawers, right?

  • Nothing to hide until a person has something to hide. An attitude of "I don't have anything to hide" may catch up to a person. No one knows what the future holds. One day they might start tracking private information a person does not want tracked, for example financial or medical data. So better to put the fence up now than try to put it up during a stampede.

    Personally I keep my data private with a reasonable amount of effort. I try to keep a small internet footprint and there's stuff I won't do for the sake of privacy. Going some years back the only social media I engaged in was Reddit until coming here to Lemmy. These are anonymous mediums. It blows my mind that so many people are willing to completely splay out their lives non-anonymously on social media.

  • "Having nothing to hide" sounds like worrying about getting in trouble from data. But you can also get yourself and others into trouble being tracked or manipulated without consent.

    A big problem is that data does not usually go away (even of you erase or delete it or forgot you shared it).

    Any data you reveal can build up over time. The more data available on you, the easier it is to triangulate, to find you specifically.

    And patterns happen over time. More data on your habits makes it easier to predict what you do, easier to manipulate you. Not just with advertisements or insurance rates, I mean outright scams. For example, my grandfather got conned out of $5k by a scammer who could impersonate my cousin based on the cousin's facebook, linkedin, and public records.

    We also have very little insight into how much data we generate. Especially online, we can't imagine the amount of logged activity and data generated. This makes it hard to meaningfully say "I don't have a problem with how somebody uses my data" because we can't even grasp the scale of the data and how it can be used.

    I also second another poster who mentioned you don't have anything to hide now, but times change. You can't go back and protect data once it's used against you! I have firsthand experience with that in Texas, USA. I worked with a company that realized in July 2022 that they should NOT record if people were pregnant in a huge database. We didn't want to have data on a pregnancies that may not work out for whatever reason in Texas because it could be used against people.

  • As the Cypherpunk Manifesto says:

    Privacy is not secrecy. A private matter is something one doesn't want the whole world to know, but a secret matter is something one doesn't want anybody to know. Privacy is the power to selectively reveal oneself to the world.

    ... An anonymous transaction system is not a secret transaction system. An anonymous system empowers individuals to reveal their identity when desired and only when desired; this is the essence of privacy.

    People can desire privacy for privacy's sake. Wanting privacy doesn't necessarily mean they're criminals who need anonymity or secrecy to cover up illegal/immoral acts; it just means they're human.

    For an offline example, consider that you're a cis girl in a women's locker room. You know everyone knows you have certain body parts, and you have nothing to "hide" due to this, but you still don't want to be stared at as you peel off your swimsuit.

307 comments