Hacker News new | past | comments | ask | show | jobs | submit login

> You say "somebody" would need to already know a lot about the user to make use of your database. But from my perspective - perspective of the user - that "somebody" could just as well be your company

Well yes, noticing problems in these databases and providing consulting towards their mitigation (ideally before the pain shows up) is more or less what we're selling. Some degree of analysis must remain possible because otherwise it would be unclear what the user is paying us for.

My point is just that different situations call for different privacy postures. "Anonymization = Untrustworthy" glosses over this in a way that doesn't help us find ways to improve the situation.




> My point is just that different situations call for different privacy postures.

True. Privacy is a special case of security, and it's only useful to talk about security in terms of what threats are of concern.

> "Anonymization = Untrustworthy" glosses over this in a way that doesn't help us find ways to improve the situation.

If your concern is to be as anonymous as possible, then "Anonymization = untrustworthy" is not an unreasonable stance. The only way I know of that data can be collected and handled in an anonymous way is to aggregate the collected data immediately and discard all individualized data.

Replacing data with something like a hash is useful in many ways, but there are many things it isn't that useful with.


If their goal is to be anonymous as possible, why did they give you the data in the first place? If they're aware of your practices and still in the loop, then presumably their relationship with you is such that some degree of trade-off is acceptable.

I think the better focus for the article would be on transparency. I didn't write the script because I was conforming to some well defined policy about what's in bounds and what's out. I wrote it because somewhere in my heart I felt a vague duty to improve the situation. I'm getting a little pushback on my defense of the script--which is fair--but better would be to scrutinize the policies under which I had to ask my gut about it in the first place.


> If their goal is to be anonymous as possible, why did they give you the data in the first place?

Often, it's because there is no other choice. When I'm revealing personal data to companies and such, that's usually why. Very few companies that I'm aware of are actually trustworthy on these matters, but I can't live without doing business with many of them regardless.

I absolutely do decide to reveal data to some companies that are optional, though. Like with all security matters, there's a cost/benefit calculation to be done here.

> I wrote it because somewhere in my heart I felt a vague duty to improve the situation.

Which is entirely laudable, and I think a worthwhile activity. It is, however, an effort that mitigates some risk rather than making things safe. People like the EFF are focused more on the goal of safety. It's almost certainly not a goal that can be fully achieved, but I'm glad that they're shooting that high anyway.


I understand your point and I agree with it within its scope, however I want to reinforce the point often missed by data handlers: you're talking about tradeoffs involved in doing the thing, whereas we (the users) are saying, don't do the thing. Anonymization may be a spectrum of trade-offs, but for the side EFF represents here, the two issues are: 1) that companies do the thing that makes anonymization a meaningful concept in the first place, and then 2) use one end of the spectrum to sell this to the public, while actually sitting on the opposite end of that spectrum.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: