Every single time something seems to claim to be untraceable or anonymous, it seems to hold for a while, and then there's a "<X> is not as safe as we thought" headline.
Seems to be a good rule to just not trust anything.
Or assume the government has unlimited resources to throw at a problem, and not commit crimes.
Most of these headlines describe something that would require nation-state resources to crack. If you wanna hide paying for something legal, it's probably still sufficient.
“Not commit crimes” is a good maxim when you’re living in a liberal democracy (and even then …). It’s less easy when the state outlaws things unjustly. Like being gay. Or being (a)religious. And to give just two examples that apply to (otherwise) liberal democracies, most people would include “personal, recreational drug use” in the same category. And there are Western democracies that outlaw certain sexual acts between consenting adults (e.g. Germany, which outlaws any incest, even between consenting adult siblings), which also rubs many people the wrong way.
In sum, “not [committing] crimes” isn’t always straightforward.
I was with you every step of the way, then suddenly you were defending incest and I was like “how the hell did I end up here?”
Also, while “many” is a subjective term, I don’t think you’re using it correctly here. The idea that you should not regulate sex between consenting adults in general is very popular, but most countries carve out exceptions for sex work. Not saying they should, but many do. And I definitely think the incest legalization lobby must be very, very small, even though it probably has a sympathizer in the Oval Office.
> then suddenly you were defending incest and I was like “how the hell did I end up here?”
… Which nicely illustrates what different people find acceptable. But I agree that sex work would have been a better example. Either way, you felt it necessary to add the qualifier “in general”, and many (…) people, though certainly a relative minority, would fundamentally disagree with this qualifier (while talking about consenting adult without power imbalance in their relationship).
If you take the US as an example (I assume you're American), then did you know that the US Supreme Court doesn't actually know how many laws apply at a particular time to a particular person[1]? The US code has an immense amount of laws that are all equally legally binding, but have different levels of "obviousness" as "don't murder people". If a US Supreme Court judge cannot be sure what laws apply in any given case, what chance does a layperson have to understand the tens of thousands of federal crimes in the US?
For instance 16 U.S.C 3372 (the Lacey Act):
> It is unlawful for any person [...] to import, export, transport, sell, receive, acquire, or purchase any fish or wildlife or plant taken, possessed, transported, or sold in violation of any law, treaty, or regulation of the United States or in violation of any Indian tribal law
So if you have ever bought or been gifted a fish, animal or plant that at any point broke Indian tribal law (even if you didn't know about it, even if it wasn't the law where you received it, and even if the plant or fish is legally farmed and sold in another area) you have broken a federal law and you're now a criminal.
In short, "don't commit crimes" is not as simple as you may think in all cases. You even have cases where the US government has retroactively applied new laws (in violation of the charter of human rights) for things that were not crimes at the time (such as "copyright infringement" for a work that used to be public domain). Very few criminals were tried under these strange laws, but they are still just as illegal as more common crimes.
OK, but this is theory. Can you point to any practical examples of people who were charged with a crime under the Lacey Act for a common, harmless transaction?
As with crypto for personal communications or mobile devices, I think it's reasonable for people to expect things claiming to be secure to be secure, or to more clearly explain the attack surface area. Someone using a technology should be able to determine what sort of protection they have if they use it for something opposed by a nation state.
As for using Monero to break laws, it's probably pretty low risk to buy some MDMA to have a very good weekend, and fairly high risk to receive millions of dollars in payment for a ransomware attack against a government.
There's a place I often visit in Shoreditch called Ziferblat[0], and I think that they have something of a solution to this (they have other places too).
Essentially, you are charged for your time and everything else is free. Presently I think it's about 7p per minute for the first hour, and then 4p a minute afterwards - there's also a cap.
While in there, coffee, snacks, etc, are all "free".
Perhaps this sort of model could work? Obviously if you're not willing to kick out all the remote workers. I guess this turns it into more of a coworking space though.
This doesn't seem like it's a solution to the problem, it's just a different type of venue with the opposite problem. If I go in and grab a coffee and a crumpet they're not going to be very happy.
Pretty much. Perhaps billing per minute if you're going to stay for longer than an hour, or per item if you're staying for less. Though that seems overly complex and a logistical nightmare.
The attacker isn’t a national state adversary. The attacker is a vengeful ex. The chances of such attacker even been aware of what hashing is, is close to zero.
Whoever designed this system is probably heavy on security, but low on product.
And when the attacker Googles "how to upload images banned by Facebook" won't they find that article from that one vengeful ex who understood the technology well enough to build a tool?
This is in general an unsolvable problem, and the only possible security is through obscurity. Images are on a spectrum, and so in order to hash an image, you must quantize the spectrum. Between two images, there is a continuum of mixed images. Using bisection, you can identify the point where it goes from matching to not matching. You can then upload the non-matching variant. Because of the way hashing works, this will be indistinguishable from any other non-matching image. Therefore, this creates an easy method to bypass facebook's filter while sacrificing minimal quality. As long as facebook matches against hashes, there is way to prevent bypassing of this form. All they can do is make it as difficult as possible.
> We use obscurity every day and its a completely valid layer of security.
Not sure I agree with that, most of the time when we do that is because we don't want to spend the time to have better security. And then we get burned. To your example of the 100$ bill: at my parents home with the car parked in the garage? No problem to do that at all. Out on the street in SF? No. I don't trust my glass enough as a security measure. But I don't leave money at all, is not security through obscurity.
But we are going OT. The problem that is raised is that they need necessarily security through obscurity. And we have two problems:
- How really robust are these algorithms? How long before we will see people abusing them?
- Have you thought hard enough about how this system could work? E.g: have a partial hashing made client-side and the final one on the server? Or a situation where the server code is open-sourced without the model to calculate the hash? That would allow for external review without disclosing the hash. Yes, you still need warranties that Facebook is using that code, but you could have a trusted third party certifying the program.
My point is, the person who designed this program didn't really understand the problem. The problem is not revenge porn. The problem is Facebook reputation. And this solution is totally deaf.
Facebook is pretty low on companies I would trust, on the other hand we know that NSA employees who have the highest level of security clearance were using their ability to intercept everything to stalk their exes and pass around their exes nude photos according to Snowden. Clearly nobody can be trusted, and as you write, a solution for this must incorporate this fact.