>You should engage with the arguments the other side makes.
The arguments are "Protect the children.", "Catch terrorists.", "Catch criminals.".
Those arguments have been engaged with for decades. They are purely emotional arguments. Anyone who still pushes those arguments forth is most likely doing so with ulterior motives and cannot be reasonably "engaged" with.
For what its worth the anti-encryption/anti-privacy laws have caught terrorists in the UK. My company provides data storage for their dragnet and handles various requests and Ive seen first hand 4 different instances where the UK gov watching everyones internet activity led to terrorists being caught.
This number by itself means nothing as the other variables are unknown.
How many terrorists were not caught by these systems?
How many would have actually done these actions instead of just talking about it? How many could have been caught with just standard police work?
Without knowing these variables then there is no way to say if these systems are particularly good at catching terrorists.
I wouldnt go as far as saying it means nothing, but I agree that the story certainly isnt simple. Was just pointing out that "catch terrorists" isnt a purely emotional argument. Would the terrorists be caught anyway? We'll never know, but theres no way you can say they would for certain. Personally I dont think catching a few terrorists is worth giving up privacy but other people disagree.
> Without knowing these variables then there is no way to say if these systems are particularly good at catching terrorists.
I dont think we can ever figure this out since no one is willing to run an rct when it comes to counter terrorism
> anti-encryption/anti-privacy laws have caught terrorists
This is undoubtedly so; but much turns on the trust in government. In this U.S., the president, himself a documented profligate liar, just invited an equally untrustworthy unelected person into the halls of government to vacuum up whatever data he pleased. Maybe trust in the UK government is higher.
Collecting data is often not the problem. The problem is how to evaluate it and use it to direct the use of finite law enforcement or counterintelligence resources.
But to your point, let's not forget congressional republicans rushing a SKIF on capitol hill with their mobile devices out in clear violation of policy (and common sense.) I am relieved by the fact that Trump and Musk do not seem to understand what they can use sensitive information for (other than perhaps to sell or give away to foreign governments and businesses.)
I think my point is good intelligence comes from stitching together numerous data points and often traffic analysis is as good (or better) than content analysis. And maybe that the overwhelming majority of elected officials have no conception of how intelligence is collected and evaluated.
Low hanging fruit. The smart ones likely aren't being caught now.
Moreover, it's only a matter of time until the criminal fraternity all catch up and are on the same wavelength. That's when all but the dumbest know exactly what not to do or say on the net.
The Internet is still comparatively young and like everyone else those who've evil intent are still learning exactly how it works. I'd bet money that it won't be long before a 'bestseller tome' of definitive what-not-to-dos cirulates amongst this mob.
The question is at what level will law enforcement's catch have to fall before it has to turn to other methods.
Let's not ignore the full history here. That is a bad faith argument. It was a crime to use expensive encryption 30 years ago, but a lot of decisions were made to allow it. Today, every single one of those old caveats about child porn, drugs, money laundering, terrorism, (both domestic and international) and criminal acts in general all have stories where weaker encryption would have saved hundreds and hundreds of lives. We have to recognize this or we're just arguing past each other.
I'm not sure how this is answers my question at all.
How many whistleblowers would have been killed without a secure way to blow the whistle? How many journalists and journalist sources would have been killed? Etc. These people aren't using the USPS for good reason.
Point being, you are only doing one side of your calculation and presenting it as a full argument. But it's just a bad argument unless you calculate both sides.
That's a different question. You asked how many people would have been harmed if weaker/no encryption was the standard. The USPS is a message system where federal employees are able to intercept suspicious content, and there is no built-in encryption for mail. Voting by mail is a great example of how a critical message can be sent without relying on encryption. Whistleblowers can still encrypt documents on a flash drive, and drop it into a mailbox. There is nothing stopping them from doing so.
I don't really want to hash this same thing out for the... At least hundredth time. You're not going to convince me, I'm not going to convince you, and we'll both just leave less happy if we keep going.
>Whistleblowers can still encrypt documents on a flash drive, and drop it into a mailbox. There is nothing stopping them from doing so.
The only thing I want to highlight for your consideration is that the USA is not the entire world. The USPS, even if it were perfect, does not exist in the overwhelming majority of the world. People talk to people across borders.
(Also, with some of the proposed laws, encrypting the USB would be illegal)
And no service offering encryption has existed since 1755? Because that is required for your argument. Otherwise you simply send unimportant stuff via USPS and sensitive/secret/important stuff via non-USPS.
My vote, my taxes, my REAL-ID driver license, passport, credit cards, phone SIM, checks, 401k statements, etc. have very recently been sent via USPS. Do you consider this unimportant stuff?
A bit of a nit-pick. 30 years ago was 1995. It was not a crime to use PGP in the US in 1995. What PKZ was charged with was exporting the encryption technology (or allowing it to export itself by putting the code on an anonymous FTP server.) The Bernstein case was similar in that it was the export of the machine-readable code the government objected to, not it's domestic distribution. The right for researchers to publish text describing algorithms had earlier been recognized by the court (which is why Martin Gardner could describe the RSA cryptosystem in Scientific American in 1977.)
>> You should engage with the arguments the other side makes.
> The arguments are "Protect the children.", "Catch terrorists.", "Catch criminals.".
> Those arguments have been engaged with for decades. They are purely emotional arguments. Anyone who still pushes those arguments forth is most likely doing so with ulterior motives and cannot be reasonably "engaged" with.
Oh come on. Why do you think a "purely emotional arguments" are illegitimate? Are you some galaxy brain, coldly observing humanity from some ivory tower constructed of pure software?
Nearly all positions people take are, at their core, "emotional." And the disagreements that result in "arguments" are often really about differing values and priorities. You might value your "freedom" more than anything and are willing to tolerate a lot of bad stuff to preserve strong encryption, some other guy might be so bothered by child sexual abuse that he wants to give it no encrypted corner to hide in. You're both being emotional.
Those are both reasoned arguments. The emotional argument would be "some guy is so bothered by sexual abuse he wants to ban lightbulbs because once he heard about a lightbulb in the context of an abuse". The "solution" is not really a solution, but the emotional person does not really care about solutions, he's too emotional to think straight.
>The the GP was using "emotional" to dismiss the kind of arguments
I'm dismissing arguments that are designed to appeal to (and manipulate) the emotions of the person listening. Such as the three examples I gave, which are, in almost every case, used to win an argument without having to consider any possible nuance of the situation.
Often, it's a completely thought-stopping appeal, because everything is simply countered with "so you don't care about children". Or, in your case, subtlety alluding to me being tolerant of CSAM (which was wildly inappropriate, albeit a great example of why I generally just don't talk to people who use those types of arguments).
Apparently that makes me galaxy-brained or whatever, though. ¯\_(ツ)_/¯.
> I'm dismissing arguments that are designed to appeal to (and manipulate) the emotions of the person listening.
My point is that's pretty much all arguments, except maybe some very obtuse ones no one really cares about.
> Or, in your case, subtlety alluding to me being tolerant of CSAM (which was wildly inappropriate, albeit a great example of why I generally just don't talk to people who use those types of arguments).
That's not what I was doing. I was giving an example to show it's a trade-off driven by priorities and values. But if you want to be super-logical about it, supporting strong privacy-preserving uses of encryption necessarily implies a certain level of tolerance for CSAM, unless you support other draconian measures that are incompatible with privacy. Privacy can be used for good and bad.
>My point is that's pretty much all arguments, except maybe some very obtuse ones no one really cares about.
There is a distinct difference between a person having emotions while arguing, and using an appeal to emotion as a rhetorical tactic. I do not agree that "pretty much all arguments" contain an appeal to emotion (again, as a purposeful fallacious rhetorical tactic), even though all arguments obviously will have people feeling some sort of emotion.
Even looking through this entire thread, most of the disagreements here do not contain appeals to emotions.
I'm sure that any book on logic and rhetoric from the last few centuries would explain it better than I can. The wiki page has some good explanations and examples as well.
> Clearly the pressure on government to write these laws is coming from somewhere
Software surveillance vendors.
> Chat control: EU Ombudsman criticises revolving door between Europol and chat control tech lobbyist Thorn
> Breyer welcomes the outcome: “When a former Europol employee sells their internal knowledge and contacts for the purpose of lobbying personally known EU Commission staff, this is exactly what must be prevented. Since the revelation of ‘Chatcontrol-Gate,’ we know that the EU’s chat control proposal is ultimately a product of lobbying by an international surveillance-industrial complex. To ensure this never happens again, the surveillance lobbying swamp must be drained.”
The problem is LEOs (and associated industry) claiming that enforcement is impossible without the ability to obtain cleartext.
This is a lie: obtaining cleartext just makes enforcement vastly easier and more scalable. If crims have encrypted mobile phones, you can still point a microphone at them.
Honestly, I had always assumed LEO wanted access to decrypted message content so they could sell it to advertisers. I mean sure, you could catch a criminal or two, but with all that non-criminal data, just imagine how much off-the-books revenue you could accrue by selling it to the AdWords guys.
The other side being, for instance, the surveillance lobby that pushes for chat control laws in the EU? The "arguments the other side makes" are pretty clear at this point, and nothing to do with the "think about the kids" really, not sure engaging with them is the point.
> Something is a crime if society determines that it should be so. Nothing more.
According to The New Oxford Companion to Law, the term crime does not, in modern criminal law, have any simple and universally accepted definition.
Society also determined it was ok to use a firehose on black people, so I think the best we can say is that the term Crime has nothing to do with Morality, and people who conflate the two need to be looked at with suspicion.
> You should engage with the arguments the other side makes.
I don't. I think most arguments about crime require one-side to act in bad-faith. After all: The author doesn't actually mean that Encryption isn't illegal in some jurisdictions, they mean that it shouldn't be. You know this. I know this. And yet you really think someone needs your tautological definition of crime? I don't believe you.
The arguments are mostly that they dislike what can be accomplished via math. “The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia” isn't exactly an 'argument' so much as an insistence.
The article does address the flaws in some of their arguments (encryption inconveniences law enforcement, think of the children) by pointing out that the average person and children are kept save from criminal elements by encryption.
The arguments from the other side are of the "think of the children" and "tough on crime" variety. They are purely emotional and if you try to dispute them they just respond with "so you don't care about children?". It's like trying to argue with a religious person on matters of faith, you're just not very likely to convince them.
Clearly the pressure on government to write these laws is coming from somewhere. You should engage with the arguments the other side makes.