It's not literacy. They don't care. They need control, and if establishing control means increased risks for you, it's not something they see as a negative factor. It's your problem, not theirs.
The government put in restrictions against using certain powers in the Investigatory Powers Act to spy on members of parliament (unless the Prime Minister says so, section 26), so I think they're just oblivious to the risk model of "when hackers are involved, the computer isn't capable of knowing the order wasn't legal".
No, it shows they're thinking of computers like they think of police officers.
Computer literacy 101: to err is human, to really foul up requires a computer.
They don't understand that by requiring the capability for going after domestic criminals, they've given a huge gift to their international adversaries' intelligence agencies. (And given this is about a computer vulnerability, "international adversaries" includes terrorists, and possibly disgruntled teenagers, not just governments).
They understand. Signal Foundation's president, Meredith Whittaker, among many other tech leaders, have made it abundantly clear to both the UK and the EU.
I personally campaigned at the time the law was being debated. Met my local MP, even.
If I'd known about the idea of "inferential gap" at the time, my own effort might not have been completely ignored… though probably still wouldn't have changed the end result as I still don't know how to show lawmakers that their model of how computers and software functions has led to a law that exposed them, personally, to hostile actors.
How even do you explain to people with zero computer lessons that adding a new access mechanism increases the attack surface and makes hacking easier?
The politicians seem to see computers as magic boxes, presumably in much the same way and for much the same reason that I see Westminster debates and PMQs as 650 people who never grew out of tipsy university debating society life.
(And regardless of if it is fair for me to see them that way, that makes it hard to find the right combination of words to change their minds).
> How even do you explain to people with zero computer lessons that adding a new access mechanism increases the attack surface and makes hacking easier?
You literally tell them that. That's it. As prominent tech leaders have been doing. They either choose to believe experts, or disbelieve them. Or they could get a CS major. They chose option #2. They ostensibly disbelieve experts because what they're hearing does not mesh with what they want.
But let's be honest with ourselves; it's not that they disbelieve them, or don't understand. It's that they don't care. You are giving these people way too much of a benefit of the doubt. They have the tools at their disposal to remove any ignorance.
> You literally tell them that. That's it. As prominent tech leaders have been doing.
As it's not working, QED not "that's it".
> You are giving these people way too much of a benefit of the doubt.
They're hurting their own interests in the process. If they were just hurting my interests, I'd agree with you. But this stuff increases the risk to themselves, directly. I may have even told them about https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-0204 given the timing.
> Neither is underestimating your enemy or making excuses for their behavior.
Indeed. I do neither, which is why I left the UK.
It would be underestimating them indeed to have remained there — I foresaw, even then, that a story equivalent to this very headline would eventually emerge.
And it would also be over-estimating myself to think that I could change them after the Act when I could not change them before the Bill.
Absolutely not, MPs are not too stupid to process the concept of “a back door is a back door” they simply want this power and do not care about security or privacy if non-MPs. Everyone who voted for this needs to be thrown out of politics, but that will obviously not happen.
They don't even need control. They want control. Why? Either they're idiots who think they need control or they are tyrants who know they'll need control later on when they start doing seriously tyrannical things.
It's natural for the government to want control. It's literally what it is optimized for - control. More control is always better than less control. More data about subjects always better than less data. What if they do something that we don't want them doing and we don't know? It's scary. We need more control.
> they'll need control later on when they start doing seriously tyrannical things.
You mean like when they start jailing people for social media posts? Or when they are going to ban kitchen knives? Or when they're going to hide a massive gang rape scandal because it makes them look bad? Or when they would convict 900+ people on false charges of fraud because they couldn't admit their computer system was broken? Come on, we all know this is not possible.
I used to think it was illiteracy, but when you hear politicians talk about this you realise more often than not they're not completely naive and can speak to the concerns people have, but fundamentally their calculation here is that privacy doesn't really matter that much and when your argument for not breaking encryption based around the right to privacy you're not going to convince them to care.
You see a similar thing in the UK (and Europe generally) with freedom of speech. Politicians here understand why freedom of speech is important and why people some oppose blasphemy laws, but that doesn't mean you can just burn a bible in the UK without being arrested for a hate crime because fundamentally our politicians (and most people in the UK) believe freedom from offence is more important than freedom of speech.
When values are misaligned (safety > privacy) you can't win arguments by simply appealing to the importance of privacy or freedom of speech. UK values are very authoritarian these days.
Well it’s important that the argument is correct. They view ending end-to-end encryption as a way to restore the effectiveness of traditional warrants. It isn’t necessarily about mass surveillance and the implementation could prevent mass surveillance but allow warrants.
I oppose that because end to end encryption is still possible by anyone with something to hide, it is trivial to implement. I think governments should just take the L in the interest of freedom.
> They view ending end-to-end encryption as a way to restore the effectiveness of traditional warrants.
Traditional warrants couldn't retroactively capture historical realtime communications because that stuff wasn't traditionally recorded to begin with.
> It isn’t necessarily about mass surveillance and the implementation could prevent mass surveillance but allow warrants.
The implementation that allows this is the one where executing a warrant has a high inherent cost, e.g. because they have to physically plant a bug on the device. If you can tap any device from the server then you can tap every device from the server (and so can anyone who can compromise the server).
They shouldn’t be able to tap any device from a server. I’m guessing they would have to apply for a warrant and serve the warrant to Apple who review the warrant and provide the data.
Putting the panopticon server in a building that says Apple or Microsoft at the entrance hasn't solved anything. Corporations are hardly more trustworthy than the government, can be coerced into doing the mass surveillance under gag orders, could be doing it for themselves without telling anyone, and would still be maintaining servers with access to everything that could be compromised by organized crime or foreign governments.
Which is why the clients have to be doing the encryption themselves in a documented way that establishes the server can't be doing that.