Hacker News new | past | comments | ask | show | jobs | submit login

If any of those governments were attempting to track down accomplices to a known mass murderer and terrorist??? At least a little part of me would support it.



And when China or Russia or Syria decides that an American is a mass murderer or a terrorist? Do you want them to compel Apple to unlock Francis Gary Powers' iPhone?

I'll be honest, if somehow Apple could sign a software update that merely disabled encryption for phones belonging to actual mass murders and terrorists, and didn't work on the rest of our phones, I'd be supportive. It's just that there's no possible way for such a thing to exist. The best you can do is a software update that works on everyone's phone, in the hands of people who promise to only use it on mass murderers and terrorists. That's far riskier.


And when China or Russia or Syria decides that an American is a mass murderer or a terrorist? Do you want them to compel Apple to unlock Francis Gary Powers' iPhone?

In this case we're talking about someone who walked into an office party and killed 14 people. So ya, I think it's in all of our best interests if we see what's on his phone. Just like I think it's in our best interests if we see what was in his house or in his car. Or do you think we should have just locked the front door to his apartment and waited for the movers to come take everything away?

So if your question is "is there ever a case in which compelling Apple to unlock a phone is OK" my answer is "you bet your ass". There is a right way to do this. It involves warrants and transparency (not secret FISA nonsense). It should involve oversight and be an extraordinary step (just as searching someones home is). But if we catch one terrorist on his way to blow up downtown San Francisco, well it'd sure be handy to have a way to know that his buddies are on their way to blow up Seattle too.

Now I know exactly what's coming, because it always comes so I'm just going to head it off now. No I don't support the unlimited power to search peoples phones. No I don't support the idea of implanting remotely exploitable backdoors into phone operating systems to make this process easy for the government. No I'm not a government shill. No I'm not suggesting that we trade liberty for security. And no I'm not arguing that if you have nothing to hide you shouldn't care about privacy.

NOWHERE in my argument did I take any of those positions. I'm arguing that when the technology exists and the circumstances call for it, backed by transparent functions outlined in the constitution, we should be able to get at data that might save actual lives.

I think China, Russia, or Syria should have that same right. I think that if Apple is going to distribute their phones in those countries they should play by whatever rules those countries have. If Apple really doesn't like it, don't cash the checks.


The FBI isn't asking for Apple to give them the update to apply themselves. They're asking for Apple to apply the update. To brute force another phone, the FBI will have to go back to Apple.


Sure, but the legal precedent will make this a rubber stamp in future, and will mean that Apple can't reasonably resist such orders from other governments.


The precedent is already set (the FBI can demand information about a person from a company).

In the future, Apple could make a phone that is actually secure. For example, requiring the phone to be unlocked before updating firmware.


What are you taking about? Apple has no information about Farouk.


What are you talking about? Where did I say Apple has information about Farouk?


You said that the FBI can demand information about a person from a company. What person are we talking about?


This case does not establish any precedent. This is a well-established power that has already been used to access iCloud data and extract unencrypted data off iPhone running older versions of iOS.


Almost all legal experts who don't work for the FBI who have expressed an option this disagree with you.


Such as? For an example to the contrary, see rayiner's comments on this issue. The only people I'm seeing that think this case is precedent-setting are tech bloggers who don't know any better and lawyers who were given a short time to respond and didn't know what the FBI was actually requesting.

The only restriction on this law that Apple could claim (aside from legally out-there options like equating writing software to forced speech) is "unreasonable burden," but this is something that almost any of us could hack up in a day given access to Apple's release build system.


This simply isn't true.

if (deviceUid = 12345) { openBackDoor() }

Cryptographically sign code. Done. We know it's secure and will only work on devicd 12345 because we trust cryptography.


There could be an exclusion in the EULA: this software may not be used in the furtherance of terrorist, anti social or illegal activities. Or any activities that might be deemed illegal in the future.

Then the EULA police would have the ability to really nail them, like Al Capone for tax evasion.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: