Hacker News new | past | comments | ask | show | jobs | submit login
Why Apple Is Right to Challenge an Order to Help the F.B.I (nytimes.com)
418 points by doe88 on Feb 19, 2016 | hide | past | favorite | 311 comments



From here (4th para from the end): http://www.nytimes.com/2016/02/19/technology/how-tim-cook-be...

"Apple had asked the F.B.I. to issue its application for the tool under seal. But the government made it public, prompting Mr. Cook to go into bunker mode to draft a response, according to people privy to the discussions, who spoke on condition of anonymity."

If this is true, it sort-of implies that Apple would have done it, but secretly, and they were forced to take their very public stance because of the FBI's posturing.


I don't think so. Apple would prefer not to fight this particular battle over this particular case in the public sphere, but the FBI would. The general principle Apple is fighting for here is very important, but the FBI knows this is the weakest point in that principle: the locked phone of a dead terrorist who killed several people. Politically and possibly legally, this is a winner for the FBI. Technically it's a loser if you believe strong encryption is important at all. Apple wants to win the political argument, so it would not want this request to be made public. Anyway, why should it be, when next to no other such requests are made public?

Anyway, in a legal battle, you contest every relevant point. The fact that they contested it doesn't mean they would have behaved differently after, except that they wouldn't have had to make a public statement under less than ideal circumstances.


Are you suggesting the court would take public opinion under consideration as being at all relevant to the law? That's very worrying to me, considering how important it's been in the past for the court to ignore public opinion in favor of justice.


There's also a longer term play - no matter what the court rules, if public reaction is strong enough then elected representatives will be under pressure to change the law.


If judges were the type to rule solely on provable facts and rigid logic, they probably would have been better off pursuing a degree other than Juris Doctor.

Judges absolutely factor in public opinion to their rulings, especially where they must periodically undergo public votes to retain or dismiss on election day.


Are any federal judges elected? I was under the impression they are all appointed and can't be removed without impeachment and conviction.


Of course they do. They shouldn't, but the law doesn't work that way. In some cases the court may even decide to enshrine the public's opinion as a valid tool for determining legality (such as what speech counts as obscenity).


I didn't mean the courts (although that's an interesting discussion). The political play I mean is that the FBI is trying to get Congress to change the laws surrounding this topic.


Public opinion is plainly part of the definition of justice. cultural norms are frequently cited in decisions.


Cultural norms are not the same thing as public opinion. Or do you want the FBI to run attack ads against Apple? Your tax dollars at work taking care of that pesky "public opinion".


Out of curiosity, does anyone know how much Apple would be compensated for their time and effort to help the FBI? I assume they would be compensated somehow. But how much? I worry that this is going to be a way for the government to get free (or greatly cost reduced) resources to do its own work.


The article states that this occurred after Apple turned over all information about the phone that resided in iCloud. Which means that the government already has access to whatever data it wanted in the phone that was iCloud synced, and is mis-representing it's case as Apple stonewalling. (while Apple is downplaying it's willingness to cooperate for PR reasons).


I don't think "downplaying it's [sic] willingness to cooperate" is at all a fair summary.

They run iButt which is an entirely different matter as far as complying with subpoenas goes, especially when the subject of the subpoena is dead and the subject of an investigation.

What are they going to do? Tell the FBI and a judge that they're refusing to grant them access to unencrypted photos taken by the dead subject of a mass shooting investigation? There's no way they can do that.

It's quite another thing to be forced to develop exploits for hardware that's someone else's property just because you're the manufacturer of that hardware.


iButt? Seriously? Do you also still type M$?


Sorry about that. Like aylons suggested in a side comment I have the "Cloud To Butt Plus" extension installed. This is the first time it gets me into trouble.

FWIW it doesn't change the submitted text directly, but I edited a spelling error in my comment which changed "iCloud" to "iButt".

There's also a an extension that replaces "God" with "Nicolas Cage" that I recommend :)


Haha, that was confusing. I also have the extension and I can't tell who is actually using Butt and who isn't in this thread


That's really most of the fun you get from the extension. You're never sure whether you're reading the landing page for some new data storage gimmick or a novel new sex toy.


That's very very funny - carry on


He probably has the Cloud to Butt extension[1], and it changed its text before submitting:

[1]https://github.com/panicsteve/cloud-to-butt


I assumed that was a reference to (or maybe caused by? I don't know if the extension affects textareas) Cloud to Butt Plus.

https://chrome.google.com/webstore/detail/cloud-to-butt-plus...


I'm actually a little surprised that the FBI would go so public with the request? Anyone have any insights here?

Did they think perhaps there would be public outcry the other direction meaning Apple would look like it was harboring terrorist information if they didn't comply to a public request?


> Did they think perhaps there would be public outcry the other direction meaning Apple would look like it was harboring terrorist information if they didn't comply to a public request?

This is exactly what is happening. It's a political hail mary to get the easiest-to-see use case of forcing privacy violations on american corporations and users legally protected in court precedent. Of course, under this interpretation, there's the disturbing implication they had to wait this long (after the revelation about PRISM) to find a case that so well matched their political agenda.

It's also a little worrying that they are going public, because normally that's not necessary for court verdicts. Perhaps they're preparing for legislative action if the case fails?


>a case that so well matched their political agenda.

It's almost perfect isn't it ? It's prima facie reasonable, there's no immediate privacy violation, Apple - by the looks of it - could comply fairly trivially and failure is sure to lead to attempts to legislate. Don't like the motivation but you have give them points for planning and execution.


The same sort of play was made after the Oklahoma City bombing:

https://www.philzimmermann.com/EN/essays/WhyIWrotePGP.html


They do. Such is the case with the "patriot" act which was prepared and in waiting for a convenient event to justify the power grab. http://www.globalissues.org/article/342/the-usa-patriot-act-...


Any insight into which presidential candidates have the most privacy-friendly views here?


That are left?

Bernie has probably the strongest record, being both for net neutrality and against mass surveillance and the patriot act. I'd recommend taking a look through his policies as they are so well thought out in the cases he addresses. He doesn't address everything. On the other side of the fence, Rand Paul was the last sane voice (in terms of privacy), explicitly coming out in favor of net neutrality and strong, backdoor-less encryption.

This is in stark contrast to Clinton—well, just google "clinton privacy" and see the massive number of problems she's had to deal with that demonstrate her technological eagerness seems to outweigh her understanding. She's certainly been an eager advocate of EXACTLY the type of action we're seeing here, arguing that we need backdoors for law enforcement. Marco Rubio has explicitly advocated protecting the existing mass surveillance techniques.

I would be highly surprised if Trump or Cruz were for privacy—the former advocated for special ids, the latter doesn't seem to have a great grasp on technology, much less how to avoid using it to bludgeon people[1].

1: https://twitter.com/sentedcruz/status/531834493922189313


H. Clinton has a very good record on defending her own privacy. Thus, I find it hard to believe that her record in terms of the privacy of the public stems from ignorance or misunderstanding. She ran her own secured, private e-mail server while serving as a major public official, after all, whether the technical details were delegated to underlings or not.

I very much doubt that any establishment-backed candidates in either of the establishment parties would be willing to make any public statements directly applicable to privacy. They are far more likely to come in soft from the other side, and chat up "tools to combat terrorism and other forms of extremism," while eliding over the implication that organized domestic protest groups like Black Lives Matter, Occupy Wherever, or "Y'All-Qaeda" militiamen would also be targeted, in addition to overseas groups.

I'd rather not elaborate further on anyone specific, as I don't want to shove my political biases down any throats on HN.


I don't know why you're being down-voted. The US is in the midst of an election cycle that may very well pertain to the "legislative action" of the parent comment (in so far as the executive has an effect on the legislature). From a recent article [0], it states (summarizing):

Pro-Privacy: Cruz (R), Paul (R), Sanders (D)

Pro-Surveillance: Trump (R), Rubio (R), Bush (R)

Moderate: Carson (R), Clinton (D)

[0] https://nakedsecurity.sophos.com/2016/02/02/where-do-us-pres...


The evidence cited for cruz is a little weird; he's come out against NSA surveillance, but he's also for forcing Apple to comply with the court order to backdoor the phone. He is a bit of a constitutional nerd, though, so I suspect he's more similar to Obama in this respect than you might think considering how different they are in most respects. I'd put him on the moderate side, or at least as a wild card.

Also, I would put Clinton in the rabidly anti-privacy camp if just for this stance she took after San Bernadino: http://www.nbcnews.com/tech/security/could-hillary-clinton-s.... She clearly has no desire for privacy from governments except through legal constraints. Furthermore, her understanding of technology as demonstrated by stances on her email servers, on encryption, and on video games, is outweighed by her desire to either (ab)use technology regardless, or attempt to intervene in it. Not great for an executive, which you would hope would be conservative (slow to move) OR knowledgeable OR defer to a strong cabinet.


> I don't know why you're being down-voted.

Off topic, but I am going to make a Hacker News bot that looks for this phrase, waits 1 hour and sees if the comment is still downvoted.

Or, I'm going to write a copy and paste paragraph explaining to people why it appears that a comment has been downvoted. This would include all the mechanics, how karma works in HN, voting algorithms, mod flagging, time passed etc, involved as well as human psychology at play. So, the above phrase either is honest or dishonest.

For example, in this case I would posit that yes, you actually do know why they were being downvoted, but that by writing that you were actually saying "I disagree with the others who downvoted you". There is a difference.

It could be similiar to when people say "I don't know why anyone voted for Bush". They are not expressing ignorance about the reasons, they are saying that they disagree with peoples choices. In this case, it would be dishonest. Unless the person was actually ignorant of others (it is possible), in which case it is honest.

Off-topic, as mentioned.

Edits - I do know why I am being downvoted.


On topic reply to an off topic comment. :) Could it also be because people use "don't know" and "don't understand" to mean the same thing in many contexts? With "don't understand" being longer to type and speak and also being more "formal" in communication than "don't know", perhaps "don't know" is used as a substitute?

In the Bush example, someone saying "I don't know why anyone voted for Bush" probably means "I don't really understand why (and with what thought process) anyone voted for Bush after everything we know about the matter."


I don't know why this always works - at least on a large enough demographic - this strategy of being literal, candid and self-aware to leave the audience with an impression of total honesty and transparency, in contrast to the rest of the surrounding dishonest, double-speaking world.

Oh wait...


You're leaving out Gary Johnson, the libertarian candidate. He's already come out in full support of Apple and privacy.

In the recent GOP town hall, Cruz did not come across as supporting privacy or Apple at all. Rubio took the most measured stance and seemed at least to understand the issue the best. He did not support Apple, but said he understood that breaking into the phone would make it available to all and the government needs to work more closes with SV for a solution. Of course that could just mean PRISM V2.


It's a political move. The FBI wants this capability for all iPhones. But this particular case is very difficult politically to argue with to the general public. The FBI realized they had a strong political force available to them, and they seized it. They are attempting to either force Apple's hand, make them give into public pressure, or--best of all for the FBI--to convince lawmakers to pass a law requiring companies to provide backdoors.

The FBI will gladly keep sealed its requests that Apple unlock phones in more ambiguous cases, because it would be easy for Apple to publicly refute. But this case is just too good for the people at the FBI who want keys to everyone's data.


>I'm actually a little surprised that the FBI would go so public with the request? Anyone have any insights here?

The simplest explanation is that the FBI wants a legal precedent more than they want the information on the phone. If they get precedent they can request a "hack" into a phone whenever they feel the need, legally. The worry in this case is the troubling precedent it would set.

https://lawfareblog.com/not-slippery-slope-jump-cliff

Quotation from, Not a Slippery Slope, but a Jump off the Cliff By Nicholas Weaver.

The request to Apple is accurately paraphrased as "Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target's phone, cryptographically sign that malcode so the target's phone accepts it as legitimate, and run that customized version through the update mechanism". (I speak of malcode in the technical sense of "code designed to subvert a security protection or compromise the device", not in intent.)

The same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn't yet in law enforcement's hand. So the precedent the FBI seeks doesn't represent just "create and install malcode for this device in Law Enforcement possession" but rather "create and install malcode for this device".

Let us assume that the FBI wins in court and gains this precedent. This does indeed solve the "going dark" problem as now the FBI can go to Apple, Cisco, Microsoft, or Google with a warrant and say "push out an update to this target". Once the target's device starts running the FBI's update then encryption no longer matters, even the much stronger security present in the latest Apple devices. So as long as the FBI identifies the target's devices before arrest there is no problem with any encryption. But at what cost?


Even if we ignore the privacy cost, there's an economic cost - no one wants pre-owned software :)


Unless you go full RMS, you are using pre-owned software. The fact that it is only legal precedent preventing this from happening already shows the software is already pre-owned.


Establishing a Manichean media narrative framing Apple (and by extension all crypto providers) as enemies of America who hinder the righteous war on terror. Then if their legal case fails, public outrage, Something Must Be Done, legislative action rendering the AWA moot, achievement unlocked : all your phone are belong to US.

Or is that too tinfoil hat ?


They might have, it's certainly been a theme in some comments by supporters of the unlocking. He public move may be more about setting clear legal precedent - if Apple is forced to do this, the government certainly will rely on this decision again in the future. You cannot set clear legal precedent under seal (outside the FISA court anyway)


They think they can win it and/or they think it will cost Apple a significant enough hit to the stock price to make it painful not to bend over.The FBI people who pushed it this far are arrogant assholes and I hope they end up being well-known assholes.


I took it to imply exactly that, but then I realised that I'm missing some important information about the US court system. Had the FBI 'issued its application under seal' would Apple have been able to appeal under the same reporting restrictions thus initiating the same chain of legal process but without the messy publicly?

Either way it looks horribly like the feds just took away the neutral ground in crypto war 2.0


Except that this still relates to an older version of the technology.. in the Dune series, Leto bred the Duncan Idaho ghola over many generations because although some persons were invisible from prescience, he could still see their footprints.. but the footprints of the ghola faded from sight. Apple has been working towards making it impossible to extract the keys, and I'm sure that in crypto war 3.1 they will work further towards this, since it appears that the Secure Enclave may still have some weaknesses (that a firmware update doesn't wipe the keys)

Whether the regulatory mechanism allows them to create something proof against the regulators, now thats going to be an interesting battle. They are still a US company after all.


Yes, a document or entire case being under seal doesn't affect the legal process itself, only what the public gets to see.


Are you suggesting that Apple would've done this particular hack provided it wouldn't set any precedent? If so that sounds quite sensible to my ears.


And that's why transparency is important.


The best way to put this for the other side is:

"How about we start with FBI mandated remote control gun disablers given that it's guns that killed these people? Oh, you're concerned that someone will figure out how to bypass it, and it won't just be the FBI disabling your gun?

Congratulations. You now understand my position."


Why don't we have remote take-over capability built into airplanes? People have certainly crashed them intentionally.

Why don't we have remote kill switches in cars? Think of the high-speed chases that could be prevented.

Why don't we have surveillance cameras in private homes? Think of all the crimes that could be solved.

Even if these are all off by default, and the law requires a court order to turn them on, Americans would not accept them. Encryption is no different, except that most Americans don't understand it well enough to see the parallels.

But these are just arguments against backdoors.

The even more fundamental argument is about the All Writs Act. Can the FBI require companies to develop new capabilities just with a court order? Shouldn't that require legislation? Why did we bother to pass CALEA, if the government could have just said "All Writs Act" and gotten what they wanted that way?


It is easier to understand if the same task is rephrased.

"Apple ordered by Chinese government to disable lock on phone seized from undercover CIA agent."

The act is the same. Being a multinational company, the result is no different for Apple. A government is legally requiring Apple to make decryption possible, but suddenly the people crying for Apple to unlock this phone will take an extremely different stance.


Everybody I've seen here saying that Apple should assist the FBI in unlocking the device takes the same stance in both cases. If Apple doesn't like it, they should either secure the device in such a way that it isn't easier for themselves to compromise than it is for a third party or stop doing business in that country if they believe the requests are unreasonable.


Unlike telephone operators affected by CALEA, Apple is free to develop a phone with security that is as hard for them to circumvent as it is for the FBI. When they do, the All Writs Act won't apply to them.


Your assumption about someone understanding your position based on rational arguments is flawed. The reason it's flawed is because of the same reason we're currently arguing whether or not Apple has the right to challenge whether the FBI can compel them/us to write software for the government so they can "protect us better" from both real and fictitious threats.

This type of reasoning leads to cognitive dissonance (believing two logically conflicting things at the same time) in the general population, which is caused by people applying double bind[1] arguments to conversations. Those double bind arguments end up creating additional cognitive dissonance in others and the vicious cycle is repeated, all the while being accelerated by our ever increasing connectedness. There's a going theory that these types of meme-based arguments are actually thought viruses[2].

One way to stop these types of arguments is to take a quiet moment and identify arguments or language structure that lean on speculative unknowns and which encourage equally speculative rhetoric which is dissonant in nature. In other words, let's all start to learn how to identify dissonant speech and reject it as a truth when we hear it. Let's tell others we think the conversation may be dissonant in nature to help stop the cycle.

One thing to consider while approaching solving these complex problems is setting the goal of removing rules of conduct as technology becomes more complicated. All this talk of creating new rules (and seeking new rulings to establish them) is creating an extremely complicated house of cards for all of us to thoughtfully manage. This is because we're basing our arguments on a fundamentally false premise, which is it's the government's job is to completely remove all suffering from society. That's actually not their job, nor is it achievable, so perhaps it's time to question why they think it is and figure out a solution that isn't in disagreement with itself.

[1] https://en.wikipedia.org/wiki/Double_bind

[2] https://en.wikipedia.org/wiki/Viruses_of_the_Mind


> Your assumption about someone understanding your position based on rational arguments is flawed.

I'm not attempting a rational argument. I am attempting to phrase the argument in terms that a set of authoritarian advocating individuals can relate to.

To most of them, their guns are more important than preventing terrorists from killing people. By driving home the fact that this precedent will eventually apply to their guns, you get their attention.

Otherwise, they just fluff it off as a "whining from a bunch of liberal, elite pansies".


It is a hard fact there is no "most of them" with which you can have this argument. You can't make a meta group of people holding the same thought virus "listen" to an argument that is an equivalent thought virus. That's why you find yourself simulating what others might say to you if you didn't say it just right:

> Otherwise, they just fluff it off as a "whining from a bunch of liberal, elite pansies".

You can't simulate what you might say and then simulate what they might say and make any of it make any sense (or be reasonably efficient in whatever systems you plug that stupid logic into).

Stop the cycle and solve the real problem. That's all I'm asking: https://www.youtube.com/watch?v=rE3j_RHkqJc


And to you, your encryption is more important than being able to come after the bad guys using it. Can I similarly argue that the precedent of government control over firearms will eventually apply to encryption?

Otherwise, you just fluff it off as "whining from a bunch of conservative, mouth-breathing gun nuts."


It's funny to me because I feel the same way about this as I do about mandatory "smart guns".

Basically, not only do I not trust the government but I don't trust that the information won't fall into even worse hands than theirs.


Given the original motivation for the Second Amendment, I think that the even just the FBI being able to remotely disable guns would be pretty concerning to some.


I think a case should be made for a right to strong encryption based on the second amendment. If anything, encryption is far more relevant today to personal defense than gun ownership and does far more to protect your individual rights against both government excess and private attacks.

A gun won't stop a government from watching your every move by converting every device in your house into a surveillance device and then using "parallel construction" to persecute you if some political apparatchik doesn't like your views or opinions. A gun also won't stop a criminal from emptying your bank account or stealing your identity, attacks that can today steal far more from you than a home invasion. A gun won't stop a stalker from hacking your home devices to secretly record your kids playing, or hacking your e-mail account to spy on you, etc. Encryption on the other hand can stop all these things. Cyber-attacks today are potentially even more damaging than non-violent physical invasion (burglary, etc.).

Not only that, but we are very close (if not already there) to a world where you can kill someone by hacking a device. Self-driving cars, Internet connected buildings, etc., could all be used to conduct a homicide remotely. If, for example, the signing key to a car's firmware were to be leaked, then hackers could remotely flash a person's car to murder them on the highway and make it look like a hardware failure or a freak accident. A government could even do this and have almost no chance of detection. The opportunity for corruption both public and private is fairly large given how clandestinely these kinds of attacks could be conducted.

Crypto is the new "well armed militia" and the new right to personal self-defense.


The "well regulated militia" is still the same.

Crypto is just another weapon in the arsenal.


Um, exactly?


That's putting a backdoor into phones, which is not what's requested in this situation.

It's more like "the manufacturer of this safe has the ability to reprogram the safe to make it crackable, we should hand them the safe and demand that they reprogram it so we can access it".


The request is to provide alternative software that bypasses normal security features of the system; that qualifies as a backdoor:

https://en.wikipedia.org/wiki/Backdoor_%28computing%29


But they aren't asking Apple to install it on all phones. The example above is only appropriate if they were asked to put it on everyone's phones.


That doesn't matter! Apple's entire argument, which I completely agree with, is that if they make this firmware, then it exists. It doesn't matter that they were only instructed to put it on one phone, once it's out there, there's almost no way they can make sure that firmware doesn't get picked up by other people, and placed on other phones, without users consent, without court orders.

There's no such thing as one-off and then it disappears in the world of software. Once it exists in the world it's only a matter of time before everyone has it.


But it exists already! The fact that Apple can write a backdoor means there already is a back door. Apple is just refusing to open it, which I'm not sure they have a legal standing to do.

The fact that they would have to write some software or whatever is irrelevant. It's no different than if they already had a button they could press to make the phone hackable.


Does the potential existence of a backdoor justify its use against users?

Having to write software is very different than using existing software. Apple developed their software precisely to prevent situations like this---the backdoor is _undoing_ those developments.


>It doesn't matter that they were only instructed to put it on one phone, once it's out there, there's almost no way they can make sure that firmware doesn't get picked up by other people, and placed on other phones, without users consent, without court orders.

I already wrote a rebuttal of this argument in a different subthread. See https://news.ycombinator.com/item?id=11134257

If you have something to add to that, respond there.


And you are equally as wrong there.

China could steal Apple's key, but it doesn't immediately get them anything. They would also have to build up a completely parallel OS. That would be really expensive.

However, once Apple builds the malicious software, NOW you now have two very clear, very desirable targets.

Do you really think that China, Russia, etc. wouldn't bribe, steal, threaten, kidnap or blackmail their way to getting a copy of those?


Do you really think it would be that hard to modify the iOS software to make a phone hackable (by brute force, like the FBI plans) without the source? It's definitely doable. If China had Apple's keys, they could trivially do it.


>They would also have to build up a completely parallel OS. That would be really expensive. However, once Apple builds the malicious software, NOW you now have two very clear, very desirable targets.

The same applies to the source code and key now.

The actual software can be locked to the specific phone, so having it wouldn't help. Only having the source code would help, but having the source of iOS would also be very easy to turn into a backdoored version for anyone with the resources to steal it.

Do you not think China etc are capable of modifying source code?

Edit: sure, modifying source code to change the phone ID is a bit simpler than changing source code to not erase the phone on an incorrect password. But the difference seems orders of magnitude smaller than the "getting source code and Apple private keys" difficulty level.


How? How does the software get locked to a specific phone?

And while it's certainly possible that someone could figure out what Apple's signing keys are, it's incredibly unlikely given how big the numbers we're dealing with are. Which means, sure, somebody else could modify the source code, but getting it on to devices is an entirely different question.

Based on what I've read of what you've written, it seems like you genuinely believe that software is some magic cure-all that can do anything and everything you want, regardless of the practicality and effort required, and actual known constraints.


>How? How does the software get locked to a specific phone?

They take the udid of the phone, and refuse to run if it doesn't match the udid of the iPhone 5C in question. This was specifically mentioned in the court order.

>And while it's certainly possible that someone could figure out what Apple's signing keys are, it's incredibly unlikely given how big the numbers we're dealing with are. Which means, sure, somebody else could modify the source code, but getting it on to devices is an entirely different question.

You seem to have misunderstood. The threat model is not someone figuring out the key, the threat model is someone stealing the key. And, as I've been saying all along, the attacker needs to steal the key regardless of whether Apple complies with the court order or not. Modifying the source is not enough.

>Based on what I've read of what you've written, it seems like you genuinely believe that software is some magic cure-all that can do anything and everything you want, regardless of the practicality and effort required, and actual known constraints.

I'm not sure what I've written that gives that impression. Which specific known constraint have I suggested software can get around? https://blog.trailofbits.com/2016/02/17/apple-can-comply-wit... makes it sound that it would not be difficult for Apple to comply.


I'm not sure which "other side" you're targeting with your argument, but as a gun rights advocate its' completely consistent to be a privacy rights advocate-- and basically a human rights advocate.

That it doesn't hurt another, do what thou wilt. Thus gay married farmers should be able to defend their marijuana crops with fully auto AK47s. Marijuana, guns and gay sex don't do bad things to society. Only people can misuse them.


> I'm not sure which "other side" you're targeting with your argument, but as a gun rights advocate its' completely consistent to be a privacy rights advocate-- and basically a human rights advocate.

I can't speak for the parent, but I'd imagine s/he has observed a correlation in that the political "right" seems to be more in favor of an encryption backdoor than the "left". The "right" is also unequivocally more enthusiastic in its defense of 2a rights.

To support this notion - at least insofar as being an indicator of public sentiment - I would point to the Republican primary field which has almost unanimously spoken out in opposition of Apple's decision. Their two Democrat counterparts have thus far been more reserved about the matter.


From my perspective, most of the crowing I see about how Apple is terrible for not immediately capitulating is coming from the American right wing, a political bloc traditionally associated with gun rights.


What if the FBI asked a gun manufacturer to build a disabler for one gun the FBI already possesses? That's more analogous here.


This is a confusing point. At the moment, there is no software or tooling to do what the FBI wants so the subpoena asks them to "develop new capabilities" in order to meet it. However, once they have done that for this one phone, they will no longer be able to say truthfully "We can't do that." and any time a phone needs to be unlocked the FBI will get a judge to issue a subpoena and if Apple doesn't comply, with the proof out there they could comply, then they are in contempt.

It is that follow on that is the back door. So now Apple is in the position where they have to unlock phones on demand, and not doing so will subject them to fines and imprisonment (you cannot easily appeal an order to do something in a case which it is clear you have the capability to do).

What is worse, if they are forced to comply and develop this capability, and then design the next generation of phone so that they once again don't have a capability to break into it, that case then is even harder. The FBI can sue Apple for intentionally obstructing justice.

Don't be fooled, this isn't a case about one phone and one set of people who have shot a bunch innocent people. This is very much a calculated strategy on the Justice Department's behalf to get back doors into phones and create legal precedents for which they can compel those back doors.

That will force the group that doesn't want back doors to get Congress to create a law that specfically forbids them. And the Justice Dept knows that would be very hard for folks to achieve.


But we already know they'd lying if they said they can't "do that" now. That's why they aren't saying "we can't do that." The threat to privacy is no different weather Apple complies or not. No precedent would be set, because this isn't different than any other time the FBI has demanded documents about a person from a company.

I'm not a lawyer, so I don't know if a company has to comply when the FBI demands information for a person. But if they do, this shouldn't be seen as any different.


If Apple can build the software and tooling in question (and as far as I've seen nobody disputes that they can, the question is simply whether they will), then they are already unable to truthfully say "we can't do that."


Lets put it a slightly different way, lets say you make safes for protecting valuables. And you make your safe so that any attempt to break into it without the combination would destroy the contents, or make it so that the only way to open it, like cutting into the steel with a torch, would destroy the contents.

The FBI can (and does) hire locksmiths and safe crackers to get into safes where they believe there is valuable information inside, but they don't compel the manufacturer of the safe to create a capability of bypassing its safeguards, but if they did criminals would seek it out so they could rob those safes, and customers would stop buying that safe because they knew criminals could probably get their hands on the same tools the government could to break into and defeat its countermeasures.

So having the government able to order a safe maker to build a safe that can be broken into, the safe maker loses customers and business because their customers don't want a safe that can be broken into, they want one that protects their valuables from everyone.


In this case, the government isn't ordering a safe maker to build a safe that can be broken into. The safe maker already built a safe that can be broken into. The government is just ordering the safe maker to take advantage of that fact and break into it.

Many people equate this to demanding that Apple create a security vulnerability, and I don't get it. The vulnerability is already there. The iPhone 5c is already insecure against an attack by Apple. The government is just ordering Apple to exploit that existing insecurity.

There are compelling problems with that too, but it's not the same as requiring Apple to build something insecure.


Perfect. To fix your analogy and make it more accurate, it's more like the safe making company has the ability to make a device that can disable the self destruct capability for a single safe.

If the safe company simply hadn't built in that capability, there would be nothing the FBI could demand of them, because it wouldn't be possible.


For the 5c and possibly the 6. What if they design the 7 to not be able to create the tooling? Would Apple be in violation of precedences set earlier?


I don't see how. "Break into this phone." "OK." Then later: "Break into this other phone." "Can't." How does that violate anything?


The fundamental question is whether a federal court can require a private company to do new work, to assist a federal investigation.

The answer now is no. Companies must provide access to information if they possess it, but that's not the same as writing new software to enable access to information they don't possess.

Once that Rubicon has been crossed, it is unclear what the legal test would be to limit the work that can be requested. If a court can require Apple to write some piece of software, why couldn't it require Apple to write some other piece of software? Or take any other affirmative action that might assist the FBI?

I know this might sound crazy but this is how common law works. Court decisions create precedents, which are often in the form of heuristics for constraining future decisions.

A very famous example is the Supreme Court precedent on abortion: before a fetus is viable, the government does not have a legal basis for considering it to have independent rights. Whether or not one agrees with it, the rule is clear.

Right now the rule is clear for what companies must do under court order. If the FBI succeeds in applying the All Writs Act in this case, that clear rule will be gone. What will take its place?

Edit to add a concrete example:

Apple announces that iOS 10 cannot be broken even by Apple. The FBI goes to court and says that they are tracking known terrorists who are using iPhones. If those terrorists get iOS 10 as announced, the FBI will be impeded. So Apple must alter iOS 10 to ensure that they can deliver retroactive access to phone contents in some way.

This would be the smartphone equivalent to CALEA, which requires network operators to build their networks in a way that permits wiretapping. The big difference is that CALEA was passed by Congress, whereas the FBI would only need a court order for smartphones!


I understand the idea of setting a precedent, I just don't see how the precent set here could translate into preventing Apple from building secure devices.

The current request is for Apple to build a special version of their OS which can be loaded onto the suspect's phone. It seems to me that the precent this would create would be for similar actions: providing custom software. The gulf between that and altering the public builds of iOS is vast, and I don't see how the former precent could possibly lead to the latter. In fact, the order in question explicitly states that the software provided by Apple must be tied to the specific iPhone the FBI wants to crack.

The precent of providing custom software for a specific device could very well be dangerous in and of itself. This comment does a good job of discussing the possibilities there:

https://news.ycombinator.com/item?id=11120036

In short, the precent set here could very well lead to requiring Apple to turn on suspects' microphones remotely, or loading software that scrapes data as the user uses it. That could certainly be bad. But it's a huge leap from that to somehow preventing Apple from ever building and selling secure systems.

Edit: thinking about this a bit more, here's what I see as the really simple version of it all. The FBI's order is compelling Apple to build something, and the worry is that it sets a precedent for building more. But forcing Apple to build insecure devices in the future is not the same; that would be preventing Apple from building secure software and hardware, not compelling them to build something insecure.


The way I view it is that it sets the stage for the FBI to request from the court that Apple retain the ability to build software that can break into all future iPhones. IMO, that gulf is much smaller to cross than alter public builds in the future. It is more like, don't add this feature.

Basically, the FBI can argue that Apple did this before and must keep doing it in the future. Making the device such that they cannot respond to a request could be viewed as going against the court. "I'd love to give you that document again, but now I burn them all so you cannot have them."


I don't understand why requiring an action which is feasible today implies that this action must be kept feasible in the future. If the FBI demands that you open a door for them, does that imply you're not allowed to brick up the entrance later on?


It implies that, because the power to require Apple to take action today is on the same side of Rubicon as requiring them to take some other action in the future.

Once we permit the FBI to task private companies via court order, there's no way to predict how far that tasking will be permitted to go.

> If the FBI demands that you open a door for them, does that imply you're not allowed to brick up the entrance later on?

The FBI is not demanding that Apple open a door, it is demanding that Apple build a custom software system for them. If the FBI can tell Apple how to build software today, then the FBI can probably tell Apple how to build software tomorrow.


The FBI isn't quite telling Apple how to build software in general. They're trying to make Apple build a specific piece of software for the FBI.

If they do this today, they can probably do it tomorrow, yes. But "this" doesn't mean compromising all iPhone hardware or software. It means building a special copy of the software to attack already insecure hardware and software.

Let's say Apple makes the iPhone 7 completely invulnerable to attack even from Apple. How does this precedent play out? The FBI goes to Apple and says, "Do that thing you did before." Apple replies, "Sorry, but we literally can't do it for this phone." How does the existing precedent cause a problem?

This whole line of argument feels too much like, "The FBI can demand something, this is something, therefore the FBI can demand this." People keep saying there's a connection and I just don't see it. When I ask what it is, people just reiterate that it's there.


Let's say the FBI wins the first case.

Then they request a warrant that requires future iOS versions to permit similar access.

By what argument would Apple fight that warrant? The previous case would have already established that such access is necessary, that Apple can provide it, and that the court has the legal power to force them to.

In terms of the burden on Apple, the staff time burden to write software that will run on 50 million iPhones is no greater than software that will run on one iPhone. Code is code.

In terms of damage to the Apple brand, the FBI has already argued that that should not matter:

http://www.nytimes.com/2016/02/20/business/justice-departmen...


That warrant isn't targeted and affects Apple customers who are unrelated to any criminal investigation. I see no reason why it would be granted.

Consider more traditional warrants. The police ask for a warrant to search your house. They get that warrant. Then afterwards they ask for another warrant, this time to enter everybody's house. Does the precedent set by the first warrant make a judge likely to grant the second one? I don't think so. Certainly I've never heard of any such warrant being issued by an American court (excluding the ridiculous FISA court, anyway), even though police would surely love to have one, and the precedent of search warrants for individual houses is extremely long standing.


I don't buy that.

"By what argument would Apple fight that warrant?"

By what argument can Apple use to fight this current demand? They clearly can break into the phone.


Complying with any request for anything by the FBI costs money and time. There's no difference between that and making a trivial backdoor for one phone.


So this debate may end of FBI asks Congress to pass a law and Congress certainly will. :-(


The standard is "without undue burden".


How much is "undue burden"? In the past, the govt paid companies for ther wiretapping efforts. Would that make the burden less undue?


We need to find out who at the FBI is literally responsible for this request and require that they lose their job (impeach? How do you as a society demand that someone in the position to make this request be removed)

Remember the attorney general who molded the torture stance for the bush admin? John Yoo and Alberto Gonzalas? We need to be able to make these people accountable and be removed when they do stuff like this.

Those both are basically war criminals.


Which in turn created a template that can easily be adapted and called upon to unlock anyone else' "gun"


What's this "other" side?


What bothers me most about this article is the following statement which is accepted without contest.

"Law enforcement agencies have a legitimate need for evidence, which is all the more pressing in terrorism cases."

What makes a terrorism case more pressing? How many domestic terrorist attacks have had related followup attacks? How many domestic terrorist cases have been linked to other domestic terrorist attacks? How many domestic terrorist attacks have been carried out by the same set of individuals or groups?

The reality is that domestic terrorist attacks are not common or frequent, there is no urgency in investigating them because they do not lead to followup attacks. They're coordinated events, not a series of related events so there's no pressing urgency?


what? Sure terrorist attacks are not common or frequent but they generally involve murder on a larger scale than your average criminal. There is of course a lot of urgency in investigating them as they could easily lead to follow up attacks. Look at San Bernidino and the Boston bombings. both pairs of terrorists clearly had plans for follow up attacks


What terrorists plan, and what happens, are very different. Terrorists often die in their attacks, so followup plans are often moot.


In those cases, had the terrorists escaped custody and carried out their subsequent attacks would law enforcement having the ability to instantly access their phones have made any difference?


I've been explaining this case to others and I've come up with a good way to make them understand.

"Instead of the FBI making this request, how would you feel if the Government of China were asking? or Russia, or Syria? Do you want them to have the ability to read your encrypted data off your iPhone?"


If the person with whom you are debating is ex-LEO, ex-military or a conservative, you can easily compare this erosion of the 4th amendment to the erosion of the 2nd amendment. Both the 2nd amendment and the 4th amendment were designed to protect the American people against abuse of power by the government. Supporting the 2nd but not the 4th, therefore, makes little sense.

The argument boils down to "if your family members were killed in San Bernardino, you would not support Apple".

The rebuttal is that first we should firearms because that is what the SB shooters used to actually commit the crime. Encryption never directly killed anyone.


It's closer than you think. Until 2000, strong crypto was classified as a munition under US export rules.

https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...


There is no reason why a China, Russia or India cannot not make this request from Apple in order to allow them to legally sell their product in those markets. And as sovereign nations they have a right to demand such, provided that's their law and in turn violates no international law they've subscribed to.

Countries in the ME and India[1] demanded such from RIM/BBM.

[1]http://www.wired.co.uk/news/archive/2013-07/11/blackberry-in...


And knowing that once they have that ability, there would be nothing stopping them using it on any iPhone in the entire world? Including yours? You're OK with that?

After what Snowden has told us, we know with certainty the FBI/NSA/CIA will use this ability on the phones of citizens of other countries, so I think it's safe to assume other countries would do the same.


They're willing to hand the phone over to Apple and have the entire process take place on Apple's premises, locked to this specific phone.

That's what the court order says.


With a warrant, the government can look through my sock drawer. What do I care if they can look through my phone?


I've honestly never met an actual "you don't have to worry if you have nothing to hide" person.

Under which religious or political umbrage do you reside?

What if you had something on your phone that isn't illegal now, but is tomorrow? (3D printed gun plans, source code that is found to have been published without consent, an ebook on sexual positions that are still illegal depending on which state and county you are standing on??)

15ish years ago, certain states might have pressed charges if you had a dildo in your sock drawer.

What is in your sock drawer?


You didn't just totally miss Rayiner's point, but also wrote an extremely creepy comment that would have been inappropriate for civil conversation even had you been right about what he was saying.


Framing mainstream sexuality as "creepy' is outdated, as evidenced by the following:

Over three years ago, a newspaper as everyday as USA Today reported "CVS, Walgreens, Kroger, Safeway, Target and Walmart are among major national chains that now include vibrators on store shelves" [1].

IOW Saying the world dildo no longer qualifies as creepy.

[1] http://usatoday30.usatoday.com/news/health/wellness/story/20...


No, that wasn't the creepy bit.


> Under which religious or political umbrage do you reside?

Very vanilla and mainstream, religiously and politically.

> What if you had something on your phone that isn't illegal now, but is tomorrow?

I'd delete it? But I probably wouldn't have it in the first place, for the reason above.

> 15ish years ago, certain states might have pressed charges if you had a dildo in your sock drawer.

And that would be wrong. But it'd be the law against the dildo in the sock drawer that would be wrong, not the general principle that the government has the right to search your sock drawer so long as it obtains a warrant based on probable cause. That principle has worked pretty damn well for 200+ years and I see no reason to change it now.


> But it'd be the law against the dildo in the sock drawer that would be wrong

So you'd have yourself and people you care about hang out in prison, cheerfully thinking to yourself, "We're only here because the law is wrong."

> worked pretty damn well for 200+ years and I see no reason to change it now.

That hasn't been running on its own automatically. It's been a massive bulwark of principled people maintaining it against attacks. And there are too many examples to count of misleading evidence used to mischaracterize, defame or prejudice opinions against individuals.

Consider our own government's treatment of blacks up to and including now. The government doesn't always know what is right and needs principled citizens to amend it.

Think back to that recent demo of predicting race by zip code. Thank back to predatory loaning practices, the KKK, job interview biases, unequal prison sentencing, racial bias by police, etc. and consider what happens in less-principled countries like Turkey and Germany. If I were of a non-mainstream race, I would want as few people to know as possible and I would want protection from society's baser instincts.

Otherwise, it's too easy for people with an axe to grind to abuse their power by sweeping through, say, all phones and singling out by language those that appear to be used by an ethnic or cultural group who may be prejudged to be a threat.


We're not talking about sweeping through everyone's phone just as we're not talking about sweeping through everyone's sock drawer. We're talking about a search pursuant to a warrant where there is a metric fuck ton of probable cause.


I'm basically with you.

However, as a German, I find it interesting why you mention Germany as "less-principled". Less-principled than the US? Do you talk about current Germany or about former east Germany or even Nazi Germany?

Of course, there are enough problems here as anywhere else regarding privacy, etc. But I have difficulties seeing current Germany as "less-principled" than the US wrt. privacy, human rights etc.

Just curious what your impression is based on.


I think his point is that if you want it to be legal to have a dildo in your sock drawer then change the laws to make it legal rather than relying on "people not finding out".

Another implication is why would a court order grant access to a sock drawer but not a phone. What makes one unlike the other, regarding lawful court approved search warrants.


Because they are going after the manufacturer of the sock drawer to help them get into your sock drawer.


It's not unheard of for the government to request manufacturers or service providers to aid them in obtaining information about third parties under investigation. Let's say a safe manufacturer, they might ask for blue prints, or a gunseller a list of people who bought a particular firearm from them (if they kept such records, disregard background checks).


Or a somewhat different example, the Nazi use of census records (seemingly innocuous) to find and track Jews (horrific).

http://www.amazon.com/Nazi-Census-Identification-Control-Pol...

In fact, I would strongly push for anyone working with "big data" to read up on this. Just because we can, does not mean we should.

Data does not have a "use this only this way and only for good" button.


Let's remember what this discussion is actually about. The FBI isn't asking Apple to help them make a Jew list. They are asking for access to the telephone of a mass murderer.


They are asking for access to the work phone. The likelihood is high that there is not much of use on the phone.

However, as multiple sites have opined, this is about setting precedent and the actual data on the phone is secondary.


There is no precedent being set here. The government has this ability already. If Apple doesn't comply like Lavabit, the government has the ability to set more onerous terms.



What's your point? China can already ask for the same thing, and it makes sense for the government to investigate ways to circumvent security, just like they might do with a combination safe. Crucially, the government isn't asking for the companies to build products with broken security — they are asking to circumvent already broken security.

From a legal perspective, this is no different from asking Apple to circumvent the passcode protection to extract unencrypted data off a device, which Apple has already done many times.


Socks.

And potentially incriminating secrets wrapped in metaphorical socks, apparently.

Probably one of those socks should contain a big machine bolt with a bunch of heavy hex-nuts screwed onto it, so that anyone who comes around to sniff at my socks can get a proper beating (along with the regular intrinsic olfactory penalty for sniffing my socks).

But I feel like you're missing something. The government cops aren't always going to be looking for incriminating evidence or actionable intelligence. They might just be using that as a pretext to look for cash or valuables that they can slip into their own pockets. You kept it a secret, after all, so how would you prove that they stole it? That's just one of the ways in which power corrupts.


The point, which you missed, is that the FBI already can look through your sock drawer with a warrant. And they already can get your private information from a company's server. If you want that not to be the case, it's an entirely different argument.


Since you have nothing to hide, please provide SSN, CC # and salary information. Please also stop wearing clothes. Sounds ridiculous, doesn't it?

People do care about privacy, but the use case varies. However just because you claim not to care, it does not mean that that becomes the minimum standard everyone else has to adhere to.


Umm, this isn't an "I have nothing to hide" argument. It's a "the government has a warrant to search a house" argument. It would be pretty dumb to argue that the FBI can't search a house if they have a warrant. In this case, they have a warrant and apple is refusing to open the door.


For purposes of discussion I have no opinion on it. I'm only stating the obvious possibilities. Just like I can discuss the possibilities of a meteor striking Earth, I don't have to like it to discuss it and entertain the idea of what could happen.


How would that give them the ability to use it on any iPhone in the world? You think they put the same software on all phones in the world? Same hardware?


Yes I will be fine - if you want security - don't buy ios apple. Buy device you can properly secure. Don't base your security on someone not cooperating with LEO.


China and Russia already have this ability. They use it to make iCloud data requests. In what way is that different? If a government is too shady to deal with, Apple has the option of not doing business in that country, which is what those countries would enforce if Apple didn't comply.

The other option is to make security that even Apple can't circumvent. They might have achieved this with their newer phones, but it has now become clear that they did not with the iPhone 5C.


China and Russia make iCloud data requests the same way the FBI already has in this case. Nothing different. The main thing is the FBI wants a free back door to any ones phone.


No, they want to circumvent security on this phone with a court order, just the same as the 70 phones that Apple circumvented security on in the US previously.


From my understanding in the previous cases the passcode only limited access to using the phone, not to the data inside. This is the same situation as having a password protected PC but not having your disks encrypted. You need to know the password to be able to use the computer, but anyone that has physical access to your machine can just take the disks and read your data using another computer.

From reading the article this is how Apple has helped law enforcement previously.

Now in more up to date versions of iOS the passcode limits access to using the phone, in addition to limiting access to the data contained inside. This limits what Apple can do to help law enforcement in the previous scenario, since any data they could copy off the phone would still be encrypted.


That technical description does not change Apple's legal obligation, which is exactly the same in both cases.


The other 70 phones stored unencrypted data. This phone stores encrypted data. The FBI wants a way to _brute-force_ the key (there's just 10000 of them), not a way to _circumvent_ it.


I think if China required Apple to let them read data off iPhones, Apple would do it secretly.


If there is a subpoena against that phone, I don't really have an issue with the government gaining full access to it. But there needs to be a subpoena and that backdoor can't be activated without one.

It's a similar, albeit stronger, idea as "probable cause": giving law enforcement a temporary waiver of rights that regular citizens are usually afforded.


The thing is, while ever a backdoor exists and can be activated at will the fact a subpeona is require to compel someone to use it doesn't mean that it can't be hacked by someone else to use for more nefarious purposes.

Deliberate backdoors can't exist, Apple needs to put the foot down and make that clear.


There is so much misinformation floating around.

The FBI wouldn't get a backdoor to all phones. It would literally only work on this one phone.

Or put another way, the backdoor is already there and Apple is being compelled to open it for this phone.


You completely missunderstand. It's not about this phone, or even this singular backdoor.

All backdoors are poison. End of story. If the FBI wanted access to the data on the phone they would have it already. But they don't. Probably because the already concluded they were acting alone and not part of a terrorist cell or network and no longer have need of the rest of the information or they really just don't care enough about it.

However they do want to use this case to get precedent for industry to produce new backdoors for them into previously secure-ish systems in the name of "terrorists". This very idea is cancer, a cancer that will spread to other technological systems they feel aren't easy enough for them to tap into whenever they want. A cancer that will spread across the world as it sets a precedent for nation states to force private companies into deliberately weakening the security in their products to please whatever government finds their citizens having access to encyption to be reprehensible.

So no, it's not mis-information.

You and anyone else that thinks this is ok just don't grasp the gravity of the situation.


I don't misunderstand at all. I just have a different opinion than you do. However, a lot of misinformation is being spread.

The FBI isn't asking for Apple to create a backdoor. End of story. They're compelling Apple to open a backdoor that Apple already created. Apple made an insecure system. They can, and I think should fix it, but in this phone at least the back door is there.

So, wait to use a slippery slope argument when it actually becomes applicable. You're so trigger happy to jump on the privacy, precedent setting bandwagon you didn't stop to think what this case actually is.

Basically, as long as you make insecure systems, the government can (by the fourth amendment no less) demand you let them in if they have probable cause and a warrant. If you make something impossible to get into, and the government starts demanding you stop making devices like that, then we have a problem. But this particular case doesn't get us any closer to that outcome. Not even symbolically.

I think you and everyone else overreacting to this case grasps gravity that isn't there. You're tilting at windmills.


>If there is a subpoena against that phone, I don't really have an issue with the government gaining full access to it.

Say the year is 2000 and the US government has a subpoena to access the phone because they want to collect evidence to prosecute a homosexual couple for having consensual sex (which back in 2000 was illegal in some states). The reasoning behind the subpoena matters, and 'OMG TERRORIST' has been so abused that I take issue with it by default.


Then take issues with illegal subpoenas. Siding with Apple on the current Apple/government discussion is trying to cure a symptom instead of a cause.


>Then take issues with illegal subpoenas.

There was nothing illegal in my example. Maybe now it would be after Lawrence v. Texas, but not before. Saying one should support the law because it is the law is supporting the tyranny of the law.


I never said that, I recommended to address the cause (unjustified subpoenas) rather than the symptom (the government asking for access to phones owned by criminals or alleged criminals).


Aren't there lots and lots of things that you would feel comfortable with the US doing but not those other governments? What makes this particular issue different from all of those other things?


If any of those governments were attempting to track down accomplices to a known mass murderer and terrorist??? At least a little part of me would support it.


And when China or Russia or Syria decides that an American is a mass murderer or a terrorist? Do you want them to compel Apple to unlock Francis Gary Powers' iPhone?

I'll be honest, if somehow Apple could sign a software update that merely disabled encryption for phones belonging to actual mass murders and terrorists, and didn't work on the rest of our phones, I'd be supportive. It's just that there's no possible way for such a thing to exist. The best you can do is a software update that works on everyone's phone, in the hands of people who promise to only use it on mass murderers and terrorists. That's far riskier.


And when China or Russia or Syria decides that an American is a mass murderer or a terrorist? Do you want them to compel Apple to unlock Francis Gary Powers' iPhone?

In this case we're talking about someone who walked into an office party and killed 14 people. So ya, I think it's in all of our best interests if we see what's on his phone. Just like I think it's in our best interests if we see what was in his house or in his car. Or do you think we should have just locked the front door to his apartment and waited for the movers to come take everything away?

So if your question is "is there ever a case in which compelling Apple to unlock a phone is OK" my answer is "you bet your ass". There is a right way to do this. It involves warrants and transparency (not secret FISA nonsense). It should involve oversight and be an extraordinary step (just as searching someones home is). But if we catch one terrorist on his way to blow up downtown San Francisco, well it'd sure be handy to have a way to know that his buddies are on their way to blow up Seattle too.

Now I know exactly what's coming, because it always comes so I'm just going to head it off now. No I don't support the unlimited power to search peoples phones. No I don't support the idea of implanting remotely exploitable backdoors into phone operating systems to make this process easy for the government. No I'm not a government shill. No I'm not suggesting that we trade liberty for security. And no I'm not arguing that if you have nothing to hide you shouldn't care about privacy.

NOWHERE in my argument did I take any of those positions. I'm arguing that when the technology exists and the circumstances call for it, backed by transparent functions outlined in the constitution, we should be able to get at data that might save actual lives.

I think China, Russia, or Syria should have that same right. I think that if Apple is going to distribute their phones in those countries they should play by whatever rules those countries have. If Apple really doesn't like it, don't cash the checks.


The FBI isn't asking for Apple to give them the update to apply themselves. They're asking for Apple to apply the update. To brute force another phone, the FBI will have to go back to Apple.


Sure, but the legal precedent will make this a rubber stamp in future, and will mean that Apple can't reasonably resist such orders from other governments.


The precedent is already set (the FBI can demand information about a person from a company).

In the future, Apple could make a phone that is actually secure. For example, requiring the phone to be unlocked before updating firmware.


What are you taking about? Apple has no information about Farouk.


What are you talking about? Where did I say Apple has information about Farouk?


You said that the FBI can demand information about a person from a company. What person are we talking about?


This case does not establish any precedent. This is a well-established power that has already been used to access iCloud data and extract unencrypted data off iPhone running older versions of iOS.


Almost all legal experts who don't work for the FBI who have expressed an option this disagree with you.


Such as? For an example to the contrary, see rayiner's comments on this issue. The only people I'm seeing that think this case is precedent-setting are tech bloggers who don't know any better and lawyers who were given a short time to respond and didn't know what the FBI was actually requesting.

The only restriction on this law that Apple could claim (aside from legally out-there options like equating writing software to forced speech) is "unreasonable burden," but this is something that almost any of us could hack up in a day given access to Apple's release build system.


This simply isn't true.

if (deviceUid = 12345) { openBackDoor() }

Cryptographically sign code. Done. We know it's secure and will only work on devicd 12345 because we trust cryptography.


There could be an exclusion in the EULA: this software may not be used in the furtherance of terrorist, anti social or illegal activities. Or any activities that might be deemed illegal in the future.

Then the EULA police would have the ability to really nail them, like Al Capone for tax evasion.


And my answer would be: but it's not the government of China, or Russia, or Syria? It's our elected government of the United States asking for help with the investigation of terrorists.


You shouldn't be down voted because you're wrong :) and what you say is in no way detrimental to this discussion.

That being said, you are wrong because if the ability to backdoor these devices is required by us law, that backdoor will be exploited by anyone with the cash for the exploit, which means Syria, China, Russia, or republicans.

It'd be better to remove the encryption entirely from these devices, so at least you're guaranteed of the safety of your data. (I.e., no security)

There is nothing about this device that hasn't already been swept up by mass surveilence programs, which I guess is fine because it's done by our elected government and its for catching terrorists, and not say, abortion providers or clinic bombers.


We're not talking about backdooring everyone's device. We're talking about Apple helping the FBI exploit a weakness already on the device, to access data on a specific device pursuant to a warrant.


I think the important part is the "unreasonable burden." Apple's position is that it's an unreasonable burden to be required to dedicate finite technical resources to a problem that directly and very publicly counterfeits one of their marketing stances. They can pretty easily draw parallels to US-based cloud companies that have lost European revenue post-Snowden. Maybe it's reasonable to request $50k in engineering time from one of the largest companies in the world, but what about $10 million in lost future sales? $100 million? more?


Apple is a very rich company. This is a federal investigation of a mass murderer, with potential national security implications.

Where is there sense of civic responsibility?

There is no way in hell they'd lose $100 in sales for aiding in this investigation. Being melodramatic doesn't help anybody.


"Apple is a very rich company"

I'm sorry, but this strikes me as wrong, if they were not rich, would they not be compelled todo this, or would they not be able to argue against it as visibly? (I personally think the second would be more likely).

Civic responsibility? What is next? Agreeing to some idiots religious views and blocking anyone from seeing nudity?

I am sorry, but they are a company, not an elected group. If they are not breaking a law, well you can take those arguments of civic responsibilities and shout them at a tree for all that it matters.

I am trying to resist the urge to bring up some hyperbolic argument against this, purely as i am so shocked someone on hn would try an argument this rediculous.

Its akin so 'someone think of the children', in my mind, and as baseless as that sort of argument.


That particular logic is not wrong; relative hardship is something courts routinely take into consideration.


Once enough people have access to the software, it could be eventually leak. Then everyone's device will essentially be backdoored.

Right now we assume no one has developed the capability to exploit this weakness. By forcing Apple to develop this capability, the security of everyone's device is compromised. This assumes that Apple is uniquely positioned to develop the exploit, or at least they are the only ones who can do it quickly/cheaply enough to make it worth it.

At the very least, forcing apple to do it publicly demonstrates the feasibility of the exploit to third parties.


By that same logic, Apple's source code and software update key could eventually leak, too, which would have even worse effect. The FBI is handing the phone over to Apple to apply the modification, not asking Apple to them the means to do it themselves. If Apple was really worried about this leaking, they could just delete it when they're done (though it would be more work for them to recreate it next time the FBI comes with a warrant to search another iPhone).


For a similar situation sure. Why not? The court isn't asking to give the FBI access to all phones. Just one phone they have physical control over.

Do you really think Apple would tell the Chinese government to fuck off if the Chinese threatened to ban apple unless Apple opened a single phone. I have ten billion reasons to believe they would do it.

If those countries were using it to crack down on political prisoners then I'd suggest Apple pull out of the market rather than comply.

But generally corporations should not be allowed to pick and choose what laws to follow. If a company doesn't want to subject itself to the authority of a government, don't do business there.


Doesn't the fact that Apple is capable of unlocking the phone mean that a backdoor already exists ? Then it's just a question of a lone wolf telling the FBI how to do it, maybe along with using some secret keys that could be stolen if Apple is hacked like Sony was. I would like a phone that is unhackable even by its creator. Anything else is just a question of time to get broken into.


It all depends on how you define a backdoor. The 5c doesn't have the secure enclave†, so all of it's encryption routines are done in the CPU, as software/firmware.

That software doesn't contain a backdoor, but the fact that a new software load could be created and installed could be viewed as the backdoor itself. But if that software doesn't currently exist, is the mere potential of it a backdoor? Somewhat of a philosophical question.

If Apple prevails and doesn't write the software the FBI is demanding, then the backdoor is not there.

The later iPhones with the Secure Enclave may truly be unbreakable if they protect their secrets with hardware in the chip.

But my understanding of it is that Apple is NOT capable of unlocking the phone right now. The FBI is demanding that they develop that capability. So I guess they're capable of being capable...

† At the risk of being an armchair SE expert...


How would a backdoor look? Some secret additional encryption key capable of unlocking the encrypted data?

Such a key exists, and it is whatever key Apple uses to sign the iOS updates. Only Apple can sign a new iOS release, so only Apple can update the code to remove all security features. This looks a lot like a backdoor to me, just that the implementation hasn't been written. But all you really would need, in theory, is the signing key.


From the Judge's request and Tim Cook's response, it looks like the updated OS with the backdoor would no longer wait after each passcode try, and would allow inputting a passcode through WiFi. That allows remotely bruteforcing the passcode.

I would guess that that passcode encrypts the phone, which is why they can't decrypt it.

There's probably more to it (otherwise, they could copy the encrypted data elsewhere to bruteforce it, in which case it would all be a show from the FBI and their real target is other phones).

I'll agree that the ability to update the firmware without the phone being unlocked is poor design on Apple's part. Given the security flaws they've had in the past, this one seems genuine.


You make a very good point: why do they even need to update the firmware? Surely, just cloning the phone's encrypted memory should be possible? (I have no knowledge of hardware at all, but I would have thought that the data is stored on a memory chip somewhere in a phone, and it should(?) be possible to copy the data from that chip to process it offline?)


In some of the other threads on this topic there were very many explanations why that is not possible.

At root of it is that even with a flash memory image, brute-forcing AES-256 is impossible. And the key is key fused into the processor and no software or firmware can read it directly.


A Schrödingers Backdoor? lol.. Err, anyway. Carry on.


They are capable of unlocking the phone. There is no philosophical question about this, as they have the knowledge and the ability and the authority.

That it might take some expertise and effort is not really relevant - some is required in any case even if the firmware already existed, so its just a matter of scale. For example I know several people who could not update a phone firmware without supervision, and probably some who could at a stretch (given massive time and resources) reverse engineer it. Apple firmly occupy the sweet spot in this range, they have the secret knowledge that would otherwise have to be reproduced and it would be a bit of effort to build on this but not huge. Changing a bit of code [to remove the delay] and rebuilding a binary image is not a complicated task for their engineers, it is likely something they do all the time.

Personally, I think they should just do it if they are legally compelled to do so. They have done in the past and this is an older device and in 5 years the issue will be moot, as any firmware they create now would be useless.


>Personally, I think they should just do it if they are legally compelled to do so. They have done in the past and this is an older device and in 5 years the issue will be moot, as any firmware they create now would be useless.

If they do it now, they the point will indeed be moot in 5 years, not because of technology, but because it will be required of them to make sure they can do this to ALL iPhones.


> Personally, I think they should just do it if they are

> legally compelled to do so. They have done in the past and

> this is an older device and in 5 years the issue will be

> moot, as any firmware they create now would be useless.

If they lose this lawsuit they will be forced again and again to build backdoors into their own systems. It's a very unreasonable request that does not find the right balance and the reasons are laid out eloquently in Apple's letter.


Only Apple is able to sign iOS updates. The FBI wants Apple to make an Evil Update and sign it with their key so the phone will accept the update. Apple has to hold the signing key so the phone can reject Evil Updates in general.

Does this count as a "backdoor"? It's built into the way signed updates work. Nobody stole Sony's master signing key for the PS4, even when their entire network was pwned. They're probably stored offline in specialized hardware.

(as I understand it, newer iPhones use crypto magic + special hardware to make it arguably impossible to update the phone while it's locked, so they're more secure against Evil Updates, even ones signed by Apple)


Can't the FBI then subpoena the signing key?


Mostly because they would need Apple source code and toolchain, too, and dozens of coders willing to make evil updates. Of course they could get those, too. But imagine what an expensive botch some "security contractor" would make of the project of crafting evil updates for the FBI. They would probably be found out and publicly humiliated, and there would be a terrible trust hit to the whole tech industry.


So, they always bounce back from it, it's not as if they haven't botched things before.


Why don't they also design the iPhone 7 when they are already at it?


For apple to do it,they have to make some signed software that can then be installed to the phone as an update which then allows the crack. If this is considered a back door, then everything that will accept signed updates has that same backdoor, which is a lot of things.

Theoretically yes,a lone wolf plus the FBI could do it, but they would need to get the software they make signed by Apple or steal the key to make it appear so, which would not be simple.

The big issue is that once it is signed by Apple, it could then be used to crack any phone it was installed on as a trusted update, which it is because it was signed.


Except the order notes that Apple should lock the update to only function with that phone. It also allows them control of the hardware if they choose, provided they turn over the data.

The more I've heard about this the more it seems like cynical grandstanding for marketing purposes: Apple don't want regular people fiction to become "Apple can always unlock your phone".


As has been said many times over, this is setting a precedent where the FBI can ask Apple to do this again, with the software already created. The "only function with that phone" is a negligible hinderance to using this on another phone.


And it would only work for that specific model of phone. And to do so the FBI would have to have a public warrant for the search, granted by public courts. What exactly is the problem with that power? It's the same as one which lets them search your home.


Apple goes to the effort of designing the software to be secure to the point of inaccessible, even when Apple is asked to work against the user, so it's a selling point. If the FBI is known to be able to get into a secure phone it commercially affects Apple.

Why would it only work for that model of phone? Why would iOS vary its repeated attempt lock out code between models?

Also another issue is that Apple are questioning whether this warrant was ever legal to exist.


In more recent phones Secure Enclave would enforce the timeout, so no change in iOS could remove the timeout.


It appears that the firmwire of the Secure Enclave can be updated with the Apple private key. It isn't iOS, but for the purposes of this, it might as well be.


The Evil Update could just refuse to function if it's being used on a phone with a different IMEID. You wouldn't be able to remove the check without breaking the signature, so it wouldn't be possible to use it to hack any other phone.


Sure - until you realize IMEI is a baseband ID and you can desolder tha baseband IC and use it on all phones you need access to. Heck - you can probably just intercept and change the output of 'get IMEI' command - it's not like the communication between those chips is protected by crypto (only baseband firmware upgrade uses crypto)


> If this is considered a back door, then everything that will accept signed updates has that same backdoor, which is a lot of things.

Yes.

So.. we should think about that and make it so that the keys are stored in a box which cannot be broken into. I think Apple have been working on this, but it is not clear that they are yet successful.


22 hours. That's the time needed to break it.

The pin is so short (4 or 6 digits) that if you can remove the 10 tries limitation and time escalation (Apple can with a new firmware) it's a matter of time to hack the phone (22 hours. 80ms [1] per attempt * 10^6 pins). It seems a farce to me that Apple says that they can't decrypt it (or maybe they never said that?). With a stronger pw it would be a different story though.

Am I wrong in any assumption?

[1] http://arstechnica.com/apple/2016/02/encryption-isnt-at-stak...


that is correct for a 6 digit pin, of course if you used an alphaumeric passcode it would take much longer. brute force 8 characters would take millions of years, though in practice you could significantly reduce the search space in typical password cracking ways (word lists, etc) but never the less.

the takeaway is that you should use a much more complex password.


At 80ms per attempt you need 3 months to brute force an 8 pin password. Things get ugly soon btw, 9 characters 2.5 years, 10 characters 25 years and so on. We reach millions of years at 15 characters

edit: I was calculating on the assumption that you can only use numbers as a pin


Apple was probably able to unlock devices previously (prior to iOS 8) because of a lack of encryption on the user partition.


Not unlock, but extract a very limited set of data, comparable to what a non-smartphone would log.


It's a backdoor like a bank vault wall is a backdoor. It is entirely a shield so far, with no actual vulnerability in itself, but in the meta it can be replaced.


Every iPhone has a built in backdoor. That is the fact that apple can sign software that the device will trust.

I am currently with FBI on this because their success could spell doom on the walled garden approach.


The fact that Apple has made this issue very public (with http://www.apple.com/customer-letter/) is unprecedented almost (Lavabit and SOPA had some similarities). I applaud them for taking a stand and not just giving in.

If this legal precedent gets set that the FBI can force US tech companies to break into their own customers' encrypted data, you can bet the industry will lose millions if not billions of dollars worldwide in tainted reputation.

Who's going to buy US companies' devices that claim privacy via encryption if they're easily backdoored at the FBI's request?


> Who's going to buy US companies' devices that claim privacy via encryption if they're easily backdoored at the FBI's request?

Forget about customer buying power, courts in europe are using the US's laws to show how EU organisations can't legally use US companies for many things.


Is that true? Armed with a court order, the FBI can search a U.S. house, compel banks to turn over financial information, compel accountants to turn over records, etc. Yet people worldwide extensively use those American services.


That's not a similar example to what's going on here.

Your example would have to include a situation in which the FBI makes copies of the keys to the house, copies of the hard-drives, etc. Then inevitably makes it embarrassingly easy for other parties to get access to those as well, permanently reducing your net security to almost zilch regardless of the outcome of the context and warrant.

And worse, in the process of gaining access to that one house and hard-drive, they simultaneously gain the potential to access millions of homes and hard-drives that have nothing to do with the warrant, and then those millions of houses and hard-drives see their net security reduced to essentially nothing.


They're offering to let Apple do the unlock themselves and just hand over the data.


And that establishes a precedent. So, in a future case that is less sensational, the court could order to Apple to unlock that phone as well and Apple would have to comply. Apple doesn't get to pick and choose which court orders it complies with. That goes for court orders in the US, China, Canada, etc...


Slow down there, you're moving the goal posts way too fast.

If you have an argument, stick with it.

If Apple doesn't want to look bad complying to FBI requests, they shouldn't have put a backdoor in their software.


Apple cannot do the unlock. They're attempting to force Apple write new software that will make it easier to brute force the passcode.

Whether they force Apple to also do the brute force themselves, or the FBI does the brute force, is not really the important part.


No, this would not reduce the net security of anyone at all. The FBI isn't asking for a master key to everything, or a backdoor to all iPhones. They're asking for the data on this one single device.

In other words, if some other party got the modified software, it wouldn't do anything. The software was asked to be device locked, which is trivial to do.

The reduced security you worry about already happened on the iPhone.


Also who's going to ever install software updates anymore?


> It is also theoretically possible that hackers could steal the software from the company’s servers.

It's also theoretically possible for hackers to steal Apple's private key from their servers, in the exact same way. As far as I see, there's no change to the threat model by Apple making a new software version, and signing it using the same process they use to sign other versions of iOS. It's useless if not signed, so the real worry is someone having the ability to sign it, and that applies exactly the same if Apple signs iOS or FBiOS.

This suggests the authors don't understand the technology well enough to know this.

edit: this is assuming the software is locked to a specific device. So the signing doesn't matter to any other device. If the software worked on all devices, then a leak of a signed version would be problematic. Although even if they couldn't lock it to device, they could make it only work for a short time, say a week, so if that signed version leaks later it would have no effects (I'm not sure if you can change the clock on a phone without unlocking it though).


Once the software exists, due to Apple's specialized knowledge, what's to stop them from issuing a NSL for Apple's signing keys + access to the remote update mechanism... then the FISC rubber stamps warrants en masse and Apple devices are used to spy on tens of millions of Americans. It sets a horrible precedent that the government can redirect the development resources of a private company.

How many millions of dollars in engineering salaries will this take? Project management? They're demanding an entire blackhat division of Apple be spun up, with the goal of circumventing other teams security effort.


None of that has to do with Apple creating the software, but rather with their handing over the signing keys etc.

The FBI or NSA could probably create their own software if they had the keys and source code.

I doubt the cost to Apple is over 100 man hours. But if it's too difficult, they can argue that and simply hand over the source code.


It would likely take more than 100 man hours just to provision the code sign; Apple probably has that all under controlled hardware that require multiple lead-engineer level crypto tokens across multiple divisions.

This isn't some Node.js box that a junior developer will monkey patch in prod—this is serious cryptography @ a large company w/processes.

"simply hand over the source code" Ohhh, so you're arguing a private company should hand over all of it's private property (it's source code) to a public institution because... terrorists?! Then Samsung starts hiring more FBI agents for some reason... or the source to the iPhone magically shows up online. Also how many hundreds of thousands of hours of engineering time will it take to sanitize that codebase to make sure it's suitable for public dissemination?


You think it takes 100 hours every time Apple releases a new iOS version just to sign it?

They used to do something similar with unencrypted devices, according to https://blog.trailofbits.com/2016/02/17/apple-can-comply-wit...

>Ohhh, so you're arguing a private company should hand over all of it's private property (it's source code) to a public institution because... terrorists?!

I'm not arguing that. I'm arguing that if Apple claims it's too hard to comply, there's a much easier method. It's to Apple's benefit that they're given the option to make the software and sign it themselves.

This isn't my original argument, I took it from http://bloombergview.com/articles/2016-02-17/the-apple-fight...


Yep, that's what I'm arguing. The SHSH blob servers probably aren't trivial + I'm certain they have a lot of process in place to keep someone from "accidentally" releasing a software update, or the release of iOS 9.3 from taking them all down.

You'd be talking about creating both a special version of the OS and updating the SHSH servers to accept that code signature.

What are they gonna do, just hardcode the device's UDID in a sub-routine and distribute it to the entire cluster? What about testing? If they do it wrong and the SHSH blob/update gets in a bad state, they could end up accidentally wiping the phone... so now QE needs to get involved + build test cases.

I'd just spitball a 20 person team with at least nine months, lots of meetings w/leadership/pm/and VPs, hardware being purchased, and several executives having to get involved.

Hell, it takes more than 100 engineering hours at my company just to update some chef cookbooks; let alone schedule a release and get sign off.

I'm guessing you haven't worked at a large company before?

----

They probably did something very different than custom firmware before.


It's trivial to redirect a network to ask your own server instead of Apple's. That's what tinyumbrella did while it still worked. So they wouldn't need to change those clusters, just set up a single machine signing only that device (which is easy enough that tinyumbrella did it without Apple's help), and give it access to Apple's signing key.

>If they do it wrong and the SHSH blob/update gets in a bad state, they could end up accidentally wiping the phone

How many hours does it take to test on a spare phone? 5?

>I'm guessing you haven't worked at a large company before?

No. But this isn't something that's being rolled out to millions of users. The team doesn't need to do anything that affects other users. They can have their own SHSH server offline, and sign everything offline.


The problem is not ability to search the device after it has been cracked, the problem is specifically does the government have the power to force companies to develop backdoors for their own devices?

In this case the cracking capability is for a locked phone in FBI possession. Let's assume the same technique isn't possible on newer phones. So what about the next case where the FBI wants remote access over LTE while the phone is unlocked / in use by the suspect?

If you can use All Writs Act to compel Apple to develop the first backdoor, then surely the same is true for the second.


This is also a great product move by Apple, as if they win it shows even the FBI can't access your locked phone.


Perhaps, but that would be a strong incentive for government to mandate that devices have to have a backdoor in the future, or that they can't use strong encryption so devices can be read without a backdoor. I doubt the FBI see losing this particular battle as the end of the war.

There's also another possibility, albeit less likely, that the FBI know they'll lose but they don't care because they can already read the device and they want any terrorists to believe they don't have the information on it. That's a pretty full-on "tinfoil hat theory" though.


Mandated backdoors are something that law enforcement wants, but something that the national-security complex probably doesn't want (they seem to feel they can get what they want without help, and don't want to make US personnel/corps easier targets for their overseas counterparts). And the national-security complex is likely to win that fight.


As has been pointed out by a study the other day, this would lead to people buying non-US phones. Strong crypto is out there. Worldwide. Subverting it in a single country will not make it go away.


Yup. I just bought a new iphone.


Why didn't you just wipe the data from it and start again?


I got my previous phone three years ago - the battery has gone to pot and Verizon has been telling me I'm reading for an upgrade - but I kept putting off going to the Verizon store to get a new one. I thought, sure, I'll go for the secure phone which is the product itself, and not the one that is created by companies that wish to make me the product.


Sure they can't mill down the chip and probe it with some advanced microscope?


It'd be difficult and expensive. There's also a good chance you'd meaa up and destroy the data.


What about asserting write lock on the flash? By piggybacking a copied (but encrypted), identical flash part, or an ICE that allows reads from the original flash and writes to another part, the os wouldn't necessarily know there was anything nefarious going on.

Besides, if this particular information is that valuable, they would have no problem paying for it.

Which brings us back to: it isn't the device they want cracked, but governmental oversight of (civilian) encryption.


I'm not sure but I think the problem is that the data isn't just encrypted with the user's passcode, it's encrypted with the passcode entangled with the private key of the secure enclave chip (which, presumably, is unknown to anyone).

So they can't just attempt to brute force the encrypted data, as the encryption key would be 256 bits or more rather than just a 4 or 6-digit numeric passcode. That's why they want Apple to open the OS up to brute force passcode attempts.

Edit: there's no secure enclave chip on the phone in question, but it seems that iOS 8 and later encrypt the user data with a separate private key combined with the user passcode.


If a component of the key is the pin the user inputs, probing the chip won't reveal much.


I'm not a crypto expert so take what I say with a grain of salt, but...

The password is tangled with the UID burned into the device at its creation, creating the 'passcode key' that secures the phone[0].

If extracting the UID is somehow possible, it doesn't sound impossible to try all 10,000 combinations (assuming the a 4-digit passcode is used) offline, on different software.

[0] - http://blog.cryptographyengineering.com/2014/10/why-cant-app...


One would hope that the part where this UID is stored at least have all the basic tamper proof protections so it self destructs if tampered with, such as e.g. many SIM cards does.


Does anyone feel this whole incident is a carefully staged by the FBI and Apple?

I'm not into conspiracy theories, but I'm wondering on this one.

Why would the FBI, or Apple make this so public? The Apple letter seemed staged?

The federal government always seems to get what it wants in the end, especially if one has a lot to lose?

I imagine the conversation starts off with an indignant, appalled CEO.

"Hell No--I'm not giving you access to my customers data!"

Federal government counters with, "Do you want us to scrutinize your past, and present life?"

"Do you want us to look at every stock trade you ever made?"

"Do you want us to publicize the personal information we have on you already?

"You know we can make your life misserable? You know we can make your companies life misserable?"

No this isn't Russia, but our law enforcement branch of the federal government scares me, and I'm a nobody. There's been some deaths, especially in tech, that seem suspicious. The drug overdoses--guy in San Francisco that was about to give a talk on ATM hacking comes to mind.

That tech guy who died in that fiery car crash on Los Angeles.

(I don't want to argue with anyone. I have no evidence. Just a weird feeling. And yes, Tim Cook seems like a choir boy. He comes across as someone who doesn't even jay walk.)


Let iPhone users opt-in and you'll find far fewer proponents of backdoor encryption than will be reflected in Congress. I wouldn't be surprised if most in the LE and Intel communities didn't opt-in. Why? Perhaps because the USGOV has yet to prove they can keep their own data secure. The Office of Personnel Management hack, Clinton official emails on a private server, and many more instances have shaken the faith in USGOVs ability to be an effective steward. I'm in favor of master keys for the FBI after they prove to the public, in a transparent and accountable manner, that those keys can be kept unassailably secure from misappropriation or abuse.


What if Apple's coerced firmware update bricks the phone? That's destruction of evidence they'd need immunity from. What trust and conflicts arise when a company has blanket immunity from such evidence being destroyed?

So many slippery slopes.


Off topic, but: "Apple is doing the right thing in challenging the federal court ruling requiring that it comply"

I am not a native English speaker. Why "it comply" and not "it complies"?


Subjunctive tense. Since it is still not known whether they will comply or not, the verb is unconjugated. Writing "it complies" would be confusing. It's highly proper grammar and not used in everyday spoken speech and not used all the time in most written speech, but since this is more formal and written, it's used. It's one thing the NYT gets right.



Nitpick: subjunctive _mood_, not _tense_.

From the Wikipedia article on English subjunctive: "The distinction between present and past is one of tense; the distinction between indicative and subjunctive is one of mood."


Native speaker here; I knew it was correct, but I couldn't answer this question. Thanks.


Future tense.

Read it with the word "should" or "shall" implied.

"ruling requiring that it should comply"


Nope, not the future, and not even a tense. It's the subjective mood (or mode, alternately).


In this context, comply means "cooperate with the courts" or "do what the courts ask."


I think you're probably right - it should be "... requiring it to comply" or "... requiring that it complies".

The original is not very wrong though - I don't think most native English people would even notice.


When two nuts with rifles trigger a "National Security" problem I think the problem is with the Nation not the people.

What happens when an entire nation threatens us ?


Couple of thoughts:

- Doesn't an iPhone become completely secure if you prevent if from passively polling for OS updates. If the iPhone would only poll and install updates after user un-locks and allows, then there would be no way change the necessary software configuration without breaking the encryption. And the encryption can't be broken, thus if you could chose OS -level default "don't accept or even check for any updates without user permnission" you really would have an unbreakable device. But if they ever did implement this, it would be terrible for Apple's forced upgrades and their platform would fragment into many incompatible versions. I think everyone knows which option Apple will chose between: CompletelySecurePhoneOS or AbilityToForceUpgradesAndPatches.

- Correct me if I'm wrong but didn't Tim Cook initially state cracking this phone was impossible according to Apple's experts? And now it seems it's a quite reasonable issue of Apple signing an OS-update specific to this device's unique ID; so quite feasible. So was that a lie?


I've got a feeling we're being manipulated by someone with an agenda at play.


Are we ever not? It's a political topic, so manipulation goes with the territory.


All they need to do is compel companies (by hook or by crook) to install backdoors in their algorithms or hardware.

https://www.eff.org/deeplinks/2014/01/after-nsa-backdoors-se...

https://en.wikipedia.org/wiki/Dual_EC_DRBG

http://www.cnet.com/news/spy-fears-lead-nuke-lab-to-dump-gea...

Who is to say that other state actors haven't done the same to chips produced by their companies? The truth is, the genie is out of the bottle.

A year and a half ago, I wrote a serious article on this: http://magarshak.com/blog/?p=169


The guy who wrote the two most important 4th amendment case opinions over the last thirty years (both in favor of privacy rights, both by 5:4 votes), and who had the longest record of cracking down on laws that were being stretched to cover uses never imagined (which by definition, the Writs law is being abused) just died.

It's not good timing....


For those of us not well-versed in these sorts of things, what were the two cases?


Jones and Kylo. Kylo held that the Government could not use infrared scanners to track individuals inside of their house without a warrant, while Jones held that the government could not use GPS trackers on cars, even in public, because it had to trespass to install the GPS tracker. He also dissented with the forced DNA testing of anyone charged with a crime, and from warrent-less suspecion-less traffic stops.

http://reason.com/archives/2016/02/16/antonin-scalia-was-a-g...


There is no explicit right to privacy in the Constitution. Some of Scalia's rulings happen to favor privacy while others do not. For example, he also said the Miranda ruling is judicial overeach, so in his view, you have rights (privacy or otherwise) only if you know about them ahead of time. Roe v. Wade established precedent for an implicit right to privacy interpretation of the 14th Amendment. Scalia has said that interpretation is wrong.


no, but the 4th amendment (which I referenced above) is the usual proxy. Even the most ardent of abortion backers will admit that Roe v Wade is flawed by every measure of judicial practice (which is why the entire line of thought advanced by Roe v Wade has not gotten any traction in any other ruling). Only the fact that it's tied to such a hot-button issue has kept courts from over-turning it.

The 4th amendments and all writs law is being broadly interpreted by this judge to give the government the power to effectively enforce a warrant against the entire population of the United States that uses iPhones. Scalia would have been all over that like white on rice.


Apple indeed may have a right to refuse the administration. Does the administration have the right to remove Apple Computer from the GSA government purchasing schedule?


Yeah, I'm sure they'd shed a tear over that.


Why doesn't the FBI copy the contents of the terrorist Farook's iPhone to a second iPhone (after all, it's only a hard drive). Make ten attempts on the second iPhone, brick it, then copy the contents again, try the next ten digits, and so on until they hit the combination?


As I understand it, half of the key that decrypts the phone is stored on the chip itself and cannot be read. When you enter in a correct password on the lock screen, it is added to this hardware key and the combined key is then able to decrypt the phone. If you move the contents of the hard drive to another iPhone, the key on that phone will be different and thus can't decrypt the drive. Its like 2 phase authentication on your Gmail account and you are being sent the wrong codes.

Check out this article for more information. It also explains how the fact that the phone is a 5C is significant to the encryption scheme. [0] http://blog.trailofbits.com/2016/02/17/apple-can-comply-with...


OK, good explanation. What about removing the original hard drive from the iPhone, and installing the copy? You could just keep plugging in copies until you crack the code?

Could the chip also not be cloned?


Part of the key for decryption is hard coded into the hardware of the phone from what I understand, so it wouldn't work.


Curious if Apple is breaking any user agreements if they end up doing this. Sharing information on servers is one thing. Sharing a key which enables access to all future communication too is different. Does their EULA cover such scenarios? Can it open them up to possible lawsuits from its users?


How, the phone owner, the state agency, granted permission, the actual users died. Who'd have the basis to sue under what claims?

If you thought you had a claim I guess you could sue to block the order, but that's up to the courts to decide and that would take precedence over an eula.


One thing I hope can be clarified: Is the FBI asking Apple to patch iOS on this one device, one time only (in a way that can not be reused) ... or are they asking Apple to provide a "reusable" patch / modification that allows future devices to be accessed?


I think their argument is fourfold: one, it provides precedent (which isn't a reason in itself, really) and two is that they fear this could be reverse engineered by baddies in order to make the phones of others' less secure, three it proves to third parties it can be done and so either someone will take it upon themselves to do or someone [Big market] will require Apple to do the same --which I think will happen regardless and four it's a good business tactic [strategically, it's tough to know what a big emerging market might demand from Apple]


> they fear this could be reverse engineered by baddies in order to make the phones of others' less secure

But isn't the procedure already pretty straightforward and well known?

1) Make a build of iOS which has the pin timeout feature disabled.

2) Sign that with Apple's private key.

3) Flash onto the iPhone.

That's more or less it, right?

What's keeping the general public safe isn't some sort of secret or obscure procedure. The general public's safety is in Apple keeping that private key private. And the FBI isn't asking for their private key, they're just asking that Apple use it in private, just like they normally do when they push out normal updates.

Am I missing something?


The specific patch the FBI is asking for is a way to be able to run a mechanized brute force attack on the pin for the phone without triggering the auto-erase.

This would require a new OS to be installed in a way that bypasses what I imagine are merely software blocks to installing OSes (it sounds like if they have possession of the device, they can install the OS to it).

This is a technique and a technique can certainly be replicated. Only problem is next time, Apple can't say, "this is an unprecedented step, and very burden-some," which actually turns out to be a legal basis.


If the government has legitimate warrants that stand up in court, then why shouldn't Apple be doing this on an individual basis?


Because a decision was passed in 1977 that said an uninvolved 3rd party can't be compelled to aide police. It's mentioned in TFA


Why would Apple not voluntarily comply? They should not need to be compelled here. They should be eager to assisting this investigation.

This isn't them taking a stand in some sort of NSA spying case, as much as Apple fanboys seem to think that's what's happening here. They're refusing to lift a finger in an investigation of mass murder.


Part of Tim Cook's argument was once they release the patch into the wild, it could be reused in other cases


> Part of Tim Cook's argument was once they release the patch into the wild, it could be reused in other cases

This is a common (and completely understandable) misunderstanding of the relevant paragraph:

"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. ...

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. ...

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. ..." [0]

Notice how the first sentence of the last paragraph talks about "this tool". "The tool" is the specific version of iOS that the FBI wants Apple to make that would only run on the phone that it wants to unlock.

Notice how the third sentence talks about "the technique". The change in terminology isn't accidental. "The technique" is "the act of demonstrating that Apple can create (and can be ordered to create) a backdoored version of iOS that bypasses tamper protection features of iOS".

The particular software that Apple would create can surely trivially be restricted to run on only a single iPhone. Unless there's a way to make iPhones run unsigned OS code without wiping the device, the only way that the image that Apple provides the FBI could be modified to run on a different iPhone is if someone got a hold of Apple's code signing keys. [1]

The problem to which Cook refers to is -therefore- not that there's a risk that someone might steal the image Apple provides to the FBI and use it to pwn more phones... it's that the government will do as it always does and keep coming back over and over and over again, demanding that Apple produce yet another image that unlocks yet another single phone of interest, regardless of whether or not they expect that the data on that phone will be particularly crucial to their case.

I expect that this would be disastrous for Apple's reputation. It certainly would not be good for society as a whole.

On the one hand, I can see how denying the government's request would be good for the industry and society. On the other hand, if the courts ultimately asserted that the FBI's request is legal and proper, it might spur Apple (and other similar companies) to ensure that the parts of their devices that handle device encryption and unlocking were not upgradable by any means... making generation of software to bypass features of those parts next to impossible.

OTOH, such an assertion would leave software-only privacy software (like Signal, GPG, WhatsApp, et. al.) in a really bad spot.

[0] http://www.apple.com/customer-letter/

[1] If someone gets Apple's code signing keys, many people are going to have many bad days.


Apple used to help the FBI get into iPhones all the time and it didn't destroy their reputation.

> A law enforcement source in the San Francisco Bay Area has confirmed to CNET that Apple has for at least three years helped police to bypass the lock code, typically four digits long, on iPhones seized during criminal investigations.

http://www.cnet.com/news/how-apple-and-google-help-police-by...


That story's relatively old now, and it's only recently that Apple has taken such an aggressive public stance on encryption and privacy. It's willfully ignorant to pretend that the Snowden revelations didn't change market realities in this area.


I don't see the difference.


Ever heard of a precedent?


Precedents are a cop out for critical thinking. Sure, they seem to have their uses, like when it guarantees victory for the righteous. However, it seems more likely it's being used as a 'they did it, so can we!' excuse.

It's one thing if you use it as a starting point for discourse, it's another when you use it to beat down the opposition with what amounts to childish antics dressed up in a suit and tie.


Precedent is an actual legal term of art, where a case decision can lead to binding (in the hard sense) judgements in proceeding cases.


"One thing I hope can be clarified: Is the FBI asking Apple to patch iOS on this one device, one time only (in a way that can not be reused)"

This must be weasel-word day. The FBI is asking for patch, hypothetically just for this phone. But only in this post have I seen anyone imagine "a way that can not be reused" since the point raised stated by the parent article is that such a patch could inherently be reused.


If all Apple produces is a signed update (with logic that activates it only if it's the right IMEI) then it's not useful to anyone hacking any other phone. The software can't be modified to work on them without Apple's cooperation, since any modification breaks the signature and you don't have Apple's signing keys.


I think in this case they are saying it would only target this one IMEI, not that the method cannot be recycled. Apple's argument is that this could "leak out" and allow mischief.


That's all window dressing. Once you compel the creation of software through a novel use of the all writs act, that opens the door to even more burdensome demands. Can decrypt an i-device with Secure Enclave? Build a new chip for us!


Given the powers and size of the national security budget and the NSAs in particular, why do they need Apple? Shouldnt the NSA be able to crack this on their own?

Kinda makes ya wonder what all that moneys spent on...


They can get the info they want with more sophisticated techniques from what I've read. Also, Apple already gave them the iCloud backups. The legal precedent of forcing companies to break their own security for the FBI's sake is the prize at hand here it seems.


Well, they'd arguably need the certificate that Apple uses for signing updates. That's well-protected, I trust.

There's also the issue that Apple software is closed-source. But that's presumably less problematic.


So here is a simple question. How will we know if the US becomes successful in having manufactures put in a backdoor? Can it be done to the current 6 models through a software update?


Glad to see this from the editorial board of the NYTimes


I may be wrong - but I have a feeling that, technology companies like apple and google are developing software on device to make user data so protected that they will say "I can not technically to crack my software" even ordered by a Judge (presumably for legit reason), thus the DOJ is using this case trying to prevent it from happening. And, if this is the case, then personally I am at the DOJ side, because I recognize this is a less ideal world (actually I think it is even worse), and this country is technically in a war.


Has there been any precedent of (the|a) government forcing someone to sign a piece of code against their will?


Wonder why can FBI just hand the phone to Apple keep the phone just get the data we don't needed specifics.


Do you really want that to be the model for American justice? Law enforcement hands your digital electronic records over to a private company who has the power to put anything on that phone? Do you want to entrust your personal legal future to a private entity? I think I'd want a solid chain of custody on anything that may prove my innocence.


The phone was the property of San Bernardino County.

The county has longstanding email and Internet use policies that state, “NO USER SHOULD HAVE AN EXPECTATION OF PRIVACY” - http://ktla.com/2016/02/18/why-didnt-san-bernardino-county-o...


I bet even I, not a lawyer, could reasonably defend that, no matter who "owns it", a device as personal and as much an extension of the brain as a smartphone could be considered private.


That's a very weak argument. Individuals and companies testify in court all the time. They could lie and plant evidence, but in most cases I'd still trust them more than the police who kind of has an incentive to find somebody guilty.


There's an article in the New York Times which says that Apple has already been doing this [1], but they seem to have had a change of heart in the past few years.

  Each data-extraction request was carefully vetted by Apple’s
  lawyers. Of those deemed legitimate, Apple in recent years
  required that law enforcement officials physically travel with
  the gadget to the company’s headquarters, where a trusted Apple
  engineer would work on the phones inside Faraday bags, which
  block wireless signals, during the process of data extraction.
1. http://www.nytimes.com/2016/02/19/technology/how-tim-cook-be...


Besides the chain of custody issues with the evidence if it leaves the Government's hands, Apple would almost certainly refuse to do this just the same.


Because once it is shown how it's done they will push to mandate the capability and force inferior security.


I wonder why the head of FBI didn't ask Tim Cook first in private. or I missed something?


TLDR for what follows: Mandated backdoors must be a red line, but this is not a request for a backdoor and actually seems pretty reasonable. Trying to argue that the tech industry shouldn't help, even in this case, is not only the wrong position in my book, but a sure way to lose the bigger debate.

My views on the general encryption controversy are:

1. Everyone must be free to make their technology as secure as they possibly can. There can be no mandated weakening of security, back-doors, or other requirements to make the information more easily accessible by law enforcement. On newer iPhones, Apple has patched up the flaw that the FBI wants their help with exploiting. They must continue to be allowed to do that.

2. The government must be able to demand, with a court order predicated on probable cause, that companies provide any and all information that they have that could be useful in circumventing their security features. This can be everything from technical specifications and threat-model analyses, to lists of unpatched vulnerabilities and code-signing keys.

3. It seems to me that American companies have a moral obligation that goes beyond the legal obligations in point #2. They should be actively assisting the government in recovering information, especially when concerning issues of national security. In extreme circumstances, like total war, this should definitely be legally mandated. I'm undecided as to what the policy should be generally. On a practical level, it's probably not feasible for the government to, e.g. start hacking around the iOS codebase themselves, so just information might not be enough.

I'm not too troubled by this court order, especially given the particular circumstances. The right to make products as secure as you can, even from yourself and the government, is what's really important to defend. Trying to argue that the tech industry shouldn't help, even in this case, is not only the wrong position in my book, but a sure way to lose the bigger debate.

Apple's definition of "backdoor" is highly suspect. A backdoor is if I ship my product with an intentional vulnerability, so that I can hack into it later. Apple's not being forced to add a backdoor, it already exists because the security features break down against an adversary that has Apple's private key, at least for the default 4-digit PIN configuration. Now the government is asking them to use their own capabilities to help hack this phone. Of course, Apple didn't create this backdoor for malicious reasons, they just didn't include themselves in the threat-model, greatly simplifying updates and other security features, and allowing the walled-gardenness of iOS. It's also central to the walled-garden. Curiously, this is in direct contradiction to their claim for some time now, that they were designing iPhones such that they themselves can't break into them.

Now put yourself in a Congressman's shoes. The FBI has been telling you for years that tech companies are being purposefully antagonistic to their legitimate search and seizure authority. That the tech companies are purposefully designing features with the sole intention of shutting the government out. Now here's a case where there was no mandated backdoor, the government was able to devise an exploit method, and they got a court order from a judge to make Apple use it on a dead terrorist's phone. "Mandatory backdoors would hurt everyone's security", one of the arguments that we've been winning with, now sounds like a bullshit cover for "we are against any government surveillance". Can you smell the legislation coming yet?

Disclaimer: These are obviously my own personal views and nothing else. They do not necessarily reflect the opinions, policies, or practices of anyone but myself.

(Reposted from https://news.ycombinator.com/item?id=11131456 with additional)


I am only going to reply to 3.

Why would any company have a moral obligation? In fact, what if they are saying this is their moral obligation? To not bypass the security they told customers they had put in place?

I think they by using the term moral obligation, people are trying to negate the meeds for laws or rulings.

In my opinion: There are no moral obligations for companies, just laws they have to follow.

After it is made law, it is the companies choice to do business there or not.


I wouldn't be surprised at all if Apple already gave a version of the OS to the FBI that enabled them to brute force the password for that one phone with the condition that they publicly put up this show as if they were not cooperating.


That would involve lying to a court. Intelligence folks might do that. Law enforcement? Not a chance; it would be career-ending even to discuss it.


If you truly believe that, I think perhaps you place too much trust in the wrong sort of people.

The reason why I consider the hypothesis unlikely is not because those involved are unwilling to deceive, but because they are unlikely to have sufficient skill to make the attempt and avoid detection. I believe that, considering the number of people that would be necessary, at least one Apple insider with ethical concerns would leak evidence that would expose the true story.


I believe that much more than "law enforcement never lies", but IMO it's hard to leak evidence in this case. A whistle-blower could say that he worked on this project, but there isn't the hard evidence of documents that for example Snowden could leak. The FBI or Apple could just deny it or say no comment.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: