It would likely take more than 100 man hours just to provision the code sign; Apple probably has that all under controlled hardware that require multiple lead-engineer level crypto tokens across multiple divisions.
This isn't some Node.js box that a junior developer will monkey patch in prod—this is serious cryptography @ a large company w/processes.
"simply hand over the source code"
Ohhh, so you're arguing a private company should hand over all of it's private property (it's source code) to a public institution because... terrorists?! Then Samsung starts hiring more FBI agents for some reason... or the source to the iPhone magically shows up online. Also how many hundreds of thousands of hours of engineering time will it take to sanitize that codebase to make sure it's suitable for public dissemination?
>Ohhh, so you're arguing a private company should hand over all of it's private property (it's source code) to a public institution because... terrorists?!
I'm not arguing that. I'm arguing that if Apple claims it's too hard to comply, there's a much easier method. It's to Apple's benefit that they're given the option to make the software and sign it themselves.
Yep, that's what I'm arguing. The SHSH blob servers probably aren't trivial + I'm certain they have a lot of process in place to keep someone from "accidentally" releasing a software update, or the release of iOS 9.3 from taking them all down.
You'd be talking about creating both a special version of the OS and updating the SHSH servers to accept that code signature.
What are they gonna do, just hardcode the device's UDID in a sub-routine and distribute it to the entire cluster? What about testing? If they do it wrong and the SHSH blob/update gets in a bad state, they could end up accidentally wiping the phone... so now QE needs to get involved + build test cases.
I'd just spitball a 20 person team with at least nine months, lots of meetings w/leadership/pm/and VPs, hardware being purchased, and several executives having to get involved.
Hell, it takes more than 100 engineering hours at my company just to update some chef cookbooks; let alone schedule a release and get sign off.
I'm guessing you haven't worked at a large company before?
----
They probably did something very different than custom firmware before.
It's trivial to redirect a network to ask your own server instead of Apple's. That's what tinyumbrella did while it still worked. So they wouldn't need to change those clusters, just set up a single machine signing only that device (which is easy enough that tinyumbrella did it without Apple's help), and give it access to Apple's signing key.
>If they do it wrong and the SHSH blob/update gets in a bad state, they could end up accidentally wiping the phone
How many hours does it take to test on a spare phone? 5?
>I'm guessing you haven't worked at a large company before?
No. But this isn't something that's being rolled out to millions of users. The team doesn't need to do anything that affects other users. They can have their own SHSH server offline, and sign everything offline.
This isn't some Node.js box that a junior developer will monkey patch in prod—this is serious cryptography @ a large company w/processes.
"simply hand over the source code" Ohhh, so you're arguing a private company should hand over all of it's private property (it's source code) to a public institution because... terrorists?! Then Samsung starts hiring more FBI agents for some reason... or the source to the iPhone magically shows up online. Also how many hundreds of thousands of hours of engineering time will it take to sanitize that codebase to make sure it's suitable for public dissemination?