Hacker News new | past | comments | ask | show | jobs | submit login

If Apple is doing what they say they are, it is in fact better. No maybe about it.

If they’re not, that means they are acting and intentionally deceiving the public security community which they are inviting to audit it.

Is that something you actually think is happening? I think we need to be clear here.

Your threat model may or may not be covered by the guarantees they are able to document, but just saying “well maybe they’re still doing some unspecified nefarious thing” is not contributing to the discussion.

Especially when none of the alternatives are even trying.




I don't make predictions about what different companies will do 10 years hence given they will be a collection of people, most of whom don't work there currently, doing business with regulations that don't presently exist.

"May" is just correct usage. How are you sure here? How could you convince a skeptic that it is possible to be sure?


Honestly I think this is a disingenuous defense. It's not insane to look at a closed-source project that is being partially-audited by cherrypicked organizations and say "that's not a very secure or trustworthy process". There is no reasonable accountability being offered to the community. It's like Ford selecting private safety inspectors to tell customers how great their safety is while conveniently leaving out any of the results from their federally-mandated crash tests. Is this really helping customers, or is it just blatant and masturbatory marketing?

Apple has worked to deceive the public before, in both small and large ways. They lied about backdooring notifications for the US government when they were asked to[0], so it's not too hard to imagine it happening anywhere else in their systems. They're not taking a traditional approach to software transparency which is suspicious, and their "threat model" has professedly not protected against motivated requests for identifying information[1].

When the Mechanical Turk attempted to fool commoners watching it work, it was imperative to hide every trace of the human inside. The candle used to see inside the machine was masked by smoke from candles placed around the room, the cabinet was locked to avoid accidental opening, and people were told not to touch it because it was apparently 'expensive and fragile'. Looks like Apple is the ringleader this time around.

> but just saying “well maybe they’re still doing some unspecified nefarious thing” is not contributing to the discussion.

But Apple is saying the opposite, "well, maybe we're doing the detailed secure thing, ask these people we hired", and you're praising them for it. If calling out objective and obvious logical fallacies isn't contribution, then how are we supposed to argue inside the Reality Distortion Field? Do we make-believe and assume that Apple's provided preconditions are true, or can we express concerns for the outstanding issues? I don't understand how these real-world flaws are somehow unjustified in conversation. You're allowed to hold Apple to adversarial levels of scrutiny if you take security seriously.

> Especially when none of the alternatives are even trying.

Apple is the largest company in the world and by many metrics (and points of comparison) isn't even doing the bare minimum in managing public trust. Whenever you are shown a whitepaper without the means to validate the contents yourself, you are being fed what is called "marketing" in the tech circles. You don't have to feel bad about being tricked though, it's the same thing that fools investors and overly-faithful engineers. Whitepapers are whitepapers, handpicked security audits are handpicked security audits, and code is code. There is no blurring of the lines.

[0] https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...

[1] https://www.apple.com/legal/transparency/us.html


> They lied about backdooring notifications for the US government when they were asked to[0]

That’s a bit much. They were compelled by the U.S. government to deny handing over data. Sure it’s technically a lie, in the same that a General stating they “neither confirm or deny X” is also likely a lie.

But it’s entirely unreasonable to judge Apple for following the legal, and mandated instructions issued by the democratically elected government of the nation they operate within. Are you honestly suggesting that companies like Apple should be expected to simply ignore the law when you think it’s convenient?


> Are you honestly suggesting that companies like Apple should be expected to simply ignore the law when you think it’s convenient?

No, I am suggesting that none of you know what you're talking about when defending Apple's brand of privacy. We know that they can be compelled to lie to us about their compute architecture, so why accept half-measures in your security? Because Apple is a good company and deserves the respect???


> We know that they can be compelled to lie to us about their compute architecture, so why accept half-measures in your security?

There's a difference between a court ordering preventing disclosure and compelling speech. The first amendment prevents compelling speech. They can be forced not to reveal. They can't be forced to make false claims.


Distinction without a difference, here. Apple's marketing already promised things they cannot guarantee, and instead of dropping the privacy shtick altogether they deliberately misconstrued their image to promote sales of Apple devices. The NSA didn't write the lines for them, but they also knew Apple wouldn't stop marketing privacy even if the CCP owned iCloud servers. Lying for marketing purposes is part of Apple's core identity.

Therein lies the problem. If you distort reality to cast a positive light on a service of dubious value, you're only going to drive out the knowledgeable users. This is how Apple killed FCP, Logic, Aperture, XServe, Metal and it's how they've driven out security experts too. Everyone serious about security got out of dodge years ago - the only people left are the sycophants who argue on the merit of whitepapers that cannot be validated. With Apple suing security researchers and neglecting their bug bounty program, it's no wonder we ended up in this situation. Companies like Cellebrite and Greykey can stock up on exploits because Apple doesn't take their security researchers seriously.


Your concern is the hardware auditors are not trustworthy because Apple hired them?

I mean that’s fair but I don’t think the goal here is to offer that level of guarantee. For example their ceremony involves people from 3 other Apple organizational units, plus the auditor. It’s mostly Apple doing the certification. They’re not trying to guard too heavily against the “I don’t trust Apple is trying to fool me” concern.

What this does protect you from is stuff like a rogue internal actor, software vulnerability, or government subpoena. The PCC nodes are “airtight” and provably do not retain or share data. This is auditable by the whole security community and clients can verify they are communicating with an auditable binary. It’s not just a white paper.

That’s an enormous step up from the status quo.


I mean they’re making the code public and inviting external auditors. There’s literally nothing else they can do as a private company. What evidence of the integrity could possibly satisfy you?

And again, the benchmark isn’t “theoretically perfect”, because I agree this isn’t. The benchmark is “other cloud providers” and they are either lying through their teeth or categorically better.


They're making the promises public. The code is being shown to selected individuals deemed suitable and then they're telling me by way of proxy. That's nonsense, show us the code if it's that easy to give to others. Anything else is suspiciously furtive.

> The benchmark is “other cloud providers” and they are either lying through their teeth or categorically better.

Other cloud providers aren't part of PRISM and generally don't receive the same level of concern from the world governments. They can afford to resist legal demands from countries they don't respect because they have nothing to lose from denying them access. Apple has not demonstrated that they have the willingness to resist this unlawful coercion, even recently. If they lied about the Push Notification security spec, what's to stop them from lying about this one too?


This has a ring of the same arguments made by flat Earthers. You could offer to take one to near space and show them things, but then every other one will stop believing that one person, so you're expected that unless you can take all of them to near space, you can't "prove" what you're trying to prove.

Your argument isn't far off from saying that Apple will collude with lots of security researchers, and because you're not invited to the party, nobody can prove that you're wrong. Oversimplification, yes, but basically true.


> that Apple will collude with lots of security researchers, and because you're not invited to the party, nobody can prove that you're wrong.

Or conversely, because you and I aren't invited, we have been deliberately deprived of a legitimate opportunity and excuse to inspect the code. Even Apple's researchers aren't going to be put in a position where they can say with absolute certainty the server isn't backdoored.

This is puppet-show levels of security theater.


> show us the code if it's that easy to give to others

See https://security.apple.com/documentation/private-cloud-compu...

disclosure: work at Apple, opinions absolutely my own.


The fancy security properties they’re talking about rely on a whole lot of closed source code not included there. Though an Apple intern did “donate” some of it to the public years ago.


Why the ever-living fuck do you think I should take Apple's documentation seriously when they lied through their teeth about the Push Notification architecture?

This isn't source code, it's marketing.


It's a link to source code on GitHub. It could be marketing too, but it's absolutely, incontrovertibly source code.


> They can afford to resist legal demands from countries they don't respect because they have nothing to lose from denying them access.

> Apple has not demonstrated that they have the willingness to resist this unlawful coercion, even recently.

These two statements are rather contradictory. A state is perfectly allowed to write and enforce any laws it deems fit, and if companies want to operate within those states, they need to follow those laws.

You also make it sound like companies like Apple are part of PRISM because they want to be, rather than being forced/coerced into participating. Do you honestly believe that Apple has more geopolitical power than the US state? Entire nations have been humbled by the U.S. state, do you honestly believe that a private enterprise is capable to withstanding that kind of pressure, while also remaining within the law?

PRISM as a whole may have eventually been determined to be unlawful, but that was only after-the-fact, and only because U.S. state secrets were leaked. How is an organisation like Apple supposed to prove that the requests they received were unlawful, when entirely legal apparatus was used to essentially make it illegal to challenge the orders themselves. It’s a perfect catch-22. To resolve this issue you’re demanding that Apple knowingly breaks the law sometimes, and that individuals within Apple should risk their own freedom and liberties for your convenience.


> A state is perfectly allowed to write and enforce any laws it deems fit, and if companies want to operate within those states, they need to follow those laws.

Or you can just leave. Google did it when China demanded unfair censorship and surveillance measures, Apple can too if they wanted to market themselves as a security defender that has an actual backbone. Right now Apple's whole security shtick feels like the theater you get out of Bitlocker or McAfee.

> do you honestly believe that a private enterprise is capable to withstanding that kind of pressure, while also remaining within the law?

No, I believe that a private enterprise claiming to respect privacy as, and I quote, a "human right" would be willing to stand up for the rights they believe in. Whether that means disclosing when things are backdoored, apologizing and preventing further backdoors, or outright open-sourcing your code, is up to Apple. They have communicated none of those things clearly or quickly which leads most people to (correctly) assume their obligation to the state supersedes their obligation to individual privacy.

> How is an organisation like Apple supposed to prove that the requests they received were unlawful [...] when entirely legal apparatus was used to essentially make it illegal to challenge the orders themselves.

By not automating the process? Let's break it down here - assuming Apple's PRISM compliance is real, we can assume the status-quo is Apple and the NSA both wanting to keep the surveillance quiet. Being sneaky with their backdoors is mutually beneficial and allows both of them to maintain plausible deniability when a national news story starts breaking.

The NSA has basically no leverage over Apple. The federal government could punish them punitively for refusing to disclose information in the name of national security, but unless they have dirt on Tim Cook the NSA is mostly relying on cooperation to get what they want. Apple on the other hand has everything to gain from proving their dedication to security and identifying illegal misconduct within their own services. When they don't identify these things and admit they were compelled to stay silent about blatant dragnet surveillance it reneges the faith they advertise to those of us in the security community.


> The code is being shown to selected individuals deemed suitable and then they're telling me by way of proxy.

That is incorrect! The binaries are public and inspectable by anyone. The tools for doing it are bundled right into macOS -- like literally on every consumer's machine.[1]

Furthermore the protocol for connecting to a PCC node involves cryptographic proof that it's running a published binary.

[1] https://security.apple.com/documentation/private-cloud-compu...


Binaries != code. A security professional cannot evaluate a remote service by inspecting the binary that (supposedly) runs on a remote system. Even under ideal conditions it's a move that proves you still have something to hide by not just showing people the code that your architectures running on. It's as if Apple will do anything to prove their innocence except removing all doubt.


> Other cloud providers aren't part of PRISM and generally don't receive the same level of concern from the world governments.

Um, Microsoft, Google, Meta, Yahoo, YouTube, Skype, and AOL are/were PRISM Service Providers and I’d argue that they all receive(d) equal (+/- 5%) concern and scrutiny from those world governments.

> They can afford to resist legal demands from countries they don't respect because they have nothing to lose from denying them access.

Are you talking about the cloud providers I listed above? From my perspective, those guys all tend to honor the demands of any state that offers a statistically significant percentage of current/potential consumers, regardless of the demand. Perhaps they have some bright spots where they “did the right thing” (like refusing to unlock a device, or refusing to provide access to private data) but by and large they all—including Apple—are subject to the rules of the states within which they operate.

> Apple has not demonstrated that they have the willingness to resist this unlawful coercion, even recently.

Ten years ago, Apple refused demands by the FBI to unlock the iPhones of various suspects. Four years ago they did the same during the Pensacola Naval Base shooting investigation. I would guess there’s plenty of other examples but I’ve not been watching that stuff much over the past couple years. Were those instances just cherry-picked for marketing purposes? Maybe, but until someone shows me compelling evidence that Apple is /not/ acting in good faith towards both their consumers and the governments under which they operate, I see no reason to believe that they’re “lying about this one too”.

I do keep a salt-encrusted spoon nearby when reading about these things but that doesn’t mean I refuse to trust someone who has demonstrated what appears to me a good-faith effort to keep my privacy intact. Maybe what Apple is doing with PCC is just security theater; I doubt it but I also recognize that marketing and technology are often in conflict so we must always be cautious. But the important thing, both to me and GP, is that none of the other cloud providers have offered (whether it be sane and intelligent privacy controls or just snake oil-like scams) any solution beyond “encrypt your data before you upload it to the cloud”.


> Um, Microsoft, Google, Meta, Yahoo, YouTube, Skype, and AOL are/were PRISM Service Providers

Correct. I didn't say all cloud providers aren't part of prism, just that many (most?) aren't scrutinized like Apple is.

> From my perspective, those guys all tend to honor the demands of any state that offers a statistically significant percentage of current/potential consumers

I know. It's awful, we don't have to defend it just because "the other guy" does it. Microsoft and Google left markets over this sort of disagreement, but curiously Apple doesn't.

> until someone shows me compelling evidence that Apple is /not/ acting in good faith towards both their consumers and the governments under which they operate, I see no reason to believe that they’re “lying about this one too”.

...do I have to link you the notification thing again, or is that evidence that the government is acting in bad faith and Apple is entirely scott-free for deliberately lying about their security/privacy marketing while being coerced to pretend nothing bad happened?

See, part of the problem isn't just comparing Apple to their competitors, but to their own advertisements. Apple knew their security was compromised but continued to promote their own security and even fabricate entirely misleading documentation for their own supposed system. This is why I will never be satisfied unless Apple nuts up and shows everyone all of the code. They have proven beyond a shadow of doubt that they will exploit anything we take for granted or are told to accept as-written.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: