Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Other than paying them for hardware that has no discernible advertising revenue for Apple...

My personal favourite example is device encryption. Briefly, I worked in the mobile device management (MDM) space when that was a very new thing, and all the major manufacturers had to start adding device encryption to meet enterprise policy requirements. (At the time, all new Windows laptops used BitLocker.)

So the vendors did that. They added encryption to their devices.

The Android spec sheet added a line:

   Encryption: Yes
Apple had a tech day talk where the guy in charge of device encryption dev team talked for an hour about the four layers of encryption on an iPhone. How it decrypts the bare minimum when booting, and keeps itself in a partially-decrypted mode while locked. How there's a bunch of fine-grained keys so that app-specific data can't leak out. So on, and so forth. The aim was to prevent state-sponsored groups pulling apart locked-but-powered-on phones and extracting plain text secrets directly from the flash, cache memory, or whatever.

They had thought of everything. It was as good as encryption could be made without compromising on functionality. I dabble in the Enterprise PKI space also, and the only time I had seen a design this thorough was the internal Ethernet network of the Boeing 787 plane[1].

Afterwards I paid more attention every time there was some fight between a government agency like the FBI and Apple. Each and every time, Apple chose the side of their customers, locking things down further with secure enclaves, anti-hammering protections, in-house security-critical silicon design, and more.

As a reminder, Google's level of encryption is simply "yes". A checkbox tick to meet a requirement, that's all. They really, really don't care about your data security, and it shows in the way they act in practice.

I see similar customer-centric design elements in Apple AirTags. I read the whitepaper on the cryptographic algorithms they used. These very cleverly provide the maximum possible functionality with the minimum possible user data exposure. It's private and useful at the same time.

Google has never published anything of that sort, ever. I'm pretty sure of it.

If you disagree, please link to a whitepaper outlining privacy-guaranteeing technology that they developed and have included in a product that they sell. Or Epic. Or Facebook. or anyone else.

[1] Yeah, yeah, Boeing is a bad company with bad leadership, but their engineers know their stuff.



> Google has never published anything of that sort, ever. I'm pretty sure of it.

Android Enterprise Security White Paper

https://static.googleusercontent.com/media/www.android.com/e...

Chrome browser security

https://services.google.com/fh/files/misc/chromebrowsersecur...

Google Cloud security Whitepaper

https://services.google.com/fh/files/misc/google_security_wp...

Facebook

Communicating About Privacy: Towards People- Centered and Accountable Design

https://about.fb.com/wp-content/uploads/2020/07/Privacy-Tran...

By Epic, I'm assuming you don't mean the public interest research center in Washington, DC seeking to protect privacy, freedom of expression, and democratic values in the information age.

their white paper about the harms of genai is at though https://epic.org/wp-content/uploads/2023/05/EPIC-Generative-...


It's my fault that I wasn't clear in my request: I was specifically requesting some fundamentally new privacy-centric research paper, not pre-existing technologies like SELinux or x.509 being added as a check-list item to appease enterprise customers.

I.e.: From what I can tell, Google spends approximately $0 on privacy research such as security technology that only benefits end-users.

You linked to a bunch of security technologies that Google just copy-pasted into their products to compete with Apple and Microsoft, or to meet large customer requirements.

Also note that GCP is one of the few offerings Google has where the "users are the customers" instead of products. Despite this, there's nothing in that whitepaper you linked that impresses me as unique or special about GCP in comparison to AWS or Azure. I never hear anything out of Google that's even vaguely privacy-first, even in such divisions.

Linking to a paper with the word "privacy" in it from Facebook is just hilarious.


Old goalpost ---> New goalpost.


> Each and every time, Apple chose the side of their customers, locking things down further with secure enclaves, anti-hammering protections, in-house security-critical silicon design, and more.

With Apple, you're allowed to have privacy unless, of course, you're trying to promote democracy in Russia [1], organize anti-government protests in China [2], or otherwise communicate privately without government's approval [3].

Of course, all of that security is a theater as far as government agencies are concerned because Cellebrite, an US-allied firm, can unlock the devices a few months after each iOS release [4]

[1] https://www.nytimes.com/2021/09/17/world/europe/russia-naval...

[2] https://www.bbc.com/news/world-asia-china-65830185

[3] https://www.washingtonpost.com/technology/2024/04/19/apple-w...

[4] https://www.documentcloud.org/documents/24833832-cellebrite-...


> Apple had a tech day talk where the guy in charge of device encryption dev team talked for an hour about the four layers of encryption on an iPhone.

Did they also talk about all the vulnerabilities that can constantly root a phone with a message on imessage, without any user interaction?

Yes multiple ones… because they don't bother (or don't want?) to fix them properly.


You’re being downvoted (by others) because you confused design intention with errors in implementation.

All vendors have bugs in their products, including security bugs.

Apple is one of the few trying to protect the privacy of their users with technology.

Google does not. The entire company exists to drain your personal data into a giant pond for their ad-tech analysts to splash around in.


You are confused. The implementation error is most likely an intentional backdoor that got discovered and reported, and they had to fix, but not too seriously. Except it got found out and reported again.

Remember when apple had their own JRE implementation (forked from sun's) and they were fixing well known vulnerabilities after months rather than hours like sun was doing?

> Apple is one of the few trying to protect the privacy of their users with technology.

Their privacy protecting effort starts and end inside their marketing department. No engineering employee is involved in that. And judging by your posts, it seems their strategy is working great, and they can also mine your data and for money like everyone else as well, while convincing you otherwise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: