I know that there is a lot of work being done at the Big Cos to meet various regulations and conformance things to make sure that data is encrypted at rest and in transit with customer supplied keys, no unilateral access, end-to-end audit logging etc.
You don't win the big big big hundreds-of-millions government/military/finance/healthcare contracts without these sort of things. The Big Cos are not going to ignore those sorts of opportunities, and are obviously putting in the work with hundreds/thousands of engineers implementing the provably-secure nature of their products, from supply-chain to hardware to software to customer support access.
PCC is fundamentally more secure than merely encrypting at rest and auditing access. That still has a variety of attack vectors such as a software bug that leaks data.
Apple is unable to access the data even if subpoenaed, for example, and this is provable via binary audits and client verification that they are communicating with an auditable node.
How is that any different in either direction? Bugs exist in any and all code. Encrypted data is unencryptable if you don't have the keys.
I don't see that apple software is any different in that regard (just try using Mac OS for any length of time even on apple silicon and you run out of fingers to count obvious UI bugs pretty quickly just in day to day usage). And obviously AWS won't be able to decrypt your data without your keys either.
The people running these huge multi-multi-billion clouds are not idiots making fundamental errors in security. This is why they all pay mega salaries for highly skilled people and offer five-figure bug bounties etc - they take this seriously. Would some random VPS or whatever be more likely to make errors like this, sure - but they are not in (and not expected to be) in the same league.
You forget that big companies will gladly offer shit to their customers, but then will offer something better to those who are willing to pay more (id est, governments. Why make things more secure for everyone if not doing that can make more money?
I know that there is a lot of work being done at the Big Cos to meet various regulations and conformance things to make sure that data is encrypted at rest and in transit with customer supplied keys, no unilateral access, end-to-end audit logging etc.
You don't win the big big big hundreds-of-millions government/military/finance/healthcare contracts without these sort of things. The Big Cos are not going to ignore those sorts of opportunities, and are obviously putting in the work with hundreds/thousands of engineers implementing the provably-secure nature of their products, from supply-chain to hardware to software to customer support access.