You really cannot. Both from a practical point of view. Does the thing really does what _you_ want it to do? A typical OS is much too complicated to verify this (and no theorem provers just move the problem).
But also from a theoretical point of view: I give you that the source does what you want it to do (again: unrealistic). Then you still need to verify that the software deployed is the software that builds reproducibly from the source. At the end of the day you do that by getting some string of bits from some safe place and compare it to a string of bits that your software hands you. That "your software" thing can just lie!
And yes you can make that more complicated (using crypto to sign things etc.), but that just increases the complexity of the believable lie. But if your thread model is "I do not trust my phone manufacturer" than this is enough. In practice that's never the thread model though.
What are you even talking about? We're talking about security, not 100% correctness, which is indeed not achievable. Security as in the software doesn't contain backdoors. This is much easier to verify, and even the very openness of the code will prevent many attempts at that.
Also, trust must not be 100%, as Apple is trying to train their gullible users. Oppenness is definitely not a silver bullet, but it makes backdoors less likely, thus increasing your security.
> you do [verification of reprodicible builds] by getting some string of bits from some safe place and compare it to a string of bits that your software hands you.
The XZ backdoor was completely in the open. It only got found because an engineer at Microsoft was far too good at controlling his environment and had too much free time to track down a 1% performance degradation. So... no, you really cannot verify that there is no backdoor. Not against a well resourced, patient adversary.
I'm not sure what your links are supposed to be proving. I'm neither of the opinion, that PCC is useless, nor am I under the misconception that a signature would provide a guarantee of non-maliciousness. All I'm saying is that, if you include Apple as an adversary in your thread model, you should not trust PCC. But not because it's closed source (or whatever) but simply because you fundamentally cannot trust the hardware and software stack that Apple completely controls all interfaces to.
Personally I don't consider this a useful thread model. But people's situation does vary
Not unless your entire stack down to the bare silicon is also FLOSS, and the community is able to verify.
There is a lot of navel gazing in these comments about "the perfect solution", but we all know (or should know) that perfect is the enemy of good enough.
On Qubes OS (my daily driver), which runs everrything in VMs with strong, hardware virtualization, you can use minimal operating systems with very low number of installed libraries for security-critical actions: https://www.qubes-os.org/doc/templates/minimal/