There are no presubmits that prevent breaking changes from "going into live". If some shared infra updates are released, the merge from live breaks for multiple individual teams rather than preventing the code from getting submitted in the first place.
“Merging to live” builds and tests all packages that depend on the update.
So for example, building the new JDK to live will build and test all Java packages in previous live, all of them need to pass their package’s tests, only then will the JDK update be “committed into live”.
The only difference is that Google runs all the presubmits / “dry run to live checks” in the CL workflow. Amazon runs them post CL in the “merge VersionSet” workflow.
If it is a new major version of the JDK, then no. Because no existing code would have a dependency on that new JDK major version so nothing gets rebuilt against the new JDK build in live and it will be committed.
If it is a new commit of an existing major version, then yes, existing code in live will be rebuilt before the packaged is committed to live.
With an appropriately configured CI pipeline, submitted / pushed code does not go live anyway, unless all tests and other checks pass. Unless a test case is missing, which can happen in a mono repo just as well, the code is always checked for the defect.
It's impossible to test for every kind of regression. Concurrency and performance bugs are notoriously problematic. At the scales of large codebases, you can have very thorough tests, but they need to be reasonably fast and behave the same way every time they run.
True, but those notoriously hard to find bugs are still notoriously hard to ind in a monorepo as well. I don't see anything in the character of a monorepo, that would somehow enable me to find concurrency bugs in an easier way than in a non-monorepo setting.