I'm in casino gaming. We have to send our source and tools to regulatory test labs so they can (hopefully) independently generate the same binary as what we are delivering. Given our tools (C++ and Windows), 'binary reproducibility'[1] is impossible, but we've got a workaround. We do our release builds on a VirtualBox that's all tooled up. When it comes time to deliver to the lab, we export the entire box (with source already synced) as an .ova. Part of our build pipeline is a tool that strips things like timestamps and paths from the PE files. Some people don't go to all this trouble and instead use tools like Zynamics BinDiff to explain away the diffs.
What are the companies that provide this service (reproducing builds)? I haven't heard of this, but sounds interesting.
Depending on how much effort you're willing to put in, even if you use C++ and Windows, you can still write a program to parse the executable and zero out timestamps and other non-deterministic data. That is actually being done in a BitCoin-related program for Windows I believe.
How do you generate and verify the VirtualBox? If you send the image over to the test lab, then the obvious thing to do is for someone to attack your VirtualBox, and you have the same problem all over again, just at a different level.
For jurisdictions that don't have their own state-run labs (so not NV, NJ, PA, etc.) everybody uses one or a mix of GLI[1], BMM[2], and Eclipse[3] Note: I'm only familiar with US gaming.
We do have a tool to zero these parts of the executable files out, but in our testing we still had unexplainable differences unless we were on the same machine working from the same sync.
The VirtualBox was generated once (installed Windows, Visual Studio, .NET, some others) and we just continue to use the same base .ova.
The package has to be sent to the lab on physical media where it gets loaded onto an offline machine that we've supplied.
This works for your goal (being able to reproduce the binary build), but in Mozilla's case it's slightly different.
Being FLOSS software, Mozilla's goal is that end-users can completely reproduce the builds from source. This includes dependencies, toolchains, AND the build environment. In this scenario, accepting a pre-build binary VM would not be acceptable, since it defeats the spirit of FLOSS.
I used to work in the same industry. We used linux and gcc, so we could, and did, produce fully deterministic builds. Actually the output was fully deterministic disk images.
I did one iteration of the build system, mostly making it such that any host could build it deterministically. This was years ago so it was just chroot that started with a skeleton + GCC and procedurally built the things it needed to build the outputs. Was fairly straight forward, just an extremely short patch here and there, a 1000 line Xorg Makefile for staging Xorg builds. If I was doing it again I'd consider reusing a package manager, but each components Makefile was pretty concise. My trusty sidekick was a script that xxd'd two files into pipes that it opened using vimdiff.
So the regulators have to use the provided virtual machine and tools to build the source, and verify that the resulting binary is the same as provided by your company?
How do they confirm that the toolchain has not been messed with? Surely they can't binary-check the whole OS/compiler/linker/other software in the VM?
[1]https://www.google.com/?gws_rd=ssl#q=binary+reproducibility