I'm building a pretty big service that has four user-facing websites and even more backends (HTTP servers, highly bespoke job queues to run ML workloads, etc.)
This was an absolute nightmare to try managing in separate repos. I've finally settled on two monorepos: a Yarn/TypeScript/React frontend monorepo, and a Rust/Docker backend monorepo.
Does anyone have any advice on these? I sort of stumbled into this pattern on my own and haven't optimized any of it yet.
For Rust, I'm curious if folks have used Bazel for true monorepo build optimization. I don't want to rebuild the world on every push to master.
Likewise for the frontend, is there any way to not trigger Netlify builds for all projects if only one project (or its dependencies) change?
If the (web) API surface between your BE and FE is based on a schema (a.k.a. typed API, like with OpenAPIv3 or GraphQL) then I'd put them in a mono repo. This way you can recompile the FE automatically if the schema changed (usually an FE client lib is generated from the API schema). This helps discovering errors at compile time.
If your API is not schema-based, you have no way of knowing something broke without FE/UI testing.
Bazel should be smart enough to build only what changed. Is it possible that your CI doesn't cache previous runs? With Bazel I successfully used Google cloud build to achieve that by storing the bazel-* folders to Google Cloud Storage as last step of every build and downloading them as first step.
The target bucket I use has a very short object lifecycle setting so I don't even have to clean up old artifacts manually.
I'm using Github to run builds. I'll have to investigate your setup, because that sounds perfect. I don't know if Github can do that.
What do you do if you need an artifact that gets garbage collected? Manually force a rebuild of that SHA? Have things on continuous deploy and update regularly? I may need better CI/CD practices.
To be honest I never optimized my setup to a single artifact level. The way I set up this was that in a Google Cloud Storage bucket I have a subfolder for each build whose name is monotonically increasing (ie by including time in the folder name like 20220225_23_49_50/bazel-* folders). That way I can copy the latest build to the cloud build VM and still retain history. The object lifecycle settings I use keep artifacts around for 1 month and I never had the need to find something outside that time window.
There could be smarter ways to do so tbh, like having time&date_<Sha of commit>, but I didn't have the need for any of that yet.
This was an absolute nightmare to try managing in separate repos. I've finally settled on two monorepos: a Yarn/TypeScript/React frontend monorepo, and a Rust/Docker backend monorepo.
Does anyone have any advice on these? I sort of stumbled into this pattern on my own and haven't optimized any of it yet.
For Rust, I'm curious if folks have used Bazel for true monorepo build optimization. I don't want to rebuild the world on every push to master.
Likewise for the frontend, is there any way to not trigger Netlify builds for all projects if only one project (or its dependencies) change?
Would super appreciate any advice.