I'm using Github to run builds. I'll have to investigate your setup, because that sounds perfect. I don't know if Github can do that.
What do you do if you need an artifact that gets garbage collected? Manually force a rebuild of that SHA? Have things on continuous deploy and update regularly? I may need better CI/CD practices.
To be honest I never optimized my setup to a single artifact level. The way I set up this was that in a Google Cloud Storage bucket I have a subfolder for each build whose name is monotonically increasing (ie by including time in the folder name like 20220225_23_49_50/bazel-* folders). That way I can copy the latest build to the cloud build VM and still retain history. The object lifecycle settings I use keep artifacts around for 1 month and I never had the need to find something outside that time window.
There could be smarter ways to do so tbh, like having time&date_<Sha of commit>, but I didn't have the need for any of that yet.
I'm using Github to run builds. I'll have to investigate your setup, because that sounds perfect. I don't know if Github can do that.
What do you do if you need an artifact that gets garbage collected? Manually force a rebuild of that SHA? Have things on continuous deploy and update regularly? I may need better CI/CD practices.