Not downtime, but some kind of issue, yes [1]. However github is not much better [2] and gives much less detail and fewer updates. Bitbucket is more solid recently [3] (but wasn't really better than the others in 2020)
It is usually their shared runners. I experience outages with them a couple times a month. It isn't a big deal since we maintain our own private pool, so I just move jobs to them. (They make you pay for CI minutes so we use them for certain jobs.)
I use GitLab for my side project (source control only; don't have any kind of CI set up just at the moment) and it's been years since there was any really noticeable slowness for me. I can't say I use it daily, but fairly frequently.
Yes. They (GitHub) have been doing a lot worse. Last time I checked, they where completely down 2 days ago. [0] Before that, GitHub Actions was down 14 days ago: [1].
At least with GitLab, you can set up a self-hosted solution for free as a backup, unlike GitHub (Unless you want to pay a lot for GH Enterprise) where some have 'gone all in on GitHub Actions' and then some couldn't push that critical change [2] before the start of the weekend. Oh dear.
Maybe its time to setup a self-hosted backup VCS and not depend entirely on GitHub/GitLab web.
> Maybe its time to setup a self-hosted backup VCS and not depend entirely on GitHub/GitLab web.
I've been wanting to do this for some time purely for git hosting (don't care about CI at all) but wasn't sure about how to make it failover...
Personally i'd be happy with a headless git server, but that's not fair on everyone else who wants the GUI to browse and organise stuff, so I want gitlab etc to deal with that. What would be nice is to have a headless backup server that allowed everyone to continue pulling/pushing from the CLI with their existing repos when gitlab is down. I can't see a smooth way of doing that since it would require messing with the git remotes, unless the solution is inverted and uses a single remote pointing at the "backup" server which then replicates to gitlab, but I don't think gitlab can be configured to change the default remote when people clone it from the UI.
I suppose this is why people just end up self hosting gitlab instead.
If you want transparent failover you need to use your own domain for the repo URL. In case of a failure you (or some automatic job) would have to change the DNS record.
Gitlab has a repo mirroring feature [1]. But of course you'd also need to sync users public keys.
Downtimes are so infrequent and relatively short that this isn't worth the effort for me.
You can instead set up a read-only mirror so people can at least still pull and browse the code.. Gitea [2] might be a better choice than Gitlab, since it's much more lightweight and easier to host.
> Downtimes are so infrequent and relatively short that this isn't worth the effort for me.
That's essentially the same conclusion I keep coming too, occasionally it has hit me when I go to push something but rarely has it blocked me or anyone else from continuing to work.
> You can instead set up a read-only mirror so people can at least still pull and browse the code
Yeah, this I need to do eventually just for peace of mind as a more automated backup solution. At least I don't have to care about failover.
Maybe some kind of pre/post-push hook to automatically push to a secondary repo? Not entirely sure if this is possible, I’ve never written a Git hook before.
Gitlab is completely down until just now, similar happened not that long ago. Over the last three weeks continuously had issues with the Gitlab runners (even not shared ones) were it wouldn't jobs or only after a second or third retry attempt.
I think the flakiness of jobs succeeding is also related whether you use their package registry. I can't say I had similar issues with Github.
Like it or not, fortnightly and “once every two weeks” are your only unambiguous choices.
We actually had a programming interview question using biweekly at a previous job. It was about thorough requirements definition and understanding/agreeing to measurements.