For 25-player lobbies and some persistent state, I can't help but feel it's over-engineered. Certainly many games (including popular mmos) have been made with much less. This seems like they were engineering defensively against the potential of massive success, which ends up slowing everything down at the point where you need to move most quickly (pre-launch game design iteration and testing).
I don't mean to be reductive though, clearly a lot of work has gone into this architecture and they know their problems better than I do. Props to the devs for seeing it through and getting the game launched!
I appreciate when teams provide such a detailed write-up of their tech stacks. The callouts to bad choices and recovery (e.g., Nomad) is also refreshing to see.
Thanks for sharing! I really enjoy reading these types of articles.
I was a bit surprised to see Perforce being used (I'm not a game developer), I have not come across it since the early 2000s. From what I recall it was very good dealing with merges.
Perforce and Plastic (now Unity, ugh) are used widely. They are AFAIK the only VCS that can handle large binaries, huge repos (XX TBs) and out of the box diffing on non-text files. Plus a nice client for non-technical folks (artists) to use.
Perforce is the standard for the last 15 years at least in video games. Probably more, that's just based on my experience. Any game middleware has to support it, Unreal Engine even if public on GutHub uses Perforce for their development internally, the GitHub is a two-way mirror.
It's only during the actual enforcement of a patent that it's validity can be called into question. It isn't so unusual for a patent to be struck down in court. Patent offices have tendency to hand them out 'easily'.