There's some truth to this. I'd say that perhaps Git is unsuitable above certain thresholds on numbers of files, numbers of commits, and size of data, but, that the size of data (and the fact that for certain file types this gets blown out on edits) is probably the one most people hit first – it sounds like you may have.
I've heard of git monorepos with >1k microservices, and it sounded like it wasn't a huge problem, but I think it likely had significantly less than 100GB. In fact I feel like you need to be committing large binary files to get towards that size.
> I'd say that perhaps Git is unsuitable above certain thresholds on numbers of files
Scott Chacon in a recent talk about git[0] said that Microsoft uses git to version control Windows. They have ~3.5 million files in the repo, which is about 300 gigabytes in size. There are some tweaks to improve git performance in those scenarios; but still, I'd say it's a pretty impressive threshold.
I've heard of git monorepos with >1k microservices, and it sounded like it wasn't a huge problem, but I think it likely had significantly less than 100GB. In fact I feel like you need to be committing large binary files to get towards that size.