This is great. I do this sort of git-blame accounting to track how much code is written by AI versus humans in each release of my app.
My "blame script" has been slowing down as the repo size increases. I was just about to add caching, like you have.
Have you thought about adding the ability to limit the stats based on a set of file patterns? Perhaps like this, where the file follows gitignore conventions?
If you have a shell that supports extended globbing, you could do something like:
$ git who table */**/*.go
That works for me using Bash. I believe all that's happening here is that Bash is expanding the globs and passing a long list of individual filepaths as arguments to git who. Git who then passes them to git log so that it only tallies the commits you'd get by running:
Git natively supports excludes in all pathspecs, e.g. `git log -- ':!generated/'` to exclude files in the `generated/` folder from showing up in the log.
My "blame script" has been slowing down as the repo size increases. I was just about to add caching, like you have.
Have you thought about adding the ability to limit the stats based on a set of file patterns? Perhaps like this, where the file follows gitignore conventions?
I tried to quickly add this functionality but unfortunately I don't know go.