I maintain a similar tool to manage large files in Git [1], but went the clean/smudge filter route. I think git-media gets into states where it can process files twice even though it shouldn't. From the Git docs:
> For best results, clean should not alter its output further if it is run twice ("clean→clean" should be equivalent to "clean"), and multiple smudge commands should not alter clean's output ("smudge→smudge→clean" should be equivalent to "clean").
So this isn't the fault of clean/smudge filters, just the way they were used with git-media.
We experimented with smudge/clean filters with our own implementation and they just didn't seem like the right solution for fat asset management.
The most frustrating problem was that filters are executed pretty frequently throughout git workflows, e.g. on `git diff`, even though assets rarely ever change. The added time (though individually small) created a jarring experience.
I'd also be curious how git-bigstore addresses conflicts. It seems like a lot of the filter-based tools out there don't handle them well for some reason.
> For best results, clean should not alter its output further if it is run twice ("clean→clean" should be equivalent to "clean"), and multiple smudge commands should not alter clean's output ("smudge→smudge→clean" should be equivalent to "clean").
So this isn't the fault of clean/smudge filters, just the way they were used with git-media.
[1]: https://github.com/lionheart/git-bigstore