This seems like exactly the kind of idea built for git scraping[0].
Have GitHub run a daily/weekly pull of the site in question. Attempt to add the artifact to the repo. If identical, no action taken. Otherwise, a commit is made with the new content, and you can now trivially diff the changes over time.
Have GitHub run a daily/weekly pull of the site in question. Attempt to add the artifact to the repo. If identical, no action taken. Otherwise, a commit is made with the new content, and you can now trivially diff the changes over time.
[0] https://simonwillison.net/2020/Oct/9/git-scraping/