Hacker News new | past | comments | ask | show | jobs | submit login

This seems like exactly the kind of idea built for git scraping[0].

Have GitHub run a daily/weekly pull of the site in question. Attempt to add the artifact to the repo. If identical, no action taken. Otherwise, a commit is made with the new content, and you can now trivially diff the changes over time.

[0] https://simonwillison.net/2020/Oct/9/git-scraping/




I had no idea you could run GitHub actions as a cron job. That’s pretty cool.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: