Hacker News new | past | comments | ask | show | jobs | submit login

Use case: crawling possibly related files based on string search hints in a repo you know nothing about...

Something on the order of 6 seconds a page doesn't sound TOO out of human viewing range depending on how quickly things load and how fast rejects are identified.

I could see ~10 pages / min which is 600 pages / hour. I could also see the argument that a human would get tired at that rate and something closer to 200-300 / hr is reasonable.




All of that assuming they're limiting based on human-initiated requests, not the 100x requests actually generated when you click a link.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: