Unfortunately places like stackoverflow don't allow for "subjective" questions. So hopefully people here don't mind an open-ended discussion of software architecture.
I'd like to build a download accelerator (a sort of peer-to-peer torrent-ey type application that enhances http downloads), but I'm not sure how to go about limiting connection speeds, prioritizing chunks from different hosts, etc. The idea being that there's a mechanism to auto-discover other servers that host the same content, and some p2p nat-punching infrastructure to make it so that you can help serve collections of files.
Does anyone have any recommendations of required reading? Any suggestions on how you'd generally handle rate limiting and prioritization of different downloads?
How would you architect a system for downloading content from a bunch of http mirrors?
A particularly simple algorithm is https://en.wikipedia.org/wiki/Leaky_bucket