Decentralization adds a good deal risk and operational overhead but it doesn't really change the core problem that people are generating more data than many want to pay for. If you want to store a non-trivial amount of data, someone needs to get paid to maintain storage pools and validate multiple copies.
Take whatever the backups are, say 1TB, split it into 12 pieces, add as many pieces as the user wants (say 6 on average), then you can recover with any 12.
When you add another 1TB, add another 18 peers.
Monitor your peers, only trust the ones with a track record of successful challenges, and of course let you white list peers you trust (like friends and family). Of course the perfect challenge is just a restore, but some checksome of a range of a blob could be useful as well, and consume less bandwidth.
Drop peers that are unreliable, or ask for too much bandwidth for their restores.
Encrypt the files before adding reed solomon, use a unique encryption key for each peer that stores 1/Nth of your backup set.
That way you can "pay" for your storage by just adding that much more local disk space to trade with your peers.
1. Everything is harder to work with: you have to deal with less reliable networks and storage, computers which aren't on all of the time, very slow uplinks, etc. Since these aren't professionally managed systems, too, you have less visibility — did that node just drop offline because the hard drive failed, losing everything, Comcast is having a bad day, the owner just bought a new one and wiped the old one without unenrolling it, or because it rebooted and is almost back up?
2. People are selfish: I don't want Netflix getting slow because you decided to retrieve your data, I'll complain if I hit a storage limit on my computer due to your stored backup data, etc. This forces you do deal with things like traffic shaping and storage rebalancing more aggressively and those are hard problems to get a popular balance on. Consider, for example, what happens when someone uses your service and it goes well but then they run out of space and need to clear some up in a hurry (_especially_ if they put on their cowboy hat and just delete a bunch of large files because they know they aren't the only ones with a copy).
3. The solutions to the previous problems make the cost problem worse: storing more copies can avoid some of the problems but then you need to figure out how to get the network to support, say, 5 copies instead of 2-3.
4. Consider what happens the first time the police bust someone for a major crime and their data is backed up on your computer. Not many people are enthusiastic about going into court to prove the negative assertion that they didn't have a decryption key.
Only peering with trusted systems avoids some of these issues but not all and the _big_ problem is that the upper bound for how much this service is worth is basically the cost of iCloud/Dropbox/S3/Backblaze/etc. The savings you can get between the fixed operational costs and what those services charge is probably not enough to support development.
It's been tried plenty of times. P2P software is really complex but the use case is people who aren't willing to pay, so maintaining the software isn't sustainable.
Check out storj.io