It's been such a disappointing trend of companies initially welcoming developers, just to then eat their lunch. Twitter has been one of the worst: they killed their developer ecosystem.
Apple falls very far from that tree. Lots of money is being transferred to developers even today. It's just hard to see because so much is getting sloshed around.
What's changed? The interface and primitives available for building applications. Rather than having to create a blockchain, we can _use_ a blockchain to do the timestamping.
Our insight is that this is still too hard to have quick access to use the blockchain meaningfully. So this service is a wrapper around those blockchain/cryptography primitives making it easy to create and lookup. It's the indexing part that can make the difference between a useful app in theory and in practice. In theory anyone could read the blockchain and create their own index from certain data on it...in practice that's more work than most will do... hence this kind of service.
Marginal costs can be upheld through protection, self-regulation, and enforcement -- essentially of human rights.
I think people who are able to make a living by doing such work - won't burn all the money by buying yachts and party airplanes (like modern Internet "heroes" of entrepreneurship) - but by creating more pro-bono information work.
I like your statistic about people not keeping up their resolutions: it's perhaps the best argument for using a tool. But how to formulate a realistic goal?