Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> they were running entirely on free Heroku dynos. This year they successfully transferred to AWS, but they got to a fairly big scale while running entirely on free Heroku dynos.

I’ve read so many of these stories now that I understand why Heroku is phasing out the free tier.

It’s kind of fascinating to see how some engineers see free tier limits as an optimization target. I wonder how many engineering hours across the industry went into arbitrarily keeping services small enough to fit in the free tier.



> I wonder how many engineering hours across the industry went into arbitrarily keeping services small enough to fit in the free tier.

Why is that wrong? If everyone optimized their services and applications like it used to be in the past, will substantially bring down compute and memory requirements and reduce overall bloat.


If a service is broken down into n microservices such that each of them fits into the free tier, these n microservices are likely to consume more resources in total.


It’s not, strictly? It’s just more of a tragedy of the commons.


Heroku is not "the commons", they're a company and people are playing within the rules that they set out.


This is definitely the "software engineer" way of conceptualizing and solving the problem. You have discrete limitations, you optimize around them.

This is clearly not "working at intended" though and I'd like to believe everyone knows that, including the people building this system.


This has to be the analog of "the first amendment only protects you from government restrictions on speech" in this conversation.


Heroku can protect itself from abuse of its free tier by changing the rules - which they eventually did.

I'm sure that the only reason why they kept the free tier as long as they did was because it brought in enough users (who were eventually converted into paying customers) to be worth the cost.


Of course it’s well within Heroku’s domain to structure their services however they choose. The tragedy is that if everyone uses the platform’s free tier to maximum utility in a self beneficial way, it exhausts the resources available and the common utility ceases to exist. So it’s not ethically wrong as was my response to the GGP. But it’s functionally “wrong” because it exhausts the common resource.


I think it's more of sadness that abuse took it away. It would be nice if hobbyists only used the free tiers and not successful businesses.


One day, compute requirements will be part of a companies eco-statement.

Eg. Amazon will be forced to say 'For every item purchased from Amazon, 370 grams of CO2 are emitted by our servers and other business operations :-P'


> I’ve read so many of these stories now that I understand why Heroku is phasing out the free tier.

To me, those stories seem like good arguments for reducing the usage limits of the free tier, but not getting rid of it entirely.


You don't need to profit off every single customer. Many business models make 90%+ of their revenue from 'whales' and crumbs from the rest. Optimising for consistent margin from every single customer could be a waste of time; a distraction, and could even hurt a business.


Why do all the big clouds have generous free tiers with reasonable limits then?


I like free shit that doesn’t suck.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: