I think it is pretty clear that these $20/subs are loss leaders, and really only meant to get regular people to really start leaning on LLMs. Once they are hooked, we will see what the actual price of using so much compute is. I would imagine right now they are pricing their APIs either at cost or slightly below.
I'm sure that there are power users who are using much more than $20 worth of compute, but there will also be many users who pay but barely use the service.
I would've thought "most" people would be firmly on the free tier, are there any stats to check how common it is? Personally I've had Gmail for 20 years, Android phones for 13 years, Home/chromecast devices but never thought about paying for Google One or any other services.
Sam Altman said they use about same amount of power as an oven. So at $0.2/kWh thats about 100kWh/4kW=25 hours of compute or a little over an hour every workday.
When using a single terminal Pro is good enough (even with a medium-large code base). When I started working with two terminals at two different issues at the same time, i’m reaching the credit limit.