Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> we used 2048 A100-80GB for a period of approximately 5 months

Do we know how much total energy a human consumes from birth to 20 yo? Something like 2000 calories integrated over 20 years. How does it compare to the GPUs above?

Wolfram Alpha:

- human - 17 MW/h ((2000 calories per day) over 20 years in MWh)

- GPUs - 3000 MW/h ((2048 * 400) W over 5 months in MWh)

We still have the edge.

LOL, I'm being downvoted, I wonder way. Some don't like the question.



You have to include our evolutionary history too. A considerable amount of our sophisticated behavior doesn't require specific training, as it is encoded in our genetic and epigenetic systems. We aren't starting from zero.


Then you would need to include the our history in the GPU calculation. GPUs require evolutionary bootstrapping - they didn't materialize alongside the first few hydrogen atoms post BB.


Every human requires the same energy, 20+ years, and training.

The trained computer model can be duplicated and used, requiring much less energy.

None of this matters to me, though.

The goal is to build better models. We can worry about the efficiency later.


exactly. we are speedrunning 200,000 years of intelligent life evolution here.


You mean MWh maybe, not MW/h? (which is what, J/s^2 in SI... "Power rate".)


Right, I used the correct MWh in Wolfram, but for some reason wrote MW/h, I think it was written like that a long time ago on electricity bills.


> We still have the edge.

Depends on what you're doing. A human is much smarter than one of these models, but the model has approximate knowledge of orders of magnitude more things. And the energy costs per word of output are a lot closer.


Don't mix MW/h with MWh.

Anyway, i remember hearing that the brain uses 60 Watt. That's 10.5MWh in 20 years.

But, we can't transfer/copy that gained knowledge limitlessly.


It’s because your human math for power output is so far off it’s hard to know where to start to point you in the right direction


Please do tell. Or better provide your estimation. I just took raw calorie intake, no heating/transportation/lighting/computer usage/....


A thing to keep in mind is that 1 MWh of raw calories takes much more than 1 MWh to produce (fuel for tractors, inefficiency of meat etc). The GPU energy is also easier to make renewable.

I did an extremely rough calculation recently that the training of GPT-3 is comparable to one transatlantic flight (all passengers combined) in terms of emissions, very depending on the energy mix of course.


That's the entire problem. There's so much more energy that goes into a modern human beyond just what they eat. Beyond physical items you've listed like clothing there's also education and healthcare. Those two institutions are critical in making a modern human and they both have their own dependency chains of physical resource, energy, and the input of even more humans.


Your units are bad. Did you mean MWh instead of MW/h?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: