The problem with your argument is that it assumes technical progress continuing eternally. For instance Malthus himself was mostly preoccupied with the idea, that the population might increase faster than the agricultural output. Through the use of chemical fertilizers there has been a breakthrough in how much food we can produce, so the malthusan catastrophe was averted. Agricultural output is still growing, but there are physical limits.
Malthus was aware of increasing agricultural progress, but the argument he made was that it would increase arithmetically (linearly f(x+1)=f(x)+c), whereas population would increase geometrically (exponentially, f(x+1)=f(x)*c).
It's a hugely important, contentious, and complicated issue (here's a thought: populations can't actually grow exponentially even given unlimited resources, with a loose, but strict, upper bound at the speed of light. The tight upper bound is unknown, probably multifaceted, but I don't see how anything beyond cubic growth is possible). Anyway. When speaking of grey whales, for instance, we say that the population is limited by the "carrying capacity." OK. But they aren't inventing new ways of getting fat nearly as fast as we are.
It boils down to there being a very real Malthusian limit on population. "History" (which is code for "1929 to 2007 or so" because for many economists history began that year, with a vague notion of there being time before that when Americans lived in some sort of Garden of Eden in which nothing ever happened), "history" proved Malthus wrong. Well, yeah, definitely, the food supply didn't grow linearly, which you figure it could have, nor did population increase exponentially, because it can't. But at any rate, he had a point in that there is a Malthusian limit at any given time, and there is a world population level at any given time, and humanity can move them up or down independently. And that's what the article is about, is unwittingly moving the limit down, by, say, turning whales into butter until they're barely any left. For much of the 20th century, we've been so successful at moving it up faster than the population level that we've come to think there is no such limit. And hey, what about Mars?
Just because we haven't hit the Malthusian limit recently doesn't mean it's not there.
I don't think people here (or anywhere) dispute that here is a hard upper limit on the maximum population the Earth can support at any given time. The dispute is in the proposition that population is fated (or probabilistically likely) to grow faster than agricultural production. That may have been true when pro-natalist dogma held sway and agricultural technology was stagnant, but that's no longer the case--people who advocate having as man kids as possible no matter are seen as crazy in most parts of the world today, and for good reason.
Birth control is likely and is effective if you don't try to actively suppress it.
Population growth is slowing down rapidly, there's even a chance that the global population will peak in the next century (and this is due to people living in industrialised countries having less than 2 children per woman, not due to any global calamity). We may have one more doubling of the global population, and then it's pretty much flat, even decreasing. Very, very far from exponential growth, that's for sure.
On the other hand, technological improvements aren't exactly stalling. In food alone there are huge benefits yet to be reaped from GMOs and hydroponics, just to mention two. It may not be exponential growth anymore, but there's little to suggest that it will just stop.
So no, the Malthusian limit may very well not be there at all.
At least in the US, the proportion of land needed to produce food is steadily declining.
For lack of a better metaphor, we have a race between galloping food supply and the propagation of the idea that smaller family sizes are a good thing. Given what I know now, I'd bet on human reproduction declining before food does.
The big monkey wrench is political instability. We have much less of an understanding of it than perhaps we should have.
>At least in the US, the proportion of land needed to produce food is steadily declining.
As an aside, I wonder how much more vulnerable this makes the food supply to shocks, coupled with the increased supply chain length and how many additional points of failure that introduces.
Not necessarily in terms of the frequency of those shocks increasing, rather the compounded impact of those shocks given the number of people being supported by smaller agricultural bases.
It all seems rather unlikely. There are all sorts of shocks, all the time in agricultural production but we have a hybrid system composed of subsidies ( that cause pretty significant overproduction ) and derivatives that lay off price risk. SFAIK, these are now deeply entangled.
It hardly perfect, but it's been polished and debugged over decades. The film "King Corn" does a good job of a layman's exposition of at least the subsidy system.
This is the problem with real-world Malthusian predictions. There are (obviously) hard limits on growth and carrying capacity for Earth, but extrapolating from some existing metric like food growth per human is an atrocious way to find them.
A convincing Malthusian argument needs to take one of two forms: either concede that it represents doom without engineering progress, or convincingly argue that it represents a physical boundary rather than an engineering one.
Yes, that's the point. But it's very hard to point to the effective engineering limit.
For example: the ultimate limit to computing power is Bremermann's limit, but that's only reached in the surface of a black hole. The effective limit of computation – how much you can compute in Baryonic matter – is going to be far lower. How low is a hard question to answer.
But computing power is at best a second-order productivity factor.
If you can apply it to a network system, you've got some options for increased efficiency. If you're applying it to an energy system, the improvement is far more likely in the 5-25% range.
Example: automobiles have made increasing use of computers since the 1970s. Computer performance has improved on the order of millionsfold. Automobile fuel efficiency has less-than doubled. A small compact of the early 1970s could achieve ~30 mpg, or 7.84 l/100km, a more useful measure of efficiency. Current models are in the 40 - 50 mpg range, let's take the upper limit, giving 4.70l/100km.
That's a 40% reduction in fuel use. Even before considering countervailing offsets via the Jevons Paradox.
Google have recently similarly claimed a 15% efficiency improvement in data center energy management, again resulting from massive increases in compute efficiency. That is, the end-point energy use impacts aren't much changed.
There's more to this, and the story is complicated. But while computing has very high potential increases, the end-point impacts (worker productivity, economic efficiency, even technical capabilities) are often oddly muted.
To put another twist on this: the Apollo lunar missions had a few hundred pounds of computer assembled in a ring around one of the higher (IIRC above the 3rd stage) boosters. Swapping out that compute capacity for an equivalent mass of modern tech would have very minimal impact on the mass, range, or accuracy of the system as a whole. The main advantage is that an equivalent compute capacity -- sufficient for the mission, could be provided in vastly less mass, which is critical for Earth-based rocket launches. But the improvement then is based on the reduction in the physical requirements for providing compute capacity, not the increased performance of massively more compute capacity itself.
Precisely, thanks. I can get an upper bound on Earth's carrying capacity with (volume of earth / volume of a human), but it's going to be way harder to calculate the number that will actually matter.