Hacker News new | past | comments | ask | show | jobs | submit login

Further reading:

Tainter -- "The Collapse of Complex Societies"

Diamond -- "Collapse: How Societies Choose to Fail or Succeed"

Meadows et al -- "Limits to Growth"

edit: after listening to the podcast there is much in common with Tainter's general argument. I'll paraphrase poorly: as civilisations become more complex and interconnected, this makes them more efficient (e.g. economic specialisation and trade). But the increased complexity also increases the fragility of the system, and also adds ongoing maintenance costs. At some point there is no marginal return for adding complexity, or the return on maintaining the complexity becomes negative, so the complexity is no longer worth maintaining. Collapse.

I did like the observation/claim that as civilisations encounter trouble, and things start to go badly, they respond by continuing to do whatever they are familiar with doing, but with increasing intensity.




> But the increased complexity also increases the fragility of the system, and also adds ongoing maintenance costs. At some point there is no marginal return for adding complexity, or the return on maintaining the complexity becomes negative, so the complexity is no longer worth maintaining. Collapse.

Society collapses because of technical debt?


Why don't societies fluctuate around an equilibrium of maximum complexity maintainable?

Going overboard shouldn't mean collapse but readjustment by simplification. Unless the entire system instantly rely on any newly introduced complexity at a structural level?


Simplification seems very difficult because entrenched interests benefit from the complexity. Look at health care: it's not hard to imagine a vastly better and cheaper system than the US has (indeed many other countries have it), but even minor changes to health care will involve battling politically powerful groups, and the political risk is extremely high. Same thing can be said about the complexity of the tax code. There have even been HN articles recently about how it's not hard to simplify the tax filing process, but Intuit and other tax preparers lobbied against it and killed it.


Re: tax code - there was a good planet money episode a few weeks ago [0] about this. It argues that Intuit aren't nearly as big a player as conservative groups who bizarrely view simplifying the tax code as akin to raising to taxes.

[0] http://www.npr.org/sections/money/2017/03/22/521132960/episo...


Is that essentially why the Freedom Caucus torpedoed the ACA replacement? "We don't believe this should even exist, therefore we will vote against measures to move it closer but still short of our vision."


Complexity also engenders complexity. Insurance is a great example. Insurance companies have incredibly complicated payment systems because the laws are complicated.

Switching to even a simpler insurance would still require rewriting all of that.


I call these aspects of systems "ratchets". I.e., there are subcomponents in systems that have a strong resistance to going "backward".


I always think this quote has explained a related topic the most concisely:

"It is difficult to get a man to understand something, when his salary depends upon his not understanding it!" - Upton Sinclair, "I, Candidate for Governor: And How I Got Licked" (1935)


It's called Shield of Complexity and technology professionals have been accused of it too.


There is such a thing as political debt. Cultural feedback loops reward the wrong behaviours and create institutional inertia which makes the entire system hard to upgrade.


Problem is, when exactly does everyone agree to a specific course and execute it? Survey says: Never!

See climate change. It's bee long and drawn out and the major hallmark of this issue is a huge amount of resistance from anyone who would benefit more from the status quo.

I what we see in the climate change debate projects onto all sorts of other issues. People are short sighted and stubborn.


As long as people are part of the system it will never be as cut-and-dry as readjusting to simplify. I don't think we can even manage coping with something like replacing long-haul truckers with automated systems.

I'm starting to believe that maybe a sufficiently advanced AI should overthrow humanity.


> Why don't societies fluctuate around an equilibrium of maximum complexity maintainable?

I suppose in many cases if you overshoot it's not possible to go back to slightly below the equilibrium, but rather the overshoot causes some non-linear/exponential/whatever effect that causes a substantial drop (e.g. a collapse). See Lotka-Volterra model for a simple example.


"Why don't societies fluctuate around an equilibrium of maximum complexity maintainable?"

If you look at the last ~10,000 years of history, arguably you can see that. The cycles are longer than you might think, though.

I think technology really throws the cycles for a loop, though. I'm a big fan of looking at history, and an even bigger fan of not thinking that we're just so special that we're specially immune to history or the forces that it shows exist, which history shows is a popular delusion and is very alive and well today. On the other hand, I think the last 200 years of technology is a fundamental shift, too, and while history never truly repeats itself, I think there's good reason to think that we now more than ever can't just take past collapses and expect them to match what is currently happening.

People have not necessarily fundamentally changed from historical people, but the relationships between people definitely have, and that's probably going to have some sort of effect. I don't mean anything like "Facebook is a big change", either. I think the most important change is massive speed change in both communication and travel. I think we're well past the point where quantitative change becomes qualitative change there.

I also want to say I'm not necessarily making utopian predictions; I'm just not sure the old disasters are inevitable. In particular I think that in the 21st century there's a not unreasonably chance that, say, the United States Federal government could completely collapse (for the sake of argument, imagine a nuke taking out the whole of Washington DC), but the modern speed of communication could result in effective, functioning emergency governments arising in just a year or two instead of decades. (Probably based on existing state governments, but I'm not sure the US as a whole would reform necessarily. Pieces of it, probably, but I'm not sure the whole would.) If swift collapse is matched by swift reassembly of other governments, will we necessarily see decades or centuries of a "dark age"? If the reassembly can occur before all the technical knowledge dies off?

On the other hand, new nightmare scenarios are on the table too; for instance, "one world government" will always be tempting (because the Powers that Be who think they will be in charge will always push for it), but the possibility that it could create total correlated economic failure across the entire world would be the biggest catastrophe in human history, and the benefits of it basically can't possibly outweigh this possibility. I suppose that'll be a Black Swan relative to the people pushing for this, though for me I consider it so obviously inevitable based on history that it probably isn't one for me.

Anyways, the upshot is that while history has things to teach us and we are well-advised to learn as much as we can from it, I also believe that in certain fundamental ways we are breaking new ground, with all that implies.


>Unless the entire system instantly rely on any newly introduced complexity at a structural level?

That's what for-profit science, marketing, and fads do for you...


I guess startups overtaking markets from established companies shows this kind of renewal within the system... :)


Legal systems for example just get more complex over time, there is very little simplification going on.


> But the increased complexity also increases the fragility of the system, and also adds ongoing maintenance costs.

Somewhat tangentially related:

https://news.ycombinator.com/item?id=14027705

Abstraction is one possible mitigation to this problem.


> At some point there is no marginal return for adding complexity, or the return on maintaining the complexity becomes negative, so the complexity is no longer worth maintaining. Collapse.

This also seems to happen to software, which is why we keep rewriting the ecosystem every 10-20 years.


That does not sound plausible at all, and does not correspond to the major catastrophies we've been observing e.g. Middle East civil wars or World Wars one and two.


Wasn't WWI directly the result of increasingly complex alliances and military guarantees? (Admittedly, with nations still performing their own calculus and deciding going to war benefited them)


For science fiction fans, Vernor Vinge uses these ideas to great effect in a subplot of _Deepness in the Sky_.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: