> People were saying the same things about OO inheritance and and a bunch of other stuff which the industry have later become more skeptical towards.
In my experience this is the usual adoption curve of new practices (not necessarily best practices). They come as a shiny new solution for an annoying problem, it gets absorbed by early adopters that then become evangelists. More people start using these patterns/practices and start encountering the issues and obstacles with it, some get disheartened, frustrated and become outspoken against it.
These practices then evolve with time, becoming more nuanced, having "best applications" and "non-use cases" to better judge its applicability. And then becomes mainstream and a expected practice for modern software development/engineering.
I've seen this with multiple trends: OOP-isation of everything, UML (birth and death, kinda superseded by C4 diagrams these days), NoSQL, microservices/SOA, Kubernetes, Dockerisation, and so on and on.
FP and immutability are getting into the maturing phase of this, I remember a few years ago when it started making inwards on FE development, it was adopted, kinda overused and now I see it trending towards more cautious adoption.
I believe it's all a matter of understanding, all new shiny things will always have early adopters that can cope with the pain points. These are addressed when getting more mainstream and with that comes the knowledge of what actually these pain points are in the real world after years of adoption.
I don't see this pattern changing anytime soon, I've seen it repeat too many times the past 15 years.
Totally agree. I will add that the hype is typically ignited because a technique have shown real benefits in a particular domain. The problem comes when the technique is treated as a panacea and is applied uncritically outside of this domain. This eventually cause a backlash, which may even hurt adoption in the original domain where the technique was successful.
In my experience this is the usual adoption curve of new practices (not necessarily best practices). They come as a shiny new solution for an annoying problem, it gets absorbed by early adopters that then become evangelists. More people start using these patterns/practices and start encountering the issues and obstacles with it, some get disheartened, frustrated and become outspoken against it.
These practices then evolve with time, becoming more nuanced, having "best applications" and "non-use cases" to better judge its applicability. And then becomes mainstream and a expected practice for modern software development/engineering.
I've seen this with multiple trends: OOP-isation of everything, UML (birth and death, kinda superseded by C4 diagrams these days), NoSQL, microservices/SOA, Kubernetes, Dockerisation, and so on and on.
FP and immutability are getting into the maturing phase of this, I remember a few years ago when it started making inwards on FE development, it was adopted, kinda overused and now I see it trending towards more cautious adoption.
I believe it's all a matter of understanding, all new shiny things will always have early adopters that can cope with the pain points. These are addressed when getting more mainstream and with that comes the knowledge of what actually these pain points are in the real world after years of adoption.
I don't see this pattern changing anytime soon, I've seen it repeat too many times the past 15 years.