I think in tech 2-3 years is also about how long it takes for one's mental model of how the world works to become stale and outdated, usually coinciding with other projects (that had been too nascent to use) maturing.
I remember that in my peak webdev years in 2008, you built webapps by designing the HTML, converting it to templates, filling in data with Django or Rails, and then adding judicious interactivity with JQuery. By 2011 the world had moved on to Angular and SPAs, and you built webapps as a single HTML page and large JS bundle with a bunch of components that you'd fetch data for over AJAX. By 2013 the world moved on to React, and you had all these tools (Gulp, Grunt, Bower, NPM, etc.) to automate packaging and code-reuse. In 2015 people were still using React and more mature versions of these tools, but what changed was the economic reality that you could make lots of money as a webdev, and the industry itself was maturing with demand for new apps dying out.
I'm witnessing this with my team at work too. We have an "old guard" of leadership that joined the team in its peak years in 2018/2019, and learned (and often wrote) the tech stack as it existed then. Now our team is colliding with infrastructure efforts that started elsewhere in the company and were too immature to use last year, but are now starting to bear fruition and get widespread adoption throughout the org. People who were experts in the old way of doing things find that their skills and projects are now largely irrelevant.
> I think it's specifically a disease of the frontend community to believe that nothing you knew about programming 3 years ago is relevant anymore.
Exactly, and it is a shame. Stability and building on the shoulders of giants is how progress is made. Not by rewriting everything all the time.
Luckily I don't work on frontend UI stuff. I learned UNIX, kernels, cryptography, TCP/IP, SQL and similar technologies in the late 80s to early 90s. While all of these areas have evolved and progressed, it's been a gradual incremental change year to year. A textbook on any of these topics from 1990 is still recognizably relevant, even where details have changed over the decades.
Nah, not really. My second example has to do with internal frameworks inside a FAANG, where the average half-life of code is about a year (i.e. half your code is deprecated or deleted a year after you write it).
I've observed similar technology shifts with backend code (where MySQL was hot in 2003, PostGres in 2006, MongoDB in 2009, Cassandra in 2011, PostGres again in 2015, and now there's this huge explosion of storage solutions) and in platforms (where we were all about the web in 2007, all about mobile in 2010, all about blockchain in 2013, all about smartwatches & VR in 2015, and all about Ethereum & smart contracts in 2018).
> Nah, not really. My second example has to do with internal frameworks inside a FAANG, where the average half-life of code is about a year (i.e. half your code is deprecated or deleted a year after you write it).
That's exactly the same web-dev mindset, only applied to backend code. I find it remarkable that you take a half-life of 12 months to be a given, rather than a screaming red flag.
Yes, the world changes, and code needs to change along with it, but it doesn't change that much. Replacing 50% of your codebase every 12 months, year after year, indicates to me that the organization is just replacing poorly architected code with different poorly architected code, not better code. The codebase is on a random walk, and any forward progress it makes is due to chance and evolutionary pressure, rather than reason and design.
Remember that the architecture competency in FAANG exists in the context of a 2-3 year average tenure. And this elides a tendency to change teams internally even more frequently than that. It is unlikely that a senior engineer rated "Exceeds" at design and architecture in Silicon Valley has ever learned from or been evaluated on the consequences of their own decisions at more than 4 years out. Median probably one year.
An average tenure hides a good number of people with longer tenure than that. I'm personally a counterexample to your point, for instance, and I know quite a number of others on other teams at my company (one of the FAANGs).
Turnover is definitely not uniform and a 12 month architecture lifespan is not, in my experience, either normal or healthy.
My experience is that architectures usually last 2-3 employee tenures, the point is the original architect usually doesn't see the outcome of their work.
I don't think workers' entire skillsets are obsoleted by any of those changes, unless you specialized in trend-chasing (obvious risk) and maybe if you refused to board the NoSQL train. But even now those 2003 RDBMS skills have a place again! High scalability companies are back to using traditional DBs where it makes sense.
I agree with this. While the flavor of the year frontend stack might change, since 2008 when I started dabbling in frontend, the fundamentals really haven't changed: understanding DOM, understanding the JavaScript execution model, understanding the layout models, understanding the complexities of dealing with data binding over asynchronous events and simultaneous users, and so on. jQuery didn't change any of that (although it drastically improved ergonomics). Angular didn't. React didn't either.
My experience has been that paradigms and tools can move fast and are relatively straightforward to pick up. Deep understanding of the underlying problem domain and the systems underneath the abstractions is harder to pick up, but the concepts move more slowly.
Depends on what you work on. If you are not on pure tech (a minority of people are), as you get closer to the business you will get insights on how things work, how you can change them, make them better etc. Only part of our job is technical, and without the business knowledge you can't push the world in which you work further. There will be pushback from the "old guard" always, everywhere, as there will be push for the "new thing" always, everywhere. Neither is healthy, and both should be evaluated with a cool head.
2008 was probably the pinnacle of web development. Everything else coming after that was a continuous line of decisions sacrificing usability and user friendliness for developer convenience.
I remember that in my peak webdev years in 2008, you built webapps by designing the HTML, converting it to templates, filling in data with Django or Rails, and then adding judicious interactivity with JQuery. By 2011 the world had moved on to Angular and SPAs, and you built webapps as a single HTML page and large JS bundle with a bunch of components that you'd fetch data for over AJAX. By 2013 the world moved on to React, and you had all these tools (Gulp, Grunt, Bower, NPM, etc.) to automate packaging and code-reuse. In 2015 people were still using React and more mature versions of these tools, but what changed was the economic reality that you could make lots of money as a webdev, and the industry itself was maturing with demand for new apps dying out.
I'm witnessing this with my team at work too. We have an "old guard" of leadership that joined the team in its peak years in 2018/2019, and learned (and often wrote) the tech stack as it existed then. Now our team is colliding with infrastructure efforts that started elsewhere in the company and were too immature to use last year, but are now starting to bear fruition and get widespread adoption throughout the org. People who were experts in the old way of doing things find that their skills and projects are now largely irrelevant.