Hacker News new | past | comments | ask | show | jobs | submit login

Work-life balance is important, but this:

> they realize that that to learn how to do the same thing with 167 technologies is not a real development

is quite an arrogant view to take on new technologies. It assumes that the sole reason people use new things is a combination of ignorance and boredom. There may be some of that involved in some cases, but even re-inventions of the wheel usually come with a fresh perspective and incremental improvements over the last time. More importantly, knowing the technology of the day allows you to relate to and collaborate with others, which is most certainly not a waste of time.

Turning down applicants who have the above mindset isn't ageism.




Might be arrogant, but after rewriting the same stuff over Sun RPC, CORBA, DCOM, DCE, XML-RPC, SOAP, WebServices, BPEL, RMI, Remoting, REST, gRPC,.... eventually it gets tiring.

Or deploying stuff over HP-UX Vaults, J2EE/JEE containers, mainframe language environments, VMs, Docker, k8s, lambdas (CGIs just got rediscovered),....


I mean, I totally get your point, but these protocols/technologies solve specific problems quite well. In most cases is not an arbitrary decision.

A different case would be rewriting your Angular app in Vue just because the latter is more shiny.


There's probably no better language on earth to build a data warehouse in than java yet i fully expect to see a BI book on Javascript OLAP technologies within my lifetime. You use what you grow up with.


> There's probably no better language on earth to build a data warehouse in than java

Slightly OT, but I'm curious why that is?


Because so many data warehouses has been built in it. Most significant problems have been solved traditional star schema/snowflake schema fact table oriented systems. And a change in language ins't going to open new doors that otherwise remain closed now.


Not sure I agree. SIMD instructions, vectorized query, and low-level memory management are hard to implement in Java. Plus there's genuine uncertainty about the future of the language. I would not implement a new data warehouse in Java at this point.


Intel and Linaro have contributed SIMD support, although it might be harder than using something like C++.

Not only is the language open source, while during Sun days it was only free beer, there are several contributors, and Microsoft has acquired jClarity, started contributing to OpenJDK, gives parity with .NET tooling on Azure, alongside Red-Hat supports Java on VSCode.

Ah, even Java has had more talks at Build 2020 than F# or VB.

The only uncertainty is from the anti-Java, Oracle hating crowd, for everyone else, companies using IBM, Red-Hat, Adobe, SAP,.... products, it is business as usual.


The category "RPC framework" solves a specific problem quite well, but it's doubtful that each incarnation is really a sufficient improvement on all those that came before it to justify re-learning and migrating.


The problem being how to sell new tech, books, consulting contracts....


Angular vs Vue is not an arbitrary decision either, the developer experience is massively different.


Try adopting a perspective of curiosity: "How does this new tech work? What new benefits might it provide?" Or at least one of professional duty. All knowledge workers have to keep up with the latest developments in their field; lawyers have to study up on the new case law, doctors have to read medical journals.

If you've decided ahead of time that it's a drudgery, you're only going to make yourself miserable. You don't have to spend all of your free time on it, but you do have to remain flexible and open-minded. You may as well try to find something there to enjoy.


Most of this new tech is mostly created with the purpose of generating new business, that is all.

As Alan Kay perfectly puts it, fashion driven industry, and naturally one must keep themselves fashionable.

So up one goes rewriting that working WCF service into gRPC, because fashion.

But is ok, after all someone needs to pay those consulting rates and keep projects rolling, book industry happy with introduction and best practices books, and new themes for conferences, and most importantly blog posts with postmortems regarding technology migrations.


You're just reiterating the cynical mindset that I'm recommending against, because it's:

a) Atrophic to one's career

b) Just a generally miserable way to live as a programmer

I don't believe things are so bleak, and I think people would benefit from being open to that possibility.


> a) Atrophic to one's career

Career growth after entry level rarely hinges on acquisition of technical knowledge. If anything, technical knowledge is the easiest to acquire which is why entry levels can claim it without much work experience. Organizational navigation, team work, delivering actual results, knowing how and when to apply those skills (and when not to) mostly come with experience, and those are going to be the determining factors for promotions beyond entry levels.

> b) Just a generally miserable way to live as a programmer

I would argue it is more miserable to not have developed those tacit skills so that keeping up with the latest and greatest is the only element of competitive advantage to stay relevant. I would also speculate this might be the reason why juniors tend to over-emphasize the latest tech as the greatest tech, because they don't think they have anything else to be competitive in the job market.

Curiosity is not an algorithmic virtue, it doesn't apply to every case. It needs to be tempered with a dose of conservatism to deliver results in real world.


I do enterprise consulting, if it generates new business opportunities I am all ears, regardless my cynical view.


> Try adopting a perspective of curiosity: "How does this new tech work? What new benefits might it provide?" Or at least one of professional duty. All knowledge workers have to keep up with the latest developments in their field; lawyers have to study up on the new case law, doctors have to read medical journals.

I think part of the problem is that some fraction of "new tech" is substantially just a reinvention of the wheel and/or oscillating fads, but it's all presented as "new therefore obviously superior" to what came before.

Others have probably said it better, but tech needs more study of history.


Like I said, often it's not that simple. Each iteration on re-inventing the wheel usually incorporates some new lessons learned. I won't disagree that more perspective on history would be a good thing, but that's no reason to dismiss it all wholesale.


Very true. In addition not all new tech is repetitive. The state of the art in computer science is advancing very rapidly in many areas. For me, learning involves actively attempting to understand and reproduce some of the latest research in my field. Some of that involves figuring out how to use a new framework or tool. And that some of that is repetitive, but the context in which it is needed is not. So the learning is valuable to me personally.

Secondly, I don't learn technology to make myself more valuable in the marketplace. At least, that is not the primary objective. I learn new technology, to make myself more efficient and the products I build better. And since I'm in the somewhat fortunate position of owning all the intellectual property I create, it's also in my best financial interest to learn.

Also, as I get older, I find the mental exercise of forcing myself into doing things differently is a lot of fun. To be honest: gaining mastery and learning is just plain fun. Thats why I do it.


> but even re-inventions of the wheel usually come with a fresh perspective and incremental improvements over the last time

I appreciate what you are saying, and I definitely think that some folks take the "here we go again" attitude towards a new approach that may very likely have a significant and positive impact on whatever they are working on. That said...

That situation strikes me as relatively rare. More often than not, I see people get excited about whatever the new flavor-of-the-month hotness is, and they forget about real world issues. One simple example is when the cost in money and (possibly) morale is not taken into account when switching from one technology to another. Sure, you may get a 1% incremental improvement in some metric that may save you X dollars, but the switch will cost 5x dollars to implement with additional soft costs like potential loss of morale and potential loss of efficiency due to lack of familiarity with the new system. I see this type of decision making frequently, and I consider it extremely poor form.

I think it's really important to ask why something needs to be done and to ask what the actual costs of switching are (esp. for folks on the front line). If the answer to why is "it will get someone a promotion for little or no benefit" or "a leader somewhere wants to brag / humble-brag about the new hotness with their peers", then the decision to use a new technology can usually be postponed to a later date. These are not uncommon scenarios, and they frequently cost companies dearly.


We can debate all day about specific tradeoffs made for specific projects - I would almost never advocate a rewrite just for the sake of using a newer technology unless it has more-than-incremental benefits - but that's not what I'm talking about.

All I'm really trying to argue against is the mindset that "I could technically do this with the technology I've been comfy with for the past decade, therefore this new technology's entire existence is a waste of time and it's a waste of my time to learn about it." Learning != rewriting. And new projects != rewriting. It may end up being as simple as, "I just lost my job and nobody uses the technology I was using there any more, so now I can't get hired."


> "It assumes that the sole reason people use new things is a combination of ignorance and boredom."

Of course it's not the sole reason; fad chasing, herd following, and résumé driven development are reasons as well. :)


I would turn down applicants who don't have this mindset. The last thing I need is a dev who never mastered postgres because they were too busy learning mongo.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: