Hacker News new | past | comments | ask | show | jobs | submit login
How I Became a Better Programmer (jlongster.com)
439 points by Flollop on March 21, 2017 | hide | past | favorite | 120 comments



There's a lot of value here, but I have to raise concerns about a few things.

First, the title: written in the past tense makes it seem like he believes the journey is complete. The article itself doesn't convey that message, fortunately. But I think it's worth noting that a critical skill for a professional is constant education - don't be satisfied that you're already "better", get better every day.

I also strongly disagree with the suggestion that you should ignore the form of your code. Code is read many, many times more than it is written, so improvements to your own and your teammates' understanding can pay huge dividends.

In particular, the advice to "not worry about" DRY and duplicate freely reveals a common misunderstanding. DRY shouldn't be taken literally, it is a recognition that repetition is evidence of some greater underlying truth. By duplicating freely rather than thoughtfully avoiding duplication, you're just creating opportunities for bugs.


I met James around 9 years ago when I was first starting out in professional development, before he eventually joined Mozilla. I inherited more than one massive project from him, and I'd like to assure you he did not actually ignore the form of his code, nor did he avoid improvements that paid huge dividends. I've always been rather OCD about DRYness, and I never had cause to complain about the quality of his work, nor was there ever much of any duplication.

I believe the spirit of what he's trying to get at is not letting concern for DRYness stand in the way of working things out. I saw this in practice many times, and even approach complex projects in much the same way--I'll typically not worry about DRYness until I've implemented something a couple of times, sometimes in different ways, to get a feel for how I can approach it best. Once I've figured that out, I clean things up.

I've seen many junior devs get so worried about the quality of their code, and how they should do things, and omgosh, DRY that they can often become paralyzed and don't just start writing code. I've been mentoring someone for nearly 18 months, and I'm always trying to push him to just start coding and get something working, then come back and discuss and evaluate his work together.

I think you've misunderstood what James has said, and feel compelled to push back on any suggestion that his advice regarding DRY and duplication is borne of any misunderstanding or lack of recognition of the value of DRY. Nor is he someone who I've ever seen stop learning. I keep up with his work, and I'm always impressed by the different things he picks up just because something about it interests him.


> I've seen many junior devs get so worried about the quality of their code, and how they should do things

TBH, junior devs are often hammered constantly throughout training about this stuff. They are drilled on design patterns before having even coded anything significant.

It's not surprising they can only breathlessly parrot software principles when they begin..


Holy crap Bob, hi! Thank you for the kind words :) (and be honest, I'm sure there were things to complain about from my projects haha)


We all find things to complain about. But quality of your work or unnecessary duplication wasn't one of them. ;)


Responding to your last two paragraphs...

James's suggestions about ignoring code form, DRY, etc resonate with me because I'm someone who is usually inclined to spend too much time on those things. I consciously try to reduce the effort I put into those things and instead focus on trying to get the code to actually do something!


The time to clean up the code is when it already does what it should.

That is: Make it pretty and readable and DRY once the code works. Until then my code can have several ugly and improvised things I've left for later.


>The time to clean up the code is when it already does what it should.

I'm a bit skeptical of that as a general principle.

I want code to be a consistent representation of my mental model of the solution at all times. I use code to help me think, remind me of my thinking and communicate with others. It's not just a series of commands that make a machine do something.

As long as any ugliness is not in conflict with this purpose it's fine. I can totally live with functions that are a bit messy and probably need cleaning up later.


Yep, I agree. I see it more as a list of priorities, not necessarily sequential steps.

1) Make it work 2) Make it maintainable 3) Make it fast

All three should be on your mind when developing something, but if you have to choose between 2 and 1, go with 1.


And you end up with the 60+ year old TODO this way. Or lack proper reason comments (the why) and end up in a gaggle of legacy code.

My rule is simpler: never write instant legacy code. If it feels like something you won't understand a month later, it likely is legacy code.

If it reads badly (and I'm not talking about this or that brace or whitespace choice), then it also provably is in that category.


If there is actually a trade-off like tshannon notes, then by not making it work first, you may end up with un-finished projects.


The solution to making it work first is to make a prototype with the correct design. You do not have to implement it at once.

(On C2 wiki, a proper SpikeSolution)

If you hit a case your design cannot cleanly handle, it is time for redesign, but often a tiny amount of design constrained by use cases will expose a good design quickly.


They never give me the time for items #2 and #3. As soon as it works most clients and bosses figure "done".


How does that work exactly?

Why can't you just keep working until you consider your task done?


Are you working as a developer? If so, how do I join your team? In my life as a developer I have deadlines that are set by customers and no amount of explanation that I need another week to make the code bomb-proof will shift them.


I've heard about jobs like that, but I would never take one. There are far less insane developer jobs out there, and you could try to find one!

Development time is inherently very hard to predict, so strict deadlines - even if set by people who understand the technical issues don't make a lot of sense. There many books on this, and I won't try to summarize them here.

Still, if you have a deadline, you can use your time however you want before the deadline, right? So if it's better/faster to clean up the code after you made it work than to try to type everything perfectly the first time, what's stopping you?

Or is it that the deadlines are so tight that you only have time to write terrible code?


That's ok assuming that time ever comes; pressure to deliver new features or just the possibility that you move on and it stops being your code and becomes someone elses means you may never get a chance to look at those ugly things. I think there's a middle ground that takes that kind of uncertainty into account and is pragmatically wary of 'ugly' and 'improvised'


I think a good metric for "advanced/senior programmer" is "writes reasonably clean code from the start". Not perfectly clean, but reasonably well-architected--at least easy to read and change. At a certain point, there's really no good excuse for starting out with a big ball of mud.


This is a reasonable expectation when writing something the developer understands and has experience with, with an architecture that seems self-evident early on. Lots of problems are like this.

But when I see developers with the self-expectation of clean code take on a task where the architecture is not self-evident, where the architecture probably won't be right the first time, then they often freeze up. Or they move their efforts into the abstractions they do understand, instead of the task they are trying to accomplish.


I've definitely been there fiddling around with abstractions or whatever. I guess I think you're really starting to get good when you see yourself doing that kind of thing and make yourself stop and figure out what the real goal is and how to proceed. E.g., maybe stop banging on the keyboard and draw a picture, work through an example, talk to the client, etc.


Several people made the same argument, so I'll try to answer it here:

I'm talking about one single commit.

If I spend 4 hours on a story, the last half hour is probably spent cleaning up things. That's when it's easiest to do, because now that the problem is solved, I understand things the best, and the code is not in flux.

I do not have to ask a manager for permission to do this. If you do, you should really try to find a job where you're trusted to make these simple technical decisions.


Can you really clean up a bad design in a bunch of minutes?


I'm mostly talking about minor cleanup like finding the best names, removing duplication, deleting unused code. That can be done quickly, and is often easiest once the dust has cleared and the design is finished and working.

If the code is working but is badly designed, that can take longer.

Occasionally, the new feature requires larger refactorings to redesign preexisting code. That can take days, but is of course essential to do.


The problem is that once the code works there will be external pressure to move on to fix the next bug and add the next feature.


Then don't flag it as ready until you have been through your working code with just some sort of readability/architectural rework focus.

Adhering just somewhat to principles from the beginning shouldn't make this too time consuming.


> The time to clean up the code is when it already does what it should.

1. That's a good way to break it.

2. Typically there is not extra time given to cleanup code, write docs and unit tests.


In TDD you do small cycles: 1. write failing test, 2. write code to pass test, 3. refactor I.e. refactoring is part of the workflow


Thanks for providing your perspective here, I'd guess that he has a similar mentality, which is why he wrote that.

In my (admittedly anecdotal) experience, devs who tend to overemphasize formatting and DRY are already self-aware about it, whereas those who give it far too little care and attention regularly shoot themselves in the foot by overlooking simple mistakes that get camouflaged in poor formatting and excessive repetition. My concern is mostly that the latter group would see this article as evidence that they can safely ignore code review comments and such.


I think this is a really good point. I am probably far too guilty of overemphasizing formatting etc., but by the same token, have seem the chaos that a lack of care in that department can cause, especially on big projects. Coming back to a big pile of code that someone else wrote, which is difficult to read or make sense of, six months after the fact to do maitenance is, like, top three things that make my professional life difficult.

Formatting, DRY, and good code organization don't solve business problems, but they do make solving those problems (and keeping them solved) a lot easier.


There is a good place, somewhere in the middle.


Always, but with regards to writing clean code, it pays to just learn how to do this upfront. An analogy is of cooking. I used to be a very messy cook, the food is just as good, but when I was done the kitchen looked like a bomb had hit it. Cleaning becomes intimidating at that point. Learning to be cleaner in my processes didn't change the end result for the food, but made the rest of the evening much more pleasant. Professional cooks and chefs learn to optimize this, they are neat and clean where this is most efficient, and offload cleanup to a plongeur where necessary.

Secondly I believe that with practice, DRY can become second nature. Back in the mid 90s I wrote a lot of COM code in straight C (for reasons). This ends up being extremely verbose, so repetition carried a sufficient penalty that I learned to avoid it, to carry a sufficient mental model of the code base to know when I could avoid repetition.

[edit] plonjeur -> plongeur


I love your use of the cooking metaphor here. Are you expanding on Uncle Bob's discussion of it in The Clean Coder or was that just a coincidence? If I remember correctly, he uses it there to describe the difference between delivering quickly once (and making a mess in the kitchen) and reliably producing over the long term (by taking a little extra time to clean as you go). Adding the spin about the learning process really (forgive me) sweetens it.


I'm thinking more of the "You cannot be Mommy!" scene from the movie Ratatouille


> I also strongly disagree with the suggestion that you should ignore the form of your code. Code is read many, many times more than it is written, so improvements to your own and your teammates' understanding can pay huge dividends.

While he explicitly said to ignore fluffs, he followed it up with `Another way to say this is "use your time wisely"`. So I interpreted it as to just not spend too much time on fluffs, rather than completely ignoring it (like code styles as you mentioned). He created Prettier, so I'm sure he knows the value of code styling.


Simply using the term "fluff" to describe the form of code shows a lack of respect that I find deeply troubling. Those that pay little attention to code style, form, names of things, etc. (including those who rely on the tooling to fix it for them) tend to miss other, arguably more critical, things. Sometimes, but not always, this happens because those errors are lost in a soup of poor formatting and copy-paste repetition.


I had a coworker come to my desk one day, years ago. "Why did you add all that duplicated code to that file??" Huh? Oh, I didn't 'add' that duplication. I took two long files by two different authors, changed them to use the same white space, variable names, comment style and organization strategy. That's all I did. I didn't have the time at that moment to take the next step.

He came back four hours later and said that he couldn't stand it and he had refactored the code and fixed/written some tests.

Perfect. I learned long ago that there are ways you can start cleaning that will induce (some) others to participate.

He spent hours writing tests while I was working on an adjacent part of the code that had worse bugs in it. The work got done and a big piece of the code (this was a project I supported but didn't maintain) didn't, in the end, have my edit history all over it.


I see a lot of article that tell you how to improve code, and they often seem focused in the wrong area. Python's PEP for example, I see the value in it, but you can follow that to the letter and still have a badly designed application and confusing code.


Aren't style guides supposed to help write code that is more standard in style, and hence easier to read - because your brain will become accustomed to that style.

And PEPs cover many things.


But then again, actually writing a code formatter does show quite some respect ;)


Other people's time is also valuable. Try not to write code that's impossible to read.


I think pure style (e.g. that a linter would catch) and DRY could be treated separately. The former should always be followed. To the latter: I find weak or incorrect abstractions much harder to follow than repetitive code. And I find those much more commonly when implemented early rather than later.

There's definitely a balance, and if you know the type of problem you are solving ahead of time you can probably abstract early. But if you're just kind of feeling around in the dark, which seems to be relatively common in complex business applications (for instance), I much appreciate an implement-test-reflect-refator cycle.


For actual redesign, you end up with a long true refactoring iteration when no new features can be added. There is no guarantee you will not repeat the same mistakes or make it worse.

It is a very tough sell to most customers.

The quick cycle does not scale with project size.


My interpretation was more like "do not focus too much on the form of your code, while you are figuring things out". Which I think is decent advice.

The classic line "Make It Work, Make It Right, Make It Fast"[1] may have communicated this concept in a better way (with each step explained).

[1] http://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast


Yes. There is also "wrong abstraction is worse than code duplication".


> First, the title: written in the past tense makes it seem like he believes the journey is complete.

How should he have formulated the title then? I think it is logically correct.


Anything in the present tense would be preferable. Just a one letter change would be an improvement: "How I Become A Better Programmer". If it's in the past tense, possibly some indication that he's partway along the journey, such as "How I Got to the Level of Programmer I Am Today" (not as punchy, needs editing).


The note about DRY didn't sound like a mandate to ignore it entirely and "duplicate freely". As you said, duplication is a way to ratify that a common pattern exists first before putting something in place that captures the pattern. Many young programmers try to anticipate these concepts and end up with some form of over-design. The goal of DRY is not so much to avoid duplication, but to figure out useful concepts in ground-up work.

"Beta abstraction" is a useful mechanism for this. Wrote a post capturing some discussions in my org. [1]

[1]: http://sriku.org/blog/2016/02/06/beta-abstraction-for-bottom...

edit: I wrote "as you said", but I expanded on what you said instead.


If you do not anticipate concerns and rely solely on bottom up design, you are guaranteed to require expensive rewrites. Plural. It is also as trivially easy to end up with bad abstraction bottom up add top down, usually by not taking measures to uncouple components (also known as leaking implementation in abstraction)

Properly anticipating issues takes experience.


Thus far, "properly" has been elusive - I.e. we know it when we see it, but not beforehand​. It has been easier to instruct people to look at past concerns in similar projects and tick off which ones are relevant based on current constraints and requirements. So it is not a one-time task to set the "properly" needle right, but a continuous one to keep adjusting it as the cases evolve.

The way abstractions are usually laid out is by trying to generalize a concept, usually along simplistic lines without attempting to substantiate it. Beta abstraction provides a substantiatable route which could involve a bit of repeating before the abstraction surfaces. For example, the Pixar cars don't form a class hierarchy with a "car" base class - there is only one class "car" and the variety is generated by configuration (need to pull up ref for this, as I read this long ago). This may not be an obvious step for someone not in the domain.


Same thought here. He makes it sound like he got to the finish line. This is an infinite journey. Good read though.


He explicitly writes:

Even now, though, I continually doubt myself. The point is that this feeling doesn't go away, so just try to ignore it, keep hacking, and keep building experience.


Those that are complaining about this article: Care to share the experiences that made you a better programmer? Can you link to blog posts yourself or others made that we may find useful?


I've been doing the following for several years, and I think it's a good low commitment way of learning, i.e. there's a lot of bang for the buck.

1. Make a ~/git/scratch repository.

2. Whenever you see a code snippet in an interesting blog post, don't just read it. Copy it into a subdir of ~/git/scratch and run it. Write shell scripts to automate the process of running it. Prove to yourself on your computer that it works.

The first few times, it may be a little onerous. But eventually you will fall into a groove and it will take 60 seconds or less each time.

I don't promise to understand it on the first pass. Just the act of downloading it and running it gets it into your brain. Half the time you end up hacking on it anyway, and other times, you don't understand it, but when you see something related later, the light bulb will go off in your head -- "that's similar to something in my ~/git/scratch repo". And then you can go from there.

I don't know why but having a running code snippet really makes it feel "ready at hand" and you will learn faster. It somehow primes your subconscious. I feel like there are a lot of people who read Hacker News a bit passively, without retention.

Here's a good blog post along those lines, with code: http://journal.stuffwithstuff.com/2013/12/08/babys-first-gar...

-----

A much bigger commitment, but with correspondingly bigger benefits: I found helpful was to reimplement like 10 different things I use in maybe 500 - 1000 lines of Python. I like the new "500 lines or less" book [1] -- I was doing this 10 years ago!

Once you implement some class of program, you have a very good idea of how the "real" libraries you use are implemented. That helps you build better systems and write better code.

Examples: A template language, a pattern matching language, test framework, protobuf serialization, a PEG parsing language, Unix tools like grep/sed/xargs, an event loop library based on Tornado, a package manager, static website generator and related tools, a web server, a web proxy, web framework, etc.

In addition to writing something from scratch, I also find tiny code on the Internet and play around with it, like tinypy, OCamlLisp, femtolisp, xv6, etc.

[1] http://aosabook.org/en/index.html


I'll take a look at the "500 lines or less" book.


Not complaining about the article, but here are my thoughts.

Write and maintain your own code for a significant amount of time. When you see bugs repeating themselves you have no one to blame but yourself. If you come back to code that you have written a few months later and you can't make sense of it fairly quickly, its probably time to refactor that part. Think about the design up front and the implications of doing things that way or another way.


Very good advice! Also, supporting the code you write in production. Getting woken up to debug your own code really makes you write code that is well tested (as well as possible )and easy to debug.


I like this question!

I think that working in constrained environments early in my career gave me lasting instincts both for efficiency and maintainability/sustainability. I wrote a lot of code on Win16, GeoOS, Epoc32, and a variety of embedded systems. Debugging tools aren't good, the dev-test-repeat cycle can be arbitrarily long, so you learn to avoid a certain sloppiness.

I've been having fun with PICO-8 on a Pocket CHIP, it has some similar constraints, the keyboard is awful and the screen is small, so there are similar incentives, and the added bonus that making a cute game is fairly accessible.


Also, pre-internet I learned from books. I lived in a provincial town, so I would go to London, buy a book and then devour it, but if I didn't understand something, I had to either keep plugging away until I did understand it, or wait possibly weeks til my next opportunity to find another explanation. Keeping looking at blogs until you find an explanation that makes sense is an amazing resource for getting shit done, but doesn't encourage deep understanding when you really need to internalize something


Reminded me of this story about Donald Knuth:

https://www.quora.com/How-would-Donald-Knuth-fare-as-a-compe...


Reading every article I could find on Ward's Wiki [0]. Apologies about the current format, though some things have improved, I much prefered the simple old interface.

[0]: http://wiki.c2.com/


Learn C - Just the basics, if you don't already. I think it's valuable to understand why everyone complains about it.

When you say just the basics, what are you referring to? Syntax? Pointers? Dealing with garbage? I've got a vague idea about it, but never had to really do anything with it so I feel now is a good time to mess around with it.

Great read though, always a pleasure seeing James mentioned (someone that definitely inspires me).


I think most advice to "learn C" really aims to get you to learn how things work closer to the metal. What actually becomes of your code and data, broadly speaking, once your high-level language's interpreter or compiler have had their way with them.

So things like:

- viewing primitive objects, structs and arrays as blobs of bytes arranged between address X and address Y

- a pointer as an address of that blob. Understanding what it means to copy that blob vs. to pass around the pointer: calling by value/reference not in terms of the semantic effect, but in terms of what happens underneath.

- the stack and how it's typically used, esp. in recursion. Local vars vs. heap-allocated vars. How control flow leaves its trace on the stack. How arguments are passed and return values are returned.

- complex data objects viewed as blobs of memory pointing to each other vs. pointers. Concept of "owning" such a blob and how ownership is passed. Understanding how your language's runtime keeps track of the blobs if you don't have to, and what are common pitfalls.

- clear understanding of the difference between int8, int32, int64. Strings as null-terminated or counted arrays. Bitops.

"Learn C" is just a useful way to force you to internalize all of the above, because you can't properly "learn C" without doing that. But it's the above that helps you back in your favorite language. That, and perhaps the fact that C gives you a feeling what it's like when you can look at a line of source code and understand immediately what happens in the machine (broadly) when executing it. No hidden effects. C++ doesn't have that (constructors you don't know about when looking at the line, exceptions etc.) That "local clarity" isn't the most important thing in the world, but if you feel and appreciate it, perhaps you'll strive for local clarity back in your favorite language, too.


> Syntax?

yes, because so many other languages have copied that syntax in superficial ways. being familiar with C syntax (ugly as it is, way too much punctuation imo) automatically makes you able to comfortably read Java, JavaScript (sort of), PHP, C#, etc.

it's kind of a shame because aesthetically speaking its so full of bad choices, but it is still very influential.

> Pointers?

yes, absolutely this. most memory managed languages don't give you pointers at all and let the GC and objects implementation handle that completely. however, understanding the details of reference, dereference, and memory addressing is good for developing insight into how things are implemented at a lower level and what the performance characteristics of those implementations will be.

> Dealing with garbage?

C is a manual memory management language. Doesn't even have smart pointers or optional GC. To avoid memory leaks you have to be extremely dilligent about error handling and cleaning up/deallocating on every possible program branch. this will really hammer home a few points about rigor that are easy to gloss over in memory managed languages.

however, it's a pain in the butt and I'm really really glad I don't have to do manual memory management on a daily basis. still, learning a bit about this has made me a lot more sensitive to possible memory leaks that turn up even in higher level languages.


My personal recommendation would be to learn and understand how pointers work. In the years I spent dealing with C and C++ I can trace many of my major learnings back to how pointers work. You can't really understand pointers until you understand how memory is managed, including all the pitfalls and performance tricks that comes along.


My advice is that whenever you become confident/comfortable in your current work, that's when you have to change team or company. Whenever it gets easy, it means you have stopped learning. I agree with what was said about tools; it doesn't make you a better programmer to know all the latest ES6 features and frameworks. What makes you a better programmer is your ability to adapt and customize your approach to different situations and environments (with different scalability, stability, interactivity, security, time and regulatory constraints). I've been through a lot of different companies (I usually stay 6 months to 1.5 years each on average) and the environments/workflows are usually VERY different each time. After changing a few times, it gets easier to move between different companies - You learn how to produce code that is universally acceptable and not just acceptable within the confines of a single company.


My advice is that whenever you become confident/comfortable in your current work, that's when you have to change team or company. Whenever it gets easy, it means you have stopped learning. I agree with what was said about tools; it doesn't make you a better programmer to know all the latest ES6 features and frameworks.

It's tough to say. Given a choice between being a JS ninja capable of hammering out http://www.track-trump.com/ in a week, or a generalist with a wide variety of skills, the former is so much more valuable from a monetary standpoint that it'd be hard to turn it down.

It's less satisfying, and you end up less capable in certain respects, but it all depends which axis you want to optimize along.


Offtopic, but I have to say - I love the site you linked. This should be a standard thing done about every politician in power.


It's the brainchild of Sam, Greg, Alec, and Peter. https://twitter.com/sama/status/822494523959877632

Greg is the JS ninja I was referring to. https://twitter.com/sama/status/822500368797966336


It's not necessarily about being a generalist. You can move between different companies within the same industry and get highly specialized knowledge within a specific domain. Going through a few companies trains you to think creatively and consider more different approaches.


is that really a good example of highly focused coding skills? it looks like a very standard web site that any web developer familiar with modern frontend tools could hammer out in a week.

I don't see anything there besides some content presentation stuff. Am I missing something more interesting? When I think of "JS ninja" I think of sites with really complicated UI, like multipart forms with lots of validation, or interesting map layer based tools, or browser games or something like that.


If you do that you are never at peak productivity. You should rest a bit of time when it become easy. This will have 2 benefits:

- you can recover a little from all this learning. Your brain need rest too.

- you are really productive and hence are the best professional you can be.

Otherwise you spend your time eternally on project when you are not an expert at, which for your client/employer is not really fair.


> confident/comfortable in your current work, that's when you have to change team or company.

> I usually stay 6 months to 1.5 years each on average

That's right around the point where you can increase your scope and responsibility within the team.

On the medium-large sized projects I work on you won't be making any major changes until the 12-18month mark.

Id say it's a good strategy for a junior dev but very limiting in the long term.


On the medium-large sized projects I work on you won't be making any major changes until the 12-18month mark.

It seems like that'd exclude anyone who wants to make a name for themselves, which usually means excluding all the best people.


What do you mean?

The two best developers I know have spent 12 years at Microsoft and 6 years at Google respectively.

What makes you think you need to switch jobs every 12-18 months to make a name for yourself?

And what makes you think the best developers care about making a name for themselves?


We may be using different definitions of "best." In some companies, the best developers are the ones who keep their heads down, do what their told, and don't try to question authority, decisions, or direction. These traits are highly desirable precisely because people reliably do what their told.

In other companies, creativity is considered an asset. Most of the important work done at my previous company was largely completed by one very productive intern, which was surprising to discover. And at each of the companies I've worked at, I was given freedom to determine direction and implementation of the projects I was given, as long as the results were excellent.

Both approaches have merit, but if I were to bet on one, I'd choose the company that's liberal in granting freedoms but also willing to fire someone if they turn out not to be able to deliver. This has generally been a recipe for success at most startups, for example.

It's more cutthroat, but in a different way: the former is cutthroat politicking, whereas the latter is based entirely on a developer's capabilities. I'd rather be in an environment that rewards effectiveness rather than alliances. And when your effectiveness can only be demonstrated within the very narrow scope and boundaries set by your boss, then people who care about being effective tend to migrate elsewhere.

Most of this can be summed up as "The idea of paying your dues is anachronistic."


We are speaking past each other.

I'm not talking about "paying your dues".

It takes time to get up to speed and to build trust when you are performing high impact work on a medium-large project especially if you don't have prior domain knowledge.

> Most of the important work done at my previous company was largely completed by one very productive intern

> liberal in granting freedoms but also willing to fire someone if they turn out not to be able to deliver.

I suspect we simply deal with different sized systems.

The idea that an intern could do most of the important work implies you work for small companies.

In comparison my current project has ~90 devs.

Likewise you have this idea that you can quickly determine a good choice from a bad one. I'm still cursing design decisions I made 3 years ago that everyone thought were great at the time. If I had left after 6 months I would still be patting myself on the back.

Again that seems to apply better to small companies.


The company prior to my last had 50,000 employees. We were still able to design and implement a solution for a core aspect of the division. I had to push in the direction of Python + Redis, but when people saw the results, the higher-ups listened to our team.

I think you're right that if a system is massive, there won't be many major changes to it. But that's true regardless of who has authority. It's always possible to solve smaller business problems with self-contained projects that can then be integrated into the larger system. But not if an environment is set up to prevent someone from doing this by forcing them to work within the constraints of the existing monolith.

(It's possible to do this without creating a microservice, in some cases. The important part is simply to be allowed to experiment with alternate solutions, as long as it's not interfering with your main duties.)


How big was your team? How big was your project (kloc + timeframe)? How many other teams did you have to integrate your solution with? What effort did they have to put in to consume your work?

Again I think we are simply dealing with different scales of systems.


> Microsoft and 6 years at Google respectively

Aren't these the end-goal companies anyway? Next is potential start-up?


While it can be nice to learn at work, that isn't the prime aim: the main reason to work is to make money. If you are productive, enjoy it and save your mental energy for after-work projects.

Granted; there are some things that require so much time you need to do it at work to really get the grasp of it, but this isn't everything.


Not sure I agree with a lot of those recommendations - the number one recommendation I have is to challenge yourself as much as possible, whether it is with picking up algorithms, or learning new things such as the workings of a compiler. Other aspects include challenging yourself to be more efficient in your workflow, such as running through your thought process faster, decrease iteration time due to absorbing prior lessons, etc. This also includes the humility of being willing to learn from anyone, including developers who haven't been in the profession as long or haven't accomplished as much as you - ego can get in the way of challenging yourself to better yourself, and maybe that more junior person will indeed become better than you/has better intuition.

Make sure you get good rest too - all the learning in the world won't help you if you cannot retain what you learn. If you're in a pressure cooker of a job, consider transitioning to a less stressful job to give yourself space to work on yourself so you're better prepared for the long haul/set yourself better up for success.

Lastly, focus on being a problem solver - this means if you see a point of friction/inefficiency/tricky situation, that is something to be solved. Your success rate in solving these problems is ultimately what drives stable success. This means practice as much as possible applying your mind to these problems, and if you fail, be ruthless about figuring out what you did wrong and iterate. Very few people can get away with being excellent there with minimal effort.


> Even now, though, I continually doubt myself. The point is that this feeling doesn't go away so just try to ignore it,

No. This is the advice you'd expect from someone in their early thirties who has been coding and doing nothing else their whole adult life.

Your doubts should fuel your learning and temper your decisions. They are the voice in the back of your head and you can make it work for you instead of against.

Mastery isn't working harder, it's working smarter. It involves retraining your instincts to match your rational understanding of the domain. You get the shape of a problem and you naturally gravitate toward a reasonable solution without having to intellectualize the whole process first. If challenged, you of course have to walk back your intuition and build a step by step case, not just a rationalization, but that's fine because you've trained for that.

Study something else, anything else. Preferably with a teacher. What you're going to learn about the process of learning (and teaching) will be profound. It will make picking up and putting down new things easier, which will keep you from getting in a rut later.

Source: 40-something who spent his precocious twenties only coding. I learned a hell of a lot but I didn't know everything.


When you mean studying something else, do you mean something academic or hobby-based? I think having deep hobbies are really beneficial, and with mine I like to take a more "organic" or free-form approach to learning.


Generally I think developers don't see our 60 year history as a liability. We've only begun to figure stuff out (the 90's and early 00's were punctuated by arguments over Tony Hoare's work in concurrency, that was all done in the mid 70's by people who don't seem to be aware that they were all written by a single person).

I think specifically it helps to deep dive on something that is much older than programming. A craft, an art, a sport, or a physical passtime (dancing, martial arts, kayaking). People have been teaching that stuff for centuries and it turns out they actually know a few things that we only pretend to know. Two things in particular stand out.

Cross-training is one. The notion that pushing harder, white-knuckling everything, is a virtue in software development is rarely ever challenged. Not good yet? Just keep doing it.

The second is that we don't get that rules exist in a context. Always do this. Never do that. As you mature it is partly your responsibility to figure out when you are no longer part of that context. When you are ready, you will understand why you had to follow the rule and why you are now free of it. Look at the backlash against DRY. A bunch of people who've been using it for 10-15 years figuring out it's not all roses. See also: 'it depends...'


Ok I came in thinking this was going to be another "don't do this, don't do that" type of deal, but was pleasantly surprised about it being upbeat and closer to e-prime. Nice job.



The E-Prime article used to be written in E-Prime. Unfortunately it looks like they reverted it.

It was especially fascinating because the first sentence was something like "E-Prime refers to a version of the English language..." instead of "E-Prime is a version...". The article had all sorts of interesting quirks that really highlighted the diffs.


" (This also applies to the DRY principle. Don't worry about it so much. Feel free to duplicate.)"

The sort of advice followed by programmers when building systems that will accumulate heavy technical debt that someone else will eventually have to sort out.

You will not become a better programmer this way.


application of DRY can also accumulate technical debt.

The main thing is communication - duplicate, but put comments in both places regarding the duplication so they can be merged if they haven't diverged after time.

This is also why I dislike "no comments in code" principles - communication is key, and there are some things only plain English can convey; If all your comments are code, you aren't communicating enough!


We've got pretty good continuations, call/cc, generators etc in here too. http://pharo.org

If you do want to leave the comfort zone, this is a must.

And we really do not care for the framework du jour, even if we use some.


I think that a lot of the cognitive dissonance between what we think very good programmers do and what we ourselves do stems from the fact that the people whom we perceive to be very good programmers do greenfield projects and we non-rockstars are stuck having to maintain legacy applications, where there's a lot less room to be "very good"; mostly, it's a success to do this work for 2 weeks before giving up.


He mentions SICP, which is trending on HN right now: http://sarabander.github.io/sicp/


I would shorten it to:

- develop critical thinking

- take time to learn

- force yourself to understand (avoid cargo cult)

- accumulate non-redundant experience


At this risk of sounding like a Buzzfeed headline, number 3 has been key for me. Diving into a new "whatever" and asking how and why it does something makes it so much easier to "get" something. Bottom up vs top-down approach.


all of these suggestions are tbh ok. but id like to say one of the things that helped me most to get better was when i wrote my own framework. doing so helps you understand not just to code, but how to architect for many many different scenarios.

this allows you to write simpler and cleaner code, which is a high quality code.

and lots of coders think clean code means isolating code into many different pieces but that is so far from the truth. clean codes to me are generally less lines of code, files, and places to look for your stuff. its elegant and efficient. should put effort to separate less and group more. i cringe everytime some dev breaks up a simple methid into dozen pieces when there arent much justification to begin with.

so pls try write your own framework as an exercise. even in a crowded seas of framework, coming up with your own will be the best learning excercise you can do imho. it doesnt even have to be framework but something along the lines that will lead you to think outside the box.


Excellent practical comments. Have not attempted the compiler piece yet. Will try Also, taking pet projects and tech to completion has helped me learn tech deeper. Plus doing office-Wide sessions in sharing them.


I also have 10 years experience but I am in the same job because I can't defeat recruitment agencies. If today a new Framework, let's call it "Framework9" is announced, all jobs will instantly want 3 years experience, and 4-5 years experience of all other known frameworks, preferably in all languages, and preferably a designer who is a master of photoshop and CSS, with a long history of blah blah blah blah

I am really starting to wonder what other careers are possible as this kind of sucks. I am especially bad at bullshitting so I don't really get anywhere.


In my experience that's often sign of either a job position where they don't really know what they want, or a recruiter casting a wide net trying to catch people with a bit of each on their resume. Playing keyword bingo. I've had much more success by looking around on the likes of linkedin / twitter to find the good recruiters. Single people with their own shop or ones within an org. The ones who seem to be well recommended or seem to know their game. Approach them personally, make friends with them, get to know them. Tell them you are bad at BS. Once they get to know you, they will go to their network. Let them do their job of finding you the position. Or, they will put you on their books for when a suitable one comes up. I find the quality of positions and work is much higher doing things this way. This took me a long time to work out too.


The bad news is that job descriptions are written in vacuums with no connection to reality.

The somewhat good news is that job listings are not a great way to find jobs anyway.

I have and have known many others that have gotten jobs they did not meet the description for at all. Just a matter of going out and meeting people. If you can meet someone face to face, they won't sweat the details in the job description. No bullshitting required.


And I want a two dollar Ferrari, but realistically i take what I can get. Recruiters will have to do to.


Thanks for the advice lads.


>> Take on big projects. Get uncomfortable.

To add few more points on this, work on side projects. I have to tell, how a side project can help you. For example, I was very comfortable using React in my project. I thought of building a simple preview container (for different sizes from desktop, mobile, tablets), where user can drag and drop few components, position them etc.

I'm in the initial phase of this and guess what, its really helpful. Making you uncomfortable at times (as in this case for using React to position components on the fly, which I had never done before) helps a lot.


I agree with the author, I think people are getting too hung up on the "fluff" statement. His point is that their are deeper issues in programming that will make you a better programmer. Still useful to write clear concise code, just don't get hung up on it, and certainly don't mistake it for being a good programmer.

My other piece of advice would be to code with other people as much as you can, ideally from very different programming backgrounds.


I'm glad the author mentioned learning new languages. I have been coding Go recently and even tho it's not my main language, it has helped me to think about simpler designs when I'm using other languages.

Other than that becoming proficient in C and systems programming (Thanks xv6 book) was a major game changer for me.


Regarding the claim that only Scheme implements continuations, aren't coroutines/generators in Python a type of continuation as well?


yes, they absolutely are. the author would probably argue that only scheme natively implements first-class continuations though, which is more specific and a little different.


Step 1: Document all the things.

Step 2: Update the documentation the next time through.

There is no step 3! ;)


> Find people who inspire you, but don't idolize them.

Luckily this is rather easy in the JavaScript world, haha.

Many good devs have bad attitudes, which makes it a bit harder for junior devs to value their skills, but it prevents them from being idolized too much.

Also, I always think it's a good sign if devs I look up to said stuff I found bad, so I know I still see them as humans and not as infailable idols.

> Don't devalue your work

This is a hard one.

On the one hand, if you work with too much non-technical people, they tend to overvalue your work. I met rather much mediocre devs who got sold to me by managers as the best devs ever. They simply always "delivered", which some devs don't. But finishing your work is a minimum in my eyes and not the "best thing ever".

On the other hand, if you only work with highly skilled devs, you could start to think you can't do anything right. In the end you got skills worth mad money to non-technical people, but you think you wouldn't get a job ever again if you lose your current one.

> Don't feel pressured to work all the time.

This is hard, especially for us devs who think of programming as their hobby.

I started freelancing 2 years ago and got about 4 weeks holidays in this time. I worked on many weekends. Not because of "crunch time" but because I liked what I was doing, but I found out it really takes its toll :\

Now I try to do 3-6 month long projects and 1 month holiday after every project. Also, only weekend work on "crunch time".

> Ignore fluff.

This is really hard, because fluff is fun.

I started with JavaScript by reading "JavaScript: The Good Parts" and "Pro JavaScript Techniques" and I learned a lot of pit-falls before I went in my first big JavaScript project.

But I came at the price of fluff everywhere.

It gives me a nice feeling reading about other devs who just don't get async/await, observables or destructuring. Not because I think they are idiots, but because I think "This seems to be hard and I already know about it!"

But yes, I probably poured days into learning observables and probably can't use them in my next projects.

> Dig into past research.

This is a nice thing, because most people don't do this.

I got a big book on HCI research of the last 50 years or so and I always find nice solutions for my problems there. Since many of the web and mobile problems have already been solved with experiments on research devices that never went mainstream.

> Take on big projects. Get uncomfortable.

Also: Let your life depend on it ;)

If you need to pay the rent with a project, you're much more inclined to "really" finish the thing and "really" learn the hard parts you need to understand before you can implement the solutions, which you need to "ship".

(Okay, letting your life depend on it isn't that good of an idea, but if money is involved it's often easier for me to walk to the end. You always should have enough money backed up to survive a failed project or two, so you can also logically justify to leave your comfort zone)


> I got a big book on HCI research of the last 50 years or so and I always find nice solutions for my problems there. Since many of the web and mobile problems have already been solved with experiments on research devices that never went mainstream.

Please share, which book is this?


the human computer interaction handbook

by Julie A. Jacko and Andrew Sears


> Learn C - Just the basics, if you don't already. I think it's valuable to understand why everyone complains about it.

WHAT? C is beauty, C is art, C is clean and concise. What is this guy talking about? Nobody complains about C, it's about C++ they complain, you idiot.


Apparently the author still has a long way to go to become somewhat good, given how this link returns 502 on moderate load for a something like a simple blog. I also wonder if this discredits the article itself.


I don't think it discredits the author's coding skills to _not_ being using cloudflare (or similar) on what is otherwise a very low traffic blog. you don't get HN spiked every day.


He writes "find people who inspire you, but don't idolize them".

And yet, his blog repository[1] which is, in his words, "This is just a stupid simple server that indexes posts off the filesystem", has 1193 stars. Why would anyone star anything like that if not because they idolize jlongster?

[1]: https://github.com/jlongster/blog


i don't see the contradiction. he's not asking for people to star his repo and it's not unreasonable to offer this advice if you perceive harmful cults of personality developing in a community. besides, lots of people just star things for lots of different reasons, bookmarking, giving props or ???.


"or ???", what does that mean?

I didn't point a contradiction, just noted that he has fans that idolize him. The contradiction is theirs.


Having an audience doesn't mean having a cult of personality.


I star githubs just in case I might want to read them later, with no comment implied about what I think of the author or even the content.


As he notes in the README, the old version was more interesting.[1] I suspect that most people starred the repo at that point.

[1]: https://github.com/jlongster/blog/tree/react-blog


His previous blog did react.js server side rendering. That's prolly it. Plus, may be it was an advice to his followers as well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: