Probably the most important bit of the entire article, which you can probably get from many self help career books (applies to all, not just programmers):
"Ask others to explain why they do things, but don't just accept dogmatic reasoning. Demand examples and inquire about context, so that you can try to imagine what it is like to be in their shoes. Doing this is tremendously valuable because it allows you to see the strengths and weaknesses of ideas in their natural habitats."
One thing I can add is always question. There's a reason kids ask tons of questions, they want to know. Asking questions not only increases your own knowledge, and that of the person answering, but it also creates a culture where complacency is extinct. The one thing I hate more than people that litter are people that say "oh, well, that's the way I've always done it." There's lots more I dislike above that, but...complacency is the death of innovation. Then again, sometimes things just work....but that shouldn't stop you from trying to improve it.
This reminds me of the woman who was teaching her daughter to cook a pot roast the way she remembered her mom doing it. You take the meat, cut off four inches from one end and throw that away, put the rest in the pan in the oven at 350 for 3 hours. One holiday when the daughter was hosting grandma came over early to help and saw her cutting the end off the pot roast. She asked why and the daughter said that was how her mom said she did it. "Oh child, I only did that because our pan was too small!"
A WarCraft III commentator told the story of how he once saw a pro, during the opening, send his Demon Hunter to the healing fountain on a certain map, and he started doing the same with the same timing on that map. One day he met the pro in person and asked why he'd done it, and the pro said, To see if the opponent's Blademaster was there.
The Demon Hunter and Blademaster is each player's respective hero/champion and essentially a free powerful unit.
The DH player has the advantage during the early game, so is scouting around the map (the Fountain of Health being a likely location) to find an easy fight.
Sure. Sometimes. Othertimes, you have to recognize that you aren't an expert, and that the real "why" is beyond your reach. If you're in this situation, you're probably better going with the herd.
I run across this sometimes with my boss. He asks why we do X, can't we do Y? I respond: X is common and a well vetted practice, I don't know the security implications of Y. Then he says: lets use Y unless you can think of a concrete reason not to.
In the meantime, I'm not really qualified to analyze Y in the way X has been analyzed. I would rather stick with X, even if my reason is dogmatic.
> Othertimes, you have to recognize that you aren't an expert, and that the real "why" is beyond your reach. If you're in this situation, you're probably better going with the herd.
Yep. And as a society we are great at this, which makes it almost impossible to improve any part of our culture basic enough to be taken for granted.
That’s how we end up with a base ten number system, a weird irregular calendar, really silly spelling rules, suboptimal mathematical abstractions, inflexible shoes, office furniture and computer input devices which cause serious injuries, roughly the same computer architecture since the 60s, or the marvelous absurdity that is the modern web software stack, not to mention bloody centuries-old feuds between neighboring tribes.
Also the main reason advertising and propaganda work so well. “Everyone else does it, so you should too” is an incredibly effective way to shortcut reasoning.
I think you're conflating cargo culting with pragmatism though. The generalized reason that everything is messy is because it's too much work to gather all the knowledge and consider all the angles and do a big reset, and even when you do, you run into unforeseen issues that may end up being just as bad (see Gall's Law).
Consider a pet peeve of mine: The Turkish Problem. Specifically, when Turkish was latinized almost a century ago, they had the brilliant idea to use lowercase i and capital I as separate letters, and then introduce their counterparts İ and ı as new letters. This now means you can't upcase or downcase the letter i/I unless you know the language. Note that it took 11 years for Firefox to handle this correctly (https://bugzilla.mozilla.org/show_bug.cgi?id=231162), and even then you have fractal complexity if you're dealing with an internationalized site. I had the pleasure of working on a film site that was localized to Turkish, where I realized that it is not enough to know the language of the document, you must indeed know the language of every string in the site. In the case of film titles for instance, many films in Turkey are known by their English title, and in that case you must use normal capitalization rules, so now every film title has to have the original language flagged. This is not a small issue because if you are a Turkish speaker it just looks wrong if you don't get it right.
Why do I go into this digression? Because from a programmers perspective, this is insane. In retrospect it would have been infinitely better just to invent a new letter or use an accent or something that fit the alphabetical semantics of other latin languages. But they didn't. This decision went into effect in 1929 when the concept of character encodings was something only maybe a handful of engineers and mathematicians had even considered, and the concept of Unicode would have been beyond science fiction; it probably seemed a perfectly reasonable hack at the time. If we were to reverse it now in order to ease technical concerns think about all the fallout with all the existing documents and printed materials. It's just too big a price to pay. The world is full of these things, there's just not a big enough ROI to justify overhauling these old systems.
That's very different from the more actionable problem of people who go through the motions without ever questioning how they do things.
Most of the things you mention are not things engineered by small -- or even large -- teams, but are evolved products of whole societies. And for all their absurdities they also embed a lot of knowledge about how to get things done in the real world.
Dogma is the roadblock to a lot of progress, not only software development.
> Asking questions not only increases your own knowledge
Acknowledgement of being wrong or in doubt is a prerequisite of asking a question. Our current approach to education discourages being wrong - although it is difficult to imagine any other system with the amount of children that need to be educated today.
I am missing an option for the guy who could not care less whether or not a tomato is a fruit or not, because he knows it does not matter at all for his purposes. IMHO the real expert knows what knowledge is relevant and what knowledge is for forum discussions that are just used to kill time.
I'm missing an option for the guy who could not care less whether or not a tomato is a fruit or not, because he knows it does not matter at all for his purposes. IMHO the real expert knows what knowledge is relevant and what knowledge is for forum discussions that are just used to kill time.
If you don't have any arguments you downvote, you stupid troll?
You don't have to know that tomatoes are fruit for that? Your reply makes no sense at all. I say "you don't need to know H" and your reply is "but you do need to know Y"! Not to mention that specifically your reply is something only worth knowing when when does know that tomatoes are fruit, and not when one doesn't.
You don't have to know that tomatoes are fruit for that? Your reply makes no sense at all. I say "you don't need to know H" and your reply is "but you do need to know Y"! Not to mention that specifically your reply is something only worth knowing when when does know that tomatoes are fruit, and not when one doesn't.
I have taken - just for fun - over 70 courses completely out of my own field of expertise (CS) on edX and Coursera. EVERYBODY knows nothing, that is unavoidable. Knowledge is infinite, your brain is very, very finite. So knowing what is actually WORTH knowing - and that is different for every single person - is extremely valuable. Simply calling "you must know this!" "you must know whist too!" and combining all those calls from all people does not work.
Trying to know everything is - as stupid as it gets. Trying to tell others they have to know everything even more so. How much do you know?
I have take several neuroscience classes and know quite a bit about the brain. I'd say that is a lot more valuable than being able to put categories on plants. Yet I would never dream of claiming "you MUST know that!" and downvote anyone who disagrees with me as you did, Dick from the Internet.
Please reflect before answering:
What do you really need the knowledge that "Tomato is not a fruit" for? What does it help you with? And don't use arguments so meta it that can be used for any knowledge. Be precise. I'm really curious what all those people who don't know that are missing in their lives.
I have taken - just for fun - over 70 courses completely out of my own field of expertise (CS) on edX and Coursera. EVERYBODY knows nothing, that is unavoidable. Knowledge is infinite, your brain is very, very finite. So knowing what is actually WORTH knowing - and that is different for every single person - is extremely valuable. Simply calling "you must know this!" "you must know whist too!" and combining all those calls from all people does not work.
Trying to know everything is - as stupid as it gets. Trying to tell others they have to know everything even more so. How much do you know?
I have take several neuroscience classes and know quite a bit about the brain. I'd say that is a lot more valuable than being able to put categories on plants. Yet I would never dream of claiming "you MUST know that!" and downvote anyone who disagrees with me as you did, Dick from the Internet.
Please reflect before answering:
What do you really need the knowledge that "Tomato is not a fruit" for? What does it help you with? And don't use arguments so meta it that can be used for any knowledge. Be precise. I'm really curious what all those people who don't know that are missing in their lives.
If you don't have any arguments, this is what you have to come up with. You show that you are just a trolling loser with no substance. Care to react to my arguments? Of course not. Maybe you should spend more time getting an education instead of browsing Internet forums, you don't seem to have anything of value to contribute if have to go for personal attacks.
First of all people do put non-fruit in their fruit salads such as honey, nuts, coconut-milk-jelly, other dairy products like cottage cheese.
Secondly, there is no law saying that I have to make any given named recipe the same way that you and your friends make it. What if I really like tomatoes? Or what if I don't like them at all but the doctor says I must eat so I want to put them into something that tastes yummy to cover the tomato flavor?
Tomatoes are a fruit, specifically they are a berry. Some varieties of tomato are quite sweet, so sweet that some people hate those varieties. Sweet tomatoes work fine in a fruit salad and if you served it to your friends without warning most of them would likely eat it and not complain.
Tomatoes are not the only fruit/vegetable that is sometime used as a fruit and sometimes as a vegetable.
Thailand/Malaysia/Singapore probably, but insisting that a tomato is a fruit these days is like saying anti-Semitism is discrimination against Arabs too.
"Fruit" is a biological classification, "vegetable" is a culinary one. Tomatoes are both. "Fruit" also has a colloquial meaning which generally contrasts with vegetable, hence the discrepancy.
I'm maybe too cynical but I always think the main motivation of such articles is to pretend the author is a very good programmer with the implicit assumption that "if you have an opinion about what make a good programmer, you necessarily should be a very good programmer".
My criticism of the article would be that there isn't really anything to disagree with. Most STEM people already understand that analytical ability is more important than knowledge; and the people struggling with this are unlikely to gain that understanding from an article.
From that perspective, I think you might be right. The article won't accomplish much except make the writer look good. But you're also right that this is cynical. If we're optimistic, this article could serve to give people already possessing some analytical ability a friendly nudge to learn and dissect a new concept.
Most STEM people should know, that analytical ability relies a lot on knowledge. You cannot analyse how something is made up unless you are able to recognise parts, their relationships and dependences. You need those chunks, patterns in your brain only then you can see them elsewhere and understand the working of the whole.
A lot of hirings get done on no more than the candidate's knowledge of XYZ programming-language.
It might seem obvious that it's more important to know whether there is any point to writing the current feature -- but a team has to be working fairly well before it can even recognise that.
I would add to that that I fail to see how much of said advice is unique to programmers and not just human beings in anything they do. Why would you want to be a proficient programmer and not just proficient, period? And if so, are programmers the best source of wisdom really?
You may see a framework programmer try to apply their framework when it is not necessary. Look at the javascript questions on stack overflow; sometimes people recommend jQuery for problems where it isn't required or even optimal to use jQuery. That doesn't mean jQuery is always bad, it just means people are not thinking critically about their own tools.
What if it's the applicable (if not "optimal") tool that the developer is most familiar and productive with?
What do you even mean by "optimal"? Loads fastest? Fewest LOC? Lowest client CPU impact? Can be designed, built, tested, and deployed in the shortest amount of time?
Salience is a big part of proficiency. If you're productive and familiar with a tool, and it wouldn't be terribly important to your actual needs to optimize that aspect of things... then even in the situation where there might be a better tool out there, it'd be fine to stick with what you know.
But on the other hand, if you struggle with a tool because you're using it in the wrong context, and you're either not aware of that because you're missing the bigger picture... or you're aware of it but refuse to step away from dogmatic best practices to customize for the situation at hand... then that's a bad thing.
So, the question of whether or not to go and find "the best tool" is a matter of whether it's salient in the context of the problem you're solving (and a bit more generally, the kinds of problems you solve day to day)
So proficiency involves to some extent choosing what not to learn as much as it does what to learn.
it's a guy who knows that he should throw a NotFoundException when his rest API has not found anything, but does not know that it translates to a 404 status code, and because his framework does not allow him to send anything else with that exception, that think that you can put a body only with a "non exception" response
This is where the use of "Older" IT people would help. Take all the new ideas and innervation and filter it through the 60 year old IT guy (Knowledge experts).
Explain your ideas well with examples. Lesson how that problem has been solved before. Don't build new ideas on old code. Find shortcuts to the results.
I've worked with some older folks with a ton of experience (think: programmed in the 80s or earlier) before, and I have huge respect for those folks who can see a disaster coming miles away, or have been around the block enough times to have formed solid opinions on many subjects.
---
Although, while more often than not true (perhaps partly due to survivor bias), I do think that it's not a given that old programmers are wonderful. Some older folks do seem very set in their ways and unwilling to experiment in a way newer programmers aren't.
The natural example is that you hear a lot of "ah it's like <blah> all over again" where blah is something like winforms or activeX or CORBA or lisp or OSGI or what have you, but it's frustrating to come into everything with that baggage.
Yes it might be the same idea, but it might be executed quite a bit better! Perhaps much wider platform support, perhaps lower total cost of ownership, perhaps increased viability under current conditions, perhaps much larger community of people and tools!
The cycle turns much quicker than it used to, and using older, more mature but inferior technology sometimes does have a big effect on the bottom line or provides a disadvantage over competitors in a way that I don't think it did as much earlier on. Granted, this isn't always the case and knowing when it is also matters.
I think "strong opinions loosely held" is a great motto, and a lot of these folks follow that motto to wonderful effect.
---
It is also entirely possible do essentially the same project over and over in a multi-decade long career, and learn nothing material from it all, and I've seen that happen too.
---
In conclusion, I think older programmers can be a great asset to your team, and are often overlooked for no good reason. But they're also people like everyone else.
> The natural example is that you hear a lot of "ah it's like <blah> all over again" where blah is something like winforms or activeX or CORBA or lisp or OSGI or what have you, but it's frustrating to come into everything with that baggage.
This is why I have changed my thinking on this. I have used technology, I have seen technology fall apart at certain points. So I just poke at pain points. If I can't poke holes, that's good. Less worries on my mind. But otherwise...
"Ok when we used this we had problems with X when doing Y, your're using X and you're doing something that's very similar to Y, but you're not pushing it as hard as we did. Have you tested this?" Aww, your stuff fell apart. 10 minutes well spent, hah.
"It is also entirely possible do essentially the same project over and over in a multi-decade long career, and learn nothing material from it all, and I've seen that happen too."
I remember interviewing someone like that. He came to the interview with 11 years exclusive experience in a language that's 11 years old; yet he struggled with the basic patterns and terminology that should be wrote memorization for anyone who's worked in it since the beginning.
As I struggled through the interview, I wondered if the guy just outright lied on his resume, and if I should end the interview early.
Then, at the end, I got to my question that's basically writing boilerplate data access code, and he zoomed through it. I realized that this guy spent 11 years writing mindless data access code and never really learned how to program!
An historical perspective, which older programmers at their best bring to a project, is important in my view. When you look at the history of software systems you see that the better technology rarely wins in the marketplace. A lot of things contribute to the success of one system over another--technical excellence is not one of them.
Also, the significance of fad and novelty cannot be underestimated. When I see yet another new language or framework that is only trivially different from the existing ones, yet touted as the next big thing I can only grumpily roll my eyes.
We all have hopes that new things like languages will make things better or easier. Mostly it does. Today's hardware beats the 4004 CPU I started with.
Technology will move forward but building tech on back tech is a bad idea.
Electronics, assembler, system / language / application development, data structures and flow design, protocol design, UI design and now working on AI.
Please excuse my curiosity: would you consider your work to be more hands on or managerial or both? Have you ever felt overwhelmed with the sheer amount of 'topics' out there?
Overwhelmed, no. More excited by advancement in knowledge. I'm more frustrated by the number of subjects that could be investigated that go un-explored.
How many years would it take to master one topic? Say to the top X percent. How many of these intervals fit in 40 years career? Worth keeping in mind, that if we are talking technology there usually overlap between fields so you are unlikely to start at 0 every time.
I used a simpler analogy in a c++ class I used to teach.
Some people might argue that setting a variable to 0 or something is pointless if you now your about to use it. But while technically it could add overhead. The chance of bugs forming from a future refactor is decreased. And those who track down sporadic uninitialized value bugs should know why it's almost always better to just initialize values. And not implicitly always trust the compiler will catch complain and or a future less knowledged developer won't accidentally do something silly.
Many habits in programming are formed from habits and we're told to not do thibgs(1 entry and 1 exit. No goto. Etc..) but few can back it up with why and when it might be a worthy edge case
Way back when I went for an interview. That was when Java started to get popular. The interview involved some coding, and I set a variable to 0 at declaration. The interviewer asked me why I did that. I said I know the language already initialized it to 0 but I like to clearly document the init state of the variable so others can have an easier time to understand. He said others should have known the language well enough to know the initial state of a variable. We went back and forth for a few minutes on the importance of code reading and code efficiency. At the end I realized I didn't want to work with people who were so pedantic on minute detail of a language. I later declined the offer, not that it was a good offer anyway.
I dislike that for another reason: It prevents the compiler from telling you that you're using an undefined value. Just setting vars to 0 doesn't mean that that's the right value.
There's a worse problem, which I've seen in the wild:
int some_function(...);
int some_other_function(...)
{
int r = -1;
/* imagine a bunch of other locals defined here */
r = some_function(...);
if (-1 == r)
{
return -1;
}
...
}
Now imagine someone changes the return type of some_function():
char *some_function(...);
int some_other_function(...)
{
int r = -1;
/* imagine a bunch of other locals defined here */
char *s;
s = some_function(...);
if (-1 == r)
{
return -1;
}
...
}
Oops! Every other reference to r has been removed except the declaration and the test. Now some_other_function() will always take the early exit, and because r is initialized in its definition the compiler does not complain. (In real life, hilarity ensued.) If it was 'int r;' with no initializer this would have been caught by the compiler.
Modern gcc is great at finding uninitialized and unused variables. -Wall -Wextra -Werror FTW.
EDIT: referred to some_function() where it should have been some_other_function().
I'm pretty sure static analysis like Clang would warn you about such a thing too... So maybe proficient programmers just need to keep their linters and tools up to date! ;-)
In C++ you're right. The value isn't predefined unless you calloc. In java there are default values. I.e. for an int it's 0 and a string it's an empty string. There is even a Checkstyle violation for this style choice.
Or in Swift, you either initialize a variable with a value or have to define it as an optional, in which case you are eventually forced to deal with its possible null state.
But I digress
Despite the reception this article seems to be getting here I find it speaks to me pretty directly and summarizes some observations I've been making lately.
I consider myself an intermediate developer (at just under 3 years out of school at my first job) and I feel like for the first year or two I increased my competence rapidly (and with it, salary thankfully) but I'm at a point where without branching out into other areas it will be more difficult to continue acquiring pure competence or I will do so at a flat or decreasing rate because of the increasing complexity of concepts left to improve upon (within my day-to-day work, not including side projects and academic pursuits).
On top of that I work with a small team of other developers who are all highly competent but only some of whom fit the description of proficient developers in the article. I think the article describes really well the differences between the two types, and I have tried to nail down what the proficient developers do and how I can acquire the same skills and instincts they have.
The main things I have noticed are
a) as the article discussed, always being able to debate the pros and cons of an approach beyond just best practices and without regurgitating things that renowned developers have espoused and
b) being able to be able to drill into problems in the code-base or our process even if its due to technical debt that other competent developers are willing to work around, but more importantly knowing when it is appropriate to down a rabbit hole and when it isn't.
I'm lucky to be able to see the distinction in action and overall I think the article helps to describe a distinction that I haven't been able to verbalize.
> But the place where the intermediate programmer tends to get stuck ... is thinking that the difference between a beginner and an expert can be measured in how much stuff you know.
> And at its essence, proficiency is about "why you do things a certain way"—It's the difference between understanding each of the parts of a problem individually, and understanding how the parts fit into the whole.
So I guess it is about stuff you know.
The article then goes into further specifics about how you have to know when to apply what you know.
I'm sure everyone has worked with someone who has memorized a lot, but doesn't know how to do much. Their productivity is limited.
There are also people with innate "know how", who can get things done without much background knowledge. They just know how things work.
My $0.02 is that experts have both of the above skills. They know a lot more than average people, and have better techniques for doing things.
My experience has been that once you reach a certain level of expertise, it's better to "know how", than to know things. For the software I've written, I just forget about the minor details, and have to look them up again. The details aren't important, and just aren't worth remembering.
In contrast, remembering the problems and solutions is much more important. The solutions can be applied to multiple problems in a way that simple "memorization of facts" cannot.
Yeah. I used to work with a guy who mainly knew databases, but also had to do some C# as part of his job. He did loads of MS certifications, just to have them on his CV, but was clueless and usually had to ask me how to do anything more than the most simple programming tasks.
Being expert means having ability to recognise certain patterns and having skills to apply then. So yes details don't matter much if pattern recognition works fine (and it's ability to work is in some degree dependant on ignoring non-essential details). Which details don't matter is aldo learned through experience.
There is a danger of miscategorizing if one gets too confindent and sloppy.
The interesting point is the distinction between skill and talent.
Skill is close to competence. You can solve well-defined problems using standard tools and practices.
Talent is close to proficiency. You can not only improvise good solutions for novel problems, but you can also prioritise problems intelligently to achieve strategic goals, because you have a clear view of how the codebase fits into your business or project.
Without talent you can waste a lot of time by picking the wrong tools, or using tools in the wrong ways. Even though your code may be perfectly correct, the solution built with the code can still be completely wrong or useless.
In fact talent is somewhat orthogonal to skill. You can have an untrained version of that overview talent, but it's not so useful if it isn't grounded in skill.
When you have both, you have real proficiency - good code doing the right things to solve the right problems.
I think you're splitting hairs. The parent and OP are saying there is a qualitative difference, and by saying "just X and better X" you're trying to say that the difference is merely quantitative. In my experience, when someone makes a claim of qualitative difference, and someone else insists that the difference is merely quantitative, it's because the latter person has not themselves seen the quality of the difference. This applies in many fields.
And that's compatible with my personal understanding here as well. As someone who has been programming for 30+ years and is generally considered proficient, I agree with the parent that the difference is qualitative. This explains a lot of the times when I look at what senior developers are doing and think "yes, that tool is generally useful, but not this time". And it gets frustrating then when they start arguing about "best practices" but I'm trying to make a larger point about "this project".
You make a good analysis, and I don't disagree with you. The crux of my argument is more like this:
Programming is a knowledge field, so any proficiency is just deeper know-how. "Stuff you know." Is that quantitative? Not exactly. I certainly don't think someone who knows 50 languages is a better programmer than someone who knows 2 languages, by virtue of the quantity.
Some things are memorization: how to write a class, for instance. And some things are not: how to model a situation into an appropriately-designed class.
One requires application of knowledge, I guess you could say. Qualitative.
But you can't apply your knowledge of different class designs without knowing why they're appropriate. That's not purely qualitative.
Am I splitting hairs? At this point, probably. I just think the arguments I responded to are poorly worded. Yours is not, and I can't disagree with it.
> Talent is close to proficiency. You can not only improvise good solutions for novel problems, but you can also prioritise problems intelligently to achieve strategic goals, because you have a clear view of how the codebase fits into your business or project.
Calling that aspect talent and not just another skill is somewhat elitist I think. It can be learned. It's just another skill. (Haven't read the link.)
I interpret "stuff you know" as easily-recallable facts. Obviously at the highest level view knowledge is what separates a beginner from an expert. Knowledge is much more than "stuff you know", though. It is about understanding why, knowing how to handle non-textbook situations, and knowing where to find the facts you need when you need them.
I think he is arguing for having depth as well as breadth of knowledge.
Certainly when I am looking at CVs and see every framework under the sun listed when we are looking for a Django dev, I assume they have done the Django tutorial, maybe a little more then moved onto the next framework.
on my resume I list specific skills (i.e. AWS, Node.js) under several categories (i.e. ops, programming) and for each skill listed i format them bold, italic, plaintext with a legend showing that those map to adept, proficient, and working knowledge
SuccessFactors HR product used to do this (may still do).
You had to drag and drop your skills within a kind of bell curve. It forced you to prioritise only one or two as your super skills, then a larger number that were your next top skills, and finally a whole bunch of stuff at the lower level.
It worked well, though it was in no way the final word on this kind of very challenging problem.
As a programmer I always write code considering if a new guy wants to take over my code, he should be able to understand it and use my code with as little complication as possible. Things should be simple enough that when a new guy takes over your code he can be onboarded soon and doesn't have to ask someone to explain him the magic pattern or secret sauce to under teams code.
I am some what the opposite. The person that comes after me should be able to read and understand what I wrote without line by line documentation. If they can't, they are not fit for the position. Although, if there is a quirk within the code where something needs to be explained, it will have documentation.
I think you're agreeing with each other. Parent never specifically mentioned writing documentation. I took their post to mean that the code should be easy to understand.
What the article describes as "competence" sounds like total incompetence to me.
If you learn a pattern from a book or a class or a blog post, and then you apply it in a project just because "the book said so" you are not a competent programmer.
This might be an artifact of English not being my first language, but I would reverse the definitions.
Proficiency means knowing how to use something.
Competence is knowing how to do something properly, which implies having a deep _understanding_ about what is going on.
That seems to be how people use these terms anyway (regardless of the dictionary definition).
Proficient in jQuery: knows the ins and outs of most jQuery functions and plugins.
Competent in jQuery: non-sensical statement.
Proficient in git: knows most useful commands and arguments.
Competent in git: almost a non-sensical statement.
On the other hand:
Competent front end developer: understands how to develop a web based application UI from scratch without being tied to a specific product domain.
To me the terms are roughly interchangeable if taken in their general meaning, or at least easy to confuse with one another.
But when I wrote this article, I hinged it off of specific definitions assigned to the terms in the Dreyfus Model, which although not exactly commonplace, is an established bit of literature (for example, Pragmatic Press uses Dreyfus Model to indicate expected skill level for readers of their books)
So I stuck with their definitions and attempted to restate those definitions (informally) in the article.
But I'd be fine with you calling these Thing 1 and Thing 2, as long as they mapped to the definitions provided.
What the article refers to as competence (Thing 1) is something programmers generally recognize as valuable, because it is valuable! Essential, in fact.
What the article refers to as proficiency (Thing 2) is something programmers probably also recognize as valuable, however... many do fall into the trap of thinking this: Thing 2 is what you get from being really good at Thing 1 in many areas.
In truth, Thing 2 is what you get from being halfway decent at Thing 1 in many areas, and then with a view of the bigger picture, getting really good at specific aspects of Thing 1... in the context of your actual work, goals, etc.
In other words, Thing 2 = Thing 1 X TheBiggerPicture.
Now if I were to use my own way of thinking about this... I actually thing of Thing 1 (competence) as tactics, and Thing 2 (proficiency) as strategy.
But I think that could end up leading to an even more confusingly overloaded set of terms.
(1) Novice... does not understand anything but basic knowledge in the abstract sense
(2) Competent... knows how to apply knowledge to solve concrete problems
(3) Proficient... sees how particular solutions fit into the context of different problems (i.e. what tools to use, when, and why)
(4) Expert... picks the right tool for the job without having to do careful analysis... works from intuition by pattern matching against fundamental concepts combined with past experiences.
(5) Master... picks the right tool for the job without even being consciously aware of the fact that they're "doing work" at all.
It's important to remember that all of these levels apply within specific contexts... you can be master one thing while being a novice at many other things.
But each step along the path is a gate of sorts: in that you can't really understand what it's like to be an expert in anything until you're an expert in at least one thing, and building expertise is easier when you have done it at least once.
But a big problem (which we're mentally hard wired to be biased about) is evaluating our own skills, as well as the scope of experience of others.
This is why someone with local expertise often thinks they can speak on topics outside of their actual expertise, and why people tend to believe them when they do.
"Thinking, Fast and Slow" by Daniel Kahneman is the best book I know on that topic. Totally worth reading.
> Patterns, principles, idioms, libraries, language features—these are all tools. But a truly proficient programmer fits the tool to the job, not the other way around.
This isn't a unique statement but many people forget it.
We all have our favorite tools and we criticize others that don't have complete parity. Picking a tool to use is often the hardest part considering there are so many nowadays. It's easy to rule out a hammer when you want to tighten a screw but in order to build a skyscraper you need to start with something.
"so many people struggle with high level programming ideas, like design patterns"
That sounds more like the movement from "apprentice" to "journeyman." Is it really expected that professional, experienced programmers should struggle with the appropriate application of known techniques?
As a budding programmer I like the ideas here, but I want to take it with a grain of salt and figure it's best to have a wide breadth of knowledge before going deep. Or rather, gain competence before striving for mastery.
> Pick a small number of specific skills you're simply good but not great at.
I am struggling with this one since forever, what are the skills-set a programmer should enumerate through ? like how good they can connect to a DB provider ? or how quickly they can set up a web server ? what qualifies as a skill in this context ?
Given that the goal of software development is to build tools, toys, etc... it can help to work backwards from projects to skills.
No matter what level of skill you're at, you can start looking at the software you use day to day and think "Could I build that? What parts would be easy? What might be hard? What might be tedious?"
Try that with a few different project ideas. There will probably be some tools/techniques/skills in common between them, some of which might be ones you have some skill with but don't feel fully confident.
From there, you have a topic of study, and can go dig deep into books, exercises, practice projects, etc. (Or find someone to mentor you and give feedback)
You'll know you're doing it really well when either (a) you're fairly certain that no one has a much better way of doing it than what you've already learned... and you've searched far and wide and come up empty or (b) the area of study becomes easy enough and automatic enough that you don't need to give it much thought... it just becomes automatic.
You'll know when you have something that can be improved when it distracts you away from the problem you're solving... repeatedly and not in an "essential struggle" sort of way, but a "I really wish I was better" at this kind of feeling.
Some level of proficiency is necessary to recognize these pain points and fix them. And the solution is usually to develop more competence, so long as it's in an area that really does matter to your work at hand. These two things work best as a spiral, supporting one another.
Ultimately, it doesn't matter. The point of the exercise isn't to get better at those specific skills, but to develop an understanding of high-end proficiency, which you can later apply to other subjects.
Like many (proper) learning sets, the content of the exercise is irrelevant compared to the deeper lesson.
Yes I can see the deeper meaning of the exercise, yet, it helps to know what we're talking about specifically.
I am not proficient yet, and I would like to be, and I don't think it's one of those (When it happens you'll know it) moments, I believe there is a clear set of skills we need to work through first in order to see the forest for the trees.
My question therefore is, what are those skills exactly ?
The context I currently apply this in is that in the apprenticeship I'm running.
So for example, we work on business applications and some amount of complex analysis is needed because we work with raw problems (i.e. it's necessary to interview the users/customers directly, collect raw data, etc... collect and transform raw data from many sources before we can even start on a project, etc)
I would start by trying to give my apprentice small chunks of the work we were doing, at a task-based level. She would do fine when I could break it down into well-defined tasks, but not do as well if I just gave her some raw materials and said "OK, implement something that will solve <HIGH_LEVEL_GOAL>"
So we circled around that a lot, and realized part of the issue was the challenge of thinking rigorously and synthesizing lots of details. We dug into Gilded Rose kata together to practice that a bit...
Then after that, we went back to some more realistic work, talking things through... taking notes... etc. It helped for a little while, but then we were still hitting issues related to rigorous problem solving / analysis.
So then we went back and did some more exercises. This book chapter gives some examples of the sort of "fundamental lessons" that can be pulled from specific exercises, so long as you're looking at them with the goal of seeing the deeper ideas behind the work you're doing:
This is the sort of stuff I've been working on lately, because I don't think there's a direct path towards getting the big picture if you don't have someone guiding you, aside from a ton of hard work and patience combined with continuous practice.
That said... keeping a journal does help. My apprentice has been working with me since the beginning of the year, and she writes notes daily on what she does, what she studies, what she struggles with, etc. Periodically going back through those is how we figure out what to do in practice and study sessions, and I imagine even if working solo... you could try that.
This article is way too long and without a proper table of contents. I don't know what's important or not, so at a first glance I could not gauge opportunity costs.
And also lists are meant for small items, not paragraphs.
Nobody knows everything, so I must admit I don't see the point of the article. Maybe sometimes you get lucky and have the relevant knowledge for seeing the big picture. At other times you won't have that knowledge. Are you proficient?
In short: if you have done it before, you are now an professional who can implement a very similar thing again. For everything else, you are not a professional. Thats how I understand the article.
No mention of scrum, agile and pair programming either. Strange, huh?
I'd say *DD is nice-to-have but orthogonal to being proficient.
Or sometimes just counter-productive, if all it is is cargo-culting with zero understanding.
One thing I can add is always question. There's a reason kids ask tons of questions, they want to know. Asking questions not only increases your own knowledge, and that of the person answering, but it also creates a culture where complacency is extinct. The one thing I hate more than people that litter are people that say "oh, well, that's the way I've always done it." There's lots more I dislike above that, but...complacency is the death of innovation. Then again, sometimes things just work....but that shouldn't stop you from trying to improve it.