I think nobody's done more damage to the developer hiring market than Joel Spolsky. His inductions about the liquidity of "great developers" seem to have become true mostly because he was the only one writing about it in the 90's.
But a lot of it is anecdote and faulty assumptions. I don't know of many other professions that have 8 hour oral exams that test the sum of all knowledge in the field with such a strong negative bias towards hiring. Or a profession that so strongly assumes ability is innate and cannot be trained.
I've not seen weak candidates that got hired get much better even after extensive coaching and pairing.
It may be that coaching, pairing and code reviews are simply ineffective in teaching people. They are also very expensive, since they sap the productivity of the rest of the team, and make planning and sequencing less predictable.
Overall, in my experience it's a lot less costly to turn down a candidate that might be good than to take one on.
(FWIW, the main metric in our company's hiring after meeting a minimum bar of an online coding test (~15 minutes to solution) is pairing with them on a problem that can be solved in several different ways. Getting a key insight into how to solve the problem isn't the metric; it's communicating different approaches and seeing how much explanation is needed to get the implementation idea across and how fluent they are at turning the idea into code. It's 2 hours of working together; it is not 8 hours of trivia.)
I've not seen weak candidates that got hired get much better even after extensive coaching and pairing.
I was very much a weak candidate when hired for my first programming job after university. No experience with shipping code or working on large code bases, virtually no experience with C++ or OpenGL/3D programming for an application where the core was doing 3D visualisation in C++, minimal relevant domain knowledge etc. etc. And I'd really like to think that not only did I get much better in the first 6 month I was there on a temp contract, but that I kept getting better in every aspect of the profession for the rest of the 3 years I was at that job. And even being on the other side of the 'table' I've seen people who could hardly code at all turn out perfectly acceptable code 6 month later.
It doesn't even take "extensive coaching and pairing", just a lot of nudging and pointing in the right direction, some well timed pieces of good advice and the occasional bollocking when they're being just a bit extra dense.
Everybody's weak the first time they do something for reals; that's not really what I'd classify as a weak programmer, but rather junior.
When interviewing a junior, ideally they'll have been programming since before college and are actually intermediate devs in disguise. But when actually hiring a confirmed junior, you look for a spark of intelligence but also go in on the understanding that (a) their potential is unknown, and (b) there are only so many juniors you can allocate across teams and maintain productivity. Juniors are a speculative investment.
Why is it ideal that a candidate has been coding since before they were in college? I understand wanting people who have an interest in the job, but how many other industries feel that the only good candidate is one whose been involved in it since they were a child? Sports are the only one that come readily to mind
> how many other industries feel that the only good candidate is one whose been involved in it since they were a child? Sports are the only one that come readily to mind
Classical musicians.
Of course there will be relatively few areas where early involvement is even considered in a candidate. How many 9 year old lawyers do you know?
Sadly true. If you haven't been practicing since before puberty, you will never be a professional classical performer.
But it's an exceptional and unusual field. Many professional non-classical musicians start much later, and still do okay - although it's a much more brutally selective field than software development.
Edit: I see no evidence that starting late makes it impossible to be a good developer, and plenty that an early start is irrelevant. Any reasonably intelligent graduate should be able to learn to be at least averagely competent at basic code grinding. It might take a few years, but there's nothing inherently magical about the process.
> Sadly true. If you haven't been practicing since before puberty, you will never be a professional classical performer.
This is correct about 99.999% of the time. The exception is when extreme levels of talent and interest are simultaneously involved.
Barry Tuckwell, one of the great virtuoso French hornists of the 20th century, started playing at 14 and was playing professionally within six months.
Hermann Baumann, another virtuoso French horn soloist, started playing when he was 17.
Anyway, I think that interest in programming before college is a likely indicator of independent interest and self-directed learning. I know a lot of folks majoring in CS or trying to get into development simply because it's a lot more lucrative than many other professions. Many(not all) of these folks don't have a genuine interest in programming which makes it unlikely that they will be effective developers.
I started programming around the beginning of high school and then stopped at the end of high school and became a car salesman. 8 years later I went back to school and got a CS degree and am thriving. So, I did have some early experience, but given the gap I feel like I'm more closely aligned with a late starter.
Now that there is a push to get all kids coding in school, this criterion will have to be tightened: I wouldn't be surprised to see: "had first PR merged into a Github project before college".
Of course these arbitrary criteria to filter junior programmers add no value, but are another symptom of how broken the entire business is.
Right, it's along the lines of people declaring that the only good programmers are the ones who have outside projects regardless or whether 8-12 hours a day of programming at a job is enough to sap anyone of creativity
Why is it ideal that someone has been coding before college? Simply because they have that many more years experience than everyone else. You may even be able to argue that coders are amongst the only people who have a non-negligible likelihood of starting the practice before college, I could believe this is the case.
I can also tell you that most the lawyers, bankers, consultants and doctors at my university will have been doing work experience + shadowing from around 14-15 years old. The even more ambitious ones will have been doing slightly more enterprising things (entering school/local/national entrepreneurial competitions), or reading around their interests, or joining/running societies.
When looking at the most successful people I know around me, what becomes apparent is that they started way earlier than everyone else. Look at Zuckerberg, he wrote Synapse when he was 16. It's unlikely you'll hire a Zuckerberg, but when someone started coding is a pretty reliable heuristic for how dependable they'll be in the job. Think of it as hiring a senior analyst to do an junior analyst's role, if you will...
> Why is it ideal that a candidate has been coding since before they were in college?
Imagine if civil engineering students were taught arithmetic for the first time at college. Firms would favor applicants with an interest from a much younger age, no?
The problem with software engineering is that understanding it well is as fundamental as understanding arithmetic. We teach arithmetic from the age of around six, yet we expect that starting to code only at college is somehow equivalent.
This is a quite unintelligent response - people can learn if they put their minds to it.
Case in point, I'm a PhD dropout from a top 15 math program in the US - I did not really code at all to any serious degree. After pouring a lot of time & effort, even after starting, I became very good at what I do. I have encountered plenty of people in the profession who have become solid to very good developers, including a former grad school colleague. The people have come from all sorts of walks of life, from college dropouts to sales.
Also to contrast, I have also worked with not great developers who have CS degrees. A degree does not tell me if you are passionate about what you do, or if you retained any knowledge from schooling - nor does various indicators that you have coded prior.
You seem to have assumed that I exclude anyone who did not code before college as incompetent, but I never said anything like this. I only assumed the context of the regular "attempted to become competent through college" route.
> ...people can learn if they put their minds to it.
Sure. I never said that it isn't possible to learn. All I'm saying is that there is a correlation when you consider the population in aggregate.
I'm sure the civil engineering industry (to continue my example) would be happy to accept somebody who started to learn later in life. However, given that it takes years to learn engineering if you start at the beginning of the knowledge tree (say, arithmetic), why should coding be any different (say, your first "Hello, world!")?
Since the current education system puts the learning of the beginning of this knowledge tree many years apart, it makes sense that the acquisition of competence at the other end is also years apart. Hence, if you follow the regular education system to learn software engineering, someone who started early will be years ahead of you. This is common in our field because the current education system is so far behind. Therefore, employers in our industry (as opposed to other industries) value those who started early more, and this is reasonable behavior.
> After pouring a lot of time & effort...
Right - and people who pour in a lot of time & effort are correlated with people who started early. This does not necessarily exclude people who pour in a lot of time & effort at other times.
I think I might not have made myself clear enough, sorry. Let me add some context to what I meant:
The problem with software engineering is that understanding [things like indirection] well is as fundamental [to software engineering] as understanding arithmetic [is fundamental to civil engineering].
I presume that this assertion, then, is uncontroversial?
Indirection may be fundamental to programming, but it is a fairly self-contained concept that in my experience can be grokked quickly. The math required for an engineering degree is based on arithmetic, algebra, trig and calculus, all of which are taught over a period of years.
I may still not be understanding you, but my own experience, when I learned programming as part of an EE degree (first language was C with lots of pointer manipulation) was that concepts like indirection are a lot simpler to pick up de novo than Partial Differential Equations, Fourier Transforms and other college-level math concepts are, even with 12 year grade school background.
"ideally they'll have been programming since before college"
Lot's of people get into computer programming as a second career. Rich Hickey, who created the Clojure language, studied music when he was in college. Later, when he was working at a music studio, he got serious about writing code in C++, and from that experience he developed his opinions about the flaws of object oriented programming, and the possible benefits of immutable data.
Your remark says a lot about the hunger for stereotypes in the tech industry. As if everyone is suppose to follow the same path to a career in computer programming.
This is very true, and I say that as someone who has been programming since age six.
With my prior employer, I briefly had the privilege of working with one of the smartest and most driven junior devs I've ever encountered, who'd been a professional figure skater in a past life and was fresh out of a Ruby bootcamp. This was a Node shop, and I've never seen anyone else go so quickly from zero to fully productive in a new stack. If the various dysfunctions of that organization included a prejudice toward long-term hobbyists, I still wouldn't have.
I'm saying that early coding implies a higher probability of having useful experience in a junior dev (i.e. straight out of college). You seem to be suggesting that I think that the probability of someone having useful experience is lower without early coding. But I don't think that at all; I don't care whether useful experience comes before or after college.
For a fresh recruit, lack of experience is expected and will be trained away. Nobody who is looking for fresh/junior developers ought to expect anything else. But most needs aren't filled by a junior developer.
It is absolutely the case that a good developer will have perhaps a few interviews, and then pick the best offer available. Meanwhile, a marginal developer will go to interviews for months until something sticks. Thus, of the candidates I actually see, I expect the ratio to be perhaps 50:50 whereas perhaps 95% of all engineers are actually qualified for their jobs.
When a "senior software engineer" candidate can't read ten integers from a text file and print their sum after 45 minutes in front of the language of their choice, I have to assume that I'd have an equally bad experience working with that candidate in the future, and pass.
The article leans too heavily on "great developers are sticky," which may or may not be true, but the above dynamic is the main reason we have coding screens for programming jobs.
Nobody has shown me a better way of screening that actually works any better, and the cost of a mishire at the senior level is high, so on we go with the tools we have, primitive as they may be.
I don't think i could type that off the top of my head. Sure, the normal java stuff public class Foo { ... blah blah blah ...}. Integer parsing with Integer.parseInt(...); do the sum in a loop, couple minutes maybe. But i have to look up references.
File because one of the constructors overwrites whatever is there there, and the Reader family because i can never remember what i need in the stack. I don't think i've had to open a file this year.
I guess it's better to just type in everything you know, then explain what you're looking up and why. But really at my desk, I'd instantly know what i needed to look up and go look that up before typing. Rather than local javadocs, i'd google for FileReader. But in an interview i'd feel bad.
Ah yes, and after actually doing it, i'm an idiot, and there's a nice convenience method that is perfect. Files.readAllLines.
Also, figuring out the package prefixes without an IDE kind of sucks.
Google has ruined me.
But yeah. stock linux + jvm and no internet connection? maybe 15 minutes? 5 minutes to figure out where the javadocs are, another 5 to find the docs. 5 for typing. Eh, maybe another 5 for remembering the weird path conventions for javac.
I'm kind of embarrassed it would take me that long.
edit
Oh gosh. you wouldn't have ctrl mapped to caps lock. I'd look like a complete moron.
Let's be fair, Java I/O -blows-. The whole stream, reader, buffered, file, blah blah blah, multiple levels of abstraction on top of the basic "Look, it's a file, let me read it", is what is getting you there.
Ideally if you chose Java, the person would let you handwave the actual library calls. Or you'd pick something saner. In Javascript, off the top of my head -
EDIT: Node, specifically, and with the spoken caveat that devoid of context of how it's being used, you're just going to let it block; were it a real application you'd use the async variant of readFile.
In C, in a handful of minutes (and with no reference):
#include <stdio.h>
int read_and_sum_ints(const char *filename, int n)
{
FILE *fd = fopen(filename, "r");
int c = getc(fd);
int sum = 0;
while (n-- > 0 && c) {
int m = 0;
while (c && c != '\n' && c != '\r')
m = (m*10 + (c - '0'));
sum += m;
}
fclose(fd);
return sum;
}
int main(int argc, char **argv)
{
print("%d\n", read_and_sum_ints("foo", 10));
return 0;
}
Given that I'm currently just an intern, I would expect both junior and senior developers to be able to do this.
I wouldn't be able to do that off the top of my head because, in my C experience, I hardly ever read files. Everything we did was reading from or writing to memory buffers.
Furthermore, we did almost all system interactions through a platform-independent middleware interface. So, while I, for example, knew what a semaphore was and how to use it I could not remember the standard Linux interface for semaphores because we used the platform-independent API for ours.
Expecting people to remember APIs off the top of their heads is ridiculous. Some do, some don't. The people who don't aren't lesser engineers just because they don't.
I agree people shouldn't have to remember APIs, but you should at least have a vague idea of roughly how to go about reading a file and parsing numbers from it.
As long as we're posting code/golfing (and since I've been lots of file parsing lately and it's fresh):
use std::io::prelude::*;
use std::fs::File;
fn main() {
let mut s = String::new();
File::open("numbers.txt")
.unwrap()
.read_to_string(&mut s);
let sum = s.lines()
.map(|l| l.parse::<i32>().unwrap())
.fold(0, |acc, i| acc+i);
println!("{}", sum);
}
The value of EOF is platform-specific, but is commonly -1. I'm fairly sure it's never 0.
But I was referring more to the fact that c is set exactly once in line 4 (`int c = getc(fd)`), outside of the nested loops, and never touched again. So if it's not initially a NULL, newline or CR, it never will be, and the inner loop will never exit.
I don't know why it's so important to see a junior developer's work before college. A junior is supposed to have near 0 experience, it's more important that they can demonstrate problem solving skills and willingness to learn.
Eh? I don't think it's important to see a junior dev's work before college. But if someone has been coding from a young age, if they've been self-taught, it's a very promising sign - they're self-motivated and are likely to stay up to date on tech - and a bonus is you're effectively getting a more experienced dev for the price of a junior dev (though in reality people should recognize this and bid their price up).
I disagree, whether a young developer have projects to show before college is near irrelevant to if they'll become a better developer as a professional. It doesn't automatically translate to self motivated, the only thing it shows is the candidate has early exposure to some technology. To my experience most people start programming from a young age are because they have family members in this profession and were asked to learn programming from a young age. I'm not saying there aren't any really due to interest and motivation, I just haven't met any yet.
To me whether you start programming before college is like graduate from Stanford, you may or may not be good, it will make me more likely to interview you, but not more likely to give you the job.
I've met several self-taught programmers with very bad habits that had to be unlearned and missing fundamentals (writing O(N-squared) algorithm and not understanding why not to). Not all self-taught programmers are this way but it's not an unambiguous green flag.
I was an inexperienced candidate when I got my first dev job. I had literally some very basic front end skills in that I could use jquery to change font colors. I'm now a fairly seasoned backend developer - if nobody had taken a risk on me (for low pay at first) or mentored me I would not be anywhere near where I am today. We treat training like a cost center and despite the huge amount of money being made out of the tech industry there are very few companies organizations or even groups of people that prioritize taking time to make people productive. I don't know the solution but it seems that there is a huge untapped amount of potential skill that will never get used.
I don't think programming is much different than any other endeavour. Talent will bring you a long way, but requires practice and training to get you to the point where you are good (or better than good).
People with average talent can become quite good, but just like everything else, they won't be able to catch the more talented person who puts in as much effort and receives as much coaching.
However, if you look at things like sports, the margins are really fine. A 2 minute margin of victory in the Tour de France represents less than 1% of the total time in the race. People at the top end of sports have the best training, the best coaching and the best talent because 1% is the difference between winning the championship game and not even making the team.
The margins in programming are huge in comparison. Someone with average talent who has good coaching and works hard in a consistent way will be so much better than your average developer that I think talent barely works into the equation. Just like everything, the super talented are rare, but in our field it doesn't matter because we can potentially train people to be much more than "good enough".
You have hit the nail on the head, though, with respect to the problem. Training is a cost centre. Let's say you take a person right out of a boot camp -- they can write code, but have only a couple months of practice under their belts. While they can improve dramatically over a year or two, it's still going to take 5 years or more for them to hit the "much more than good enough" level. If you want them to be able to handle the nuances of making sure a project is successful, you need at least another 5 years on top of that.
People change jobs every 2-3 years on average. You are just starting to see some payback on your investment and they are gone. Even worse, investing in training costs you more than buying talent. If you spend time and money (and factor in the cost of lost opportunity by hiring people who are still learning their craft), when they reach the level of a more experience/talented individual they want the same pay!
So you can pay double for someone talented who has 5-10 years of experience, or you can try to train someone up to that level. After 5 years, they still want double (if they are even still around), even though you have invested in them.
This is why the top teams are willing to spend ridiculous money (2x the median) on top people and are unwilling to train anyone. The teams who aren't willing to pay top dollar also are unwilling to pay for training. They think they can get twice as many people if they pay half as much and will get more done. They aren't interested in paying for increased skill.
Unlike sports, 1% difference in ability has virtually no meaning in programming. This means we don't need to scour the world for the best young programmers and build a world class training program for them. At best, we pay double for people who have the talent and/or experience to be "much better than average".
I think part of it is about the price of failure - why do people need not only GOOD programmers but the BEST? Because their business model doesn't work unless they succeed fully, the first time (and most still fail). I wonder if it would be better for innovation if ventures could be made without putting ones entire livelihood on the line - I've found that the more "assured" a business is of its model, the more willing they are to coach and train people (at scale, it costs less to have a training regime so you can move lower-paid programmers through your company than it does to pay the highest sought after people in every place of your company).
I think part of that perception may come from the kinds of companies you have dealt with. There are many companies where the "IT" side of the company is considered to be mostly superfluous. In fact, most programmers work for these kinds of companies. In my experience, there is still very little training. At best you'll get a 2 day "Scrum Master" course, or a couple of days to go to a conference here or there (which is sadly often seen in the same light that sales people view a sales junket). Even in big insurance companies, etc people don't stick around as employees for long. Of course these companies are not the ones paying double for good programmers.
People make that judgment based on inconsistent opinions though because ultimately it's measuring developer ability against a commonly held standard and we don't even know how to do that.
What were you working on, if you don't mind my asking? I really enjoy graphical programming and I'm curious what kinds of jobs involve that, besides the obvious ones like video game programming.
I know this is all very anecdotal, but I certainly have seen developers that I wonder for months how they could have ever been hired, and given time become quite capable. I first started to notice with out of college developers, but eventually saw beyond that too.
In my experience, the ones that were 'capable' from the get go either had a lot of experience programming on their own, or had some internship or other real world work experience. The ones less capable didn't.
Then I realized that my measure of 'capable' was probably unfair, because I expected people without having worked before to be aware of all the complex details that working with other developers entail. All the annoyances you have to deal with about writing real code that needs to evolve and change that only a stable work environment provides.
Then I took it one step further. What if the ones that weren't out of college were just never expected to really improve. They never had the chance to be exposes to a stable work environment that required growth. They just either were let go from previous jobs before being allowed to grow, or weren't put in an environment that neither required nor promoted growing? I think not all jobs do a good job of demanding and providing space to grow. And that's a problem. It's possible to 'get good' on your own, but it shouldn't be expected; and it's far from the only way.
So maybe we are being a bit unfair when we say a candidate is weak. Maybe they just haven't been in an environment that would require and allow them to improve.
In all walks of life, people with more experience are expected to be more formed, more of a known quantity, than people with less experience. The distinction between a junior candidate and a weak programmer is mostly down to the length of their CV. If someone has been programming for a few years, it's a bad sign if they're not very good at it.
You can spend ten thousand hours doing the first hour of the same work, at a bad job. There's a lot of bullshit positions with poor management that don't exactly lead to programmer development.
Yes. And if I'm expecting to hire a senior developer, I quite naturally don't want the person with that experience, because that's not an actual senior developer.
No, but if you had a crystal ball that could distinguish between those who had been given lots of opportunities but never improved, and those who just need a little bit of training to make them a world class engineer, then you'd have an amazing hiring advantage relative to your competitors.
I know some smart, thoughtful, enthusiastic developers who have only ever worked in one team, and that team had a bunch of dev practices that don't fit the industry's typical expectations (e.g. zero automated unit tests). Many companies would pass them over because they'd bomb that apart of the interview, but with a couple of months of on the job training, they'd fit right in.
When they go looking for a new job they're going to find it hard, but whoever can see through their inexperience an is willing to take a chance on them will be happy they did.
I have. In fact: it became sort of a norm. Want to talk more about it?
I'm sure there's a definition of "weak candidate" that perfectly captures "uncoachable". But my definition, and I think it's the common-sense definition, is that a "weak candidate" is one who gets as much negative feedback as positive (Joel Spolsky famously advocates for "if you have to ask, the answer is NO HIRE"). Most of the people we hired at Matasano for several years fell into that category, and most of them wound up outpacing the strong candidates (none of them did badly).
Are there any other companies that you know of that do what you did at Matasano (offer post-hire specialist-type training to non-specialists/generalists is how I understood your posts about it)? The moneyball/trainingball idea seems like such an obvious way to get the people a company needs that one would think it would be somewhat common by now.
For all the alleged shortages, can it really be that all these years later companies are still unwilling to gamble on training because they feel like people will leave too quickly?
Companies have too incomplete a notion of the value of training and a positive environment to retention and productivity. People really do think skill is innate and people 'are who they are' more or less.
It's probably not true in 80% of cases, is true in 20% of cases, and the 10% of cases where people got burned by a poor candidate who couldn't improve are the ones they remember and say "never again;" likewise the 10% of cases where they are amazed by a highly skilled candidate and think all of them should be that way. They're missing the majority effect that impacts the productivity of 80% of their workforce.
This is why it's crucial to have an innate sense of human variation and psychology, a la W. Edwards Deming. So, in that sense, it's not the employees that need the training, but the managers.
I know lots of the big name consulting companies in London at least used to do it. A friend of mine with a physics degree from Oxford got job offers for all kinds of things on the assumption that if you can get a physics degree from Oxford, you can probably learn other stuff as well. He ended up doing some sort of forensic accounting.
A friend of mine in Australia became a management consultant with a PhD in physiotherapy, they essentially were saying a PhD in anything is a good base.
From my perspective, I see that several other skill-based professions assume novice hires, and emphasize a some sort of on-the-job training as part of career progression.
In software dev, as much as we have stack overflow and such where we can figure out trivia of the tools we have to work with, I guess we'll come to the realization that some kind of formalized training post-hire will be useful, especially as development becomes more specialized or domain specific.
I know you've talked about your case as a counterexample, but I think it's because it's not coding ability you were finding hard to get, but security and pen-testing knowledge and experience.
To my mind, that's domain knowledge, and domain knowledge can definitely be taught, and much faster than how to write good code in good time.
I think sending some pen-testing books to a bored and under-exercised VB programmer may well get good results; but put the same guy in front of a problem that is almost completely abstract, and if he's able to understand the solution you present to him and create code (in any language he likes; we don't care about specific language experience either) that implements the solution, then he'd get hired at my company too. Because we don't test for domain knowledge; we don't expect it. We test for ability to communicate, understand and turn concrete implementation ideas into code.
It's a programming job. Matasano wasn't a netpen firm; every project involved us digesting and processing some giant blob of other people's code. Everyone on the team had to write tooling and, more importantly, quickly understand other people's code well enough to outguess the authors.
I don't accept the argument that we were hiring for an easier job than "programmer"; I think we were looking for a subset of the population that ordinary software development teams look for.
The bored and under-exercised VB programmer, by the way, helped found the NCC Cryptography practice, and moved on to one of the greatest crypto software development jobs in the industry. It's not my place to say exactly which job, but if you make a mental list of the 5 coolest crypto software development jobs that might exist, it's probably one of those five.
The problem is everyone else runs hiring processes that assume the bored VB programmers are just VB programmers. They are wrong. Joel Spolsky was also wrong about this (despite the other nice things he said about VB).
> I think we were looking for a subset of the population that ordinary software development teams look for.
I think that's actually why your experience might be less representative, but I'm open to being corrected.
Security is generally considered harder than regular development, at least to the extent that it would typically be hard to get a security job without security experience.
From what I understand, your model was to take programmers with no security experience and train them on security. The effectiveness of this doesn't surprise me at all, because it's fundamentally a different problem than taking someone who's bad at programming and teaching them to program better.
I have absolutely no problem with hiring someone who's a good VB programmer and teaching them to do web programming. What people generally dispute is whether you can hire someone who's a bad programmer in general and "train" them to be a good programmer.
> I have absolutely no problem with hiring someone who's a good VB programmer and teaching them to do web programming. What people generally dispute is whether you can hire someone who's a bad programmer in general and "train" them to be a good programmer.
How do you know if a programmer who is in a field you are not familiar with is good or not?
You can ask someone to show you some code they've written and teach you about it. Concepts transcend language and field: if you understand what you're doing and why you're doing it when writing a VB program, I have high confidence that with a bit of time you'll also learn to understand how and why to write other kinds of programs effectively as well.
Why does it have to be this complicated? Explain some of their existing code? Why not just give them a representative programming problem that your team has already solved, and ask them to code up their ideal solution? That's called "work sample hiring", and it's the gold standard for candidate evaluation.
Frankly, your statements are extremely conflicting. Pure work sample hiring is in direct contradiction with hiring trainable people.
I do not expect a VB programmer to be able to code up a single-page React app during an interview, even though building a React app would be a representative problem. Do you?
So which one is it? Hire exclusively through work samples, or hire trainable people? I mean this seriously: how do you reconcile these two? You're a strong proponent of work sample hiring but also of training—do you expect people to come in and do a security audit for their interview, even when one of your hiring goals is to hire people with no security experience?
I would personally much rather hire someone who is a strong overall programmer (but might very well not be able to code a JavaScript app in an interview) over someone who knows a little JavaScript (so they can do a simple problem) but has relatively shallow abilities.
> That's called "work sample hiring", and it's the gold standard for candidate evaluation.
Having a discussion about someone's existing work is, in fact, another gold standard of hiring and is quite similar to having someone do the work on the job. It's not quite as good as having someone execute work, but it's the next best thing—especially when you're trying to hire people who don't have direct domain experience.
I think you're neglecting to remind folks that Matasano would give people a crypto textbook to study from first.
So if a VB programmer were applying for a React job, you'd say, "here's a React book; we're going to have you code a React component during your interview. Let us know when you're ready to schedule."
Google does not hire with a work-sample test. tptacek's claimed process is therefore easy to distinguish from Google's. Maybe better, maybe worse.
Google does allocate non-trivial time on training, though, both on the candidate's time before the interview (as you note), and on paid time after hiring.
What a low effort reply. Honestly, I expected better from you. This is pure argument from authority without any attempt to confront the fact that there is in fact a very real contradiction between hiring through work samples and investing in training.
For the record, I have done work sample hiring. It's definitely my preferred strategy, but it's also far from a panacea.
But it's also equally true that if you give someone a work sample to build a React app and their entire experience is in backend programming, they will do not very well. Would you really be surprised if someone fails to produce good work in an environment, language, and domain completely foreign to them?
Taken to the extreme, some companies traditionally hired engineers who had absolutely no programming experience and trained them to program on the job. Would your advice to them also be to "hire through work samples (of programming)?" How would candidates who, by definition, need training to even do the work be able to produce a work sample?
Your whole "I know best, so I don't have to provide logic or evidence" schtick is really getting old.
There's no contradiction. Set the bar where you need it to be, but make the bar about ability and aptitude, not about people talking their way through an interview.
If your bar needs to be super hard, set the bar there. If you need people to understand in detail when and how to use shoudComponentUpdate and how to write a pure functional component, do that. I'm not arguing that React shops should hire people who suck at React.
I think you think I'm saying something I'm not saying.
> I'm not arguing that React shops should hire people who suck at React.
I guess I misinterpreted that you were a proponent of training people.
Yes, if you want to exclusively hire people who are already good at React then of course work samples are an effective technique. I personally think the majority of good hires are not already familiar with my particular stack.
> Set the bar where you need it to be, but make the bar about ability and aptitude, not about people talking their way through an interview.
I think you might have misinterpreted my position as well. When I ask people to talk their way through their existing projects, it means they literally bring some code with them and we spend the interview working through it and them teaching me about the decisions they made in building it. You should try it some time.
Also, it's easy enough for a VB programmer to get started with React, he/she can quickly teach himself/herself some js and then follow a tutorial online, and after some point build a toy application with React. And that can be the work sample. It won't be a production ready app, but it will be fine for a junior React guy.
Oh, I agree about VB; once upon a time, a long time ago, I too was a VB programmer (before I discovered Turbo Pascal and then became aware that I found the distinction between Let and Set disturbing; I ended up loving Pascal so much I spent 5 years at Borland on the Delphi compiler).
While at Borland, another engineer (Allen Bauer) shared his personal theory, that some engineers were great at reading code, more were great at writing code, and not so many were great at both. Sounds like your job required people that were very good at reading code. It intuitively feels very testable.
My latest hire (I hired them almost 1 year ago) was very green (no job experience, didn't study CS, autodidact programmer). But I saw enough drive, motivation & willingness to learn that I gave them a shot. It has proven a very advantageous hire for both parties. I invested a lot in them and the investment quickly bore fruit.
A previous person I hired (in a previous job, for a different company) had a lot more industry experience as well as appropriate education. They were quite capable but I had a slightly bad gut feeling, ended up hiring anyway cause I didn't want to discriminate against a seemingly capable candidate. Ended up being a poor fit: argumentative and not willing to accept that they do not always get to call the shots about how stuff is done. I think this was disadvantageous to both parties.
In the future I will definitely prefer hiring someone showing a lot of potential than someone skilled who I suspect may not be a "team player" (for lack of a better term).
I always took Joel's writing with a large pinch of (kosher) salt. He's a great writer in that he knows how to flatter his audience (us). But two things always niggled me: firstly that I didn't know a single person who actually used his products and secondly that it simply wasn't plausible that the top talent in the industry would be jumping through hoops for a chance to work on software for project managers.
I stopped seeing him as an unfailing source of truth after he okayed developing fogbugz with a private, in-house language, that (unsurprisingly) was a huge waste of resources.
It's not about being afraid of computer science, it's about being pragmatic and efficient about the problems that you choose to solve. By choosing to develop your own closed-source language, you now have two projects that need to be developed, the product itself and the language that it depends on. They retired the language (Wasabi) some years later specifically because it was so demanding to maintain, there was significant on-boarding for any new hires, and it constrained their developers by frequently being behind the current state of C# (which it compiled to). When you write your own language, you don't have an entire open source community updating it, fixing bugs, and maturing it with new features and libraries. That is now work that your team needs to take on, and in the vast majority of cases, the negatives far outweigh the positives.
I think the main advantage is that if you emit JSON then you know that someone else isn't going to write a bad parser for it. Given that we can't depend on people to do simple things like handling escape sequences correctly, that seems pretty important. (Also, the 'jq' command is nice.)
Fogbugz is (was?) writing a language for internal use only, so it's a better fit there.
> Given that we can't depend on people to do simple things like handling escape sequences correctly, that seems pretty important. (Also, the 'jq' command is nice.)
jq stores all numbers as doubles, which means it can't handle 64 bit ints (or larger integer numeric types).
It's otherwise a great tool and I wouldn't point out a negative such as this, but you did explicitly mention that you know JSON won't be fed to a bad parser.
That is what almost all JSON parsers do; even the spec says that you shouldn't rely on more precision than a double:
This specification allows implementations to set limits on the range and precision of numbers accepted. Since software that implements IEEE 754-2008 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision. A JSON number such as 1E400 or 3.141592653589793238462643383279 may indicate potential interoperability problems, since it suggests that the software that created it expects receiving software to have greater capabilities for numeric magnitude and precision than is widely available.
Good to know about the JSON spec. The only JSON parser I used before jq was the Python stdlib parser. Python JSON correctly handles big integers, which lead to my (out of spec) expectations for jq to do the right thing.
On the other hand, don't we always hear people say that it's a really bad idea to write your own crypto algorithm, and you should use something that has been extensively tested and vetted instead?
I would posit the same applies to writing your own programming language: it may be fun to do as a hobby, or when you are learning, but may not be a good idea to use it in production to rewrite your company's flagship software.
The most obvious reason is that it's much, much easier to design a new programming language --- especially the extremely simple kind of language that Wasabi was --- than it is to build a new cryptosystem even with vetted primitives.
The second is that the impact of any crypto error you make is very likely to be game-over, while most programming languages aren't (you can observe this empirically from MRI Ruby).
A third is that designing new programming languages, especially the simple kind that Wasabi, is a straightforward well-understood engineering task. Nobody with a CS degree should be incapable of doing it competently. You cannot say that about cryptography. In fact: many of the problems people attempt to solve in cryptography (take, for instance, multi-party secure messaging) don't even have well-understood best-practices solutions; they're still active topics of research. Not to do better, the way Haskell was an attempt to at innovating a better functional language, but to reliably do at all.
I could generate more reasons. This is just the top of my head. There is simply no comparison between the two problems. I implemented a new scripting language for building TCP/IP stacks in my first professional programming job ever --- in fact, it was the first programming assignment I got in that first dev job, and I had to do it in C. I handled it comfortably. 20 years later, and I still wouldn't be comfortable implementing most cryptosystems myself.
I'm not afraid of parsers - I like them so well that I have to be careful not to use them on problems a regex or two will much more maintainably solve.
But I would hesitate somewhat to join a team which used a language developed entirely in house. As enjoyable as such work would undoubtedly be, it also reeks of overengineering, unless the problem space is so unique that no extant language can effectively address it. Experience suggests that project management software is not such a space, and the maintainability risk of a bespoke language seems therefore very hard to justify.
On the other hand, I don't pretend to be a "top dev", by anyone's definition - top quartile, maybe, on a very good day. So perhaps I'm just not favorably enough placed on the capability curve to understand why it really is the great idea that it doesn't seem to be.
If you read some of their postmortem stuff on that whole situation(which is pretty much an engineering meme at this point) they are in no way willing to admit that as a mistake. Tump would have a run for his money on avoiding admittance of mistakes..
I don't think that was the OPs point but along those lines Stack Overflow doesn't use Fogbugz. I'd love for them to tell us why Fogbugz is not a fit for Stack Overflow
I don't know if you're being sarcastic, but I will choose to take your words literally because Excel has had an enormous productivity impact on the people who used it, and in many areas changed how business was done by introducing programming and automation (formulas, pivot tables, graphs etc.) to technical business users.
Edit: glad you were not being sarcastic :-) thanks for clarifying
I was not being sarcastic. As a person who builds statistical models of credit risk for a living, I truly consider Excel to be one of the finest software ever made. I generally consider it true for a lot of Microsoft products, though most people in this forum tend to disagree.
From what I've seen, the the spreadsheets themselves are great, but the lack of good version control, revision tracking, and regression testing makes for poor operational processes.
People e-mail one another Excel workbooks, there's 5 different versions in flight at once, and impossible to find when an error or mistake got introduced or who was responsible.
Computerised spreadsheets were inevitable, we've been using spreadsheets in some form for thousands of years. Visicalc is recognised as being the first, but Excel has been the dominant product for decades now, for all its warts it has had a massive impact on humanity.
Not the OP, but I think you misunderstood.
Excel was always a windows application that had the same functionality as lotus 123 that was a DOS application. So much so that the keyboard interface is still compatible.
Actually what I remember is that Lotus killed itself by not having a windows version on time.
Or a profession that so strongly assumes ability is innate and cannot be trained.
And that, by virtue of being plunked down into the interviewer's chair through whatever chain of random circumstance... you've suddenly become innately endowed with the ability to evaluate it in others.
Is this including the Joel Test? While that makes more sense for the land of native applications over the web, working with a company that is 0/12 is a vastly more terrible experience than 12/12. Granted a number score doesn't matter much when certain answers have more weight and you could argue someone would've come up with it eventually. When I first heard about it though, I immediately started working on getting better answers for the company I work for and continually do so. I can't speak to what you're talking about as I somehow glossed over that but maybe that's its own problem. It's easy to fanboy when something great comes along without understanding the years of opinion that went into it.
I think you being a too literal in your interpretation of the phrase. 'the best money can buy' is figurative - it means 'the best available, regardless of circumstances'. The 'money' part is figurative; it represents resources.
I think it's a little unfair on Joel Spolsky's team to suggest they might not be good at what they do based on a semantic argument around whether 'the best money can buy' can be interpreted as using resources or just cold hard cash. Considering the things they've built they're very evidently a highly accomplished group of developers. I don't see any need to argue about this; they're obviously using what they believe are the best tools for the job. You can tell that from looking at their results. Whether they're using free, open source things or they're buying tools isn't relevant, and certainly isn't a sign that "your development team is probably really bad". They're clearly not bad. They built Trello and StackOverflow.
I totally agree with your assessment of the wording. Money, to me at least, never equates to closed or open source. Fortunately, at least with stack overflow jobs, you can kind of self-assess this if they list tools like JIRA or Trello. You can also ask questions to see if they're actively using what they claim. If they aren't singing the praises of their toolsets and checked that answer off, you can generally assume they're lying.
If the test was updated for the web and native in a single set of data points, some companies would incorrectly score low because they focus on one or the other. I think it could use some kind of polish after 16 years but only to remove very narrow interpretations that derail discussions like this.
I'd like to back this up. I went from a ~6/12 company to a 0/12 company, and man is it painful. I knew going in it was going to get low marks, and saw it as a challenge to try to push it to 6/12 level if possible (a nice career highlight imho), but I drastically underestimated how taxing it would be.
The biggest factor I didn't consider: People who exist in a 0/12 system are at best willing to tolerate it and at worst have engineered it. Change within that population will be challenging, even if they all publicly claim things need to change. You end up dancing on a razor's edge of being a positive change agent and being made a pariah. All while trying to do actual work.
well even on hardware.
on my daily base I use a macbook pro late 2013. however we also built a workstation from ebay components. the workstation is as fast/faster than any computer you can buy at the moment, however money wasn't the problem, more the resources to get everything, resources would be the correct word, not money. not every resource is buyable.
and some stuff on the list also only applies to bigger companies.
The macbook pro 2013 is probably 10 time more powerful than the thing i use. And your workstation probably used 10 time more money than what my company allows as a budget.
I am not saying everything should be there but yes. There is totally a point to the test. Being really low means you are in a shitty place. Being high means nothing.
well our standard developer workstation is around 1000€-1600€ for 4-5 years. so my laptop will still need to be used one or two years, actually it's also over the budget, but basically if there is a need for it we buy it and some people are just way more productive with a mac since they knew it form day one, however most new hire's actually used linux/windows so they never want it or only want it cause others have it. it really depends on the needs. we are a java job and I even encourage everybody to use IntelliJ Ultimate, however some still prefer Eclipse and I let them use it, even if I dislike it, it's still their preference after all.
Oh totally i think we are talking of different things. I have to dev on a shitty netbook, on a 17" 4:3 15 years old external monitor and i can't use any external dependencies repository...
> Or a profession that so strongly assumes ability is innate and cannot be trained.
Like most things, it's a spectrum.
But if you have a knack, you'll enjoy it more.
Enjoy it more, you'll do it more.
Do it more, you'll get better at it.
This feedback loop has been suggested as one root of the (contentious) "xy,000 hours equals mastery" findings.
But it's only true up to a point.
We know that in fields subject to large-grained biological differences -- sport, in particular -- genetics swamps training effects. Most of what matters is throwing people at a sport until you find the person who is freakishly good at that sport. Which is why Australia is the best at Australian Rules Football, why the USA is the best at Gridiron, why China has come to dominate the lighter weight classes in Weightlifting and so on.
Actually, the established science in weightlifting and other sports is that only the top 1% have to play based on their genetics. For most people it's more a matter of having a decent training program and sticking to it come hell or high water.
I think if you look at software work you'll find the same kind of dependency on training rather than some innate gift.
I would add Google, Microsoft, et al to that list. Seen lot's of candidates come in for interview who have read books like "Cracking the coding interview" who have obviously memorized the white board solutions to complex problems who then can't implement 'boolean equals(Object)' correctly when asked to do so.
At least Google can say "we get so many applicants this is what we have to do!". Why do random web dev companies attempt to grill you with the same level of rigor as Google is beyond me.
Because they like to flatter themselves that the problems they solve require top talent, and they, the current employees are top talent. Usually neither is true, and they don't pay like Google either.
Then why do their recruiters reach out to candidates with extensive work history and slam them into the standard hiring pipeline(I'm not talking Google specifically, but the BigCos in general). It's not because they have too many people applying, it's because they have too many people complying.
While I can't speak for all Microsoft interviews, I will say that I was pleasantly surprised to find that my (successful) Redmond interview had zero focus on traditional "coding interview" questions and instead was 50% about my prior work and the rest about some interesting software dev questions focused on design (not whiteboarding, not algorithmy), culminating in this lovely networkingy question. This is true for the MS interviews of some acquaintances of mine too.
OTOH I have yet to find someone who has had a Google interview that did not have major focus on algorithmy questions which you can just memorize for. To be fair, most of the folks I know are students; Google may have different tactics for more experienced candidates.
I'm sure Microsoft does this too (and I know that Microsoft is where this attitude came from in the first place), but it seems like it does it much less (from my limited anecdotal view) these days.
Google does not seem to have a different tactic for experienced candidates. My onsite was only algorithm questions on the whiteboard with about 5 minutes of 'do you have any questions for me?' for each of the 5 one-on-one interviews I had (not even asking me about my prior experience).
Programming ability is of course not innate, but the ability to learn to be great at it does seem to be mostly innate. You can overcome a lack of innate ability with hundreds of hours of hard work - but you'll never be able to catch up to someone with that ability who has also put in hundreds of hours of work.
That aside, nothing you said actually provides any support for your criticisms. Your comment can be condensed down to the phrase "that's wrong" without losing any meaningful content.
By nature, if you post an article stating the truth of the matter - that truly good people are extremely rare and that most people tend to be mediocre and some serviceable - most people will be outraged. Because they're the "most people" being called mediocre to serviceable.
As the article points out, most jobs are mediocre. Good work is not appreciated, a company can't form a team that works normal hours and uses tests and version control, autonomy and learning is low, the work is some dull obscure business problem, IT and the company is dysfunctional and pay for the skillset is below market average. Yet all these companies want great developers as well.
Also, as he mentions in the article, sometimes a great developer does roll in but they get filtered out for some reason. In my experience it is usually due to age - a great developer who is 52 comes in, but management almost openly says they want someone in their 20s who they can push around more when they nix the person.
So the mediocrity goes both ways. When Oculus got going they seemed to have little trouble getting some of the best graphics programmers (Abrash, Carmack) to join them. If you want a great programmer you need to be a great company with a great position.
Exactly. 95+% of all developer jobs are not particularly novel nor do they require the top 1% of developer talent. However it seems like 95+% of all developer jobs believe themselves to be novel and requiring the top developer talent.
People keep saying this, but I don't think it's as easy as you make it sound. What about the candidates that talk so much that it seems like they are stalling and therefore don't get a reasonable and reasonably complete solution on the whiteboard in the alloted time, despite repeated prompting? What about the guys who don't explain much and take way longer than expected on simple problems, and when asked about it reveal that they are just blowing all this time internally debating the perfect solution?
I'm not in charge of setting our hiring bar so it doesn't really matter. Given my druthers I'd give a candidate a not particularly difficult multi-hour coding project just to show that they can build something and would like to have a conversation with them that shows both breadth and depth of knowledge.
Ok, but none of what you said addresses my concerns. And sadly, this is what always happens when I press people for better interviewing techniques.
If they can't finish squat in 60 minutes are they likely to finish something in double that time? If they are, I could just ask them something that should take 30 minutes to finish and give them double that time to finish it, and just pretend like everything's fine...
> and would like to have a conversation with them that shows both breadth and depth of knowledge.
Conversation is nice and all, but there are people who can do that but fall flat with the coding. And that is a significant part of the job.
It would help having some above average developers (or maybe architects is a better word) on the past two projects I have inherited. They clearly knew a lot of stuff, but elegant database and application design wasn't one of those things.
> Programming ability is of course not innate, but the ability to learn to be great at it does seem to be mostly innate.
It's no more innate than any skill. The only innate aspect to it is the desire to do well at it.
> that truly good people are extremely rare and that most people tend to be mediocre and some serviceable
The problem with this sentence is the word mediocre. The vast majority of developers are average, which is to be expected. The problem is that Silicon Valley is seen as the norm for development and not the exception. SV plays right into the personality culture we seemed to have built (thanks to social media), so people start thinking all development is like that, whereas most people just want to do their job and gohome to their family at night.
To play a devil's advocate, Spolsky's advice was provided at a different time. Perhaps then the requirements and thus necessary skills were different from today, and it might have made sense to rely on his quick and polarizing criterion. I do agree that his tone is brash, noninclusive and overall damaging to the industry as a whole. I would imagine some people would be discouraged from even trying to get into the industry after reading some of his articles.
You hit the nail right in the head, it was a different time. I remember back then how many people had completely fake and fraudulent resumes and a very large part of hiring was just trying to filter our the fraud. It was massive at some point. It seems to be better now but back then it was just incredible how much resume fraud there was :(
No doubt Joel Spolsky is a great software engineer. But sometimes when reading articles like the 'great developers', I feel like he's standing on the high ground and look down on us mere mortals.
Sure it's near impossible to find 'great' developers, but most companies don't need great developers, they need good competent developers and there are plenty of them on the markets at any time. I would say the positions that require a 'great' developer to work on are as rare as great developers themselves.
I've been in the industry for 10 years now and I've never seen this at all. I know a couple of people that applied to Google, and they reported that it was a fairly gruelling process, but apart that, I haven't even heard from friends about people going through anything approaching 8 hour interviews.
> I haven't even heard from friends about people going through anything approaching 8 hour interviews.
Is this a joke? "All-day" interviews are so common as to be the usual expectation.
And while they aren't always a full 8 hours, it's not unusual for them to be 5-7+ (including giving candidates a lunch, and interviewing them during it while they eat).
I wonder at what point this becomes self reinforcing. They need the 8 hour interviews because only the most incompetent will tolerate it or the ones with full time jobs already can't make the time for the interview.
I recently, in the past two years or so, interviewed for a (contract-to-hire!) position at a major airline. First I had a 30-minute phone screen. Not bad, did great on that. Then they had me out for a two-hour face-to-face onsite, which I had to take time off of my current job for. The feedback I got was positive, so they wanted me to come back AGAIN for another set of two two-hour face-to-face onsite interviews (which, again, I had to take time off for). My first face-to-face interview showed up 30 minutes late and kept me for two and a half hours, so I was late to my next interview, who kept me for another two hours.
I've done tons of interviewing and I'd say that 1 hour on the phone + 6 hours in person is the MINIMUM interview length I've ever seen. It goes up from there.
This does not seem to vary across company sizes, locations, job level, etc. Everybody copies this horrible process.
I still can't wrap my head around how Joel Spolsky, of all people, became such an influential voice in software, especially considering that every single thing he's ever written has been an opinion piece (and you know what they say about opinions...) He went to Yale and worked for Microsoft - both of which, in fairness, eclipse any of my own accomplishments - but there are a LOT of people who've done that and then some.
I agree. I think a motivated engineer is far better than a talented one. Some really talented/experienced engineers might find day-to-day programming tasks trivial/mundane and their biggest challenge is staying focused - It's good to have a few such people on your team (to help guide big technical decisions and encourage good practices - They are great supporting roles) but in terms of raw productivity, you'd get more out of a less talented/experienced engineer who feels more challenged by the work.
In my opinion, engineering talent is based on three things:
- The range of projects that the engineer has worked on during his/her career.
- The total number of hours spent coding in their career.
- Their attitude towards learning.
I think this 'innate' 10x developer idea is just bullcrap.
The best way to identify a great engineer is to check how many years of experience they have, how many companies they have worked for (more variety is better) and their attitude to learning. If a developer says "I like to be challenged by my work" - That's generally a good sign that they enjoy learning; combine this with their total years and range of experiences and you can infer how much they know today.
> I don't know of many other professions that have 8 hour oral exams that test the sum of all knowledge in the field with such a strong negative bias towards hiring.
Investment banking, management consulting, doctors, lawyers (probably) ... - seems like the traditional 'power jobs'
I asked my law school friends what their internship interviews were like and they aren't asked any technical questions about law. They just walk through your resume and ask you behavioral-type questions.
The interesting thing is that Google found that behavioral interviews better predict employee performance. I'm not sure why they didn't change their interviews after that internal study..
Because the people giving the interviewers were hired under the old system. Which they can't just turn on its head, all of the sudden, because that would... invalidate the exacting, methodical "high bar" which they had to pass in order to be hired (though proving their Great Developer status).
That is how they hire business majors, not engineers. High end business major job interviews usually has a ton of questions like "why are manhole covers round?", and google realized that such questions where worthless.
The technical interviews are totally different and are used because they actually work well, but you can't do something similar with business skills.
No, but there are many other places where they talk specifically about non-tech roles and I have never seen him comment on technical interviews. Example of article talking about non-tech roles:
Investment Banking interviews for entry level positions have usually very few technical questions (if at all). There's sometimes a standardized test (e.g. logical reasoning) before the main round of interviews. The interviews themselves are just behavioral.
This is factually false, unless there's some 'entry level' below analyst, and if you're excluding the bulge brackets (and oh man the boutiques) from consideration.
From my experience that is factually true (London). I know this for entry level positions and internships (that can often lead to an offer), i.e. candidates with no work experience. I know that this is the case for at least 3 large investment banks.
The situation is of course different for boutique firms, I should've said that. Small firms cannot afford training candidates internally, they'll want to have technical knowledge.
"I don't know of many other professions that have 8 hour oral exams that test the sum of all knowledge in the field with such a strong negative bias towards hiring. Or a profession that so strongly assumes ability is innate and cannot be trained."
I'm not following you here. I guess you are referring to a high selection bias for CS knowledge? How does that relate to Joel Spolsky?
But a lot of it is anecdote and faulty assumptions. I don't know of many other professions that have 8 hour oral exams that test the sum of all knowledge in the field with such a strong negative bias towards hiring. Or a profession that so strongly assumes ability is innate and cannot be trained.
It's outrageous.