This is a key point. I see a lot of criticism of the coding interview. But what's the viable alternative?
My goal is to figure out the candidate can do their job - write software - and that myself and the rest of my team would enjoy working closely with them on their daily tasks.
How else do I figure that out, if not by a coding exercise?
I’m not saying that you shouldn’t do a coding exercise. I’m saying that coding exercises are overrated and far too difficult.
Programmers today think that TopCoder questions are “FizzBuzz” tests. That’s idiotic. Joel Sposky wrote that blog post to say that you should do an absolutely minimal check that someone can write code, then move on to more important things. It’s just bizarre how people have twisted that over the years.
But to answer your question: focus on what matters. Communication, writing skill, the ability to read code (this is easily 10x more important than algorithmic wizardry IRL), good habits (e.g. “does this person insist on putting all of their code in a single file?” - an actual thing I have seen!!), opinions that match your team, etc.
All of these things can be answered in interviews, but nobody cares to try.
Communication is obviously being tested in both phone screens and onsites. It's not like a candidate in a coding interview can simply start coding silently, finish, and then pass to the next stage because their code works. A big part of a coding interview is working with the candidate.
> the ability to read code
How do you test for that in isolation?
There are exercises that do that: you show the candidate a piece of code, and ask them to find the bug. They have all the worst issues that folks here are complaining about, amplified: stressful, awkward, etc. In my experience these types of exercises are also more random: i.e. depend more on luck. Sometimes great candidates fail them, and poor candidates get them right on a fluke. So they are overall less indicative.
Ultimately, though, it's back to the first point: how do I test if the candidate can do their job, i.e. code?
A candidate may communicate really well, say all the right things in best-practice questions like the one you mentioned, write well in English, but still unable to code their way out of a matchbox.
Again, nobody said “don’t test if a candidate can code”. You should do a minimal test, then move on to more important things, rather than beating the same horse.
”In my experience these types of exercises are also more random”
Well, “your experience” also says that you regularly encounter senior candidates who can’t code at all. I doubt the statistical validity of your experience.
> Joel Sposky wrote that blog post to say that you should do an absolutely minimal check that someone can write code, then move on to more important things.
Interesting, and I won't look up the article again before writing this comment.
I remember it as saying he was having a hard time finding people that can code at all, that a FizzBuzz is weeding out a large percentage of applicants. I don't recall it as a gateway to move on after.
My goal is to figure out the candidate can do their job - write software - and that myself and the rest of my team would enjoy working closely with them on their daily tasks.
How else do I figure that out, if not by a coding exercise?