I've found that the best predictors of a good hire are indications of the following characteristics during the interview/vetting process:
- adaptability -- can this person deal with situations that are spontaneous and unplanned without losing his/her cool.
- openness -- is this person instinctively scornful of new concepts/ideas or energized by the chance to be exposed to something potentially interesting.
- resourcefulness -- has this person found ways to keep learning after college and has he/she continued to up his/her game.
- desire to ship -- does this person have a deep desire to produce work that will be used by others.
- desire to grow -- does this person have clear career growth goals. Ideally these are not tied to titles or bossing others around, which are negatives.
- rationality -- can this person admit to being wrong and speak about it authentically... can I imagine this person admitting to being slightly wrong and even very wrong about something and handling it gracefully.
- territoriality -- does this person seem territorial or speak about issues of proudly defending turf in previous positions? Doing so is a negative.
- poker face -- does this person seem to be playing it close to the vest in the interview or otherwise have very low levels of openness? This is a big warning sign and an instant do not hire.
> - rationality -- can this person admit to being wrong and speak about it authentically... can I imagine this person admitting to being slightly wrong and even very wrong about something and handling it gracefully.
I think this is a super important characteristic for a great employee to have, but I don't know how reasonable it is to expect to see much of it in an interview. In an interview setting, the candidate is under a lot of pressure to come across as "perfect" in the sense that they want to appear to meet all of the expectations the interviewer has for them.
Asking about mistakes they have made, or putting them in a position to make a mistake during the interview itself can just be really stressful. Candidates that think you want them to make no mistakes will feel they are doing poorly. Candidates that know you want them to be comfortable being wrong—whether or not they actually are—can easily game this.
> - poker face -- does this person seem to be playing it close to the vest in the interview or otherwise have very low levels of openness? This is a big warning sign and an instant do not hire.
One of the things I've heard from HR/recruiting folks is that it's not a good idea to use body language as a signal. There's a lot of cultural and personality variation around it. You can end up with an unfairly biased opinion simply because they come from a culture that is not very demonstrative, or they are introverted or otherwise have personality traits that lead them to not display a lot of emotion or energy.
I agree with all of the characteristics you list here. It's a great snapshot of what I like in a good coworker. But, man, accurately detecting them in an interview seems super hard to me. But maybe I just don't have much confidence in my own interviewing skills.
On the rationality point, I've definitely been one to talk openly when asked about times I've failed / would have done something differently. I've looked at it as a no-loss scenario: if I'm talking with a company and they can't accept that no one is perfect, then I'd rather not work for that company anyways.
Interviewing in a sense is kinda like dating, one way to really screw things up is if you're both looking for different things. If you're both trying to get into a fling, then yeah I guess lie your pants off and have unreasonable expectations of each other. If you're looking for long-term match, just lay your cards on the table and if you match, you match.
There is one rather big key difference in dating versus interviews. You only get once chance to find out _everything_ about a company during an interview. Whereas dating has many "meetings". This is the real problem with interviews. They're an all or nothing.
Many selective companies have many phone screens and coding projects, plus hours of on site interviewing. Google's process notoriously takes months. Some companies interviews likely take 10+ hours all in all. That's more like 3rd date territory in terms of hours spent together, and plenty of time to evaluate a hire, not some one shot thing.
Google's process consists essentially of 2 meetings - phone screen, and onsites. You don't get engaged to someone after the second or third date. It's not about hours spent, it's also about hours spent reflecting on the relationship.
So right here is the mysterious reason why perfectly qualified people don't get hired at Google - they didn't spend hours reflecting on the relationship!!!
I realize you're trying to mock me, but you've totally failed because that's exactly my point. If your interview process consisted of a 6 month work trial period for every single candidate (and assuming no candidates dropped out), you would pretty much be able to detect nearly every single qualified person. Of course, this is not feasible in practice, so we settle for interviews.
First dates are probably most analogous to phone interviews. It's a testing of the water. If you've met the barrier of entry, you'll probably make it to the next one unless there is no chemistry whatsoever.
I worked in an organization where we used behavioral interviewing techniques to get to the core of how someone operates.
"Tell me about a time in which you..." works very well. If they resist answering the made mistakes questions we prompted them to understand that an honest answer is important. If they could not do that then they were not considered.
These were not developer roles but most were very technical / analytical roles for an organization that paid well and candidates knew up front they would be interviewed this way. In addition some jobs were "cased" where you had to solve a problem and talk about how you solved it to the interviewer. Great fun when I went through the two days of interviews. I ended up doing maybe 200+ behavioral interviews over 6+ years for the company. I saw this approach work time and time again.
Some people could not answer for themselves: "the team did..." "They did...". When pressed we usually discovered that the candidate did not have the ability to get the job done themselves.
> "Tell me about a time in which you..." works very well.
These types of questions probably cause me more stress than any other. Being asked to think of a situation where you felt a certain emotion or had a certain mental experience is not actually that easy for everyone. My memory is terrible at that kind of thing.
It's like my memory lacks a necessary index for what you are asking. I can start going through every major work experience I've had in sequence and filter out the ones that fit, but we might be sitting there in silence for 2-3 minutes which gets awkward enough in itself.
And it's not an inability to remember mistakes specifically, I mean the old "tell me about a time where you solved a problem with no prior knowledge of the domain..." gets me. I solve problems like this ALL THE TIME. That's a skill I have. Fetching those times from memory is not a skill I have.
Luckily I've had these questions in enough interviews that I generally try to have some examples in mind.
Incidentally, behavioral questions are common in nontechnical interviews, since the entire interview is spent analyzing a candidate's skills in the abstract.
How will I know if this person is good at, say, sales or biz dev in a vertical with long lead times? I can't vet that skill set directly; I can only ask the candidate about it. Behavioral questions give me something to cross-reference in the absence of on-the-spot performance data. So do case questions, which (at least ostensibly) tell me how a candidate thinks, and how she organizes her thoughts. (And of course, domain-knowledge questions help me assess the depth and quality of a candidate's experience in the field.)
Behaviorals aren't perfect, though. They can be gamed. It's not too much of a stretch for a candidate to memorize a half-dozen anecdotes to trot out in response to any question that begins with "Tell me about a time when..." (Nevertheless, you'd be surprised by how few people do that sort of prep work.) Canned responses are especially common with candidates straight out of school or grad school, where they are often trained in how to answer behavioral and case questions.
I have the same problem. I couldn't think of these things on the spot for the life of me. So awhile back I googled 'Top Personality Questions on an Interview" and filled out prompts for each question way in advance, and then I review it each time before right before I go into a an interview. Made the rest of the process much simpler.
Hah, I'm exactly the same, and not just with this, with everything. I just have a hard time retrieving things by index when prompted, and I envy people who can just recall interesting/funny things that happen to them. Things happen to me too, I just have to be reminded of the situation to recall it, I can't just go "funny thing? Ah yes, I remember X".
I've got one particular project that usually fits these questions. And if it doesn't, I can tell enough interesting stuff about it to satisfy the interviewer anyway.
From my experiences with employers in many parts of Europe, they value partial contributions to teams more than complete individual achievements. Talking to much in "I did..." sentences instead of "We did... (...and I helped with...)" can come across as self-centered, which often is considered a negative trait here.
So noted. We asked follow up questions to get to a person's contribution. It is valid to move the team forward. It is not valid to ride on the contributions of the team without putting your own contribution in.
This is exactly where biases creep in to evaluation. It's really, really hard to assess these signals objectively, and not use them to justify your snap judgements or prejudices. So maybe when the guy with a college sports background and a military-style buzz cut talks about how 'the team' did stuff, you mark it down as a sign of being a great team player and sharing credit. But when a guy with long hair does it, you assume he didn't make any individual contribution. And if a young woman talks about 'I' did stuff, you interpret it as confidence and evidence of personal contribution, but an older lady with kids does it and you dig in hard for evidence she's taking credit for others' work. Not personally accusing you of these specific biases here - just trying to show how these vague 'personality' judgement signals can be unconsciously misjudged by an interviewer, without them even knowing that's what they did.
At the company I work at when we interview we tend to ask tougher and tougher questions until the candidate either starts spouting BS or replies that they don't know. BS is a no-hire.
If you're going to do this level of hazing, save yourself some time and just have the candidates compete in a big Jeopardy match. Those that don't know will stay silent, those that know will answer correctly, those that don't know but answer anyway will be punished more than if they stayed silent.
Personally I'm pretty tolerant of a lot of BS, depending on the person. Sometimes it's just a cognitive quirk that someone uses BS to search around their brain see if they can trigger some association that lets them remember, or discover that they really don't know. Saying "I don't know" immediately is often the wrong thing to say (especially in interview settings) when the real answer is "I don't remember, let me rediscover / look that up", but not many interviewers let you look things up or will mark you down if you say you'd need a reference for X though you studied X some years ago in college like everyone else and might work it out (after wading through a bunch of BS and false-starts in general) if you had unlimited time and no pressure...
An ability to state "I don't know that" without shame is a strong positive signal of intellectual maturity, but I'm not sure that I'd want to press it out of a candidate by asking an impossible question, if it had to get to that point. The process that you describe isn't one that I would feel comfortable using, but I like the objective.
The thing is, at the edges of people's knowledge there are two issues.
One is, people want to demonstrate what they know about the topic, even if they can't answer the specific question. If you ask me to explain elliptic curve cryptography, I can tell you that it's useful because it uses smaller keys then prime based RSA crypto, and that's something I'd want to mention to indicate that, despite the fact that I'd need to look at a book or writeup to actually implement elliptic curve cryptography and I'm not sure I can verbally explain the math behind it, I at least know something about it.
The second is, at the limits of my domain knowledge, I may not know what I know and don't know. Limits are fuzzy, and I may feel like I have a pretty good understanding of a linux file system from a user perspective (permissions, inodes, etc) but I couldn't implement one off the top of my head and I'm probably missing some important components. As an interviewee, I may have incorrect or incomplete knowledge, but not know that. That doesn't mean I'm bullshitting intentionally, I'm just trying my best to answer the question I think you are asking.
Also I have to think that other developer cultures may have different values around these ideas then my own. Maybe the cultural expectation is to show confidence in my ability; I know that confident individuals are more likely to get hired or even apply for jobs, and that often job requirements are a screen for confidence.
Sure, I'd love if I knew every gap in my knowledge and every mistaken understanding I have, but I don't know how realistic it is to suggest a deliberate deception on the edge of a person's knowledge.
And that's setting aside whether the interview process is a good predictor of workplace behavior... I think we all know people who are excellent devs that can't write on a white board for shit, or who sit down for a good think for an hour and then come up with a perfect solution but aren't going to be able to write and simultaneously talk you through a solution to a 15 minute data algorithm question.
Knowing what you don't know is pretty important, because then you know when you have to go look something up rather than rely on your memory. When I have to call mmap(2), I know that I have to read "man mmap" first, because I know that I just can't remember the order of the five arguments, or if that's even the right number. I've been working with UNIX type systems for 30 years or more, and have worked on the guts of three different kernels on many different architectures, but when I say that I know UNIX, I think that I mean that I really know where the stuff I need to know is described in the man pages.
It's important to have approximate knowledge of many things, including how to look up and figure out the shit you don't know.
That said, I think you balance assumptions vs knowledge in order to be effective. Like you assume that it's in the man pages, and the man pages describe the code that is being used on your system, but it's entirely possible that your version of vim or whatever is different then the one described in the system wide man page.
No one said "impossible". Just harder and harder. I think this is okay, especially if the interviewee knows that's the plan. Make sure they shed the notion that they're expected to know everything that is asked of them, let them know that your plan is to challenge them.
A developer who can't accept a challenge that stretches his/her boundaries a little bit is probably not a developer you want.
I completely agree that I want to hire programmers who want to constantly learn and improve, but I'm not sure that I should expect them to do so during a stressful high-stakes interview.
I've had good success when we introduce the conversation with "We're going to ask a bunch of questions across a variety of problem domains. We don't expect anyone to be strong in all these areas, but we want to get a feel for where your strengths are, and then we'll drill down into those areas"
But, once you do that setup, everyone is eventually going to say that they don't know what they don't know. It's exceptionally rare for someone to be able to delve into the implementation details of a couple of different programming languages (of our choosing), cryptography, distributed systems, relational modelling, CS theory, and a variety of algorithms and data structures.
So "I don't know" ceases to be much of a signal, since everyone says it a few times.
In fact we prefer to see the opposite, where people are willing to tell us the bits of info that they do know. Even if you don't know how two phase commit is implemented, knowing what problem it's trying to solve will allow you to research it when the need arises.
Because an interview situation is so unlike a normal workplace environment, I don't know any reliable way to translate their willingness (or otherwise) to say "I don't know" in the interview, to on the job behaviours. Either you'll coach everyone into it, or you'll be letting the interview pressure (potentiality) distort their behaviour so much that it's too hard to derive a signal from.
This is basically how PhD qualifying exams work, as well as the defense to an extent.
You're welcome to guess or describe how you'd think about something, but you should be willing to clearly state when you're really out of your depth. It's important to note, though, that this isn't done adversarially; it's usually just a natural conversation that goes down a particular rabbit hole.
As you said, interviewing itself is a very stressful environment. I've been asked a time I've failed by one of the HR person and I talked about it. Was basically sending emails to real people from my test environment...
I talked about it openly but he kept pushing. In the end my feedback was that my "time I've made an error" was not very impressive.
Like... wtf? I find these questions very weird. I never know what they're expecting as an answer.
Imagine you start a job painting houses. You would not paint them the same way on your first day as you would after a year of experience or 5 years of experience. What have you learned? How do you work smarter/better now than in the past.
What experiences triggered that learning process, or did you just take orders from someone else with a more problem-solving oriented worldview.
The same applies to technical work. We're not perfect robots, we are given the freedom to figure things out, so the goal is a refinement of the "program" in our minds. The more effective/sophisticated that program is, the more rational we are.
Why don't you just ask for their self-performance reviews / retrospectives for the last n years of their current/previous job after you've filtered by resume? I suspect that unless they were expecting it (which isn't unreasonable since a lot of interviewers have variants) "remarkably unsentimental" people like John Carmack would blow this part of the interview. Why do you care about the step-by-step process of my programming journey, which is irrelevant to whether the current me can help you out right now? If you're trying to test for passion ("why'd you start house painting anyway and what drove you to keep doing it for so many years?") or self-awareness there's probably a better way.
I would consider asking for that too invasive. I think the important thing to get a sense of is how they look at problem solving as a function of expertise.
In other words, it's possible to be smart but not act smart, and it's possible to act smart while not being exceptionally intelligent.
I'm basically looking for self-awareness of one's own cognitive limits and evidence of a meta-process that seems likely to result in better decision making than whatever the raw intellect would offer.
I don't get how that relates to the question asked to me? I mean, it was a time I had a problem / failed that impacted production environment.
That was the only one I had in mind, the guy looks like he was expecting a time I've put the servers on fire. It simply didn't happen.
Another thing is that my current environment IS very CTO pushy taking order and doing his view. Why stating this counts against me? It's not like everyone can take leadership in the tech stack / implementation.
I am a 100% sure that I would get into his (big) company and just follow orders as well. Of course having my implementation and my personal touch every now and then but still I wouldn't get to make architectural decisions or choose the tech stack.
Unless you're working in a 100% microservice architecture with hundreds of small people team deploying separated stuff I don't see how that can happen.
>I think this is a super important characteristic for a great employee to have, but I don't know how reasonable it is to expect to see much of it in an interview. In an interview setting, the candidate is under a lot of pressure to come across as "perfect" in the sense that they want to appear to meet all of the expectations the interviewer has for them.
I would never hire someone who couldn't honestly speak to making mistakes and being wrong. We're human, and your ability to be self critical and learn from your mistakes is one of the single most important aspects in continued professional (and personal!) growth.
It should never be difficult to speak to a time you've screwed up. No one rational thinks we go through life with a perfect record, and no one has done it. Be honest.
Just to clarify, I do not suggest asking about mistakes a person has made. That is both a hint that I'm looking for a flattering narrative and not quite true. I'm looking for someone who can think back and identify the things he/she was wrong about. What were the false assumptions, how did those come to light, how would you solve a similar problem differently after that experience, etc.
In other words, is the candidate someone who has a rational meta-process going on about his/her decisions and is able to refine his/her approach based on learnings. I'm ideally looking for small things that show clear evidence of that kind of thought process.
As for body language, I agree, and I think my choice of the term "poker face" was a bad way of describing a certain kind of evasiveness which can indicate bad motives or bad people skills, either of which can make the person a bad hire.
> I think my choice of the term "poker face" was a bad way of describing a certain kind of evasiveness which can indicate bad motives or bad people skills, either of which can make the person a bad hire.
As a practical matter, how do you detect this kind of "evasiveness", and differentiate it from (say) plain old introversion or nervousness?
Good point. I certainly require more than a few moments where I sense it before I feel confident in considering someone's behavior evasive.
Suppose you're having a conversation with someone and in 15 minutes one of you chosen at random is going to have to trust the other with his/her life. What kinds of signals indicate that the person is trustworthy, understands the potential future, and is accurately representing him/herself?
There are some people where I never feel the kind of connection I'd want to feel in that scenario, and others where I feel it fairly early on in the interaction. Just a kind of straightforwardness, not extraversion or social comfort.
That is certainly a risk with any notion of "culture fit", but any time someone who cares about culture is interviewing someone that fit is being evaluated to some extent.
A big long list of relevant but subjective traits is a perfect vector for bias to creep in. Take what you call "poker face", for example - how do you measure openness? By how well they make small talk? What if they're just nervous because of the interview; what if they experience social anxiety; or what if they're just an introvert, and don't tend to be open around new people?
For a less trivial example, take "resourcefulness". To a large extent, resourcefulness is determined by access to resources. Continued learning after college takes more than curiosity: it takes a bunch of free time to burn on reading, hacking, debugging, etc. What if the interviewee has been spending their time outside of work raising a family instead?
I agree about the possibility of bias creeping in. On the other hand, humans are biased, so understanding how to hire for success within your own (or your organization's) biases is useful. Some people would want the opposite candidate to the one I described.
Resourcefulness is more of a mindset. New parent? OK, what have you discovered about childrearing that you didn't anticipate? What will you do differently next time? What is your ideal work/life balance? These are all relevant things, and sometimes the young parent is a superb hire and may read very interesting books in anticipation of having more time to hack in a couple of years.
I would not ask those things, my point was that whatever a person is focused on can indicate the characteristics I mentioned. Obviously some things are personal and cannot be asked about, but may be volunteered by an applicant.
FYI asking people what they do outside work is often a big HR no-no because it can easily fall into illegal but accidental discriminatory questions that are unrelated to the job which can be unnecessary PR and legal risks for your company.
I like the attributes you're probing but I'd keep it within the context of engineering and programming if that's what you're hiring for. That's part of the reason open source and github projects are such strong signals in hiring.
>- desire to grow -- does this person have clear career growth goals. Ideally these are not tied to titles or bossing others around, which are negatives.
Can you give an example of what you WOULD like to see here? There's some selection bias among the kinds of developers who tend to hang out on online technical communities and learn new things, but what IS the right long term goal, and how is this different from "desire to ship"? Sure, I'd like to work on meaningful projects and have enough breadth and depth of knowledge to design and contribute in a meaningful way in them, but if we're being totally honest my career goals are absolutely "make enough money to pay the mortgage, go on the occasional vacation, and save for retirement". I'm not in this to be a rock star who works 100 hours a week for the chance to be a multimillionaire, and I'm not in this to achieve fame in the developer community. I'm in it to get a paycheck doing work that is intellectually stimulating while still having enough free time to do things OTHER than work, but it's long been obvious that you're not allowed to say that in a job interview, so what ARE you looking for people to say?
To put it another way, in the title terms you don't want to discuss, I have no desire to be a principle architect or C*O. I'm perfectly content being a good-to-great senior-level engineer who's a solid team contributor if not in control of everything. I'm on board with your other qualifications - continual learning, adapting, and gaining exposure to new technologies is great - but is there a limit to career growth? Based simply on number of positions in existence, it's impossible for everyone in the field to keep growing and advancing indefinitely.
Well, I ideally like to see some interests that are on the edge of the candidate's current skill-set which indicate that he/she is aware that technology is changing and that we must all plan ahead a bit to remain relevant and employable in 5 years. Not trend-tracking, but deepening and broadening the skill-set.
Because I'm going to be in the game for at least another 30 years and who knows how many opportunities to work together again might exist.
Also, the idea that an employee is a specific cog that will indefinitely fit into the mechanism in the same way is an outdated relic of using a machine as a metaphor for a firm.
While some employees reach limits, others will continue to evolve, particularly those at the beginning of their career. I have not ever been in a position where I prefer a less ambitious/promising candidate out of fear they will leave after a few years, though I know some people in HR look at the world that way.
I think I can totally make that impression now. I totally couldn't when I was younger - and when I would have provided you much more value for the money (meaning, I'd work my ass off for peanuts.) I would have failed on numerous grounds:
* desire to ship: more like "I don't care who uses my work, I just want a job"
* desire to grow: more like "I don't know what people grow into, I just want a job"
* rationality: I dunno, I might have tried to fake understanding something when I didn't out of fear
* poker face: I dunno, probably I'd have "very low levels of openness" because fuck, I really really need a job and I'm super nervous
In general, usually the guy who appears nice, sensible and calm is indeed a great hire, but they cost a fortune because they're actually quite rare. A very simple way to hire pretty great people cheaply is opening the doors for the hordes of somewhat-mentally-imbalanced people.
Those are good points. You're right I'm describing a fairly rare kind of hire, and a solid process must take into consideration the reality that not everyone is that strong or interviews that well, yet may have other highly valuable traits.
Perhaps an appropriate measure for interviewing is to do some psychological profiling to spot where a candidate is imbalanced, and then incorporate into the remainder of questioning. As an interviewer, you're always going to be biasing towards your own strengths and downplaying weaknesses, and although there are personalities that pose more or less danger to others, there isn't a perfect personality overall.
>> - poker face -- does this person seem to be playing it close to the vest in the interview or otherwise have very low levels of openness?
By nature I am pretty open, and not especially competitive. While my credentials are not great, I have been praised by certain mentors for other desirable attributes in this list: open mindedness, humility, pragmatism, adaptability, willingness to collaborate, and “growth mindset.”
The problem with these attitudes is that they can work to your disadvantage around non-technical authority figures.
For example, a little while ago I was invited to meeting with possible collaborator in another department. After being introduced, I asked few general questions about his project. He probed for clarification until I finally said something he found technically wanting, and then made a great show of grandstanding at my expense in front of the group, but didn't answer the question. My (non-developer) boss was very impressed with him, but other employees that had to work with this guy were not as impressed. I wasn't invited back, although maybe that was good thing.
Obviously it's not good to seem evasive in interviews, but in general I have had to learn the hard way that not everyone at work wants to be my friend.
How do others here deal with this sort problem?
Edit:
I guess I'm not sure how far I have to go in the direction of measured answers before it might seem evasive in an interview. Because of these experiences, I'm probably inclined to give more measured or carefully considered answers in general to people I don't know well. I'm not sure what degree of this sort of behavior will seem evasive. My last boss told me once that I was generally "very careful." Maybe that's a hint?
So what I mean to ask is: how do others calibrate their degree of openess to a given situation, in an interview or otherwise?
Not everyone wants to be your friend, but everyone SHOULD want to work together towards a common goal. The trick is in getting the two parties involved to see that common goal and why it's in their own best interests.
It sounds like the person you're talking about was using the conversation not as a way to acquire help, but rather to demonstrate that he didn't need it, probably in an effort to boost his own status in his social group. You're better off not working with that type of person.
That's definately true. The problem is figuring out how to avoid situations like that without hesitating to speak up around people you don't know well.
I won't pretend to know the answer to that, but my personal approach would be to gather information about the people involved beforehand. If I can avoid as many surprises as possible going in, that puts me on (more) equal footing with the other players.
The scenario you describe has happened to me as well, and it's one reason I've come to value the attributes I listed. It is very common for non-technical people to be impressed by that kind of bravado.
It's usually a sign of a corporate culture where some of the wrong characteristics are valued across the board. Do you really want to "win" in that culture? I know I don't.
I feel like the whole "poker face" thing has been my achilles heal. I AM guarded with a lot of my personal information; in my experience for good reason.
So what do you do in your free time? FUCK YOU. If your answer isn't play softball and train for marathons, it's a way-ahead/way-behind scenario. Maybe you get lucky and interview with someone who has shared interests.
> I really like movies. I watch a lot of modern movies just for entertainment, but I also watch a lot of classics, even boring ones, to try to fill out my knowledge. I would love to make a short film and get it accepted to a film festival.
< Huh... which means: so you're, like, not very social? And you don't seem very serious about your career if you are trying make movies or whatever. Good luck getting into Sundance, whatever the fuck that is.
I once, only once, made the horrendous mistake of telling two interviewers (they were doing the old gang-fuck) that I was passionate about Starcraft II (it was 2011 or something). I thought, what the hey, the truth will out. If I'm just open and genuine at least that will come across as a positive. They actually openly ridiculed me and subsequently told me that they weren't the kind of company that extended much free time or control over schedule to their junior employees, which I thought was especially interesting (frustrating) for two reasons: 1) Why the fuck are you asking me about my free time if I won't have any? 2) They just got through explaining how their company didn't have any hierarchy. So what the fuck is a "junior employee" in a company with no hierarchy?
On non-personal subjects, saying "I don't know" has never worked as well as everyone claims. Oh, just say, "I don't know". That shows maturity and confidence. WTF? Whenever I said, "I don't know" in an interview, the interviewer kept pressing me for an answer until I produced some garbage that I'm sure sounded rather like BS, because it was.
Ultimately, I agree. It's complex though; every job I've ever had has left a sour taste in my mouth, yet I still need to eat and pay bills. And despite the difficulties I've encountered actually getting jobs (interviewing), my employers have always said I was doing great work.
I just had this discussion at work... I think that it is implicit in this context that a minimum amount of technical ability be met. Otherwise we'd just be hiring "nice people". Obviously we don't do that.
Regarding your last point... sometimes, it's a matter of being professional. You don't want to go on a diatribe about all that was wrong in a former position or company. It depends on context of course, but in the end, sometimes there are reasons not to be completely open.
In my personal life, I try to live unfiltered. Not mean, and work more to filter on sensitive subjects, but I mean that let thoughts, even incomplete ones flow. Otherwise, I personally tend to stay in my head, and not let much of anything out. It's a balance.
I can't speak for others, but would say that your interpretation on the last point may not always be the correct one.
I think this is a valid point. My characterization of openness in my initial comment was a bit narrow. I think the most important aspect is openness to new ideas and experiences, and willingness to put one's self out there a bit.
Great points, though out of curiosity, could you expand on the last point?
As someone who has always topped performance ratings, gotten along well with colleagues (many of my past co-workers are close friends now), and provided meaningful to significant contributions to all the projects I've worked on AND has a reputation for having a poker face, I'm a bit concerned that it would be as bad as an "instant do not hire"
Exactly. I realize now that the term "poker face" came across as too literal, and what I was really trying to describe was exactly the sense of caginess or unwillingness to put one's self out there.
> - poker face -- does this person seem to be playing it close to the vest in the interview or otherwise have very low levels of openness? This is a big warning sign and an instant do not hire.
You do realize many people are just naturally hard to read right?
I'm so hard to read people have made comments about how, if I was going to a wedding, they couldn't be sure if it was mine based on how neutral my delivery is.
The only time I'm easy to read is when I'm visibly irritated and/or angry.
This is also part of why people think I have a temper, they miss literally all the warning signs for weeks/months/years so to them it comes out of "nowhere" despite the fact I've verbally expressed unhappiness to varying degrees. However, the neutrality of the delivery causes them to assume I don't care that much [so they continue, repeatedly].
I usually ask "What are you interested in?" and just let them speak, then ask them to elaborate a few times. People who find this frustrating are usually low-openness, as are people who become noticeably indignant after being asked to share more.
Also, I ask what are the most important things they are looking for in the team they will be joining. Some people are interviewing because they dislike their current job so this is a good clue about their morale and what kinds of situations can lead them to give up and start interviewing.
By this point, after talking to someone for a few minutes I usually have a sense of whether they are easy to talk to or whether the conversation was draining for me and a notion of how smart I think they are. If both of those feel good I focus on selling the team and what it offers so they bring their A game to the in person interview.
> I usually ask "What are you interested in?" and just let them speak
I've tried that a few times but never really got any decent results out of it. (Disclosure: I conduct ~100 interviews a year.)
The question is too open-ended, so IMO it's akin to asking a child "what did you do at school today?". The answers for that are just non-specific and often the candidate falls back to describing what they currently do at their work. Hell, if they are interviewing for a new job, they are looking for a CHANGE, and I learned just one thing they probably don't like. Instead in preparation I go over the candidate's CV and try to gauge what kinds of technologies, environments and problems they have been exposed to. And then I pick the most interesting and/or longest ones as items of discussion.
Most of the times I find that the candidates are happy to tell about their experiences and voice their opinions. This has the dual benefit of allowing the candidates to better relax, and gives me a glimpse into what kinds of details catch their eye. The very best experiences are those where the candidate forgets they are in an interview and end up treating me to a wonderful lecture on a topic that I have not known enough about. When that happens, I always feel cherished.
The reason why it's a problem is because when a coworker is always in his or her own little bubble and you know nothing about them it is harder to get along with them and thus harder to work with them. And by the same token, if you like and get along with the people in your environment, then you'll be happier and thus more productive.
I don't believe in this at all. Some of the most effective people I've worked with were people that quietly worked in their own space for 4 hrs before lunch and 4 hrs after lunch. Some of them socialized with coworkers after hours, some didn't. At least in engineering field I don't believe it is important to be that open with people unless maybe you're in sales and need that social vibe all day to keep in state. Engineers can be highly effective staying in their own space unless communication is required and omitting the usual everyday small talk BS that wastes 1-3 hours of your day
Sure, those people who keep entirely to themselves may be highly effective programmers. But they will likely not be effective communicators, and unless they are their own manager and work on teams of one, they are going to be less effective overall than someone who can communicate.
This attitude is why so many people bemoan the "frat house / in-group" culture of companies. Why can't you have a pleasant conversation about the problems you are trying to solve at work?
I'll be more specific about openness. It's not bad to be socially introverted. Many of us are somewhat, especially in comparison to the average member of the population.
But imagine you're at a restaurant with your team and someone orders a dish you've never tried. Do you simply decline to try a bite? Or do you try a small bite in case you might like it? I find that openness to new experience correlates highly with good problem solving skills and team cohesion.
Many group/team interactions are essentially minor culture clashes, and being able to navigate them requires being open to different points of view, etc.
Replace dish with drink (or worse, joint). Do you still think people who "decline to try" have a problem? That they should try in case they might like it?
I'm really NOT keen on trying chocolate-cover ants, thank you.
Some people are vegetarians. Others are Muslim or Jewish and only eat halal/kosher foods.
Many groups are discriminated against (historically or presently), and members of those groups may not be too keen to open themselves up to unfair stereotyping. I've dated a girl who would never tell a prospective employer she was Jewish; she just wouldn't eat a pork dish and would probably seem evasive to you.
That doesn't even account for those who may just be squeamish or petrified of insects. Still others have embarrassing dietary restrictions.
It's hard to fathom when you have mainstream traits and limitations (it certainly was for me), but many people who consider themselves others are reluctant to give away information that could later be used to hurt them. Sadly, this reluctance is often justified.
Food is so so so personal and cultural. It's insane to believe someone not eating/not eating a particular food has anything to do with workplace performance.
As metaphors go it's actually pretty good: people often have very specific, very individual, and entirely valid reasons to not eat something. Just as they can have the same for experiences under different conditions. Change the conditions, and you might get entirely different outcomes.
What I'm saying is, the one-shot impressions from interviews are horribly inaccurate. There is simply not enough data, plus the measuring instrument (interviewer) is flawed also in nonreproducible ways.
A simple thought experiment: let's say I build a machine that measures confidence and ability to read the room. Would it select different candidates than you?
I wouldn't. I don't really like food that much. It's easy to ignore when you've had something a million times, but it's harder to ignore when it's a completely new, unpleasant sensation.
But, I understand that other people do like food. And that's ok.
poker face - is super cultural thing, I work with several Asians, most of them (not all!) sits _with_ poker face whole day. It is kind of hard to read people and understand if they got it when I'm explaining something. They are not even nod or smile. I'm European and sometimes it is super frustrating to not receive any body lang feedback. But when we close office doors they starts to chat, nod, laugh and what not.
On other hand there are two Japanese who acts like Europeans all the time - nods, talks, etc even though they left Japan ~2-3y ago. I thought initially that they already adapted to this Western style.
I've talked with them all about this and they just said that they just do what they did in their countries.
Fun fact: There is a word in Japanese -- aizuchi -- for "meaningless nothings which you say during a conversation to let the other party know that you're actively engaged in the conversation."
Sitting with a stone face during a meeting in Japan would be very not normative. Your counterparty would assume that either you're totally checked out, repulsed by what is on offer, or so ridiculously above their social standing that you didn't even have to go through the motions of pretending to care about what they had to say.
Your periodic dispatch from Asia Is A Big Place; Consider The Possibility It Is Populated Mostly By Humans.
My experience with Japanese people (I used to work in a Japanese company and attended a lot of meetings in Japanese) listening to someone else in meetings is a lot of "Aaaa, sou sou sou sou. Sou desu. Wakarimashita. Hai. Wakarimashita. Mmmm. Mmm Mmm. Sou desu." I eventually did some of it myself. Seems like a lot of aizuchi to me :P
And how open to be is also cultural. Americans appear excessively cheery and excited all the time to Irish/British people. It sounds so extreme, that it sounds forced and fake.
Everything in America is awesome, people are super excited to work with you, etc. British people might say "cheers" to say good bye in a shop, but Americans wish each other "have a great day!".
It might be because the US is such a spread out country, we're sometimes deprived of interaction with people and not just cars on the highway. When we meet other humans, especially those with similar interests we're genuinely excited.
I think mentioning these points upfront is a great way to give potential employees some idea about what kind of environment they should expect and whether they will be a good fit or not.
To me, at least, having this kind of information would immensely help me narrow down my list of possible employers.
In addition to these being great things to look for in a candidate, these are also great things to strive for as a coder.
Adaptability - being able to look at a change in direction/target market/etc as an exciting challenge keeps you less stressed
Openness - always learning keeps your brain's plasticity up, keeps you happier, and helps you be ready for whatever new concept/language/whatnot will be useful next
Resourcefulness - being able to be a combination of self-sufficient and knowing where/how to acquire the resources/skills you need when they're needed helps keep the stress down
Desire to Ship - if you haven't gotten the chance to deliver something to real users, do it. Somehow, anyhow you can. Whether at a 9-to-5 or doing open source stuff in your spare time. Seeing how the work you do can affect others, make their jobs easier (or even possible in the first place), make their lives better, etc is a game-changer if you haven't done it yet. And if you're working on something you don't care about shipping, you're probably working on the wrong thing/for the wrong company.
Desire to Grow - Several of the above help with this and the whole "keep moving forward" thing. The more you learn, the more experience you have, the more you can build and ship useful/interesting/fun/cool things.
Rationality - You can't grow, adapt, and gain new resources without screwing up. We all do it. Follow that with admitting it, analyzing it, and learning from it. Sometimes you learn the most about a situation like this, often about yourself and your own inner-workings. That gives you the knowledge to be able to start working on improving those inner-workings.
Territoriality - Turf wars are always a mess. Whether at a company or in an open source project. So, figuring out ways to let things overlap and allowing different groups to help make each other better helps you in the long run. And it's always gratifying to take a win-lose and turn it into a straight win.
Poker Face - Be honest with your interviewer and expect the same from them. Why would you want to keep things close to the vest and wind up in a company/team/situation that isn't the best fit for you and doesn't fully allow you to adapt, be open, grow, ship, etc?
All the above will help with growing and enjoying tech. You'll still have to deal with interviews/companies like the one posted here, unfortunately. I'm learning that now as I'm exploring getting back into working for a company full time.
These are all great, but I think you're missing something much more basic: commute time. All of the above are great, but if someone's commute time to work (and home again) are intolerable, they will not last.
these are all great traits, but how do you determine them in an interview? for example openness. Everyone will say they are open and interested to new ideas, maybe accept few odd cases.
how do you test for rationality? do you wait to catch them on something and keep drilling until they admit defeat?
I've found that people who are open usually think of the interview as a way to learn new things and to determine whether it's a good fit. They will ask questions and seem to enjoy gathering knowledge.
Rationality is more of a value than a litmus test. Those who value rationality will understand systems thinking and will tend not to focus highly on authority (either being authoritative or deferring to someone else's authority).
thanks sounds like you've done your share of interviewing. It would be great to hear more about your experience if you ever want to write a more lengthy blog post on this subject. I still find it very hard to predict who will make for a great employee.. provided they do ok on technical side it's still more of an intuition thing rather than process.
Daniel Kahneman talks about an idea called substitution in his book Thinking Fast and Slow that I think really applies here. Here's the jist: When your brain is faced with doing a task that's going to require a lot of glucose it will look for shortcuts to save you energy. One of those shortcuts is your mind will look for an available heuristic, swap out the energy hungry analysis for the heuristic, and then signal your conscious mind that you did all the analytical hard work.
I think the truth of that matter is, most of us (myself included), don't know how to interview people well. Interviewing is really hard. Rather than doing the hard work by researching the subject and testing ideas, most of us try to imitate successful companies much in the same way the Melanesian cargo cults imitated the construction of airfields and air traffic control towers to lure back that wonderful cargo.
I suspect we like to tell ourselves that we're more analytical because our work can demand rigorous precision. More often than not, I find we developers tend to select heuristics that indirectly test a person on how similar they are to ourselves or people we aspire to be like. Then again, I'm probably making a broad generalization.
I think you're right about the mental shortcuts, but the main effect I see at play in the article is about misaligned incentives.
It's in the organization's interest to hire the underrated, but it's not in the HR recruiter's interest to do that. If the recruiter refers a candidate who looks good on paper but turns out to be a bad hire, no one's going to blame the recruiter. But every time recruiters have the engineers interview a candidate that's not from a top school or a top company, recruiters risk losing credibility with the engineers if the candidate is rejected or ends up being a bad hire. So recruiters focus on the candidates who are the most defensible, not necessarily those who would make the company the most successful.
I liken this to a government contract seller - nobody will ever get fired for choosing IBM/Microsoft/BigCo as a vendor if it fails, but you certainly can for choosing that small shop that can usually get stuff done faster.
Cover Your Ass isn't a good incentive pattern when you're trying to hire engineers, which is hard enough already. At the same time, its really difficult to balance the time. My trick (and I think this is common): interview anyone that comes in on a referral within reason.
Exactly. Recruiting is ripe with principal agent problems.
Recruiters don't get blamed for the undervalued candidates they failed to hire. Nor do they get fired for hiring Stanford grads. (The engineering equivalent of IBM.)
The frustrating part is that even when this is brought up and openly described and discussed explicitly in the context of working towards a better and healthier hiring practice, manager and HR types still insist on the cargo cult nonsense.
It goes further than just an inefficient heuristic or biases. It becomes codified, even venerated, standard practice, and gains an air of certified approval up the corporate ladder, and eventually becomes something that, politically, you're not allowed to disagree with and typically you must even display boisterous and enthusiastic approval of it.
It exacerbates the crab mentality that affects the programming world -- engineers basically start to feel that if they had to put up with stupid, inefficient interview hazing, and if it's all just going to be a political status game anyway, then they're going to drag the candidates down into the muck with them and give themselves over to designing their own hazing interview questions.
The spirit becomes focused on cutting people down, and only hiring people if some gauntlet was unable to cut them down. Without ever stopping to think: what kind of human beings are you actually helping to succeed? And what kind of human beings is this process systematically harming? Nobody cares.
"Good hiring" is fundamentally an altruistic ideal with all the attendant difficulties of idealistic practices in the modern workplace.
This may seem ridiculous given competition for talent and cutthroat markets involved in startups and tech companies...
But from the perspective of low-level hiring, where someone you hire will possibly replace or eclipse you, or where you may empower a team with more resources, or there is budgetary envy (or kill-or-be-killed cuts to funding), or easily masked discriminations aplenty (dotNET vs Java, Stanford vs Berkeley, Indian vs Pakistani, Georgia vs Alabama), conflicting management directives (we will hire the best but only pay for mediocre) or just sheer resistance to change... there are many reasons to resist transparency, hire mediocre, submarine good candidates, backstab rivals, all from obscure chaos of middle management and its wannabe Machiavellis.
Management can put their smiley faces as they impose conflicting priorities and try to parse data that is fundamentally enshrouded by the multiple agendas from actors with highly variant priorities in a complex game theory all they want, but idealism is hard in cutthroat capitalism.
Oh, did I mention that all candidates are lying to some degree of mendaciousness? Oh yeah, that.
I can see HR's desire to follow some sort of procedure though. If you let any manager hire anyway they so desire, it could open the company up to discriminatory hiring practices. By having a codified (if ineffective ) "procedure", they can use this as a defense in a lawsuit and say "look, see ? Everyone gets hired the same way at this company, and the plaintiff was subjected to the exact same scrutiny as everyone else."
I agree that some standards are helpful. However, I think for the sake of the discussion on this thread, we are talking about all sorts of unnecessary, buzzwordy things that are absolutely obviously not necessary, and in fact are even harmful and in some cases may even increase the chances of discrimination lawsuits, and that this is openly understood even by the HR managers who set such policies, and that they are still not changed or even re-evaluated under some framework providing even a tiny consideration for their human impact.
One of the modern classics is ageism in hiring, which is baked right into the whole process in a lot of ways that dangerously straddle the boundary of legality. HR types place a high emphasis on this because hiring younger engineers means they can pay lower wages and those younger engineers have less experience about how employers treat people, so they are less likely to expect basic, dignity-preserving job features, like private working conditions, respect for work/life balance, etc.
Of course, they can't come right out and say they are trying to hire cheap dummies who don't know they are being swindled. So instead they invent code words like "thrives in a dynamic environment" and "handles vague and conflicting business needs well" which are just short-hand for "this worker will not enact the obstinate, incredulous frustration that they rightfully should enact upon learning how we plan to actually treat them" -- which often screens out more experienced candidates who know what shit companies try to pull.
This is how a lot of the nonsense bullet points in a job ad get there. It's also how a lot of nonsense company handbook policies get there too. The bits comprising those characters didn't just get flipped by cosmic rays and randomly appear in the job ad or the company handbook. HR and legal staff placed them there, with intention and forethought -- which, if you're really thinking clearly, means that most job ads are frightening windows into how the company conceives of its workers.
The doesn't actually work if your hiring procedure is actually discriminatory.
There is a concept called disparate impact in US employment law. That means your employment process can't have a disproportionate adverse impact on a protected class, unless there is an actual business requirement that causes the disparate impact.
An example would be: if your job requirements is "must be able to lift 70lbs" it probably will have a disparate impact on women and the disabled. This is fine as long as the job actually requires heavy lifting (such as a mover). But if you require candidates to be able to lift 70 lbs for an office job - then it's illegal discrimination even if everyone who is hired meets that criteria.
I think most people in middle management, especially HR, are looking for "plausible deniability". They want their actions to appear defensive enough to keep their job. So don't hire self-taught, always demand a degree. Follow "industry standard" hiring practices. etc.
They don't understand tech and they don't need to. If something goes wrong they need to be able to demonstrate it's not their fault.
Kahneman also writes in the same book about his experience with interviews as a young psychologist with the Israeli army. His recommendation? A simple (but well thought out) rubric.
Identify 5 or 6 qualities that are essential to the success of your team. Tailor your questions around evaluating for those. Interview a bunch of people, score them on a scale of 1 to 5 for each of those categories. Resist your gut feeling. Then hire the ones with the highest score.
He acknowledges that it's crude but it was more successful than their previous process and makes as much sense as anything else I've read on the subject of hiring.
Amusingly, when he got pushback from the interviewing team for not allowing them to follow their well-honed instincts, he agreed to add "Gut Feeling" as one of the 5 or 6 parameters on which candidates would be scored. That detail, for me, sums up the man's brilliance.
This sounds quite similar to the competency model, which basically entails listing required competencies for a position and then developing a set of specific questions to evaluate each question.
Probably the only good thing I did in my first management job was to create a specific competency matrix for each position and to evaluate candidates against it. We only hired a handful of people, but I was very happy with every person I hired.
The amount of self-delusion among my fellow developers is truly breathtaking at times. Just the stuff I catch myself doing - even while this pattern is a huge pet peeve that I obsess over - is pretty bad.
We need to feel smart, even though half the stuff we do is pretty damned stupid, and we keep doing it over and over again for decades before catching on.
We still have watercooler conversations about issues that were identified in books that are now 30 years old. How depressing is that? Just how clever are we?
Unless one is interviewing for a really specific set of capabilities for a particular job, it's best to try to gauge aptitude for programming positions instead. I believe that it's possible, with the right questions, to check for basic competence in a few areas and to avoid some bad hires. I have a few small problems in my repertoire that can be solved by any good programmer in 5-10 minutes, but can't be solved in even 45 by a candidate who lacks basic understanding of how to analyze a problem, deal with basic abstractions like indirection, etc.
I.e., I've given up on trying to distinguish the great candidates from the good ones, and now I just try to identify the people who just have no idea what they're doing. The error bars around the results of a 45-minute technical interview are just too damn wide, even with research and training, to do anything except try to prevent obviously bad hires.
The hiring manager has a principal-agent problem going on. The manager's incentives are more around avoiding blame for bad hires than getting the best expected value for a hire at the best price.
I would also say that interviewing is hard because we have limited time to decide upon each person and what we judge is (apart from some technical qualities) not what kind of person really is but rather what we believe the person is (based on our limited human perception). And with handful of techniques a person can be really cool from at the first sight (trendy, rock star, etc.) but might be a bad choice in the long run.
Limited time makes it immensely difficult. However, I've found my prejudices against people not like myself have gone way past the interview. I had one intern for 2-3 months before I realized I was really lucky to have him, and it was only because someone else pointed it out.
This person didn't have the right education (classes, not pedigree) nor did he project a passion for coding on his sleeve, but it turned out he was disciplined, a fast autodidact and had a very good sense of priorities. I couldn't have been more wrong about his potential.
It's not just that. I think we too often discount people's ability to grow. My suspicion is we probably place too much emphasis on keeping the wrong people out and not enough on developing the people we have.
>When your brain is faced with doing a task that's going to require a lot of glucose it will look for shortcuts to save you energy. One of those shortcuts is your mind will look for an available heuristic, swap out the energy hungry analysis for the heuristic, and then signal your conscious mind that you did all the analytical hard work.
Sorry I didn't read the book but I am curious if there is any scientific evidence to back this up.
Thinking Fast and Slow isn't a textbook, but Kahneman is a Nobel laureate and most of the book reports experimental findings with 30+ pages of citations in the endnotes. It's a good read.
I have heard there are MRI studies that back this up. Basically people make instinctual decisions and the rational parts of their brains light up afterwards to rationalize the decision they have already made. I am on mobile but that should be searchable.
Humans are very good at pattern recognition and we're optimized for it. Real calculation takes time and energy and might make us less able to survive.
> Real calculation takes time and energy and might make us less able to survive.
Sometimes that grass moving strangely is just the wind..sometimes it's the tiger, the ancestors who sat down to have a good think about it got eaten.
When I was younger and played chess my teacher drilled into me that when you find a good move that's the time to look for a better one, we instinctively play the first 'good' move we see, in fact manoeuvring your opponent into a trap by giving them an obvious 'good' move is effective against people who don't play a lot and very ineffective against people who do (and even then vastly stronger players than I ever was still fall for it occasionally).
Just to be clear, this article is talking about ego depletion which is the theory that willpower is linked to glucose consumption implying exercising willpower in one area will deplete willpower for another area.
What Kahneman is talking about is a tendency for the brain to swap out expensive slower analytical thinking with cheaper and faster heuristic thinking. Both ideas are only tenuously related because they both discuss the role of the brain's glucose metabolism.
I sometimes think there's a built in pomposity in the whole attitude of hiring, looking for someone "good enough to qualify." Instead, I like to think of it as, we are looking for someone to help us, someone who has different experience and knowledge, someone who could tell us how we can improve. Then instead of this adversarial situation of selecting a new ingroup member, it's a solicitation for assistance with a built in humility.
This is exactly what I look for when I interview for a new job. It's inclusive, shows humility and a willingness to learn through others' experience. Any other attitude belies an employer that's arrogant at best, and at worst nurtures a narcissistic blame culture.
Appreciate the humanization of both job candidates and employers inherent in the shift you advocate.
I have done a bit of hiring and it was very hard to walk the line between effective use of my time and being open to people with all kinds of experience. (It was a junior position and I talked to some very junior candidates.)
But it is always important to remember that there is a human being on both sides of the table.
Could you expand on how you ended up walking that line? Did you feel you wasted your time on under-qualified people? Did you end up hiring anyone who initially didn't seem like a textbook candidate? If so, how did they turn out?
The reason I ask is that I've always been nervous about applying for jobs when I don't have every skill listed on the ad.
The advice I've been given is usually something like "Apply anyway. The job advert is a wish list, not a minimum."
But I'm still terrified of an interviewer opening my CV and asking how the hell I slipped through the screening process. Or even worse, being too polite to be so blunt and awkwardly going through the motions of an interview.
It would be awesome to get the perspective of an interviewer.
I definitely wasted time interviewing underqualified people. But I wasted time interviewing some overqualified people too (because of salary/location expectations that should have been communicated up front). Actually, all of interviewing feels like wasted time (you have this burning need, and you want to fill it yesterday, or better yet a month ago, and yet you have to go through this process and learn about all these strangers, while the building feels like it is on fire).
Note that I said feels like wasted time, not is wasted time. It's a bit like learning a new programming language--you end up going down lots of blind alleys before you find the way you really want to go.
(All we could afford was non textbook candidates, and out of 3 hires, 2 worked out.)
As far as walking the line, as I got more comfortable interviewing, I always worked off a rubric, and set things up so that I could ease out of a phone interview early if it was clear that they didn't live up to their resume. Ended up doing the same thing with the in person interviews as well. Definitely screened by resume.
> I've always been nervous about applying for jobs when I don't have every skill listed on the ad.
This is a hard problem because some institutions write wish lists in their job reqs, and some write hard and fast requirements. And you don't know before applying. (And sometimes the goal posts in the organization move when their "requirements" meet the labor market.)
Personally, I'd apply if you have half of the requirements and feel like you can speak to the way you learn.
If you're interested in the company, I'd also take the extra step and do some work around it, whether that's writing a pain letter ( http://www.forbes.com/sites/lizryan/2015/03/01/how-to-write-... ) or writing a simple client against someone's external API or creeping^Hscanning LinkedIn, Twitter and Github profiles and finding out about the team and company.
>But I wasted time interviewing some overqualified people too (because of salary/location expectations that should have been communicated up front).
Well that's your own fault as the job advertiser, and when I say "you", I mean almost all companies that advertise jobs. You almost never state the salaries to be paid, so tons of peoples' time gets wasted by pointless interviewing which gets followed up with insultingly low salary offers.
If you're a cheap-ass and want to pay a pathetic salary, you should state this in your job ad, so that non-deadwood people don't bother to apply to your job.
>This is a hard problem because some institutions write wish lists in their job reqs, and some write hard and fast requirements.
It's not that hard to tell the difference. When the word "required" is used, that sounds like a requirement to me. When a separate list is preceded by "nice to have", "plusses", etc., those are obviously skills that the company would like to have in a candidate, but are not hard-and-fast requirements. If the company is so stupid they can't write a simple job advertisement this way, and they use the word "requirements" or "required" when they really meant "nice to have", then they don't deserve any employees at all.
Why should the company lead with what they are willing to pay? Why is it on the company and not the possible employee? Why shouldn't every phone screen begin with the possible employee saying "this is my salary range" and politely exiting the call if the screener won't validate that the salary offered is within that range?
When I buy a car, the person selling the car sets the price. I can take it or leave it.
When I rent a house, the landlord sets the price. I can take it or leave it.
When I'm selling my labor as an employee, why am I not the person setting the price that the company takes or leaves?
I'll tell you why, because the first party to state a number in any negotiation is at a disadvantage, because the counter party suddenly has more information.
Now, there's a valid case that the employer/employee relationship is asymmetrical enough as it is (one employer -> many employees) that the company should give up that negotiating point, but if I ran a company, I'd want to justify that. (There's also a case to be made that, especially with knowledge work, the employee has an asymmetrical advantage because they know how hard they are working, and it's hard for the employer to know.)
That said, when I enter into a new engagement to sell my labor, aka an interview, I do my best to make sure they want to buy my time before I set a price. It's negotiation.
Edit: I love the parent comment even though we disagree, upvoted.
>Now, there's a valid case that the employer/employee relationship is asymmetrical enough as it is (one employer -> many employees) that the company should give up that negotiating point
That's exactly why I think the employer should give up that negotiating point.
The other reason is that employers are constantly whining about how they don't have enough engineers, can't find qualified people, etc., and then lobbying Congress to do something about it. Employees don't have this kind of political power.
Finally, I wouldn't mind if negotiation were simply eliminated with job salaries. You don't negotiate with the cashier at Walmart about how much you're going to pay for some vegetables or a TV. The price is the price, take it or leave it. It'd be better if everything were that way, so that consumers could compare things more accurately. There are many nations where the posted price is not the actual price, and haggling is expected and normal, even on something as mundane as groceries. Without exception, these nations are backwards and economic disasters. There's a reason for that.
There are many nations where the posted price is not the actual price, and haggling is expected and normal, even on something as mundane as groceries. Without exception, these nations are backwards and economic disasters. There's a reason for that
That's a big claim that you make very authoritatively. You should back it up, or change your wording to better express that you're making a hypothesis without much evidence.
Do you have any counterexamples? Haggling is very common in countries like India and various Middle Eastern countries. To say any of these countries have world-leading economies would be quite simply false. India's getting better, but it's basically adopting western culture.
Well as for some prominent examples haggling is considered bad form for small transactions in the nordic countries and they do kind of well. All the places I go to that have a culture of haggling seems to be way worse off.
I guess others can provide more data points that points in this direction but I'd also appreciate counterexamples.
What solipsism is objecting to ("big claim"/"hypothesis") is the statement "There's a reason for that", which implies that there's a causal relationship between the prevalence of haggling and countries being "backwards and economic disasters" (for which there's been no evidence provided).
Only the company knows how well they will be able to turn work into the money that they can use to pay the worker.
The candidate can show the ability to do whatever work the company may require, but if that work does not increase revenues in some way, it will not be able to keep the worker employed indefinitely. Obviously, there's a lot of room for speculation here.
The worker has a general idea of the average amount that many other companies might expect to value the work of similar workers. So the prospective employer has to signal that it can monetize the work more effectively than the median company to attract better than the median quality of candidate.
If your company is building yet another CRUD business app, you do not need above-median skills, nor could you afford them. If your company is building a new, Wall-Street-killing trading platform, you need the 99th percentile of skilled workers, and should therefore be offering 99th percentile pay, because the work will eventually be worth billions of dollars.
The candidate knows how much their labor is worth on the open market. If the prospective employer does not know how much the open position's work will be worth to the company, it really shouldn't be trying to fill it until it does know. If you want to reach the higher-quality candidates, you have to send a clear signal that they will not be rejected for wanting too much money, which happens all too often with companies that need to pinch their pennies or extend their runway.
There are companies out there that will hang up the phone if you say $100k. And there are also companies out there that will struggle to hold their poker face at being offered such a great discount on an employee. You won't necessarily be able to determine which is which before you apply.
When the company does not say up front, it is implicitly saying "we will pay you exactly what you are worth, as determined by negotiation, with no predetermined limits." If they wait until halfway through the second phone screen to bring it up, and then say, "that's more than we can pay", they are wasting the candidates' time.
That is why the company should lead with their salary maximum.
The analogy is not a fixed price on fungible goods in commerce. The candidate has a unique artwork, to be sold at auction. The auction house would very much like to establish that potential bidders have at least enough money on hand to meet the prospective employee's reserve price before giving any of them paddles, especially when the bidding procedure can last several weeks per bid. The candidate does not want the reserve to be known, as they would prefer to get a higher price. Likewise, the prospective employers do not want their maximum bids to be known to their competition. But as they can only complete the purchase by making the highest bid anyway, their wishes do not matter one little bit. You have no business bidding on a Van Gogh painting with only $5k in your pocket, looking for something nice to hang up in a hotel room.
The employer is the one that makes the offer. They are selling the pile of cash, and the employee either buys it with their labor, or leaves the offer on the table.
Actually, this is exactly the type of person I like to interview. One that's already thought ahead and read the posting and decided if it was even close to a fit lifestyle-wise for them, and technically.
Thank you. When I look for a job, I'm not looking for the very top salary (usually those go to the very top performers, which I'm not, I'm good but not top 1%, or are companies which expect too much time), but there are a good number of companies out there trying to get good people for bottom-of-the-barrel salaries. I don't want to waste my time on those places. They usually have other big problems in addition to poor pay too.
Honestly, I wish every job posting included the following:
- salary range (and an honest one too, not one where they post a mediocre low and a great high, but then never actually offer the high number to anyone and just offer the low number by default)
- work location - sometimes it's not that easy to figure out where a company's office is located, or they have multiple locations. The address is important, because it determines my commute time.
- office environment - is it open-plan, cubicles, offices, shared offices, etc. Some photos would be good.
- computing environment - do you use Windows (7, 8, 10), MacOSX, Linux (Debian, Ubuntu, RHEL, etc.)? A combination? (RH in a VM for development, Windows for email/Office). What version control do you use? (git, SVN, or (ugh) ClearCase)
- standard benefits package: insurance company and regular single-guy premium, number of days off/year, etc.
- a fairly detailed explanation of the actual work involved in this position: what technologies you'll likely use, what the project is, etc.
- number of people in team, how team works together (Scrum/Agile, waterfall, etc.)
If companies would just post all this info with their job requisitions, it'd save everyone a lot of time. I see posting filled with paragraphs and paragraphs of flowery crap about how wonderful their "collaborative team environment" is or their corporate philosophy or whatever, but the things I listed above are what are important to me in a job and what will determine if I'm happy in that job. Spare me the flowery prose about how wonderful your company is; I'll make that determination on my own.
I interview tons of candidates. The resume tells me nearly nothing, just what you've worked on before. Not how much, out how well that went. So in terms of some "textbook" model fit, the question is a little silly.
Generally few people are really good. And their spread out unpredictably across the landscape, so you have to just interview lots of people to find them.
That approach makes a lot of sense to me. If you can actually manage people, filtering for "not dumb" is usually sufficient to get the desired result.
Most companies demonstrate this with H1 contractor hiring. They usually don't bother with the cool kids club screening in those situations. One of the smartest people I ever worked with had a degree in Chemical Engineering from some mid-tier university in India that I never heard of. That guy would never get hired as a FTE at that company, because he didn't drink and didn't have the cultural "fit" nonsense.
I think cubano is referring to the mindset that "it's better to skip over 100 good people than to hire one bad apple" which is very silly but a common attitude in certain kinds of organisation.
It really depends on what 1 bad apple is, that it is someone that is not the most productive but is still productive and don't completely harm whatever they work is seems too much but often that is the meaning in use.
I've had this problem when it's time to adjust headcount (smaller, or trying to swap people out to get more done).
I'll keep the Eeyore person that self selects tasks and issues that are of low complexity (say, 2 on a 5 scale) than the self confident idiot who keeps asking for 4/5 stories when they're really only good at 3/5 on a good day.
That jerk is creating 5/5 stories that I have to burn myself out on (they are either notably quiet or loudly in denial when this happens). I don't care how much project management or the marketing guys like him, even an empty desk would be better. At least an empty desk is predictable.
There is not a hiring team out there that even bats 90% let alone 99. The problem is that they significantly hinder their ability to get stuff done when they spend too much of their time in vetting/hiring mode. I've been there and it's no fun at all.
But this is required for the hiring staff to give the appearance of being important and needed within the company. If managers really understood how effectively non-management engineers can find and hire acceptable candidates when you remove the bullshit from the process, they would be confronted with the cognitive dissonance of their choice to staff up an army of HR and recruiters and talk all day about "ZOMG how hard it is to hire a good engineer!" and "look out for the toxic worker" and other such drivel.
When I interview candidates, the make or break question for me is does the person possess critical thinking skills?
Various things may point one way or another for this, but it is a mistake to assume [insert working at a particular company] is an indication that someone has to have the skills, as well as to assume someone not having [insert pet qualification] is an indication that someone does not have these skills.
Sadly, there is a severe lack of this skill applied in our profession, and it is probably the costliest common mistake I see.
instead of asking them stupid, verbatim-recall questions, i give them a problem and ask them how they would solve it.
i.e., "a customer reports his website is running too slow. describe how you would identify and solve the problem."
it's a good sign when they start asking you follow-up questions, like "is it load balanced? is there a database?". it's a bad sign when they say, "just restart the server".
I got asked "what happens when you hit return in the browser?" After I had traced from the keyboard driver through libraries and runtimes to the browser event system, then back down thru the network layers to sockets, then thru IP events to land a packet on the remote router, they called a halt. Apparently nobody had actually answered the question before.
IMO Its a good exercise for problem-solving, and plumbs experience and terminology. I'm not so sure you learn anything about the subject's actual programming skills?
I think that both this question, and beachstartup's question about speeding up a website, are totally decent interview questions, for a intermediate-to-senior web developer.
But if anyone thinks that either of these questions is testing "critical thinking skills", or "problem solving", then I would like to hear in what way. Both of those seem to me to be pretty much archetypal "verbatim-recall questions".
Experience and exposure and education and ability to brute-force recall all of the above is valuable. But it's pretty much the exact opposite of what beachstartup said (s)he was testing, and there is zero "problem solving" in the "what happens when you hit return in a browser" question.
I'm a little bewildered how anyone could confuse these diametrically opposed aptitudes.
True. It takes all kinds of questions to get a good impression.
But I'd just like to protest, my answer was not 'brute-force recall'. It was simple experience. See, I've written code at all of those levels. None of it was booklearning.
I'll start by noting that this sequence of comments is just nit-picking, and if that bores anyone, stop reading. That disclaimer disclaimed...
I don't honestly care if it was booklearning or not, and I don't see what it has to do with my kvetch. I'm happy to believe you that it was learned from experience.
What does "brute-force recall" mean to you? To me it means that you are only repeating things that you knew before the question was asked, as opposed to dynamically generating new knowledge during the time that you answer. Whether you originally got that knowledge from books, or from experience, or from Mr Spock doing a mind meld, it's still memory, as distinct from problem-solving or critical thinking or perhaps more generically we might name "wit" as the counterpart of memory.
Again, I'm absolutely not knocking this form of knowledge; memory without wit is perhaps inflexible, but wit without memory is impotent. Memory is a good thing. Memory makes up the much greater half of expertise; this is why seniors get paid more than juniors (would you rather hire an IQ180 noob who knows nothing about the problem, or an IQ120 worker with 20 years of relevant experience?).
But I'm getting pedantically wound up about this minor nit: both you and beachstartup gave examples of interview questions that are tests of memory, and framed them as tests of problem-solving. If problem-solving is the thing you want to test, those are terrible interview questions for that particular purpose.
well yeah, it's just one question. programming and devops proof is in the pudding. we just ask for code samples and a walkthrough, and go with our gut. maybe a few technical procedure questions.
in my experience it's the other things that will make or break an employee, like whether or not they have an actual work ethic, or is a closet drug addict, or gets too drunk and touches women inappropriately at company events.
these are things you can't test for in an interview or background check, and i find it strange that nobody ever mentions this kind of stuff because personal problems are the most common kind we run into. it's rare to hire someone totally incompetent if you yourself have extensive experience in the work you're hiring them for.
I think that's actually a really good question. It shows deep knowledge and understanding of everything that is happening during a user interaction which is essential knowledge for a webdev. Plenty of "front end developers" have no clue what an HTTP request is.
Nobody was discussing how a keyboard works, but how the web works after you press return on the keyboard. And it's sad that too many web developers in general[1] seem to have the weakest understanding of such basic principles.
[1] super-anecdotal self-selected data points from hiring and conversations
If any front-end developers do want to learn more about Internet/browser networking, I'd highly recommend the book High Performance Browser Networking by Ilya Gregorik. It looks like there's even a free version now available online:
Yeah, definitely. I never felt more like the Mike of the story than I did interviewing at Airbnb. Every interviewer was fresh out of school, very smart, fashionable and attractive and definitely wanted to prove something. Despite doing rather well and scoring exceptionally well on personality, I was still passed on, and it's hard to shake the feeling that maybe it was because I wasn't a hipster. They just had this air of superiority the whole time.
Wow I had a similarly perplexing and bad experience interviewing there.
It was for a UI Engineer position. I did very well on all of my technical coding challenges, with the exception of an algorithm question, which I still managed to complete once the interviewer pointed me in the right direction.
I was later told that while everyone really liked my culture fit for the company one of the interviewers with whom I sailed through the challenge thought I did mediocre, and the algo question asked said I "really struggled". It was pretty devastating but I just chocked it up to me needing to dig in and study harder.
When I interview, I start with a description of the couple of team members they'll be working most closely with, for, or leading. After that, I like to ask "With your limited understanding of where we're strong right now, what do you think your biggest contribution to our team will be? what will we learn from you?"
I've not done any validation studies on the answer, but it's certainly the question that has led to the best conversation.
There have been a number of posts about hiring practices lately. And a lot of them contradict each other. My conclusion is, that people hire people that are similar to themselves or similar to how they would like to see themselves, and the whole hiring process, the style of interviews and coding tasks and the sources from which they hire, is based upon this model.
A company founded by Stanford CS students will focus on ivy league CS students; friends of mine have a pretty successful consulting company and none of them has a formal CS background, but years of experience delivering complex software; they focus on guys that can deliver, regardless of background. Some people never went to MIT, but have a deeply ingrained wish that they would have - by surrounding themselves with MIT grads, they suddenly generate the wanted association.
Most hiring processes spend gigantic amounts of effort to see how a candidate works as a member of the team, without actually having the candidate... work as a member of the team.
I suspect that the reason why, is that so few engineering teams do pair programming full-time, complete with daily-or-more rotations.
Pairing gives you the ability to spin somebody up rapidly enough to see how well they do on real code, and at the same time get a good read on personal fit from multiple team members.
Pivotal is shockingly good at this. You start with an hour-long pairing exercise over the phone. If that goes well, you come in, sign an NDA, and pair on two different projects, with a block for lunch in the middle.
That's a grand total of ten hours of screening time.
This process isn't fool-proof, mind you. But it does do a good job of answering the most important question: Do I want to show up for work, tomorrow, and start working with this person?
Perhaps just as importantly, it gives the candidate enough information to answer that same question for themselves.
Nothing tells you whether or not you want to work somewhere, like actually working there.
In my experience, pairing interviews usually falls into two categories: one where you're tapping away at a problem while your interviewer is doing a work assignment (but please ask me questions), and the second is where the interviewer is grading you (and interjecting) line-by-line rather than actually participating. It's rarely the case when pairing goes as it should i.e. a collaborative work process.
I agree. I interviewed with Pivotal, and it was fairly distracting for me to be asked to work on a problem where the interviewer clearly knew the solution, and did all the typing. It wasn't fun.
It was a gigantic waste of time. Some whiteboarding would have been adequate; it would have been better, even, without all the weird keystroke errors and unwritten expectations on the codebase.
While I have great appreciation for problem solving and human interaction, pairing on a non-customized computer, on a random code base, with someone you met 5 minutes ago is absolutely the wrong way to go about interviewing.
I've done work sample tests: they take a lot of time (time is money), and I don't really have enough invested in this company to want to work for free. I'd much rather do a work-sample after the onsite - let's at least determine if we are comfortable around each other before I start investing hours of my after work life into this thing.
> I'd much rather do a work-sample after the onsite
Pivotal Labs and Pivotal Cloud Foundry teams default to 100% pairing.
What you experienced is basically the actual job. It's perfectly OK that you didn't like it, lots of people don't want to work that way once they've tried it.
But it wouldn't make sense to find potential pair programmers by not pair programming.
Almost all silicon valley interviews are a 1hr phone screen and a full day of interviewing with lunch in the middle. What they seem to be asking seems to be about the same amount of time. And TBH it's less intense sounding since it should be a mostly normal work day.
> Arrogant absurd process with no basis or evidence in reality.
Do you sell t-shirts? I would like to buy one.
In fairness, though, most people who decide to take an offer have self-selected. Malleability might be one reason. Another could openness to new experiences.
Great answer. When I see these ideas floated around, I always think that these companies will never hire people who value their time and have at least a decent job. When a company comes to me with these great "deals" the first thing I say is thank you very much and move their emails to my spam box.
I get the sentiment behind this, but bear in mind that interviewing is a two way street - and the company is often spending a lot more on it than you are (of course, they can better afford it).
That's worth possibly negotiating the amount of time or staging earlier parts with less impact.
I wouldn't say no to an interview that is going to burn a day, but I also wouldn't go to one without a pretty good feeling that I was likely to accept an offer if it came.
I mostly think of interviews as at least as much of a chance for me to evaluate them as the other way around (even the way they choose to interview tells me a lot).
you're asking me to feel bad for a company that probably has millions in seed funding for investing a day in a new hire? hah, no, not going to happen. I (and probably most people here) already have a job that pays me for the work I do, this company can either do the same or they can walk. The one thing that's not going to happen is for me to pity them for the money they are investing.
No, I didn't ask you to feel bad for them. I only pointed out that there is at least some symmetry in the situation. I don't know where you got the very odd idea that pity is/should be involved.
There is no reason that you can't have a mature interaction with a company where you both agree to invest some time and effort in a process that could benefit both of you. If it looks like a bad risk to you, don't go. If you are doing a competent job of this, you should know what an interview process looks like before you agree to the interview. If there are aspect of it you aren't sure about, you should ask about them. If their reasoning doesn't convince you - respectfully decline the interview.
The most valuable thing about an interview process is an exchange of information. Thinking about it too much as a time-for-money trade off can miss this point. Of course, if you can find a more time efficient way to exchange the same information, that's good all around.
However candidates for Labs will see stuff our clients consider commercially sensitive (sometimes their mere existence is commercially sensitive), sometimes candidates for Cloud Foundry will see stuff that is commercially sensitive for Pivotal.
Mind you, I did a round of interviews a few years ago and everyone had an NDA at the door. I read them and none of them were silly, so I signed them.
If pairing and rotations is truly a critical part of how your company builds stuff then this isn't a bad approach. But there are plenty of productive coders who would struggle with this.
I think part of the reason people how those like themselves of that that's the only area where they have any confidence in their ability to assess skill. It's easy to say that this person is just like me, only a few years behind (or ahead!). But someone with a vastly different style and background is much harder to assess. Even with the same questions - as they're often kinda bullshit - it's hard to interpret the responses.
For a generic/flexible developer position I'd purposefully avoid hiring people from "brand" school. You'll definitely need to be over paying, and they will likely be poached.
I unless you need like a PhD who is the top expert in X (and you're willing to pay any price) .. it just seems like a bad bargain
The Ivy League is actually an athletic conference, believe it or not, and consist solely of schools in the Northeast.
The schools are Brown University, Columbia University, Cornell University, Dartmouth College, Harvard University, the University of Pennsylvania, Princeton University, and Yale University
As a Midwesterner from the heart of Big Ten territory, I am often disappointed that Carnegie-Mellon, Northwestern, Purdue, Illinois at Urbana-Champaign, Michigan, Wisconsin, Minnesota, Ohio State, and Indiana are seldom mentioned in such lists, despite all of them being world-class schools for computer science. Sure, the Ivy Leaguers and Californians, and the closer-to-the-coast schools are remembered (except maybe Georgia Tech, Maryland, Pennsylvania, Penn State, or Johns Hopkins), but it's like the space between the Appalachians and the Rockies is one vast flyover wasteland not worthy of notice (or venture capital).
Rice, Texas A&M, and Texas at Austin are also ignored pretty often, but Texans can be offended on their own behalf without my help.
As another person from Big Ten territory, you might as well add University of Washington to that list. I often find interesting work (research, course assignments, etc.) from there, and it's typically ranked highly in CS, but it also seldom is remembered.
I assumed he was using it as a metaphor, because the actual Ivy League schools aren't especially known for computer science. So "ivy league [of] CS" as in the top CS schools, not literally the Ivy League schools.
Would love to put all these "no, THIS is how you hire properly" people in a (virtual) room together to hash things out. Someone is the most correct and the others are all more wrong than that person. We could go really slowly and break down all arguments etc. and see where things fall apart or when things contradict other arguments. And actually get somewhere
>by surrounding themselves with MIT grads, they suddenly generate the wanted association.
Could you clarify what you meant here? Was it that 1. those who surround themselves with MIT grads discover an ingrained wish that they would have gone to MIT, or 2. that they satisfy a pre-existing wish to have gone to MIT by instead surrounding themselves with MIT grads? In the case of (2), are these effects tangible as they might expect?
The latter. Having been to quite a couple of interviews with hiring managers, most of them interviewed in a way that they would have hired themselves, so case number 1. Case number 2, hiring people that they subconsciously wish themselves to be, is coming more from an observation of all these hiring practices posts here on HN. The latter would be quite a nice empirical/psychological study, though I fully expect such a study to already exist.
Disclaimer: I'm living in Germany and nearly exclusively interviewed in Germany so far. There's nothing comparable to the top US schools here, so school reputation is probably much less important than in the US.
>
Disclaimer: I'm living in Germany and nearly exclusively interviewed in Germany so far. There's nothing comparable to the top US schools here, so school reputation is probably much less important than in the US.
I once had an internal recruiter tell me during a phone screen that my resume was weak since I went to a no-name school in the south called Georgia Tech.
During my last interview round, another interviewer found it hard to believe that I did not use Twitter.
I think these problems existing more in the "trendy" areas such as the Bay area. Once you leave that culture, ignorant recruiters are still around, but the exclusivity is less.
I think this is spot on. I've seen people on HN say that they prefer the Bay Area to Boston/NY because the Bay Area is so much less exclusive. I think in reality Silicon Valley has its own flavor exclusivity and boxes they are looking to check off in potential employees
"I once had an internal recruiter tell me during a phone screen that my resume was weak since I went to a no-name school in the south called Georgia Tech."
Wow. Just wow. What a (cognitive) bubble to be in.
Many engineering/science-oriented schools have the same issue. Urbana Champaign, Ann Arbor, even CMU. If you are not from a STEM background, you tend to only "recognize" the Ivy League schools and MIT/Caltech/Berkeley/....
You have to really not care about your industry to not recognize Georgia Tech or CMU, birthplace of some of the greatest technical innovations in the industry. Let's talk about how you hire HR people. I've never interviewed for another company (always run my own), but I'd never hire a HR person who didn't have a basic understanding of tech.
Yet that seems to be standard for most recruiters. Do you have 6 years experience using Server 2012? My favourite question ever (made by a recruiter in 2010) was, do you have 8 years of experience programming Android? Or what's the difference between Java and Javascript? Gah...
The computing building is named after a student who went to Georgia Tech in the mid 1990s. He went on to found a company which was sold to IBM for over $1 billion.
For those interested but did not want to do the research, the student in question is Chris Klaus who donated $15 million in 1999 in order to get Georgia Tech's new (at the time) computing building named after him[1]. He founded Internet Security Systems right after graduating which IBM acquired for approximately $1.3 billion in 2006[2]. It is also interesting to note that the gift was made after his company went public in 1998[3].
I see wonderful symmetry here. There are TrendyCos (hot startup unicorns), BigCos (established companies like google or amazon or microsoft) and UnknownCos (not in the limelight so there is not much information about them). Likewise there are TrendyDevs (hotshots who produce one heavily github-starred framework after another), BigCoDevs (multiple years of experience at one of the BigCos, probably were responsible for some important part of one of their numerous services) and UnknownDevs (been there done that, hard to say).
And here is the rub. There are certainly many undervalued gems among UnknownCos and UnknownDevs but also many abysmally bad workplaces and programmers. So you either have some inside knowledge about them (a referral, an acquaintance working at UnknownCo), have some magic method for separating the gems (like tptacek claims to possess) or it is just too risky to consider them.
There's no magic to it at all. Have every candidate work on programming problems related to the work you do at your company. Have every candidate work on the same problems, and let them do it from home. Build and iterate on a rubric for grading those challenges.
It is amazing to me that almost nobody does this, but: almost nobody does this. They have programmers write code on a whiteboard, or on some whiteboard-coding site; they have them do programming puzzles ("solve Towers of Hanoi non-recursively"), they have them talk about code, or, more likely, computer science trivia. They'll have them "pair off" with one of their own programmers and "fix a bug". They'll have them work as a 1099 for a month to see if they're any good.
In reality, most companies are trying to hire on trendiness; they want people from the right schools and/or the right cohort of companies. They're aggressively courting friends and colleagues of their existing team, and the special people get very different interviews than everyone else. The actual technical evaluation is mostly a facade.
My last job search wasn't typical (I thought of starting a consultancy, so didn't search for a job per se), but one before that showed to me that it's exactly what people do. A company that was looking for a guy to write mobile SDKs asked me to build an Android app with embedded webview and animate html stuff with JS based on gyro movement. A slot game company asked me to build a minimalistic slot game from scratch (thanks to them, I have a clean minimalistic and complete game on github now). Another game company asked me to build a small endless shooter — and I ended up being the only candidate who moved texture offsets instead of actually moving game objects to infinity.
Each of those tasks took from 4 to 8 hours, and 2 of those got me an offer. And of course, I used the same method when I consulted others on hiring decisions or hired people myself. So, based on personal experience, companies do that, and it works.
I'm glad that I'm starting to get feedback like this. I believe you that more companies are doing work sample tests.
For the the gigs that asked you to do these problems: did you also get interviewed with a standard programming interview? If you had to guess, between 1% and 100%, how much of the weight of your technical evaluation do you feel was on the sample challenges you did at home, versus Q&A and whiteboard coding on-site (or on the phone)?
Don't remember details about interviews, but I'd tell that homework was about 75%, because in those two cases they told me that I exceeded their expectations and they learned something new from my projects.
I agree that this is "better" than the alternative, but it can be absolutely exhausting for candidates actively searching for a job. I feel like it's recently become much, much more common (from my small-ish sample of me and some friends).
My issue with this approach is fourfold:
1. Most companies have no idea how to structure a problem that is both informative to them and also not abusive to the candidates time.
2. Companies generally do this right after the recruiter phone screen, which most likely doesn't give the candidate enough information to decide if the next steps are worth their time.
3. Most companies still do a whole suite of normal tech screens after you work on a take home problem.
4. If you're actively looking, getting a bunch of these over a short period of time is likely. I know during my full time search, more than 50% of companies had a take home test right after the recruiter screen. Most of these were 4-8 hours of work each, due within the week.
A lot of startups structure it more like hazing or a barrier to entry than an evaluation criteria. I have some fun (read: horrifying) anecdotes from my recent search that illustrate the problems above, but I don't think any of my points are surprising.
A nice alternative would have been to simply have one or two projects completed that are straightforward to evaluate and walk companies through them, letting them ask me questions.
Here's a really simple test for whether a work-sample scheme is effective, or just a bullshit trend-chasing afterthought:
Does the work sample test offset all or most of the unstructured tech evaluation the company would otherwise do?
If it does, that means they have a work-sample test rubric they believe in, and they're hiring confidently. If it doesn't, they don't believe in what they're doing, and the programming challenges they're assigning can thus reasonably be seen as more hoop-jumping.
In the latter case, it's up to you to decide whether a prospective job is worth extra hoop-jumping. Some teams are worth it!
I think that's fair. I've had both the former and the latter, but unfortunately most of my experiences fall into latter case, where it's simply been hoop jumping. Most of my friends (all about to graduate, so a good number of examples) are experiencing the same.
For example one company gave a problem with five parts, with the final part being solve longest path on a bipartite weighted graph (which is quite a hard and time consuming problem). After that, the next step was a phone technical screen, then an on-site with 4-5 more interviews, most being white-boarding. It was basically hazing instead of an evaluation criteria.
An alternative is my last job, which had a take home test that took about 6 hours, but that was the whole technical part of the process. Being on the other side reviewing them, the problem absolutely gave enough information.
I totally get there's a right way to do it, but like most interviewing trends, companies seem to just be adding this as a step instead of revamping their process.
Does the job they're interviewing involve finding the longest paths on weighted bipartite graphs? Or is this just non-recursive Towers of Hanoi pretending to be a realistic work sample?
No, the position most definitely had absolutely nothing to do with longest path or combinatorial optimization.
Anyway, my larger point is that what I've been seeing interviewing is that these tests are becoming much more common at US startups without companies removing/reducing the rest of their technical evaluation process, nor really structuring the problems to be a good signal.
In an ideal world where companies do take home tests right, I think its a great solution. But what I've been seeing more often than not doesn't support that, making it hard to support.
I'm really curious what you've been seeing at Starfighter. Are partnering companies still going on to do a full technical interview? Or does Starfighter largely replace their normal technical evaluation?
Ignoring the fun of the challenges themselves (which probably isn't entirely fair), the latter makes it very compelling for a candidate. The former does not.
Most of our partners have a somewhat abbreviated interview for our candidates, but everyone (as far as I know) still techs our candidates out.
I'm actually fine with that! We make no pretense of having designed a screening process that is appropriate for every job. What I'm less fine with is the fact that the norm, even for abbreviated tech-outs, is 7-8 hours of on-site whiteboard interview.
Maybe this is an US thing. I changed jobs last year and nearly every single recruitment process involved exactly that, after a remote interview them giving me a small project worth 2-5 hours of work and then going over what I produced one week later. This was for companies in central and northern Europe.
No, there's no principle, there's a ~20 minutes old idea that we engineers are so obviously awesome that companies should actually be paying us for the privilege of interviewing us.
There may exists engineers obviously awesome enough for that to be feasible, and great for them (but they are probably also obviously awesome enough to not have to go through coding tests, so it's a moot point) and even if I could perhaps (judging by certain recruiter-emails) scrape by as one of them, I certainly couldn't five years ago.
Back then, if I'd had to be good enough on paper alone to warrant not just being put through the recruitment process, but to be paid for it was well, I am not convinced I'd have been considered (and yes, I did a take-home test, and I aced it and it made up for my near-total lack of on-paper qualities).
Of course, companies shouldn't waste their applicants time with needlessly extensive tests, but there certainly exists no 'principle' by which you have a claim to be reimbursed for spending a few hours on an application.
Actually, for actors it does. If you go to a casting session, you should get payed for it. That's because they had a first chance to filter you out and they are not entitled to waste people's time. For the first filter, they use their CV and their reel (video of the actor showing itself).
I think that the same rules should apply for programming jobs. Just look at my resume and my publicly available code (that is linked from my resume). If you decide I am good enough to interview me, pay a reasonable hourly rate. This would stop the abusive practice of giving big problems to solve in our free time.
If this is true of programmers as well (and I have no opinion on whether it should be), it should be even more true of standard on-site job interviews, which are more disruptive to your work schedule and more demanding of your time and attention. But, of course, people do not generally get paid to go on job interviews.
It's at least reasonable to cover travel costs for interviewers. I was given a $40/day stipend for food and reimbursed gas and hotel when I interviewed at my current job.
This is a sound-bite, not something that holds up to scrutiny.
If you decide not to interview with possibly great places based on this, you're essentially saying that all future expected gain isn't worth some set hourly rate for 2-8 hours of your time.
Also, what's your free time worth? For me, sometimes it's worthless, and sometimes I feel like I'd sell my soul for 10 minutes to myself.
I have had a few of these the last year or two. Never been paid. More than half of them are a waste of time. I have refused quite a few of them.
The last "shouldn't take you more than two hours" exercise that I was given involved jumping through hoops to get set up on Instagrams API, only to discover that I was in a sandboxed mode, and needed my account approved before i could get anything more than metadata out). I didn't even get started coding, so I am not sure what exactly the test was meant to achieve.
I was not, at any of them. Yes, I was effectively giving them my time but it doesn't sound unreasonable when you compare it to the alternative: I'd spend that same time answering whatever obscure coding questions or far-fetched exercise they had but in a far less comfortable setting (and arguably less representative of my skills). Not to mention how I'd probably have to do the interview at a time they're working, instead of doing it on a snowy Sunday evening, at my discretion.
I'd so much rather do a project with my own tools and from the comfort of own home than commute several hours across the city for a several hour interview.
If all it required was an hour phone call and a small project, I'd totally do it for the right company.
Have every candidate work on the same problems, and
let them do it from home
To be fair, I've seen a lot of good arguments against programming assignments. I think at the end of the day, the employer needs to conduct some method of determining if the employee has the technical capabilities needed for the specific job they're being hired for. However, there are MANY other factors too like "how well they get along with the team" that need to be considered. That's what they're trying to determine with the pairings and other stuff you mentioned I think. Whether that works is another story.
Every team I've talked to starts out with some X-factor they think they need to assess for. It's my belief that if you can have the discipline to stop filtering for X-factors, you'll build better, more effective teams, because those factors are really just vectors for personal biases.
Regardless of whether you agree with me about that, I think we can all stipulate that if work-sample technical evaluations work (and: they do), most of what companies try to evaluate in on-site interviews is stupid. No part of working effectively with a team requires timed recall of how to implement a stable quicksort, or reversing a doubly linked list at a whiteboard.
At the very least, using work sample tests allows you to build an on-site interview process that honestly engages with "team fit" (or whatever your X-factor is).
My guess, though, is that when more teams adopt work samples and then go through the motions of trying to design a pure team-fit interview, they're going to realize --- once they don't have "implement Bellman-Ford on this whiteboard" to fall back on --- how unequipped they always were to evaluate team-fit in the first place.
The problem could also be that many teams simply don't have the time or possess the knowledge to do an assessment like you're suggesting (which I agree is a good way to assess candidates). It requires someone to actually design an assignment (which is a task that many coders might not be good at), and requires one or more people to evaluate it.
The ridiculous whiteboard coding of puzzles probably stems from laziness or inability to implement what you're suggesting. I know that personally, if someone asked me to developed a work sample evaluation for my job, it would take me many hours to come up with something, and frankly I even once complete we would have no way of knowing if it is actually a good predictor of whether or not someone is the right employee.
Hours? It could take a week, and if you're going to hire more than one person this quarter, it will still be worth it just in the time savings from not having developers deliver bad interviews.
Have you done this? My worry would be that sooner or later you'll have a candidate who posts the assignment online ruining your weeks of effort. It also seems like it makes it much easier to cheat if you make it a take home.
Yes, I ran a process like this for several years. I ran recruiting for the largest software security firm in the US; before that firm bought us, I used this process to more than double the size of my company. When I left, to work on a recruiting startup, not one of the people I'd overseen hiring had quit or been fired.
We paid the market median to new hires; we definitely didn't buy our way to that turnover (NCC pays better than a lot of early stage startups, but not better than late-stage ones).
No, I attribute the turnover to the recruiting mechanism. We found great people who were sorely mispriced by the market, and we took advantage of that to create a win-win scenario: people without the resume to get a similar job at a competing firm got an extremely impactful resume bullet and a good-paying job, and we got people who genuinely wanted to be on our team and weren't applying as a once-a-year lateral job-hopping gamble.
(I have no problem with people job-hopping, by the way, but every employer is trying to minimize their exposure to that.)
You were also recruiting for a "cool" sector where people would be learning marketable skills. Do you have any evidence that the same technique would work for more "boring" companies? (I ask because I'd like to convince my manager that creating a work-sample test wouldn't be a waste of time)
I agree completely. But in many organizations recruiting and hiring would be considered overhead instead of strategic, and thus a target for outsourcing to someone in HR or some outside firm. The problem there of course is that those people have neither the ability nor the inclination to create or evaluate a work sample challenge.
Really my comment was just to help demystify why this sort of thing doesn't happen in many companies based on my own observations.
If you substitute the candidate's time in an in person technical interview with the estimated time for the take home assignment, I can't see the issue. I would gladly give 3 hours of bullshit algorithm white boarding exercises for 3 hours of homework.
I love this approach. I'm also wondering what you think about giving well-designed work-sample tests over Google Hangout instead.
The candidate shares their screen with you, so you can watch as they solve the problem in their own dev environment. You can understand how they approach problems (quick and dirty, slow and methodical, lots of rewriting, etc), and you can ask questions at the end. You get a good sense for how they work as an engineer even before having to bring them onsite.
This seems to resolve the time asymmetry of take-home tests as well -- the interviewer spends as much time watching as the candidate spends working.
The only downside I can see is that you have to design your problems to take about an hour instead of the 2-5 hours you could imagine for a take-home test. But, you can break them up into multiple rounds, and give additional exercises to the candidates who do well on the first one, for no more total time cost than a collection of onsite interviews.
For what it's worth, I've done this at my startup and hired a great developer, and got very positive feedback about the process from the candidates I didn't end up hiring.
As someone who's had to do a few take home tests as well as coding with someone looking over my shoulder, I am definitely not a fan of the latter. I prefer to sit down and think about a problem, maybe even read it and let it sit in my head for a day or two before I actually start the coding. I also feel like shortening the time and making it synchronous would remove the opportunity to revise and polish the code I've written, which is one of the advantages of the take-home test format.
My ideal interview format would involve a work sample test, followed by a code review of said work sample, and also a reverse code review (candidate is given some code to review). As an added bonus, all of this can easily be done over email without wasting too much of anyone's time, especially compared to an all day in-person interview with lots of whiteboard time.
FWIW I presented this option to my senior management and HR at a "household name" software company and was essentially told we can't do this because it looks like a test and we'd have to do the same test for every candidate or else open our selves up to discrimination issues, no matter which team the position is for, and that was a no go.
1) HR (presumably taking direction from in house counsel) takes the position that it will be difficult to distinguish between a work sample test and an IQ test
2) We are a large organization with very broad roles. The Performance Team hiring for a Senior Software Engineer will probably want a different work sample than the Analytics Team...
It is simply not true that the Perf team must deliver the same work-sample test as the Analytics team.
Let's not dignify that argument. I absolutely believe you that your company has allowed HR to sabotage your hiring process, and that sucks. But let's not pretend HR is right to do it.
Ironically, a work-sample test which WAS the same across every developer would be closer to an IQ test than if that work sample test were tailored to the role.
There's probably a solid argument for giving every candidate for the same role the same work sample test; otherwise you might naturally want to randomize some parameters or choose from a bank (if you think "cheating" is likely).
The HR involved must be applying cargo cult rules of thumb without understanding the actual rules and the work the company does (which is, unfortunately, distressingly common for HR organizations.) If the actual work you did was such that a work sample would be indistinguishable from an IQ test (which it isn't, for almost any real work anywhere, and you'd have to be ignorant of either what the work is or what IQ tests are to make that mistake for most jobs) then IQ tests wouldn't be problematic in any case: IQ tests aren't specially prohibited in employment, they were just the immediate subject of one notable case which held that anything with a greater impact on a protected class is illegal if that impact is not warranted by the validity of the filter at issue as a measure of performance in the job for which it is used as a filter.
Not only that, but there are several very large companies that do in fact use IQ tests during screening (that's a stupid policy, for what it's worth), so I'm pretty dubious about the claim that IQ tests are unlawful.
They are lawful when you can demonstrate a link between on the job performance and the IQ test. Given how g-loaded software development this should not be hard to show.
I'm looking for a leadership role. Whether that happens at my current company (I'm an IC with a lot of technical leadership and mentorship responsibilities) or elsewhere only time will tell.
If you or anyone you know is interested I'm a software developer and spend a lot of my time on cloud architectures/containers/multi tenancy optimizations but just generally enjoy solving business problems with technology. In a past life I cofounded a startup that didn't turn out to be viable after going through an incubator (not YC). I have a resume, LinkedIn, Github etc.
> "we can't do this because it looks like a test and we'd have to do the same test for every candidate"
Oh, what a terrible tragedy, to give candidates consistent evaluations that are actually comparable between each other... /s
I get the point you're presenting, but at the end of the day it still boils down to "we can't use tests because then we'd have to be held accountable for objectivity, we prefer instead a system that lacks any kind of rigor such that any result may not be effectively challenged, officially or otherwise".
How much time do you give candidates to complete their work samples? I'm curious whether work-sample tests filter out, say, parents with young children, in which a company with a 4-6 hour technical interview might be preferred over an unbound take-home assignment (in addition to the possibility of more interviews).
>It is amazing to me that almost nobody does this, but: almost nobody does this.
There isn't enough feedback when someone has a bad idea on hiring. People will very quickly tell you if your code sucks though (this is a good thing), which might be the kind of feedback needed to get hiring changed.
I am fine with asking for candidates to work as a 1099 for a month if you pay them the standard 1099 contractor rate, which is almost invariably more than 2x their FTE rate. I am not fine with the industry norm of paying people their eventual FTE salary on a 1099 basis; that's a scam.
I think it's pretty dumb to do that, though, because most software developers can't take a month off of work from their current job to see if they're a fit for your company, and none of them will put their health insurance in jeopardy to do that.
I also think it's pretty silly to design a process that deliberately demands a month of time to make a decision that could be made effectively in a week.
That strikes me as a very costly way to build a team. Independent contractors are expensive; they know, better than most other developers, what their time is worth.
It shouldn't be on top of that! It wasn't for our process. Of course: we did in-person interviews. And in-person interviews are disruptive no matter what you do. But:
* Our in-person interviews were shorter than typical in-person interviews because they weren't tasked with fully teching candidates out.
* Because they didn't try to tech candidates out, they weren't as stressful, and so were less draining and unpleasant.
* Because our in-person interviews were entirely scripted (the interviewers had very little latitude with what they could ask and how they could ask it --- which they hated, by the way), they were easy for everyone to deliver.
* Before candidates arrived to their interview, they already knew (because we told them!) that they were likely to receive an offer from us based on their performance on the work-sample tests.
I understand why people resist the idea of coding challenges, given:
* They're not going to get feedback for weeks, months, or maybe ever
* The challenge is going to be graded pass-fail, or whimsically, without rigor or consistency
* They're going to have to do the exact same grueling bullshit draining dehumanizing programmer interview anyways
* They're not going to see the challenges coming, or, for that matter, know exactly what happens next after the challenges are done
That process is super common, even at companies that ostensibly do challenges. It sucks. I agree, no company can really get away with this in the long term.
But those are also chickenshit problems. Switching from unstructured interviews to work-samples is hard: you have to change your mindset on how to qualify candidates, and there's a leap of faith involved. But getting candidates feedback, telling them what to expect with your process, keeping a schedule, and having processes in place to be consistent aren't hard problems. They are table-stakes common sense business execution, and if your team can't handle that right now, your team is mismanaged.
Can you share an example of challenge given to candidates? In the past, I've used "script a file sharing tool that handles encryption", which worked well and took candidates ~4/8 hours to complete, but isn't something we used directly in our projects.
Sure: we wrote a very simple retail trading system as a one-file Ruby app. It used a custom binary protocol. We had candidates reverse the binary protocol, implement enough of it to test the system, and find vulnerabilities in it.
We had a grading rubric for the challenge that was based on the kinds of things people found in the application. We evaluated performance on the challenge on a 1-5 score; depending on how you did on other challenges, a 3 would keep you in the game, and a 4-5 probably assured an in-person interview.
It saved us enormous amounts of time at Matasano. When we designed our work-sample rubric, our #1 concern was filtering out people who interviewed well and/or had great resumes but weren't worth a billable hour on a real project, but our #2 concern was interviewing faster and getting results back to candidates quickly. (We didn't discover the best reason to do work-samples --- discovering hidden talent --- until after we'd started doing them).
With standardized interviews and challenges, our whole hiring pipeline became entirely mechanical. We could send our a challenge and its instructions in 2 minutes, and evaluate the response the next day (or week, or whatever) later in less than 5 minutes. 7 minutes, to produce a technical evaluation that crushed a competing battery of tech interviews by several senior staff members that took 4-5 hours to do.
Aha! I sort of like that, then. Assuming of course that it actually does work.
Back when things were simpler, we had a multiple guess questionnaire that did the similar. For onsites, we then looked for evidence of rigor and thoroughness, the ability to estimate how much communication was required, had they deployed real systems?
You understand that it sounds like you "fixed" hiring, and so we're all sorta skeptical, right? If you could bottle that, there's your billion.
I am talking my book in the sense that my business works a lot better if companies get saner about how they hire people, but Starfighter is not the commercialization of the idea I'm talking about here.
Yes: like I said, the goal was to get results back to candidates faster.
The biggest complaint we'd been getting from candidates before work samples was that our process took too long. Our goal became to get the entire process done under two weeks (from its worst point, a month or two earlier, of 1.5 months, when someone finally posted an anonymous complaint to Glassdoor about us).
After iterating a couple times, we could reliably go from first contact to hire/no hire, with a motivated candidates (ie: one who would work with us to schedule the interview process aggressively) inside of a single week.
As a candidate, I would be a little wary of this process. You're asking me to put in multiple hours of work, but you only put in 7 minutes. So the incentives are misaligned: you are incented to give these challenges to many people, even if they have a low chance of passing through. But as a candidate, I don't want to sped multiple hours if I have a low chance of passing through. This is one reason why full-day on-site interviews aren't so bad - if I've gotten to that stage, I'm probably pretty likely to get an offer.
First, I don't concede the idea that this is a concern I need to ameliorate, because the work-sample process consumes fewer hours of candidate time than the conventional interview does, and, better still, consumes those times as, when, and where the candidate chooses to make them available: an hour a night during the week, say. Do the work from your favorite quiet bar. Do it during your coffee break.
I am spending a lot of time these days talking to people interviewing in the valley, and what I'm seeing is that the norm candidates are subjected to is 7+ hours of 6-7 on-site interviews. Candidates have to go through all the interviews, even if the first interview has effectively ejected them from the process.
Compared to that horrid process, I don't believe I have to justify anything about my process.
But, if you read downthread, you'll see that we in fact did a lot to ameliorate the (bogus, I think) concern that we were incentivized to soak up hopeless effort from lots of people.
I was thinking something similar. When one party can waste the other party's time at little cost to their own, the situation can be abused. If a job ad wants you to submit to a test before you even talk to a hiring manager, this is a signal that the employer doesn't care about wasting your time. I can see the advantages to automating the hiring process but as a candidate I am less inclined to engage with a party that has no "skin in the game".
Think of the design world:
Potential client "A" asks multiple desperate artists to work on spec in hopes that they will get the commission.
Potential client "B" call you up and talks to you, sends you some napkin sketches and generally engages with you for an hour before asking you to do a design.
Both clients want you to do a design (test) but one clearly doesn't have any skin in the game.
Assuming you're a decent designer, which client do you respond to?
Its a crap waste of time. Take home exercises have little more in common with the job than a white board exercises. And not all devs are going to do half a days work for free.
People say this in every thread about hiring, and it never makes any sense to me at all.
Companies that don't hire using objective at-home tests hire instead with grueling on-site interviews that knock out an entire business day (most of them last a whole business day, some recent interviews peers have gone on have taken multiple business days, and all of them at the very least kill the day for anything else). The latter is obviously worse than the former; I don't even see how any other argument could be colorable.
What I'm beginning to think is that this kind of pushback comes mostly from developers who are well-connected, and so they never experience the grind that less well-connected but equally-capable developers do with interviews.
Yes: if you're at a point in your career where you can get a job in any of 6-7 different companies just by raising your hand, saying "I'm available", and having a 30 minute conversation with the VP/Engineering who you worked with 2 jobs ago, everything is a waste of time. I am not here to tell you that you should make things harder for yourself.
If used wisely, at-home coding tests should be far better than the average mess of a technical interview.
The problem is in the hands of TrendCo, they can be used crudely to find the trendy hire who has the exact same ideas about engineering as the hiring person. Because giving the test costs TrendCo nothing, they are happy to throw people at it until they randomly find someone who is exactly what they are looking for.
Anecdotally, I and 2 people I later befriended applied for the same role. We each spent 4 hours on the task, all 3 of us strong coders who made a solution that would be suitable for any startup, all 3 taking different approaches based on our styles. We were all rejected, because none of the 3 of us hit upon the exact approach the company was hoping for, but didn't ask for. There was no opportunity to ask for this feedback either. And it took them nearly a month to bother looking at my code (even after I called).
Since then, I won't take these kind of tests unless I have reason to believe they are being given in good faith as a way to determine if a programmer is capable, not as a way to find their perfect ideal of a programmer.
If you had asked at Matasano how we evaluated our work-sample tests, we'd have told you. In fact: the first contact any new candidate had at Matasano was a 45-minute-long phone call with a director+ (for a year and a half: me), during which we explained our process in excruciating detail and answered any questions that came up. If we got even a hint from a candidate that they might not know exactly what we were looking for, we'd get their mailing address and fedex them a stack of free books; we gave them cheat-sheets on what parts of those books to read, as well.
I have the strong impression that a typical SFBA tech hiring manager thinks it's crazy to give candidates cheat sheets for interviews, and that helping candidates with technical evaluations decreases rigor. THE EXACT OPPOSITE THING IS TRUE. Evaluating technical ability under unrealistic pressure and with unrealistic restrictions confounds the results of the evaluation.
Of course, you have to design tests that are valid predictors for candidates who have received tips and assistance and practice and resources. That sounds hard! But it isn't. The best predictor for a candidate's ability to do the kind of work you really do is, simply, the kind of work you really do. Not problems from a book, not trivia questions, not one algorithm poorly implemented on a shared coding site and a Skype call, not whatever bug is at the top of the bug tracker today, but a sample assignment given to a real member of your team, the same one, for every candidate, graded the same way.
I completely agree with all of that! If only more hiring managers were so thoughtful. Now, I'd only do this kind of at-home work if I can talk to someone who will explain how their hiring processes works, and I get the sort of good vibes that your post gives.
The problem is that a lot of companies put out the take home test and then expect a long on-site as well. One or the other would be fine but take home tests are easy for the company to send out a lot don't take them that seriously.
"Contractors are generally not the strongest technically"
Where does this come from? I actually see a contractor as someone who had experienced way more "war stories" than the guy who has been sitting in the same chair for 10/20 years. As a contractor, you are exposed to more technologies, more business sectors, different working cultures.
"Consulting" or "contracting" can refer to different kinds of work arrangements. One of these is what's called "staff augmentation."
Large companies with huge teams of full-time employed software programmers will often hire contractors to fill gaps. These contractors are typically contractors-in-name-only: they work under exactly the same conditions as full-time employees but with a worse tax situation, without access to benefits, and without eligibility to receive equity or cash bonuses. They are often contracted through intermediaries who take a large cut. These contractors can be fired more easily than their full-time employed colleagues, and they may often work on time-limited contracts that cannot be renewed. (They often do not count against a manager's allotted "head count.")
These contractors are typically in this position, because cannot get hired as full-time employees. They often jump between contracting jobs until someone gives them a chance or until they manage to get a client to "convert" them to full-time.
Staffing firms are in on this. If a recruiter sends you to interview for a contract role at some large company, it means they don't think you could get a job as a full-time employee. Hiring managers also know this. If they see you've only ever worked as a contractor, then they will assume you don't have what it takes to get a job as a full-time employee.
I've worked for Fortune 500 companies around here. The senior full time employees have been working at the same place forever, so their skills are often outdated. They get paid a lot less, but they are often fine with that, because the place gives them the one thing they want: status. Changing jobs means you might not get that status. In comparison, the top contractors get far better pay, have worked in more than one or two places in their career, and are the guys you want to hire.
It's probably different in the bay, where big tech companies have little trouble paying 200K+ compensation packages, but around here, just contractors do.
Why do the contractors get paid more? Because they can get paid without talking to HR, or meeting any pay bands. Last year, I worked with other three contractors, whose names anyone in the Scala community would recognize. 200-300K a year range. The best full timer they had (which just quit, after being told how silly he had been by being full time), made 110K.
Being well known means that switching to work for TrendyCo isn't difficult though.
You absolutely nailed it on the head. I am one of these "staff augmentation" devs and it has been bothering me for awhile now. I would love to try my hand at one of these "trendy" startups as a full-time employee but I am pretty much the opposite of what they seem to be looking for:
- non-technical degree from a local state school
- 35+ years old
- only have a couple of years of professional software development experience and most of it as a contractor/consultant
- not white
- most of my work is back end bug-fixing, maintenance, being in an on-call rotation, and occasionally integrating some system to some enterprise database
- buzzwords in my resume (Java, Borg, RPC, Hibernate, SQL, XML, Eclipse IDE, DAO, DTO, etc) probably scares off the young devs because apparently anything remotely related to J2EE or enterprise is evil
- my real employer is one of those Indian staffing firms
I tried one of those "dev auction" sites (Hired.com) one time. The "talent advocate" assigned to me got really excited upon hearing I am currently a contractor at the BigCo. in Mountain View, CA. She immediately put me up on auction and I guess her enthusiasm got to me since I got excited as well when I started seeing all these cool San Francisco startups viewing my profile.
Then both of us were disappointed when none of the startups were interested. The auction period came and went. I had zero offers to chat on the phone despite dozens of views. She was nice enough to give me another auction round and I went for it. Only a half dozen new views on my profile that time but at least I got one phone interview out of it. The talent advocate was a bit confused by the whole thing and couldn't understand why most of the startups didn't even want to speak to me. She did offer to personally reach out to some of the startups on my behalf but I knew what happened so I politely bowed out of the auction.
So I guess I have this stigma now and I know it's only going to get tougher the older I get. I want to fix this and hopefully move to a full-time product development role before it is too late. Should I quit my job and go back to school? Join a coding bootcamp or "hacker community"? Continue working as a contractor and keep applying to startups on the side?
There are two types of contractors and they lie on either side of the skill bell curve. What you describe are the lower end contractors. Top end contractors refuse to take full time. They run their taxes through S-Corps, come in to fix the most challenging problems, and big-co's get in bidding wars over their next contract.
I think in general, you are correct about the skill difference between contractors. But there is more to it than just that. Low end contractors don't choose that path, they end up their because they either, like you said, are low skill, or because they made career mistakes and need to get back on track. I would definitely fall into the latter group.
Agree that this is the exact opposite of my experience as a contractor in the UK. I work as a contractor because the money is much better and because I have access to a wide variety of interesting projects. In the UK the contractors have the power - the companies I work for would hire me full time if they could.
Similar situation here. The company I'm currently contracting for mentioned they struggle to find experienced Scala devs who will interview for permanent roles.
I think this is an U.S thing where this kind of division has sprung up - so you have
contractor ("temp", "paid less")
full-time employee
consultant ("paid more")
There is a pretty important distinction here -- We have seen in the US consulting market (at large corporations) an influx of 90s style contractor arrangements for staff augmentation. This is basically the counter to failed outsourcing efforts. These contractors almost entirely work for large groups like Robert Half, Tata, Infosys, Tek Systems and others.
We also still have a very very strong consultant labor force making 2-3x what W2 full-time employees can pull in. These consultants generally work through smaller consulting firms that take smaller cuts for the placement/handling billing and invoicing.
Perm - Mid level benefits + Pay (maybe options in a startup)
Contractor - Individual with their own Ltd company, pays less tax and much higher take home pay but without any benefits.
Consultant - On-site via a supplier on a framework of some kind, however, often a permanent employee of the supplier with the supplier cashing in the margin.
At least 50% of the tech workforce in companies I work with or where peers are at in London are made up of contractors, especially in Banking.
Totally agree with you here. The contracting market in the UK (particularly London) is much more lucrative than relative permanent roles, even when their benefits are taken into account. It's actually more lucrative on tax here, though the Gov't are trying their best to change that.
It's also my experience that you get what you pay for, contractors in the whole tend to have a much wider breadth of knowledge from various sectors and hit the ground running whereas I don't tend to see the same appetite from Permanent employees (in most cases!).
A good way to know what kind of contractor they are is to look at the vendor. Robert Half, Tech Systems etc probably augmentation. Smaller vendor, or independent. Better chance of "Specialist".
I'm a contractor for a large research facility. The arrangement is almost exactly as you describe, but I am a W2 employee. They hire researchers with phds and such, while offloading all software work to contractors.
I took the job a year ago with the hope its name recognition would be helpful in the future but I had no idea there was such a stigma around contracting. I'm starting to look around again, so I guess I'll see how it effects me.
I'm biased (being a long-term contractor), but I would agree that this statement is exactly wrong.
Breadth of experience counts and you can't get that without having 'been around'.
On the other hand, if you're hiring a contractor who has had lots of very short contracts, that should raise a warning flag.
Quite simply tho, why would any good contractor want to go permie? the attraction for me is simple... I get paid twice what I would otherwise get in a permie position.
Contractors go back to permanent jobs for lots of reasons.
Re-locations, change in family status, social reasons (sure does get lonely working from home), aiming for managerial roles, a more stable income (even though less per hour) being able to focus on programming(without having to hunt for new contracts after each project is finished).
Yeah, was being slightly tongue-in-cheek when I said that. I know many fellow-contractors who have turned permie, but for me personally I've gained so much professionally, experience-wise and of course monetarily that I would never go back to permie-land.
I've also been lucky in that I've never been forced to work away from home and have always gone from one contract to the next with no gap.
Why would it raise a warning flag? I'd move to Australia for a one-week job, but I would hate moving to another town for six months. [I'm in Europe.] I just can't see any drawback to short-term contracts.
Certainly in my experience, all my contracts have been a minimum of 3 months, but averaging around a year and going up to a couple years or even more if you include return contracts.
Its not necessarily the length, but the lack of extensions that is a warning sign. Most projects do not last a short time so a lack of extensions indicates a problem.
I worked for a company which extended my contract to two years, but I've been in five or six different projects during that time. All but one (Avon AU which is ongoing) have been completed successfully.
I am really biased towards short, "solvable" projects :)
In the case of Microsoft which is the context, this is generally true because the hiring for contractors is less rigorous (and pays less) than SDE. Most v-dash's I know spring for the opportunity to re-interview and trade up to FTE for benefits/comp and longevity of course.
As the article suggests, I think it's just code. The recruiter is being subtle in case the employee overreacts and passes on the message to the candidate. Contractors are expensive, more experienced, less malleable and can give other employees ideas that there's an alternative.
Graduates and junior developers will burn themselves to hell and back trying to impress their employer because they don't know any better. That includes people who have never worked for a proper business (i.e. not a VC backed startup).
It comes from the incentive structure of the recruiter and says less about the potential hire.
Recruiters are often paid a commission for every one of their successful hires and that commission is often split between an immediate payment and a payment (or payments) when the hire hits specific milestones (6mo, 1 yr, etc). Recruiters don't like trying to fit contractors into non-contract roles because they're worried about the contractor leaving before the recruiter's remaining commission is paid. If that happens, the recruiter puts in just as much work and gets paid less. They'd rather discard those resumes that they consider to be riskier and focus all their time on people with a history of staying at least 2 years in every one of their jobs.
Uhm, I remember hiring one contractor twice. The first time for $50/hour. The second time for $400/hour. He wanted more, but I was able to talk him down to the same rate as the other contractor. They both worked in crypto.
As for me, after co was acquired, I was a FTE for one day, with great benefits. But, the lawyers wouldn't clarify a claus in the FTE contract that would exempt my IP. Was I assigning it or excluding it? Their answer was "either". Before that, I was working for 4 days per week for the startup and 1 day per week on my own projects (at 4/5ths the pay).
I know plenty of indy developers that are absolutely brilliant. Sometimes they take on side projects as contractors.
Depends on the area of expertise and how you buy that expertise.
Many contractors are really installers, and are great at bootstrapping a product or service. But... they know little or nothing about the actual running of the thing they setup. They don't get how to integrate their product's magic with your ITSM system, or what metrics to monitor, etc.
On the developer side, I've seen more than my share of consultants who are high-priests of whatever methodology/language/framework religion they preach, but have a hard time delivering value.
"Contractor" can mean, entrepreneurial person who's made a career helping a range of companies who's internal developers couldn't get the job done.
Or it could mean, someone that a company took on with a day-rate because they needed a warm body, but didn't want to actually hire them either, in case someone better came along.
"When I started programming, I heard a lot about how programmers are down to earth, not like those elitist folks who have uniforms involving suits and ties. You can even wear t-shirts to work! But if you think programmers aren’t elitist, try wearing a suit and tie to an interview sometime. You’ll have to go above and beyond to prove that you’re not a bad cultural fit. We like to think that we’re different from all those industries that judge people based on appearance, but we do the same thing, only instead of saying that people are a bad fit because they don’t wear ties, we say they’re a bad fit because they do, and instead of saying people aren’t smart enough because they don’t have the right pedigree… wait, that’s exactly the same." --- that's his conclusion and that pretty much sums it up for me as a black junior developer from MidWest University who's looking for work now.
There are a lot of down to Earth devs, but there's also some... how do I say this delicately? ... Assholes. These people grew up knowing they were smarter than average, but didn't learn to talk to an average. They always have to be the smartest person in the room. This person's sentences start with "Well, technically..." more often than not because you're dumber than they are.
A lot of us go through a period, when we're young, where we start correcting people because, well, we're smart and they're wrong. Most of us get over it and actually learn how to talk to an average (being able to accept when they are right and being able to STFU when they are wrong or explaining it delicately if it's actually something important). Mostly because we learn that an average hates to be corrected all the time. But also because conversations can be fun if you just don't worry about whether what people are saying is "technically" correct or not.
I agree that there's definitely a lot of down to Earth Devs and they are probably some of the reasons why I didn't give up on this career(even though I didn't fit in nobody's box and felt alienated from everyone since Freshman year of college), but booooy are there so many holier-than-thou "I went to CalFord or StanTech so I am a god!!" whose last iota of humility was probably when they were in middle school. It almost feels adversarial sometimes when going through tech interviews(as if the process of preparing for them wasn't draining enough). Fortunately, there's a glimmer of hope most smarter-than-average devs are actually good people and are willing and nice enough to help when asked. It just doesn't seem like so when I am doing phone screens or talking to some people at insert-programming-language meetup.
(Vanity compels me to share: I was probably going to ask for more money than they would have been willing to pay. But before a former CEO pops in this thread to reveal that I was actually incompetent, I must say that I think I put on a good show. The weekend prior to being flown out to SF, I spent a day analysing their server code and was able to not only find a fairly subtle bug but propose a fix for it.)
I'm one of those underrated engineers who has had trouble getting past nearly every phase of the hiring process at various times. It's frustrating because I'm pretty smart and very passionate about building software, especially tools.
This bias against having the wrong keywords on your resume is incredibly unproductive,I agree, but I would guess it's a result of how little you can actually tell about someone from just a page or two of their work history, which is all most companies have to go on from a resume alone.
I am dying to get some company to take a chance on me, to someday have the benefit of a real mentor, and to be given a real opportunity to feed all this ambition that just won't go away.
I'm pretty sure I'm not alone in that, either. We're here, we're waiting, and we are legion. It's time for a solution to the problem of how to put us in touch with all those hiring managers that complain about how hard it is to find talent.
I co-organize the EmpireJS/EmpireNode conferences in New York City - besides just those sponsors for the conferences, there are plenty of great places here looking for your attitude and outlook.
OK sure, there's the appeal of "Smart? Any human is as a lowly slug to the cosmos!" but in the context of a job application it would be dancing around the common use of the word. If the meaning is "I'm probably in the minority of humans who could solve this problem" then it is definitely a good attitude to have in the context of applying for a job.
Now if you say "I'm hardly ever wrong" that won't go over as well.
A lot of businesses which are basically web sites don't need high-powered technical talent. They just need competent people who can make the thing go. This isn't new technology any more. Don't overcomplicate things.
Soylent is looking for a software engineer.[1] Soylent, as someone figured out a few months back, does about two sales transactions a minute. They now have all of two products. They should be using some off the shelf shopping cart program. Yet they're apparently using NoSQL and other fancy technologies.
Is the anti windows/.NET bias really that common? I have been a .NET dev for 6 years because that was my job. But I don`t believe that makes me useless on any other stack. In fact lately I desire to work on something that feels exciting and fresh again but sometimes I feel like .NET devs are looked down upon and it`s not so easy to land a job or contract outside .Net land.
Bias towards Windows is an interesting one. As someone who programmed on Windows for 9 years, I got away with writing code in IDEs and since most apps had GUI and command line was less than worthless, I was never exposed to simple tools that could be chained together under a minute that one had to spend hours either writing, or looking for a software that did the same in Windows.
It wasn't until I started college, where I had to use Linux to get my assignments done, etc. that my mind got stretched open. In three months I couldn't believe I survived in Windows for that long.
As an employer, the fact that someone has only ever developed in Windows and also uses Windows at home, tells me that at best they can appreciate unix and unix philosophy, but that it's unlikely they could build tools that do the right thing and play well in a unix environment, no matter how many years of experience they've had building tools.
But not only that, unix goes deeper and teaches you about how to do things inside your program, even if that part never talks to other programs. Things like do the dumb thing first; or when you have nothing to say, say nothing (e.g. `ls` prints nothing if no files found), etc. It's about mindset and values and those are almost always dictated by the environment. Also values cannot be looked up on stackoverflow and copy/pasted in 5 minutes. They take a long time to sink in and I can understand why an employer would want to skip that cost when hiring.
> But not only that, unix goes deeper and teaches you about how to do things inside your program, even if that part never talks to other programs. Things like do the dumb thing first; or when you have nothing to say, say nothing (e.g. `ls` prints nothing if no files found), etc. It's about mindset and values and those are almost always dictated by the environment. Also values cannot be looked up on stackoverflow and copy/pasted in 5 minutes. They take a long time to sink in and I can understand why an employer would want to skip that cost when hiring.
"Unix culture values code which is useful to other programmers, while Windows culture values code which is useful to non-programmers." (really: read the whole article!)
As with much of Spolsky's writing, it's a really fun piece to read, and thought-provoking too. But I think the bigger distinction between Unix and Windows is not differential valuation of programmers and non-programmers: it's believing that non-programmers exist or not. From the Unix perspective, everyone is a programmer, and that is appropriate: anyone is capable of directing a computer to achieve tasks. From the Windows perspective, programmers are wizards who deliver things to end-users (and there are higher programmers who deliver things to lesser programmers as well, which is why proprietary software is okay).
I'm pretty firmly in the Unix camp on this. 'I can't program' should be as rare as 'I can't read.'
Programming exists so most people don't need to program - in the same way that other professions exist so most people don't need to become experts in medicine, law, engineering, building trades, or car maintenance.
What makes programming so special that it needs to be different here?
> Programming exists so most people don't need to program - in the same way that other professions exist so most people don't need to become experts in medicine, law, engineering, building trades, or car maintenance.
You might as well write, 'reading exists so most people don't need to read,' which would be equivalently true.
> What makes programming so special that it needs to be different here?
Because directing computers to perform work is as fundamental a skill in the modern world as is reading. Yes, there are people nowadays who cannot read, and there are some few jobs for them. But there aren't many, and it's not a good place to be.
Programming is not a specialised skill: it is a general skill with applicability in almost every field of human endeavour.
It's not that unix folks don't appreciate it, but that it's not clear what to appreciate. From solely a personal standpoint, years of both using and programming on Windows taught me nothing about its values. However, I can talk about Apple and its values for hours and I've never owned anything Apple except its keyboard.
Joel, the author of the article you linked to, states the core value of Windows as being "useful to non-programmers" basing it on "By contrast, Windows was created with one goal only: to sell as many copies as conceivable at a profit". The former does not follow from the latter.
Microsoft didn't need to rely on usefulness to anyone to sell many copies, and they didn't. As for why Linux isn't non-programmer friendly, I beg to differ. Android is a prime example and the argument as to why it's not doing well on desktop, as Linus Torvalds puts it, is because no one ships Linux on their desktops. Microsoft has a lot to do with that. When and if they ever do, Aunt Marge will be as happy using Linux without even knowing it's Linux (does the average non-programmer know that Android is Linux?)
I'd say on the contrary, Android shows what it takes to make Linux do well with the general public - a huge company to ignore like 80% of the standard Linux install and build a whole new UI layer from scratch that is designed to appeal to end users.
"1. Make sure the printer is turned on.
2. Connect the printer to your system...
3. A message will appear when the system is finished installing the printer..."
I second that. When I was a student, I use MS stack a lot and sweared by it, but never really got to use the command line, since everything could be done faster using the GUI, and the command line had no benefits I could possibly see at the time.
I had to really use it when I started to do automated builds, and since then it opened my mind on the value of command line and above all the unix philosophy, which I really consider as super important now.
If I had to hire, I'd be relunctant to hire someone with Windows-only experience, not because I think the stack is bad, but because it's unlikely this person adhere to the Unix philosophy and is able to write scripts for small tasks, and automate important tasks with scripts.
If anything, using a non-windows OS teaches you about being able to even "navigate" a command-line, or understand basic concepts such as file-paths. Such as "../" or relative vs absolute paths, what "/" means, etc. I'm amazed at how often such things are missed and never understood, even by individuals who are in the "maintenance" side of IT and have bazillions of "certs" for everything from networking to administration.
"Windows greybeards" who originally came from DOS also know the whole thing about navigation etc. I admit that for a long time (before Powershell was introduced) the default shells of GNU/Linux were far better than cmd.exe of Windows NT (not even to speak of command.com of Windows 9x). Whether with Powershell this is still the case or not is yours to decide.
>Is the anti windows/.NET bias really that common? ... I feel like .NET devs are looked down upon
I don't have evidence of a wide random sampling of "trendy" employers but based on anecdotes[1][2], I think it is.
Many trendy companies (a.k.a. "hot startups") overwhelmingly favor a Linux + open source stack. If the hiring managers at these companies see a resume with just ".NET/Winforms/WPF/ASP.NET", they perceive a negative signal.
Yes, writing a loop to display a list in ASP.NET should be Turing equivalent to writing a loop in Node.js/Golang but people don't think purely on equivalences of computer science concepts. They also think about "culture" and a resume that's exclusively ".NET" looks "corporate" and "enterprisey" instead of "cutting edge" and "hip".
My advice to potential job seekers who only have "MS .NET platform" as a skill: start a github profile with projects using Linux/Node/Golang/etc to help counteract the adverse cultural selection bias.
[2]"And the .NET developers are mostly enterprise folks who don’t like working in rough-and-tumble startup world. Remember, I worked at Visual Studio magazine and had a front-row seat." from http://scobleizer.com/myspaces-death-spiral-due-to-bets-on-l...
It's not just cultural. Writing idiomatic C# does not mean you'll write idiomatic Javascript. Nor does it mean you'll write idiomatic Python.
Being in the .NET ecosystem exclusively can also mean a reliance on IDEs and .NET-specific tooling to be productive (or things like Eclipse refactoring for Java).
And there's the simple fact that there are a lot of qualified candidates who do more than just .NET/just Java. Polyglot programmers are more likely to be flexible and be in the right environment.
Of course people can be good enough, but it's not just about the looks. Things happen differently depending on the tooling.
Now, the extended part of this discussion is that maybe it's fine, and companies should be able to onboard people into a technology. Because, hey, it's just programming right? But that could mean looking at a multi-month training session until you're looking at someone as productive as the guy who's familiar with the tech already. Not all experience in programming is equivalent.
An aside: I feel like this discussion usually has a lot of "web programming is easy" undertones (it hasn't been explicit but we all know what "trendy" means). But I don't think there would be as much of this discussion if it were something like "web-only developer applies to work on embedded systems with basically no C experience".
"Programmers who used Java or C# (when interviewing with us) go on to pass interviews with companies at half the rate of programmers who use Ruby or JavaScript. (The C# pass rate is actually much lower than the Java pass rate, but the C# numbers are not yet significant by themselves.)"
I struggle to understand, from a practical standpoint, where it comes from. The performance of ASP.Net is becoming outright ludicrous[1]. I've personally rewritten a lot of C++/CLR to C# and we've seen performance gains over 20% each time. C# is a language that hasn't gone stagnant, regular iterations on the language are made to meet real-world demands.
Now that the OS is no longer an issue, why isn't C# seeing more startup usage? Just look what StackOverflow does with it.[2] Startups shouldn't be judging developers for knowing it, they should be looking at it themselves.
The problem is that with .Net experience comes a very big reluctance to "re-use" existing, open-source libraries. I've seen it first-hand, devs that "only" know the .Net stack. You come at them with any solution that involves "free" due to a library/project being open-source, and their brains nearly explode (bit of hyperbole, there).
Really? Everyone I know in the .Net world is busily building upon open source libraries via Nuget, which is an NPM equivalent. Most of that open source code is hosted on Github.
Now .Net is now also open source it would seem that the only proprietary part to the stack is Windows itself, and shortly .Net is going to be running on any platform.
In my experience, trying to get people to use NuGet is like pulling teeth. When OSS dependencies are considered cromulent, vendoring, sometimes just straight-up binary vendoring, has been my experience with .NET shops. Those are the above-average places, though--I've seen a ton of reticence around using anything with an OSS license.
Coming from a Mono background, this was very weird to me.
> I've seen a ton of reticence around using anything with an OSS license.
Very true. This used to be a point of contention where I work, but was easily fixed by having an automated process for getting our lawyers to approve OSS code (with a Nuget or Github URL). We still have to do it on a per-project basis (not per-license), but we're using lots of OSS code now and I've made a little headway on open-sourcing some of our own stuff.
These days OSS advocacy isn't that hard, especially with the hard facts demonstrated by CoreCLR.
And that reluctance comes from being mentored by senior software engineers who have been burned by using the latest shiny, and appropriately mentor the junior developers against using the latest shiny. We .Net developers build mission-critial enterprise apps using only the .Net stack, and most senior .Net devs would applaud one in doing so. Now I'll admit that the .Net stack is a moving target. But I'd rather have one commercially supported moving target than a dozen OSS moving targets. No rational person likes having their head explode. Being "free" has nothing to do with it.
The old perception of .NET devs is that they're provincial in their approach to tech outside of Microsoft. Stacks are always top to bottom straight from MSDN or P&P: jQuery, razor, ASP.NET MVC, WCF, EF, MSSQL, TFS. Part of this is MS fault: the silver/gold partner licenses provide financial incentive for businesses to stay in the fold by being MS stack proficient. I've seen Biztalk shoe-horned into a product mostly because it was included in the quarterly MSDN mailout CD package and hey, it's free!
To some degree, it's true. Many of my colleagues never touched node.js til Microsoft started using it. Or Linux til Azure supported it. Or functional programming til F#, or git etc.
Java developers seem more comfortable searching online and piecing together a solution from oracle, apache common, other 3rd parties. It feels like they're better at critically inspecting packages for performance, applicability, adoption since they don't have a mother corp creating or blessing every product. nuget has helped a lot of .NET devs come out of their cave and explore the great unknown that is open source.
Hopefully Microsoft's recent reinvention of itself changes both this perception of .NET devs and whatever reality is behind it.
> Many of my colleagues never touched node.js til Microsoft started using it.
Couldn't there also be the reason that before Microsoft started to become active in Node.js the Windows support for it was really bad (bad Windows support is a typical problem of lots of open source projects; I mean: I could understand it if OS X (at least the same degree proprietary) support were similarly bad, but this is usually not the case).
Node is very attractive for a variety of reasons, and TypeScript complements it like hand in glove (which is completely intentional, kudos to MS on that part).
Transitioning to Node+TS from .Net is fairly straightforward (setting aside the completely new tooling, building, open source package resources, etc), and you get microservices + much better performance with minimal friction.
Granted, large-scale systems of the Node variety are a world's difference from .Net, but that is not insurmountable.
The appeal is more than simply improving compatibility -- the whole microservices movement had been happening beneath Microsoft's noses, and this is their response to it. It is nothing short of impressive how they have expanded the .Net stack in order to remain current.
I think that depends on where you live really. In the area I'm in being a .NET developer is clearly an advantage.
One reason I, and others like me have looked down on .NET developers, is that a large number of them aren't any good. I've meet brilliant .NET developer, some of the most talented developers I know are .NET developers, but the crappiest developers I know are .NET developers as well.
Some developers are completely ignorant in respect to computers, the Internet, algorithms and new technology in general, and they are almost exclusively working on Windows and by extension .NET. These are developers that can make code work, but take Visual Studio away from them and they cease to be developers.
It's by no means fair, because as I said, some of the best developers I know are .NET developers. It's just that I've never see anyone working on Linux/Unix be completely ignorant other languages, tools and platforms.
Replace .NET with Java and Visual Studio with IDEA/Eclipse and you'd probably be right again with the same post. It's not really a .NET problem, more like a bell curve and popularity problem.
> These are developers that can make code work, but take Visual Studio away from them and they cease to be developers.
Ha, during my brief stint at a windows-shop after graduating, I did meet a few who couldn't get their heads around the concept of having a build happen without someone clicking a green triangle button in VS.
There is nothing inherent about the .Net stack that makes it biased or preferential-towards CI. Other than the fact that you can "do it" in a CI context, I can't think of anything else. Care to enlighten me?
Bear in mind, a lot of .Net developers don't even know it's possible to compile their projects outside of Visual Studio. Msbuild, Nant, csc.exe, "what are those?"
The standard for .NET shops is TFS. When you create a TFS project you setup CI and automated builds.
"Bear in mind, a lot of .Net developers don't even know it's possible to compile their projects outside of Visual Studio. Msbuild, Nant, csc.exe, "what are those?" There's probably plenty of java developers like that as well.
I've never seen Team Foundation Server in a .NET shop. It's either been SVN, Git or Perforce. That being said I think it's very individual, but many small companies won't pay for TFS, they barely want to pay for Visual Studio Professional/Enterprise edition
It probably varies by geography, but the .Net scene I was in really didn't have a bias toward CI at all (or testing, or almost any decent development practices).
I wont argue about it, our experiences differ, and that's cool :)
Those sorts of developers exist on every platform. The trendy platforms are simply too new for non-adaptable developers to have yet made themselves visible.
I think why .NET is looked down upon is because many have severe lack of experience with .NET. I too have made assumptions of .NET because of my experience was only webforms in a small course at my university.
Ironically, I have recently started using .NET MVC for my projects since I think it's really great. It beats many alternative MVC-frameworks in my opinion.
I think most startups I've seen are using some form of POSIX back end. I've been advising corporate techs with only .NET experience to start learning some bash at least.
I moved from commercial .NET coding to startup-land a few years ago, and there's definitely a learning curve. I had to ditch Windows on my personal PC and start using Linux at home to really get the hang of it. YMMV.
Funny to read, because I have the reverse experience. I have for long been using Linux and classic open source tooling for my work and am now moving to use Windows and .NET.
I have immense issues understanding the platform and how everything work. For example, scheduling jobs is something I still don't fully understand.
> The mess of terminal windows and text editors that I code in now is frustrating at times.
Just use emacs for everything! It runs in X, it runs in a TTY, it runs everywhere.
My own stack is st[1], tmux & emacs, with Firefox as my browser, within a tiling window manager. I have a single terminal open at any point in time, with tmux multiplexing the sessions. I use C-z as the tmux escape key, because it's easy to type and doesn't do any harm if I accidentally type it outside of tmux. I autostart emacs in server mode from a systemd unit[2] and run it under X.
Sometimes I use emacs's TTY or shell modes. This is something I'd like to explore more. If I did this, I might be able to have one window for both shells and text editing.
Regardless, I have one terminal window to rule them all and one editor window to rule them all. With Firefox tabs, I only need one web-browser window too (although I'm considering using emacs-w3m and/or eww more).
>Sometimes I use emacs's TTY or shell modes. This is something I'd like to explore more. If I did this, I might be able to have one window for both shells and text editing.
However, I still use the terminal by using ConEMU with Git Bash to get git in the terminal because I don't know how else to do it. But you mention bash and awk, I have never in my years of experience really used these tools a lot either.
Maybe some small bash-script to backup or something similar, but nothing else really. You don't have to use it and most of the times it's easier to write a python script or what not.
I haven't really bothered to learn any PowerShell except "Update-Database" in the Package Manager Console.
DLL Hell could have got a lot better since I last had to deal with it. I occasionally bump into a dependency problem with Ruby gems that reminds me of it.
I don't use bash a lot, but there is an expectation in Linux-land that everyone has some familiarity with it. That's not true in the Windows world so much. In my experience anyway. I'd be interested to see if a much more functional shell environment generates more use of it.
> I think most startups I've seen are using some form of POSIX back end.
Typical hip startups use "tools" (frameworks, libraries, etc.) that abstract underlying platform to such an obscene degree, that it doesn't really matter, if the platform is POSIX-compatible or not.
On the contrary, when these tools invariably break, knowing your way around POSIX environment is indispensable. Something as simple as invoking strace on a binary to find out where this damn thing looks for files it needs can save you hours.
Ex Amazon here. I've done a lot of interviews and I've never seen candidates being refused an on-site interview for having extensive Windows experience.
Still, the hire rate would be really low. Most would have little knowledge of what exists outside of an IDE: OS internals, system design, networking.
14 years .NET developer here. Oh, I regularly am involved in LAMP and other uncool stacks, but I'm still labeled .NET developer.
After all these years, I still am mystified by the whole technology/stack religion that pervades software engineering. I come from a mechanical engineering background. There's many ideologies on how to solve mechanical problems, but I would never hear "oh I see you have AutoCAD on your resume...we are looking for more of a life-cycle-engineering type of engineer" during an interview. A discussion like that would not even make sense.
The whole .NET/LAMP/XAMPP/GemStone/whatever solution stack you use does not define who you are.
I still consider myself a Mechanical Engineer who happens to be in the software engineering field who happens to choose .NET to solve problems since I'm highly productive on that platform.
The wast majority of people, who claim .NET experience can launch Visual Studio, then some wizards to generate skeleton of the app and then fill something to have a CRUD "solution".
They spoil it for the folks, who are truly experienced. The claims in the CVs are so maximized (Dunning-Kruger at the extreme), that it's simpler to just ignore .NET experience.
"The wast majority of people, who claim .NET experience can launch Visual Studio, then some wizards to generate skeleton of the app" -
This is the ignorance. There are people who are pretty bad on other stacks as well. I know many competent .NET developers. And many terrible PHP developers.
On average the average .NET developer is more skilled then your average Python developer. This has about as much substance as your claim.
I wouldn't say it's an "Anti .NET" bias, as much as it's an "anti too many years in the same exact thing" bias.
1. Choose a different language for your next side project.
2. Step in the door with C#, try to help out on projects or change positions inside the company.
3. Attend hackathons, contribute to opensource projects.
A good programmer can quickly feel comfortable in any programming language. Recruiters feel more comfortable when you can actually show proof of that.
> A good programmer can quickly feel comfortable in any programming language.
This is not true. It takes a not-insignificant amount of time to just learn to write idiomatic code, let alone making oneself familiar with all the quirks and idiosyncrasies of libraries and frameworks.
Additionally if it's your first exposure to a non-Algol language, you're going to have a hard time... you definitely won't feel comfortable quickly. But at least going from, say, Java to Python, you can quickly feel fairly comfortable writing Python code just like you would Java! (Your coworkers who understand "Pythonic" might hate you though.)
I don't mean to be hurtful, but you sound like you've exactly the same biases the original post is about, and you're rationalizing. Seriously, not all good hackers fit any cookie-cutter definition of a hacker.
If you don't have 3-5 years experience in language/framework X, no jobs in language/framework X will be forthcoming. People hire based on what you know, not your ability to learn. A new hire who can "hit the ground running" is an asset; one who can't is a liability.
My first company (as a non-programmer) was very much like this. It was very corporate and they weren't willing to take a risk on cross training a hobbyist without the requisite boxes checked on a resume. It was frustrating but I also understand the CYA logic behind it.
There's someone willing to take a risk on you somewhere for some amount of money. Having a github with noob code in the language you are trying to get hired for helps. If you are just starting out, being able to show you want it bad enough to learn it off the clock is a strong signal vs "I'll start learning it if you start paying me" (which is...also a signal).
I have side projects in JS and TS (Node.js), Python, C (~500 loc projects). Mostly I care about solving a specific problem. Learning a new stack just so I can put it on my resume would require time I don`t have :(.
I've strangely also found that this works the other way as well.
I have strong experience in the .NET ecosystem and good experience with nix platforms and commonly related technologies (including Java and Ruby). When I interviewed for a part time freelance gig with a company on Upwork, they were concerned about my nix experience. They wanted to make sure that since they were a .NET shop, that I wouldn't try to get them to use any non traditional Windows based technologies.
I told them that I'm not partial one way or the other - I just like to use the best tool for the job regardless of where it's hosted. I did end up working for them for a short period of time, but it didn't work out long term.
> Is the anti windows/.NET bias really that common?
I think it's less common now than it used to be but back in their heyday Java devs looked down hard on .NET devs. I think it's because MS visual IDEs, going all the way back to vb6, made a lot of people think they were professional developers just because they could create a basic crud program with a functional GUI very easily. This ended up promoting a huge amount of unusually bad code/coders to production environments and with that the bad rap .NET devs get to this day. The perception is that developers outside of the MS ecosystem usually have to be a little bit more erudite in the way they went about learning their programming language of choice.
And they prefer using uncool tools intelligently to solve a complex problem; rather than using the next cool thing dumbly that creates more problems than it solves.
I'm not fantastic with Powershell but there's another potential reason: you're not allowed to run "random" exe files on $SERVER but (signed) Powershell scripts are fine.
I'm not sure where the cutoff point is between this needs to be an app / this can stay as a script, but Powershell seems to lend itself to going either way. I can either incrementally replace C# functions with Powershell (while importing types and functions that I need) or incrementally replace Powershell with C# with the same method.
Sounds like a bunch of C# devs who didn't know Powershell and couldn't be arsed to learn it. Even though it has similar syntax. To some developers every problem requires the only hammer they have.
Must have been really bad. Why not just hire you on as a consultant?
And sometimes they are unbelievably set in their ways, for example using procedural code ( STATIC ALL THE THINGS! ) in Object Oriented languages like Java.
And that's the rub. When you are hiring experienced devs, are you getting Mr I-use-staticz or are you getting Ms Let's-use-boring-tech-because-it-works?
What I think in such cases is "unless I can convince him that this is a bad idea, then I should let him do it". Yes, I understand that sometimes we aren't entirely sure why a thing is bad, we just have a gut feeling that it would break something, or is a bad practice.
So first off I try to convince them with everything I got. That usually works if I have solid concrete reasons. If I don't and let them do it there way. Sometimes it turns out my instincts were right and I get to learn exactly what was right about them. Sometimes it turns out I was wrong and his way was much more efficient and I get to learn something new.
Either way, it doesn't make the junior dev feel I am imposing my will on him. People tend to work harder when they feel it was their idea.
When I was a junior dev (and some days I still feel like that), sometimes I needed to write a proof of concept to see why something wouldn't work. Luckily this kind of development (at least in the early stages of a project) was encouraged. I still throw together small scripts and apps to test ideas all the time (http://github.com/voltagex/junkcode)
I would assume most devs do the same all the time to benchmark technologies or approaches to problems.
(Then again I seem to meet far more devs who are telling me about using shiny new tech X and how amazing it is, than ones who actually tell me about how a specific tech solves their use case).
But about as annoying as senior devs who give bad advice or ignore half the words in your sentences and then give you bad advice, or who start giving you bad advice during a planning meeting after a 30 second brief about your problem that was directed to your manager.
The “Moneyball” quote is by far the most provocative. Never mind hanging around bashing TendyCo or trendy engineers. There is a market for programmers, and like in every market, some are overvalued and some are undervalued.
But unlike stocks, you don’t extract value from your programmers by selling them at what the market perceives it will pay. So you aren’t hunting for programmers that are valued by the market, you’re hunting for programmers that will be productive.
And because it costs you time and money to even talk to programmers, you want the programmers with the highest probability of accepting an offer and being productive. Which means that regardless of your budget productive programmers that are undervalued by the market are the very best ones to interview.
Think about that the next time you see an ad that “encourages women and people of colour to apply.” Maybe they’re raving SJWs who hate white men. Or maybe they’re playing Moneyball. (Where or is `||`, not `^^`, of course.)
In my experience, hiring managers, and almost all companies apart from extremely early-stage do-or-die start-ups, simply do not care about productivity. Sure, they'll use productivity as a political excuse to fire you, overlook you, or ding you on a bonus or something. But that's just a way of cover their political games with plausible deniability paperwork. At the end of the day, they don't really care about productivity, and most businesses are not punished by consumers when the business is not productive, beyond a low, low threshold of productivity that non-do-or-die stage businesses can mostly automate and extract from workers who only even need to be semi-conscious to do the work. I've experienced this laissez-faire autopilot attitude first hand, even in a multi-billion dollar asset management company where there was "real money on the line every day" and blah blah blah.
Just look at the office space provided to knowledge workers. Open-plan offices demonstrably destroy productivity. You're literally throwing away something like 10-15% of a worker's productivity (and thus also the corresponding compensation you pay them) just by placing them in a loud, no-privacy open-plan space.
But companies can't get enough open-plan. They want on-site cafes, rock-climbing walls, elaborate decorations, game rooms, liquor-focused cultural activities, etc. etc. all taking place right on top of the space where work is supposed to get done. Can there be more of an unequivocal admission that they don't actually value productivity?
The idea that most businesses have to be really sensitive to bottom-line productivity is a Big Lie. They hire you because they think you'll be a politically ally to their particular internal faction for their pet projects and power struggles. They may need to use your on-paper credentials to win arguments or to "certify" some things, but make no mistake, they absolutely don't care about what you can do.
Absolutely. I've had a contract where I could have done the job in two months; there were four people in the team and the contract was for six months. I spent most of my time reading articles on the internet :)
The reason for the over-budgeting? The team lead was a permanent employee trying to get promoted to manager; in order to do that, he needed a 1) successful and 2) significant project. Genuinely large projects risk being unsuccessful, but an overestimated one ticks both checkboxes.
"Look at that one, honey! He's wearing a Ramones t-shirt while head-banging and playing air guitar. Oh, and look over there! That whole section of the room appears to have fallen out of a Warby Parker catalog..."
But my cynicism says that managers often hire people who will be productive because a huge part of their political game is producing software that appears to be “strategic” or will generate the revenues that make their group more valued and garner more budget and power for said managers.
So yes, it’s cynicism all the way down, but no, that doesn’t mean that managers aren’t trying to hire productive programmers :-)
Somehow, I don't think we're going to find any agreement here, when your comments consistently dismiss anyone who disagrees with you as having an invalid perspective.
I am striving to back up my disagreement with cogent points. The excuses that are used to justify the anti-human way that we allow HR-type thinking to structure our working lives are something I am passionate about criticizing.
There are a lot of time-honored HR-type tactics used to try to justify and proliferate these anti-human patterns, and I do wholeheartedly acknowledge that my belief is that attempts to defend them are fundamentally invalid.
In the sense of principled sociological understanding of bureaucracy, such as in the book Moral Mazes, it's probably the aspect of technology culture that I am most interested in.
You are surely correct in at least some cases where my comments are expressed poorly or come with too little supporting information, but I strive for that to be the minority.
I'm a hiring manager at a mature healthcare software company. Productivity is my number one concern. Among regulatory overhead, testing, and legacy compatibility, a developer has to work hard and smart to move the ball forward here. Teams are lean. A single unproductive developer will cause us to miss the scheduled release date of a feature.
There are a number of buzzwords in your comment that cause me to doubt its veracity. Without getting into an extended debate about exactly how you detect this productivity in a candidate (something even the best tech firms openly admit is extremely hard to do), the best thing is just to agree to disagree.
I didn't say I could detect it in the hiring process -- far from it actually. I just said that I care about productivity, as a counter point to your claim that most companies do not care.
I can't agree to disagree about being called a liar. Can you be more specific about which "buzzwords" cause you to doubt which of my claims?
I didn't mean to call you a liar. You might believe you are screening for and prioritizing productivity. I'm saying that I don't believe that's actually what the hiring process you're a part of is screening for, even if the people who comprise that process believe it. Obviously, this is just my prior opinion, a default, since I don't have special evidence.
Regarding your comment, the most major issue is with that term "lean." In many cases, this means some usage of Agile/Scrum nonsense, which if true completely throws out any credibility that the position is focused on productivity in the least bit. Sometimes instead of Agile, "lean" is used for six-sigma like micromanagerial process, which suffer all the same criticisms as Agile.
Even in the best case, a team that self-identifies as "lean" is failing to make use of specialization of labor. Many of these teams hire people into roles titled "full-stack" or "generalist" and they say stupid shit like, "because our team is lean, you will have to wear many hats." (I am actually so sick of hearing the phrase "wear many hats" that it causes me to instantly reject a job out of hand at this point).
"Full stack" is arguably the single worst trend in all of software. It goes completely against the major benefits of software development: specialization and separation of concerns. Many organizations that use full-stack practices also believe that they don't need to provide meaningful job descriptions. They want to leave job descriptions vague ("many hats!!") and argue that candidates have to be adaptable if they want to cut it in this crazy dynamic world of ours.
This is all total bullshit. There might be a small period of time in a start-up life cycle when it pays to have many generalists and leave everyone's work assignments vague. But most start-ups, and certainly most established companies, have no business operating in "lean" mode. You need to empower employees to know the limits of their job descriptions, so that they won't be treated as catch-all, pan-everything work receptables who are capped solely by the literal limits of their physical exhaustion (at which point you whip together another pan-everything job ad to hire yet another generalist to handle the undifferentiated work overflow).
This fails to respect the worker's speciality (which is something the worker absolutely had to protect to advance in their career). It also means the company is not extracting the full value from the worker that they could by doing the challenging job of actually managing them and setting up the workflows so that work requiring that specialist skill is routed to the right worker. If the company embraces a "full-stack" "we're lean so everyone wears many hats" attitude, it's an overt admission that the company couldn't give two shits about what you're actually capable of doing for them, and instead only cares that you do what they happen to tell you to do right now, even if it's hilariously underutilizing you or is hilariously inappropriate for someone with your particular skills, or is using in a way that fails to address critical business needs you've identified.
Basically, I see "lean" and I immediately think, "managers believe they can throw a bunch of so-called full-stack developers on a Scrum team and then flip on autopilot." The managers are going to get mad if that machine learning expert they assigned to clean up the legacy Rails codebase ever breathes a word of dissatisfaction over not being fully utilized or utilized in their speciality.
"lean" is by far the buzzword that is most negative from your comment, but I also see the phrase "move the ball forward" and I immediately picture those trite motivational posters with eagles and someone passing a baton in a relay race and I just roll my eyes. We're not in middle school. We go to work to do work. We can speak about the work we do in grown up terms. Not "moving the ball forward." To me, this communicates a very top-down attitude about what progress means. There are some high-level, likely paternalistic or even misogynistic, ideals about company progress and what a good little worker must do to be productive. No thank you! Even if this language is not indicative of the worst kinds of problems, it still is extremely infantilizing.
"A single unproductive developer ..." oh boy, don't even get me started. Right away this sounds like someone with a way over-inflated opinion of the work their team does. "Our work is so very important that we can't abide even a single person who isn't amazingly productive." Yeah, OK. For one, you just said your team is lean, so whose fault is it that you don't have adequate redundancy built into your technical resources (your tech staff). If someone said their distributed database "couldn't tolerate even a single node failure" you'd ask them why they don't create more nodes and add redundancy to have a safety factor.
Why would a team agree to be lean if they are also worried that a single bad link in the chain will cause a problem. Either there's some kind of extreme budget constraint, or else this is someone who just read a Malcolm Gladwell-like pop book about "lean" and "Agile" and decided that's the shiny new management thing they just had to have.
Lastly, I'm not sure why "release date of a feature" was the example you chose to go with. This could be innocuous, but more often than not when I see people who think in terms of release dates and features, it's a huge red flag. Most teams need to actively constrain the set of features they agree to support, tell customers and internal business stakeholders "no" way, way more often, and address overall technical debt and architectural quality concerns far more than shipping features. If I was in an early interview stage and someone is already asking me how I make sure I always cram out all the features on time, that's a huge red flag of a dysfunctional process driven more by short-term business managers seeking bonuses that are tied to on-paper accomplishments (e.g. we shipped X,Y and Z) than the engineering reality (X, Y, and Z were technically delivered 'on time' but they suck and now everyone's asking for W which we can't even do because of how fragile the implementation of X, Y, and Z was to get it out the door on time").
Let me qualify all of this by saying that yes, absolutely, 100% this sort of detailed dysfunction can be inferred from very small amounts of buzzwordy HR text and job ads. I've seen it time and time again, and even been part of the teams responsible for drafting job ads and seeing first hand the thinking processes of HR as they inject all of this awful stuff into it.
There is even a major qualitative social science study about exactly how this kind of vague, symbolic HR-approved linguistic atmosphere is heavily, heavily related to the skewed ethical views held by executives and managers. I strongly urge you to read it if you have not already:
I'm not saying that this definitely applies to you (I don't know you). I'm saying that my experiences tells me that the odds are that the team you are representing to "move the ball forward" with a "lean" team that cannot bear "a single unproductive developer" is a very dysfunctional one, and that many of these buzzwords actually imply the opposite of the image they are invoked to create.
> If the company embraces a "full-stack" "we're lean so everyone wears many hats" attitude, it's an overt admission that the company couldn't give two shits about what you're actually capable of doing for them, and instead only cares that you do what they happen to tell you to do right now, even if it's hilariously underutilizing you or is hilariously inappropriate for someone with your particular skills, or is using in a way that fails to address critical business needs you've identified.
Another possibility is that the value of shared context and mutual learning is seen as desirable outcomes, on par with production outcomes.
I'm not saying specialisation is unnecessary. But it comes with costs of its own, so some overhead from wasting the specialist's time is worth it.
"There were even a couple of cases where I had really strong internal referrals and the recruiters still didn’t want to talk to me, which I found funny and my friends found frustrating."
This has to be a firing offense.
A recruiter's job is to find people the engineers want to work with and who the engineers believe could be a strong contributor. If the engineers have already found someone they want to interview, and the recruiters are saying no, the recruiters are just blocking the very process they were hired to facilitate.
In interviews I actively try to down play the companies I've worked for and the things I've done with them. The reason I do this is that I don't want people to come in with unrealistic expectations and expect me to live up to them. I also don't believe in preparing for interviews. I'm ok with the rough edges I have. I want my imperfections to come through on some level. Yes I have a preference for certain technologies. I love what I do. They should be more concerned if I didn't have an opinion. It doesn't always work out in my favor. shrugs Rarely, do I serve up great interviews back to back on different days. I'm an front-end guy because I love building them. That's all.
I struggle with this as well and it worries me. I tend to be painfully realistic about things I can and can not / have not done. However sometimes I feel like I need to do a better job of selling myself. Walking the line between 2 extremes is quite difficult.
“Experience is too random, with payments, mobile, data analytics, and UX.”
This one made me chuckle.
Same thing usually happens with my résumé: either one discards completely the part where I cite strong OCaml knowledge plus functional/logic programming interest and/or they discard completely the part where I mention experience with ARM PRU assembly and the fact that I ported an educational OS to MIPS during graduation.
It usually comes down to:
- "So you are a C++ programmer?"
- "Yes, I have a good experience with C++."
- "Good, we'll like to hire someone with your knowledge, we have PHP and Perl codebases to maintain and your profile seems a good fit."
I'm currently in the process of applying at Facebook (which I guess qualifies as a TrendCo that recently transitioned to BigCo). I haven't even got to the technical phone screen yet, but the recruiter tried to get me to attend an "interview prep session" and then sent me a pair of videos from it totalling 2.5 hours when I couldn't, which of course I felt obligated to watch on the assumption that all the other candidates will have.
Ridiculous enough. Then there's the content. The videos are a presentation by Gayle McDowell, the author of Cracking the Coding Interview. Lots of tips about interviewing, 95% of which were pretty obvious but most of which seemed useful enough, on the assumption that the process is a given. But McDowell would repeatedly say things like "most interviewers won't expect you to write perfect syntax on a whiteboard, but some will" as if this was perfectly reasonable. And she gave "if you're writing in Python, it's a red flag if you use ++ (instead of += 1)" as an example of something she cares about herself! So both the recruiter and McDowell recommended doing a number of pen-and-paper practice coding problems before your interview, specifically on the basis that this is something you never do in your day job.
If I want to be charitable, this is Facebook trying to even out some of the biases introduced by their (apparently unchangeable) whiteboard algo puzzle interview questions, by ensuring everyone is prepared.
If I want to be less charitable, the demand for hours and hours of preparation is just (further) hazing to prove dedication.
In either case, I think this evidence that BigCos are not necessarily any more rational about hiring.
> If I want to be less charitable, the demand for hours and hours of preparation is just (further) hazing to prove dedication.
Come to reddit's /r/cscareerquestions one day. Tons of stressed out 18-22 year olds practicing CTCI and LeetCode for hours just for a chance at passing the Facebook/Google hazing. Because these people are usually "high achievers" who spent hours in high school at SAT preps, this feels natural to them, which is why they'll give questions like this when they hire in the future.
For what it's worth (I'm not speaking on Facebook's behalf, etc etc), I've done hundreds of interviews for Facebook and I don't think it's a red flag if you write ++ in Python.
Unless you're interviewing as a Python syntax expert or something like that.
Also in the hundreds of interviews club for Facebook and I also do not care about ++ or standard library memorization stuff. I think the recruiter is just trying to do as much as he/she can for you, because our recruiters are inherently invested in the candidates for obvious reasons.
I will review the video you mentioned to make sure we are not misleading any candidates unnecessarily. I have done 500+ interviews at Facebook, 100 for coding. I have also read thousand of interview feedback, and there has not a single time when the point about "++ instead of += 1" coming up. So something is obviously lost in translation here, and we will correct that. Thanks for the feedback!
I've done close to 200 interviews for Facebook. If you're coding in Python, then presumably you think it's your best language. If you aren't sure how to increment in your best language, it looks a little funny.
But just that: a little funny. It's certainly not a deal-breaker; we don't expect perfect syntax at all. For all I know you often switch between Python and JS in your day-to-day work and could easily make such a small error in a high pressure situation. That said, I'll notice it and take note. If one of four or five interviewers says "<name> incremented funnily" nobody will care. If every interviewer is saying "<name> doesn't seem very fluent in their chosen language" then the red flags start coming out.
Not directly related but also relevant to the OP: half the reason I'm applying to Facebook in the first place is that I know telling recruiters at other places that I have a Facebook offer will do more to make me attractive than any actual work I've done at non-TrendCos.
Sounds about right. I work at a company a lot of people are trying to work for. We had a rash of Googlers interviewing with us not too long ago. Turned out, most of them weren't actually interested in the work we do. Instead, they were trying to get counter offers to stay at Google. Brutal.
Likewise, I've seen our recruiters shit themselves trying to sell candidates because they had a competing offer from another A-list company. Getting competing offers is excellent leverage to maximize your starting salary / equity.
Being in the IT job market off and on for the last 18 months and being interviewed at about 40 companies in one of the major tech hubs in the USA, gave me a similar experience as the OP. I came to several conclusions:
1) In my case trendiness is more valuable than the school you've graduated from (graduated from a school overseas that nobody in the USA heard of). I worked for one of the companies in the area, notorious for its difficult tech interviews and I got a bunch of calls from recruiters (including from Google) during my tenure there and immediately after I left. Two years and few short gigs at not-that-cool-companies later, I don't get that amount of calls.
2) Most of the "technical" recruiters have no clue how to evaluate technical candidates. For many of them the main (for some the only) way to judge about your skills is how much money you make/how much money you want. I'm talking about deep understanding what engineer do, I'm talking about basic stuff.
3) If you have 1-2 short gigs for the last couple of years you are under the suspicion that you can not be trusted enough to be hired as a FTE.
4) If you are unemployed at the moment of the interview - look 3)
I believe now that hiring in the IT industry is somehow a stochastic process. I was hired and got offers from reputable companies and organizations and rejected even at the first round of interviews by no-name/less-than-mediocre-companies. Go figure.
I completely agree with the article. I worked for EC2 for >3 years and consistently exceeded team expectations. I'd get pinged by recruiters all the time while at Amazon. Now that I work for smallish not-for-profit, no recruiter seems to be interested any more. What's interesting is that I'm more active in OSS projects, have far more experience than I had when I left Amazon. Totally weird!
Recognizing these difficulties and biases, and watching the swaggering, overconfident, surface-level way coworkers around me talk about their interview candidates as if they had awesome powers of judgement, my approach has evolved into:
Pick an LSAT-style logic problem. First just give the candidate the problem, and see if they can work it out. No programming, just simple logical thinking skills. Then get into writing some code (or even pseudocode) that can solve some version of this problem.
There will be some conversation after that, maybe about how you'd scale the code, and then branching off into other things. But mainly I just see if a) they could solve the logic problem and b) they could translate that solution into some reasonable code. I try to apply this consistently across candidates.
I was unemployed once, and I knew that I know how to do this stuff. It was incredibly frustrating to me that the people interviewing me were holding me up to unreasonable and irrelevant standards. I keep that in mind and try to give people the benefit of the doubt.
Back when the web was still young and I had less than 1 year experience, I interviewed at IBM for a web developer position. They gave me an aptitude test, which they said I scored amongst the highest they had ever seen. This got me through several rounds of interviews but at the very end, I was rejected. When I asked for feedback, the hiring manager said that I didn't have enough experience. It felt like a complete waste of time, because they KNEW I didn't have enough experience from the beginning, so the entire process was really offputting.
I'm not one to believe the official explanations given. It is too easy to lie than to give the truth, even when that truth isn't bad. So I wouldn't put too much work into you being inexperienced. Instead, I would say that they had a different reason for not choosing you but that was the simplest lie they could come up with on the spot by looking over your resume.
yeah I'm pretty sure that any approach I proposed, on a spur of the moment answer on an HN thread, without spending years of research and data analysis with a team of diverse experts in the field, would end up being a method to hire people who were more like me.
Not as if I had time to do that kind of research either :)
That'd probably always hold for some definition of "like me". However, my approach has resulted in thumbs ups for people different than me in ethnicity, age, gender, and career background, which should count for something I think.
As I get older (I am currently thirty), this is something that terrifies me, mainly because it is something that I have experienced from both sides. I got a B.S. from a major CS university, then started on a Ph.D. in Astronomy. Due to my dissatisfaction with that program, the fact that programming is my passion, and various life events, I decided to leave after obtaining a M.S. and began to look for work.
I attended my university's CS job fair, only to find that I had become almost completely unemployable to tech companies (granted, I only had a 3.3 GPA in CS; damn you, WoW!). I ended up taking a tech support job with a scientific computing software company after I was promised a quick transition into development. However, that too evaporated, ironically due to the stigma of working in tech support.
After about 8 months, I interviewed for a developer position at a Midwest branch of a major tech company, but was again turned down. Luckily, I was instead hired to be a QA contractor on a different team, whereupon I quickly transitioned into a non-contracted developer on the original team that didn't hire me about a month before the company's new CEO announced that we would not take anyone with lower than a 3.5 GPA. I took the opportunity to mention this fact at every opportunity to one of my co-workers who had originally voted not to hire me.
After about a year, I started interviewing new candidates at this company. You would think that, given my prior experiences, I would be more forgiving than my peers in looking at past experience. While I didn't make any judgments based on school or GPA (in fact, those were filtered out before the resumes ever made it into my hands), I did turn down several candidates who were "too enterprisey", i.e. did not have experience or interest in whatever tech was trendy at the time. Granted, these candidates also had trouble understanding and debugging simple perl scripts I showed them, so I was already not on board, but when those candidates were borderline hires/no-hires, that was something that swayed me in the "no-hire" direction.
Now, as a married thirty-year-old with two kids and several non-coding hobbies, I have neither the time nor inclination to use my free time to learn today's trendy languages and frameworks. My work keeps me relevant for now, but if that ever changes and I need to look for work, I'm terrified that my experience and ability to learn quickly will be irrelevant to prospective employers, who will just want to see the right technologies appear on my resume.
Hello, just want to chime in and say that you shouldn't have to feel anxious about falling behind potentially if something new comes along.
Life is too short to worry about things that can't ultimately care about you (weird thing to say I know, but the interests that you care about, even if they are non-human, you fall in love with because they give you something back; and excluding the emotions of: the fear of missing out, the pressure to fit in, or the greed to get ahead).
I'll go out on a limb and say that most of the new Javascript web development frameworks are created by really zealous young people fresh out of school, eager to explore the brave new world of open-source and using Github as a medium to mark their mark (hence the reason for why there are so many new solo frameworks or fork, not enough collaboration); or by old-timers in that community who for whatever personal reasons really enjoy doing it and been doing it. Go to your local open-mic for poetry or music jam for a "In Real Life" representation of this phenomenon.
Speaking for myself, I am just not interested in learning arbitrary new frameworks that isn't intellectually interesting, that is simple syntactical sugar (remapping the textual receipe of rendering of a button from desktop development, to Web 1.0 to Web 2.0 to React.js) or doesn't open up a new outlet to other subjects that I want to learn about (e.g., machine learning, Bioinformatics).
Practically speaking, if it turns out in 5 years, all software companies mandates the banning of lamer frameworks such as PHP or .net and only Node.js/React.js/Flux allowed. I'll wait for the idiot crash guide for these frameworks and these frameworks to be watered for "plebes" like the rest of us and try to learn the keyboard remapping in a couple of weeks; I'll let the hipster kids on Github who "did it first" take the street cred; and hopefully then, I can still manage.
Grammatical and logical issues aside with, "we only hire the best," once you make it into TrendyCo or BigCo you're faced with the dilemma of Plato's cave. Here is a code base written by a group of average people with big egos whose reality is limited. Your perspective as an outsider to the cave are not going to be valued. Even in very open-minded teams I've found that once the way is established people will struggle to keep it that way forever.
As a developer, trendy or otherwise, you take a big risk sitting down at the back of that cave and joining in the group.
There really isn't a way to avoid this other than only working in places were you already agree 100% with the way they do things. If you build your own company you will also have your way of doing things that you will impose on everyone working for you and you will not change it because of some new guy. Not significantly, anyways.
There is a way to avoid it but I think software development is unique in that we haven't been able, so far, to establish what the state of the art is. And so it seems like, as you say, every company is in its own cave and the only way to have the freedom to have your own ideas is to start your own cave.
It's a little bit backwards in my opinion. Instead of it being commonly accepted that we should be checking mathematical models of our systems we have "senior engineers," debating whether unit tests are a waste of time. We have developers either being trendy or forcing others to pretend they are. And trendy changes in each ecosystem (BigCo vs TrendyCo).
Companies can, and should, change. The AWS teams managed to see the formal specifications are a good idea and have been better for it since. You can be sure there were highly experienced senior developers who resisted their introduction with the best arguments they could muster. Even very smart people want to stay in the cave and watch the shadows.
I find that "how can he improve?" is probably a much better question than "Why don't you want him"?
Ask the first - and you'll get real actionable items to improve on.
Ask the second - and you'll get the recruiters equivalent of "you're cute, but I already have an imaginary boyfriend".
While a company might pride itself in hiring generalists and "we do a bit of everything", very often they're just looking for someone to fill a specific position for a very specific need.
Those needs might change in 3 months, (and then they'll be glad you're a generalist), but they're hiring someone to fill the gap NOW.
This is especially true for startups, who have limited funding, believe in "move fast" mantras, are afraid of hiring someone that might need to be fired, and are afraid to take time for ramp up.
> While a company might pride itself in hiring generalists and "we do a bit of everything", very often they're just looking for someone to fill a specific position for a very specific need. Those needs might change in 3 months, (and then they'll be glad you're a generalist), but they're hiring someone to fill the gap NOW.
So in other words: The startups present plain lies in their job requirements and complain that the people that submit an application don't fit what they really want.
IMHO the easiest solution would simply to be honest...
S/he must be good because s/he worked for X known company.
It's a pervasive bias, especially in tech where certain brands are lionised to the point of nearly becoming sacred. Brand association alone will get many a person through an assessment process with sometimes with the complete abandonment of any semblance of due diligence.
Remember the Scott Thompson scandal? Basic stuff like lying on your resume did not stop Yahoo! from making a pedigree hire. Must be good because PayPal.
We have to all work very hard to educate ourselves that we are all inherently susceptible to this bias.
My favorite question from bosses is, "So, Napper, what kind of developer are you?". This is about the time I start listing all of my experience and they're just like...I don't know what to do with that. - I can do anything, just give me a check and I'll make you profits like a good ferengi.
This practice of hiring for tech glamour is bubble behavior. I'm an old guy, so I remember back in 1999 some nitwit hiring companies were demanding five years of J2EE experience.
Don't worry; if the overfinancing bubble bursts, or even deflates gently, people with salaries and get-it-done tech experience from outside the Silicon Valley reality distortion field will once again have an unfair advantage.
I think we're starting to see the need for 'laborer' programmers. There's a lot of relatively unskilled glue/laborious coding that needs to be done. When you need someone to hammer a bunch of code out for you, you don't need an experienced and flexible software engineer as much as you need a kid with a well-trodden neural pathway that lights up when they write code in your tech stack.
I think code schools are effectively training programs for this kind of work at the companies that use them as hiring pools. I would be surprised if these companies didn't influence the schools' curricula in some way. If they aren't doing this explicitly, they should, and they should absorb some of the cost that is currently being levied against the students. Right now, we're at this weird point where students are paying for their own training at these 3rd party facilities that more or less kind of guarantee them maybe a job at this list of very specific companies but no guarantee... Something feels a bit sleazy about it. I don't know how transferrable the education you received at a code school will be once you leave the company that hired you from it.
I think the field of software engineering/programming/whatever has broadened so much that you have to have some distinguishing, domain-specific feature to get a job somewhere--the distinguishing feature of the 'trendy' programmers is usually the tech stack alignment. Maybe general knowledge of writing good, maintainable software is moot in the face of an efficiently trainable workforce, short-lived code, and a diaspora of tech stacks.
> I think we're starting to see the need for 'laborer' programmers.
The mass offshoring effort in the US through the 00s was exactly this. The problem that many companies discovered is that you really get what you paid for. If you spec something out to a 'T'[1], you'd get exactly what you ask for - warts and all. If you don't provide enough detail to do that, you get to go back-and-forth until you do. And if you want anyone to troubleshoot a mildly complex issue... well, good luck. Maybe one in ten of the people who wrote the code from your spec have that ability.
Some equilibrium has returned, but not before the damage was done - causing a lot of unfortunate misconceptions about the abilities or lack thereof for entire demographics of people.
> code school
For the most part, these are an attempt to create the same thing without the offshoring - 'teaching code' of that sort has been relatively common in offshore markets I've worked with in past jobs. I don't think that's their intention, but it does appear to be the end result.
[1] To the point where you're more or less pseudo-coding the solution
> I think we're starting to see the need for 'laborer' programmers. There's a lot of relatively unskilled glue/laborious coding that needs to be done. When you need someone to hammer a bunch of code out for you, you don't need an experienced and flexible software engineer as much as you need a kid with a well-trodden neural pathway that lights up when they write code in your tech stack.
What do you think those indian programmers were doing for the past decade?
There kind of used to be acceptability for this - i.e. architects designing the class topology and sometimes even method names, and handing it down. From what I've witnessed, this started to fall out of favor around the early 2000's, if not before.
The problem (at least as I imagine it) was that this is inefficient when the designs need to change and one person gets a really soft easy job and the others get a lot of detectable grunt work.
Now we have the opposite problem - the "architect" title is kind of despised by people who know what it means, and everybody designs - but often those designs are at odds with each other, and usually too entwined with ego to come to compromises that make different sections of the code work well together.
There is a complex balance between finding the right level of top-down design and order and using all the assets of the team to make sure things are good and stable, and in particular, maintainable to folks who are going to be new to the code.
> I think we're starting to see the need for 'laborer' programmers. There's a lot of relatively unskilled glue/laborious coding that needs to be done.
I have a computer for that! Seriously, that's one of the things a computer excels at. This is what is so nice about a language with strong dynamic typing: it makes writing the sort of code which writes glue code reasonably easy.
How do you get your computer to solve the problem of "this text needs to be 2 points larger, a darker shade of red and 5 pixels to the right" without involving a programmer?
I've worked on the Avon AU website and we had a few hundred tasks like that. It's why (to link back to my current pet peeve) I'm upset that my relatives are unwilling to learn a tiny bit of web design because "it's too hard". There is a market for people who ONLY know how to do this. (There were two people in the company I was working for and my team only got access to them one hour every week - they were that busy.)
I'm living this right now. I'm unhirable right now due to my salary requirements (having 13 years of practical experience means that I won't take a job for $60k), and I was turned down for an internal transfer because "you have the skills and experience, but your answer to the interview question was too inefficient."
That will teach me to not memorize algorithms I haven't used since college.
> having 13 years of practical experience means that I won't take a job for $60k
Where are you based? If you're in the US and you have as much experience as you say you do, even if you're in places that aren't major tech hubs, you should be able to find remote work that pays more than $60k/year.
> That will teach me to not memorize algorithms I haven't used since college.
The debate about the merits of algorithm-driven interviews aside, the fact is that most companies still do this for their interviews. This is starting to change, but slowly. When I was last looking for a job, every company but one[0] asked questions about complexity classes at some point during the interview stage. Unlike the last time I was on the interview circuit, though, more than half had at least one question that was explicitly focused on something else - so it's changing, but still slowly.
So, while it may be information you never access again while on the job, if it's the knowledge that will secure you that well-paying job in the first place, consider it a worthy ROI.
> should be able to find remote work that pays more than $60k/year.
Key word being "should". The "companies willing to take risks" from the OP don't accept remote people most of the time, and those who do allow remote frequently don't believe that people outside of SV are worthy of the payscales you see in SV (even after adjusting for the cost of living).
> asked questions about complexity classes at some point during the interview stage
Yeah, and I am reasonably certain that I passed 3 of the 4 for that class of question, but I had forgotten the minutiae of Dijkstra pathfinding algorithm. As a result, my response for the 4th was O(n^3), but it should have been O(n^2). At least I won't forget that one again.
It was pretty funny in one of the interviews, where we spent 50 minutes going over a logic puzzle (the kind where if you don't know the answer, it's unlikely you'll figure it out in an hour), they then went on to say how the job was about creating an interface between a DB and RabbitMQ, and creating a common application framework for the rest of the company. We ended up having a good (but short) conversation about how to do exactly that, and those 10 minutes seemed much more valuable than the time spent on a logic puzzle.
I think that's the big thing: the existing employees who are drafted to do candidate interviews were themselves interviewed with this style, so it's how they go about it. They simply haven't stopped to think about what biases these interviews introduce, or think those biases are a good thing.
> I'm unhirable right now due to my salary requirements
There has to be more to this story.
"13 years practical experience" reads like someone who who describes themselves but doesn't have a formal CS background (just how it reads to me). I could speculate wildly, but it boils down to knowing how to market yourself.
Move to Seattle, we could use a dozen of you here.
Minor nitpick: how difficult is it to slap a date somewhere? Especially with this design it is impossible to tell when in the last 30 years this post was written (for the curious, it looks like it was posted today).
"Contractors are generally not the strongest technically"
Nonesense. Contractors are actually much stronger technically because they usually come in to complete a specific task and are not usually trained by the company.
At least that is my experience here on the east coast.
At least at Google, it was almost impossible to hire an engineer with domain specific knowledge. Nearly everybody we tried to hire who had the domain specific experience our team needed didn't make it past HC due to some BS gotcha question in the interview. On the other hand, contractors were trivial to hire.
So in a sense, the contractors were not "as strong", since they didn't have make it through the absurd Google hiring process. As such, they were generally treated as second class citizens.
Success or failure of an employee is based on what the company does with them after they are hired. It doesn't matter how "great" an employee is when they come in. I've seen people come in as senior devs but you'd never know it. I've seen some employees hired and ran off when they couldn't clean up a hot load in record time.
Then don't work at those companies. It seems there's always .NET jobs - I get recruiters all the time bugging me about those gigs, but I get nothing on any of the other languages I've worked with (and the .NET work I've done is a very small % of my career). Perhaps it's a location thing - Houston is definitely a .NET town.
And the San Francisco Bay area is _not_. That's been frustrating. Fortunately, I had a lot of PHP in the past so that helped me find a place when I decided to roll the dice and leave Texas.
Now back to scraping up some Python and coming up with a good project to throw on GitHub to show "Yes, I are engineer. Here is me saying 'shibboleth'."
Some areas are location specific. Richmond VA seems to be a .NET and Java town mostly. My team's stack is Apache httpd PHP and Java mostly on Linux. A little scripting in Python and bash.
It would be cool to have a map showing what stack/language is most marketable in what area, so that one can adjust time spent on learning new tools depending on where they want to go. But maybe these things evolve faster than the time scale involved in getting enough experience to be attractive for employers.
I have been in a similar frustrating position. I have a degree in marketing. I taught myself to code and spent five years building a huge variety of native and hybrid Android and iOS applications. At one point I eould have rated myself as a 9/10 at iOS and Objective-C at a time when these skills were in severe demand. Unfortunately all recruiters saw is my marketing degree.
Part of me, a large part, wanted to simply tweak my profile to say "computer science." If I had done that I am certain I would have been making $150k years before I finally got there. No one would have known or bothered checking. I was able to blow through fairly technical interviews at smaller companies who weren't as picky including answering all manner of algorithms, data structures and sorting questions on the whiteboard. I actually bothered to back fill on all the knowledge I had missed by reading college text books in my own time. Yet...marketing degree. I was even told that I wasn't technical enough after absolutely acing a particular interview which blew me away.
The filtering system is corrupt and broken, if they are going to discriminate against me based on wrong indicators, why should I have to play their game? Unfortunately my feelings of ethics prevented me from doing what seemed logical. I made it on my own time the hard, boring way. Maybe I am the dumb one for not cheating.
Just remove your education from your resume. I'm only a high school grad so Im kind of in the same position. The moment I started leaving my education off my resume, no one ever asked about it again.
Overall, I get the gist of the article, and it makes sense. Some counter-points:
> Mike has worked on systems that can handle multiple orders of magnitude more load, but his experience is, apparently, irrelevant.
Did Mike build/architect said system, or was instrumental in said system reaching massive performance? As an MSFT contractor, it seems doubtful. Our team had a "Principal Architect" that did that sort of heavy lifting.
> (3), combined with (1), gets at what TrendCo’s real complaint with Mike is. He’s not their type.
Generalizing isn't good, but in my experience as a former MSFT contractor, there is a lot of stigma on A- employees, deserved or otherwise. Part of it happens because FTE's that are let go for performance reasons, end up rehired as contractors by shops like VOLT. These folks know the ins-and-outs of MSFT, and the bar for hiring contractors is much lower.
It's super easy to apply for jobs--just click a button, and you're applied. My last open rec receieved over 100 submissions in less than a week. As a hiring manager, I need to quickly go through those submissions and get back to the candidates I want to speak with. For better or worse, I've come up with filters that I use to quickly sort candidates into reject or contact bucket.
I'll definitely contact non-ideal candidates, but your resume should give me an indication that you're worth considering if you don't fit the standard mold of:
"Worked for a solid tech company, familiar with recent tech stacks, has some sort of successful side project, has a 4-year CS degree."
If you don't fit that mold, but are awesome. Great. Prove it (on your resume plz).
A lot of companies founded by young people, hire other young people and believe the best tech is what they learned in college: Command-line python, javascript. They don't realize that teachers teach python because it's easy to teach, rather than because it's the best language for the real world, or that tools make people more productive in the real world (where they can be afforded).
Being a self taught programmer with no formal education (but ~20 years of relevant experience by now) I can relate to this.
It can be hard to get hired since companies may have crazy requirements. But on the other hand, you don't really want to get hired by those companies anyway.
My experience is that great companies recognize great people regardless of background.
It's not about trendiness. It's about valuing experience with a specific stack out of all proportion to what that's worth - perhaps because it's one of the few things that's easy to measure.
I've looked at going in the other direction - onto .NET. There's a lot to like about the framework. But I can take a job on the JVM and be paid like the dev with 7 years' experience that I am, or I can take a job doing .NET and get paid like a fresh grad, despite the two systems being about as similar as it gets.
And I've watched plenty of people in suits be hired at trendy startups. Maybe the bar will be a little hire, but it's not an instant firing the way wearing a t-shirt to many companies would be. A lot of tech companies really are better at this kind of thing than other companies; there is always room for improvement but there's also room to celebrate the things we do get right. Stop listening to DHH.
I interviewed at Loopt (sama's YC company) 4½ years ago. When I went out to lunch with their tech guys, they joked about a guy who had come in before me in a suit, and how that really hurt his chances to get the job. Though I knew enough to dress casually, I was coming from a company in the midwest where I had to wear a suit every day. Both extremes seemed absurd to me.
I eventually turned them down in favour of another offer.
I interviewed at Pivotal labs and I'm pretty sure they rejected me because I use the Colemak keyboard layout and because I wasn't that great at foosball.
> Anyway, (1) is code for “Mike’s a .NET dev, and we don’t like people with Windows experience”.
To be fair, this is legitimate although poorly phrased. A more correct version might read: "We're interested in folks who have more experience with GNU/Linux and open technologies."
If you're a startup, and you (the founders) have any sense at all, you will screen out candidates from "top 10" schools. As a startup, you have absolutely no business competing with Google / Facebook / et al engineering talent. This isn't even approaching anything like moneyball, it's just common sense. The idea that where someone went to school makes two shits of a difference in job performance is ridiculous. Job performance in technology is mostly about a collection of other soft skills in addition to grades and technical ability (communication, willingness / ability to learn new technologies, ability to work in quickly adapting work environments, etc).
A lot of people treat employment status and credentials as measures of the inherent worth of individuals. But a large component of these markers of success, not to mention success itself, is luck.
This is one of the most accurate lines I've seen on a technical blog.
Related anecdote: a good friend of mine works at FB, and pushed and pushed and pushed for me to be okay with him passing off to a recruiter. Finally I said yes, and after much ball dropping, they rejected me without even a phone screen. I don't mind, but related to this - my background is odd, yet pretty strong: YC co-founder (Corp acquired by Google), top 5 in country PhD, bootstrapped several businesses, lead several concurrent teams of developers, strong ability to ship, among the top iOS BTLE developers in the world, etc. I just don't think I fit their mold, and told my friend so before he even tried. Guess I was right.
You have engineering managers in charge who are too inexperienced to really have a good instinct for talent. So they just throw all of their investor's capital at a very narrow range of candidates - basically people who remind them of themselves.
This will sort itself out. Companies like Google and Facebook will continue to bloat themselves up, just like almost every single conglomerate in the history of business. Their products quality will start to deteriorate because their cultures are more interested in status signifiers than they are in great work.
One of the best threads I've read here in recent times. Wish all developers, managers, and HR folk would read it. Can't say that I read it word-by-word so apologies if I missed, but are there any tech startups that provide tools and resources for pursuing such "open book" coding interviews? Or is there a business model here - think "reCAPTCHA for hiring". Solve stackoverflow bounties. Fix bugs in OSS projects. Everybody wins. Questions get answered. Bugs get fixed. Programmer candidates prove their mettle. Good candidates are identified.
As a candidate I'm looking for an employer that wants me (or that has a real urgent need or want for a good candidate). This is usually evident at interview or before. You can tell they really need someone, rather than having to grow for the sake of spending their VC money.
As an interviewer what I'm looking for in a candidate is someone who really has enthusiasm and passion and wants to work for me. It's less important how they look on paper, I want to know how much they want to work with us.
My team and I at Workshape.io are working hard to build a hiring platform for software engineers that addresses some of the various shortcomings in this space.
We have noticed over the past few weeks that interviewing techniques and hiring standards have been a hot topic on HN. We are interested in finding out from the community what your limits are when it comes to interviewing as we feel there is little understanding of this - We all have our opinions on what is effective but it would be incredibly useful to identify trends across different demographics in our industry.
We have created a quick survey [1] (that is anonymous) to help us gather information so that we can produce a report for the benefit everyone. It shouldn't take much longer than 10 minutes to create. We'd be incredibly grateful for any contributions. (The survey has been designed with attention paid to a survey designing guide from Harvard [2])
"When I started programming, I heard a lot about how programmers are down to earth, not like those elitist folks who have uniforms involving suits and ties. You can even wear t-shirts to work!"
This alone dates the author to beginning in the eighties :D
Today, you basically must sport the "lumberjack" look to be cool :D :D :D
But seriously, yes, now as the industry matured, there is a dress code. It might vary from area to area, but it's there.
A friend of mine experienced something similar to this. She was advised to find a small upcoming startup and join them. "You need a win," was how one of her mentors phrased it. This did not necessarily mean a win for the company as much as it did mean a win for her and her career. I would strongly advise a similar approach to getting your leg in the door at TrendCo.
The abundance and heterogeneity of various technology stacks, frameworks, methodologies and whatnot fuels this hiring mess. Twenty-five years ago, nearly everyone who called himself a programmer knew C and an interviewer could judge the relative strength of two candidates by really examining their programming skills and not some trendy keywords.
This sounds like a falsehood, brought about by the belief that the past was somehow simpler, because the relics from the past that survive into the modern day are few in number.
I doubt that everyone knew C. Lots of big firms extensively used COBOL. Fortran was important in science/numerical computing. Pascal was popular. The list goes on.
Need I mention that for decades, there were lots of competing assembly languages, and that only in recent years we've converged on three? That Harvard Architecture and Von Neumann Architecture competed into the late 70s? That compiling a program on a machine with particular hardware could be seriously difficult? The conflict between big-endian and little-endian systems? Computing has always been full of heterogenous, competing technology stacks, frameworks, methodologies, etc. Why? https://xkcd.com/927/
And in the 90s in particular: do you really think that designing software for this so-called "internet" and newfangled HTML and JavaScript was free of trendy keywords?
Indeed there were lots of competing standards and technologies, but C was still the kind of "lingua franca" among most (professional?) programmers. Many people used to write really intricate programs in it without relying on anything but standard library functions. It was about raw talent and people used to take pride in the way a program was written and not just what its effect is.
And yes, it all started to change in the mid 90's with the advent of Windows and the web.
A very large number of software developers today don't actually write much new code. The job is more about wiring up libraries, frameworks and APIs rather than designing and implementing new things. Consequently you need more organisation and focus than "real" programming skills like math or logic.
I started programming professionally about 25 years ago, and it was exactly the same. Windows was rolling out into the office spaces, replacing old mainframe line-of-business apps.
MS Visual Basic was the new hot thing on the block. Delphi was the same thing but in Pascal. There were a bunch of database-with-front-ends like FoxPro and Access that you could build a windows program with pretty easily.
Actually, C occupied pretty much the same space as it does now: there's some people using it for serious stuff at serious companies, but it's not RAD so no-one else is interested.
I think it is fairly common these days to allow the candidate use any language he or she finds convenient. But now there is another problem! Twenty five years ago everyone who called himself a programmer could do algorithmic stuff. And now there is a growing cohort of "programmers" who proclaim that they are above it and instead take pride in their masterful ability to call APIs.
It's overly simplified, like "in 2010's nearly everyone who called himself a programmer knew JavaScript". From what I remember software stacks were far from simpler and standardised, between Mac toolbox, 68k assembler, x86 assembler, Object Pascal, DOS, Windows, MFC, VCL, OS/2, AppleTalk, IPX, NetBIOS, OpenDoc...
Sometimes they flat out just ask, though. When I interviewed at google, they asked me my college GPA. After telling them I left college before getting a degree, they just repeated the same question. Silliness.
I've given perhaps 50-100 interviews and I'm not sure how many I've been on the other end of. I still believe interviewing is hard.
One of the things I typically told my employees and coworkers to do was to avoid psycho-analyzing someone based on small wordings and behaviors in responses.
Really, hiring is imperfect - I did try to hire people who were high energy, friendly, and not likely trying to suck up or B.S. through an interview - but there's a lot of things you can't really tell.
Many interview questions - such as targetted selection - can find negatives to not hire someone, but are easy to bypass if someone knows how those questions work (always give answers that cast you in a favorable light, etc) and are good at interviewing.
I agree that a lot of places do screen too much on cultural fit, but on the other hand, people want to work with people that will be good to work with - and they will have a LOT of time to spend with these people.
In days where we so closely identify with our work, I can see wanting people who could possibly become friends.
This becomes dangerous when it trends into ageism, or reduces the ability of very well qualified people to get a shot, and I've seen firsthand this result in some places hiring a lot of young folks that THOUGHT they knew a lot, but were really at a sophomoric level of experience.
Really good products come out of a diversity of experience and technical backgrounds. However, if someone is going to clash with the team, it's somewhat also better for them to not be working with that team - sometimes - but sometimes it may also work out great.
There's really not a good answer here, only tradeoffs.
People are squishy and complicated.
I tried to limit psycho-analysis and culture fit to "are these good, honest, and mostly friendly people".
I also believe smart people can learn almost anything, and often technical criteria or even small tech preferences that are not clear-cut can be used as code for really not hiring someone for culture fit.
Sometimes I feel a greater danger in interviewing is not allowing the interviewee a good enough picture of what is wrong about the company they are joining - often, interviews feel like sales pitches at the candidate, or focus primarily on the candidate's abilities and background, not hitting most of the things that will really say "will they be happy here".
I have seen this effect in action and it is why I don't have trouble hiring quality engineers in the Bay area. Of the last 15 or so people I've hired I only had one not work out and that was due to their extended family situation making actually working really really hard for them.
That said, it can be ridiculously hard for a lot of recruiters to figure this out. I have worked with good ones (who seem to be formerly technical), bad ones (who seem to be "people persons"), overworked ones (who seem to operate much like a random decimation of the input stream would work). Another comment here on HN identified jobs as a 'matching market' which is a thing I had not heard of before. I've started hunting down good references on how folks analyze such markets.
I tend to agree with most of this (university, GPA, subject, blah blah blah). Bits of paper that are signals of arbitrary things (family standing, ability to take on debt, country of birth).
Anti-Windows bias doesn't really feel like an example of 'pedigree' to me, though.
I can't imagine programming on Windows. I can't even imagine using the OS. I moved away from Windows ~a decade ago and haven't looked back - it seems to be getting worse every year (the business model for W10 appears to be surveillance?).
Anyone care to chime in? What's the appeal of Windows in this day and age? To me it really feels like 'big companies use it so their programmers do'.
- binary compatibility for GUI applications over decades (try to get to run an application that was linked against GTK+ 1.x under a current GNU/Linux system)
- I personally prefer the Visual Studio debugger by far over gdb (but YMMV)
- there is exactly one way to write GUI applications against (WinAPI) - all other libraries as MFC, WTL are just wrappers for it. Try writing an X11 app against XLib or xcb and you will find out why Windows is better in this point.
- the same holds for other GUI functionality as clipboard and drag & drop (OK, admitted, it is a PITA under Windows, but far worse under GNU/Linux/X11)
- When developing GUI applications under Windows vs. say Gtk+ oder Qt under Linux: If there is some functionality, which is very low level (say, support for multiple mice): Under Windows for this there is Raw Input and under X11 there is XInput2. While the Raw Input feels integrated quite well into the style of the whole WinAPI, using XInput2 when developing using Qt oder GTK+ feels really foreign and I always fear that I will break something. In other words: When developing for Windows the APIs often look much more homogenous than under GNU/Linux where I personally feel that there are different libraries with different styles (but you might feel differently)
- I find the MSDN documentation much more helpful than man pages, but this might be my personal preference, at least I feel that when under Windows something is documented it is nearly always documented really well, while the quality of documentation varies a lot under GNU/Linux and is (for things that change more often) often outdated
- Also I consider it as a bad idea that many Linux devs consider code as documentation if by default most "user-centered" GNU/Linux distributions will not install by default the source code to any installed package.
- I personally prefer the slightly "overengineered" style of the Windows API instead of having to hack something (sometimes ugly) together if the API will not provide what I need; but this is again a matter of preference.
that is a great presentation. the degree to which everything is asynchronous in NT is awesome. it's a shame techies don't engage with it or dig in to it and just say that linux is better because it has a better shell.
Are the usual GNU/Linux shells still better than Powershell (being better than cmd.exe is trivial ;-) )? I'm really interested in a serious evalution from both Windows and UNIX nerds.
it also comes down to explorer vs anything else. I'm a huge fan of NT, but everything else is kind of mediocre. using someone else's windows computer recently I had the awesome experience of running an installer, being told I needed to install the ".NET 2.0 runtime", so I go to the control panel and select "remove programs" then select "add programs." why is add programs behind a button that says remove programs?!
the criticisms of the technology are skin deep but when you're using the system, you're interacting primarily with the skin, so...
Speed of setup, and a standard stack everyone knows.
You don't argue over which email server you use, you just setup Exchange.
You don't argue over nginx, apache, etc just use IIS.
It's also configured quite well out of the box, where as Linux you spend several days getting it all configured to perfection.
When you just want to focus on the business problem at hand, I find it brilliant. Where as in other stacks you get caught up in technology, for technologies sake.
> It's also configured quite well out of the box, where as Linux you spend several days getting it all configured to perfection.
Uhm, no. Most distros comes with sane defaults, perfection is something you work hard with to achieve on both Windows and Linux. Counter-Example: how to run get max performance out of 10Gbps Ethernet on Windows.
Not stuck on the CLI and text terminals culture. The Windows developer community knows its way around basch, Technet scripts and Powershell, but we rather use nice comfy GUI tools.
C++ (moving away from C), .NET languages, COM and nowadays WinRT offer programming paradigms that follow Xerox PARC ideas and drive forward developer experience.
Those kind of developer experiences being shown by Bret Victor.
I could say the same about Mac OS X developer experience.
For almost a decade I was a dedicated *BSD and GNU/Linux user as well, but they seem to be stuck being just yet another UNIX clone without any consistent plan for anything outside C and POSIX APIs.
> Not stuck on the CLI and text terminals culture.
See, those of us who use Unix & free software see that as a benefit, just as the literate find the written word to offer advantages over picture books.
With a keyboard, I am able to enter hundreds of discrete inputs: with a mouse or touchscreen I have one 2D input and a few buttons. With a keyboard I am able to have a conversation with my computer and my data; with a mouse I point and click, which isn't much different from pointing and grunting in human conversation.
See, some of us that were already into computers when Sun was a startup got bored of it.
Your answer is a typical example of UNIX CLI users that need to express themselves as being more intelligent than GUI users, while forgetting that some us do have enough CLI experience and decided to grew out of it.
I can use macros and key + mouse combos in FPS style across many GUI applications and if I ever need a CLI, Powershell or Terminal (Mac OS X) are just a click away.
Mouse use has a much higher risk of CTS and other work-related injuries. [1]
I've worked at startups and enterprise environments, and what has always remained constant at every Microsoft/Enterprise company is the troves of obese, unhealthy, arthritic middle-managers that can barely use a computer anymore because they got lazy and succumbed to using GUIs to get their basic work done.
Now, they are stuck on MS because any foreign GUI elicits a sudden "NOPE!" from them. They are constantly unhealthy because none of them have standing desks. They don't have standing desks because mouse use is extremely non-intuitive in such setups and in many cases dangerous [2].
By surrendering to the mouse and the GUI, my older co-workers are low-quality producers, exceedingly lazy, and otherwise slowly dying.
My console emulator has looked the same for the last ten years. I can SSH into any UNIX system and it's familiar to me. Innovation hasn't slowed me down at all.
> Yep that says it all. Hardly the example of what innovation means.
It appears you fundamentally misunderstood my point. Though innovation has occurred on my Linux system (and on my MS system) without fail for years, I haven't hard to learn bizarre new mouse movements and locations of menu items. Rather than slow me down, I can begin taking ADVANTAGE of innovations right away because they aren't implemented on CLI interfaces in a detestably-arbitrary manner.
Also, I don't get your first point. If people got unhealthy and unemployable due to something OTHER than the mouse and GUI usage, then so what? That doesn't negate my point at all. It simply shows that yes, more than one cause exists for obesity and brain rot.
It's been shown through years and years of clinical study that the injuries received at the modern office have to do with the movements associated with mouse usage. Keyboard use is easy to adapt safely to the modern healthy workplace (standing desks, padded surfaces for wrists and feet, etc.) Mouse use, on the other hand, barely works correctly for the seated and unhealthy worker, and works even less properly for the worker trying to make healthy changes.
> He said "not stuck on". I work the whole day in my dev env. I want something that hundreds of paid people created over several years at Jetbrains or Microsoft. I use CLI and a DevEnv. Just like some things aren't possible to input easily with a mouse, just as hard is it to visualize some things with a text UI. What is easier: adding a CLI to a polished graphical application, or making a polished graphical experience when you have a CLI?
The appeal of Windows for me is inertia. I keep using it because I can keep using my preferred programs, namely: Winamp, MPC-HC, and my video game library. Perhaps once my desktop with Windows 7 installed dies I will put Linux on a new one as the idea of using 10 terrifies me.
I did try to use Linux a few times in the period between 2007 and 2010 but I never found a desktop environment I liked enough. The file browsers and music players always left much to be desired.
I can't say much about how the programming compares as I stick to the command line mostly but I do prefer the thought of using the Win32 API over whatever might, might be available on Linux. Do I choose GTK and its copious unique types (even for integers) or do I force myself to learn the abomination that is C++ to use Qt? What audio subsystem do I use from the dozen or so that is around?
Video game library is a fair point. I use W7 in a VM for that, though I find myself playing games less and less nowadays.
I guess it's that I don't do GUI programming. I mostly produce software that does things directly (rather than enabling others to do things).
I fundamentally have very little use for a GUI unless I were producing software for someone else to use (e.g. QuickBooks or something).
I think that's probably the barrier in both directions. GUI users don't usually see eye to eye with terminal users. It's vi-vs-emacs all over again. :P
.NET runtime and framework with Dev Studio are very very good. If you haven't tried for a long time, you owe it to yourself to take a look.
To me my stereotype - if you're in a corporate world and want developers that like users and spend a lot of time on complicated business problems - you're likely to see lots of "windows developers".
If you want some to write apps with lots of technical complexity and simple business logic, you're better off with "unix developers".
Regarding the attraction of windows.... market-size and momentum.
Remember a good softie is flexible... that includes getting experienced on all OSs, dont limit yourself... Windows isn't all that bad (and ignore all the silly surveillance scare stories).
Good software should be platform-independent, good software engineers should also be platform-independent.
So far none of the responses have gone on a business tangent, yet usually business guys are the ones who decided on / enforced the MS-only doctrine. I assure you some MBA veep who's never touched a keyboard did not make the decision based on the accurate technical details in some replies.
The first one is hiring. Supposedly everyone knows windows, great hiring pool. I donno if I believe that anymore, but in the old days there was certainly no shortage of MCSE to hire. It doesn't matter if the UI for the OS and office software change completely every couple years because "everyone knows windows" so training costs only exist for alternative OS or alternative office software. When that linux genius quits he's irreplaceable, isn't he? That's a great position to be in if you're that guy (me), but business people hate that. To some extent this is a young person vs old person problem because college campuses (campii?) are owned by apple, not MS. MS is just to run the dorm xbox, that's all. This is strategic planning on the part of decision makers.
The second is financial embrace and extend. Once you got one windows server or a couple desktops, inevitable licensing agreements make the marginal delta of "another windows box" extremely cheap (maybe zero) vs the measurable costs of mac etc. BSA / etc audits will make your life a living hell unless you participate in this scheme. This is bean counting on the part of the decision makers.
The third is interoperability. Unless you're the worlds most boring company you'll be doing something weird and under the first topic above, everything weird has by default windows clients or drivers or applications. There will be a windows client for an eprom programmer or I2C driver. There might be alternatives but it will work under windows. Then see the topics above. You will need windows boxes... you just will. This is pragmatism on the part of the decision makers.
The fourth is CYA. Like decades ago, "nobody ever got fired for buying IBM" today "nobody ever got fired for buying MS". You could get fired for buying a mac and it doesn't work, or installing linux on a server that crashes, but no quality of service, including zero, is low enough to get fired for buying microsoft. No matter how badly the system gets powned or how much stuff gets stolen, you can't get fired if you bought MS products. Just doesn't happen. If you can afford it, and you like being employed, the decision seems clear from a risk management perspective. Cowardice on the part of the decision makers.
The fifth is support. Business people think we get support by calling a script reader in India who tells use to wipe, reinstall, and reboot, and furthermore that is worth money and worthy of making procurement decisions. In the real world we know we use google for support and non-MS stuff googles better. But business people like a risk management of a nice security blanket of telephone support from a dude with a nice looking certificate. Its not actually useful of course, but business people don't understand it. Business people think computer support is like medical support and you call 911 and get fixed up, but 911 converted to computer support would be a pretty funny standup comic routine. Essentially this is plain old ignorance on part of the decision makers.
> Sometimes people don’t agree (e.g., it took months to get the non-version-control-using-team to use version control)
There's an easy way to stop that sort of attitude. The next time that a bug occurs, you tell that team that they must restore the code back to what it was previously, and provide diffs that show what they did so that you can review their rollback.
When they say they can't, you tell them that's their problem as they haven't implemented version control because they believed they didn't need it, and they need to go do it anyway. After they go through this exercise, you then offer to help them to implement version control.
That's not actually going to achieve anything other than making an already stressed team more stressed with a side of hatred for you and your pointless request.
In this day and age I'm not sure how you do persuade a team to use version control other than mandating it and sucking up the hostility while they settle in, but taking the opportunity to make a smug point while a team is firefighting isn't the way.
Oh yes, truly pointless. Much like their insistence that they can get by without version control really. The team has already pushed back on something that no professional software development team can get by without, and even the most amateur of software projects use. I mean, you need to ask yourself why they are so stressed and you know they aren't using version control?!? Cry me a river!
And the thing that sucks is that a lot of the best places to get started as someone out of college is a .NET or otherwise boring shop. Not everyone who is talented and intelligent has been programming since 7th grade or went to a brand-name school.
Unemployment after graduation for CS majors is one of the highest because a lot of times they aren't ready for the real world. They get started in a .NET shop and grow in to better developers. I think we can all agree that if you consider your self good at this it's mainly because you had to or wanted to do it for real and thats where they get the experience to be on their own.
I find that site a bit misguided. Any amount of CSS might still break, for example, w3m. It seems more sensible to have pure HTML sites with client-side styling within the browser.
I think cold hiring is dead: draining exchanges of info lead too much too often to nothing and leave both parties dissatisfied. Referrals are the easiest way to go.
The industry seems to be trending this way, but I think that is a very bad thing that should be fought at every stage. It is leading to ghettoization of talent and the perception of a talent shortage that probably doesn't exist.
If you have a resume you're not "trendy". You're "trendy" if you have a github account full of projects in Go and Node-y stuff, a twitter-stream with a few thousand followers, some medium rants and (maybe) a linked-in profile.
Unless you are specifically looking for social media skills, it can be a mistake to hire engineers based on social media presence. People who are good at self-promotion often play poorly in teams. Just sayin'...
my impression has been that node.js reached the top of its hip curve a few years ago, same with golang. not that they're disused, they're just not obsessive-compulsive topics. this is mostly based on what i (don't) see and hear at work and here.
again, based on what i see on hn, github is actually falling out of favor (we'll see if they can reverse the trend [sic] with the newly released features; the search removal inspires little confidence).
twitter has been around for too long to be trending upwards, it's like facebook. the populace split into users and abstinents has ossified. it's even more pronounced with linkedin which has been around much longer.
i have no idea what's supposed to be trending in programming instead though. recently it's felt to me like a plateau of sorts.
I'm 50 years old and my day job is teaching teams how to perform better. I still code - love coding. But several years back I decided I could do more good in the world by helping multiple teams at a time than simply coding.
My "real" job is, of course, startups. I save for a while, then work on various startup projects. I'm learning a lot.
So now that my latest startup project is coming to a close, I'm thinking of where to go next. Instead of hands-on teaching, I'm seriously considering getting back to coding. Sit down, make something happen, look at a job well done.
The problem here is this: all I know how to do is make solutions happen for people in a dozen or so different technologies. I haven't spent time trying to keep up with the cool kids. C++? Sure. STL? Not so much. All the basic web tech (HTML, CSS, Javascript, etc)? Sure. Angular/Meteor/etc)? Nope. F# and OCAML? Love me some functional programming. Haskell/Erlang/etc? Nope. SQL Server, MySQL, some Oracle? Yep. NoSQL, Cassandra, Mongo? Not really. Once I knew enough to get the work done, well, I just got the work done.
There's a difference between coding to get things done and spending time with toolkits and frameworks because they're popular in the market. Back when I was a senior programmer, one of the most important things I did was try to guess what the cool kids would be doing in 3 years -- then study that and get some experience. This paid off over and over again. As somebody who just wants to make things happen, that's not a priority.
This leads me to my second observation, from sitting on the other side of the table looking for developers and working on staffing models. The dirty secret of the tech industry is that nobody has any idea how many developers they actually need. There's so much variability in teams, and you can break the same work down so many ways....there's a lot of risk. (Yes, there are several heuristics you can use. Insert long discussion here about how they're only heuristics.)
So at the end of the day, you have a lot of companies that want to be trend-setters chasing after what all the companies think are the cool kids -- the trend-setting hires. I remember sitting down with a SV CEO a while back talking about his 100-person shop. Everybody in the place had some cool-kid story attached to them. The place was a mess, but it was a trendy mess.
At times, I am not convinced that these companies would do much better with some kind of "sweep the street" approach to hiring, vacuuming up a huge number at a time and taking them through some structured filtering/training process. I've even thought through how such a model would work. Haven't found anybody willing to experiment with this yet, though :)
As arbitrary as the system is, I'm not sure its any better or worse than any other system. And the simple fact of the matter is that buzzword-based, trendy hiring has been with us for decades. I doubt its going anywhere soon.
> At times, I am not convinced that these companies would do much better with some kind of "sweep the street" approach to hiring, vacuuming up a huge number at a time and taking them through some structured filtering/training process. I've even thought through how such a model would work. Haven't found anybody willing to experiment with this yet, though :)
Didn't IBM and a few other big iron shops do this in the 50s/60s with secretaries? iirc, they could apply to be programmers and would get trained.
I don't like it but I do understand it from the employers perspective. Most hiring is done to avoid risk. Went to a top school, working at a popular company and already using the tech we use? Come in for an interview. Missing one of the three? We'll phone screen. Missing two out of the three? Not worth my time.
I understand the other two, but I'm curious why "working at a popular company" makes sense to you. What does that have to do with avoiding risk? Is it the assumption that those companies don't make hiring mistakes?
The OP's point is that "This only works if you pay the appropriate amount".
If you go with that process, but discard the "money" variable, you'll only interview the people who got rejected from every one of the well-paying companies.
My experience is that cultural fit is important but more important is attitude. There's a set of can-do attitude that will almost always fix the cultural issues unless it's a very distinct cultures. Most cultures are just about being open, etc. But so few are actually hungry and have can-do attitudes.
I Just read this article and am in a similar situation right now described. I tried to do a startup and it didn't work out. I'm in the job market now. In India it's a lot worse because some of my friends tell me, you've been out of a job so long it might be difficult to get a good package.
I think of this as "Hiring natural selection". If they don't hire you because "we don't like windows people", or "contractors are dumb", or <insert excuse>, then you really didn't want to be there either.
How many of you guys continue to interview when you sense that you're either being insulted or your interviewer acts like a jackass and thinks that he's God's gift to this earth? Has anyone politely walked out?
You can replace "trendy" with "good cultural fit" in many places. Cultural fit is arguably the most important factor in hiring someone. From my network, I see a hugely different culture at Microsoft and Google. Just saying.
The problem with "good cultural fit" is that you need to ask some really hard questions about what you mean by that phrase. In many companies it seems to boil down to "people I'm comfortable hanging out with."
Very strong engineering teams are not necessary comfortable places, because they have members with different but complementary viewpoints. The cultural fit in this case is to be able to debate vigorously but also to be able to accept decisions and move on together. It's not that important to be friends outside of work.
If this is so important: why don't they specify the desired culture values beforehand in the job ad so that people who don't fit don't have to waste their time applying for the job?
What reason does the company have to think that young white men fit the job better? I can imagine that there are properties that these kind of people have that makes them a better fit, but why not advertize these properties instead? This should probably not lead to a lawsuite.
I don't think so. For the reason why I disagree: Lots of cultural values take a lot of time to internalize, but are easy to check/hard to fake. For example if one property of "cultural fit" is "loves the UNIX way of thinking and writing programs" it is easy to check with few questions if he/she has internalized it or is faking it. Or if "cultural fit" implies "loves doing hackathons" the prospective employee will have taken lots of time at hackathons in the past. If there is nothing to be found in the internet of the hackathons that he/she was on (say, either from the Hackathon's website, pictures in social network, source code on Github/Bitbucket etc.), he/she is probably faking. etc.
No - it's not at all. Quite the opposite. People who fit into a specific work culture bind together from across many different nationalities and upbringings.
Great article with a lot of good points. Enjoyed the read.
Having said that though, the first paragraph or so made be choke. In the "hypothetical" case of Mike, who after being layed off from Microsoft finds himself a little shocked that his "skills and experience" are not valued in the real world ... well, its really hard to have any sympathy for him.
Hiring on merit is one thing, but there also has to be some weight added for a candidate's integrity (or total lack thereof, as in this case.) These Microsoft folk will never understand how much damage they have done, and how deep the resentment against them runs. Its not even worth trying to explain to them.
For another example, lets consider the hypothetical case of "Gavin". Gavin is someone with brilliant skills in say - early childhood learning, adminstration, media relations and marketting skills. He applies for a job as a senior admin at a childcare centre. The application is rejected !!! Gavin is appalled at this blatantly unfair treatment, so decides to investigate further and marches into the office to demand an explanation.
Anyway, as it turns out, Gavin proudly made full disclosure on his CV about a number of recent highlights in his career, including filming and marketting child pornography, collecting healhly organs from child donars for resale in the Middle East, and also the collection and hiring out of particularly talented children to both the Catholic Church, and the Freemasons.
"Gavin" sees these activities as mere expressions of his brilliance in the fields required. He is at a complete loss as to why all the staff / parents in the childcare centre are "discriminating" against him, and some are even casting unfair daggers in their eyes. He cant understand why ???? Perhaps they are just inferior to him.
I dont see this as being any different at all to the hypothetical "Mike" situation. Seriously - Fuck Mike !
To anyone recently layed off from a job at MS, who thinks they can jump bandwagons onto the next "trendy" thing in software, I think the most humane thing to do would be to tell them the rotten truth right up front .... "Just Fuck Off, and get as far away from software dev as you can. Get a job pluckng chickens, or collecting broken glass, or whatever pays the bills ... but whatever you do, keep the fuck away from our computers. You are NOT wanted here, and we all have excellent reasons to hate your guts with a passion and intensity that you are too selfish and narrow minded to ever comprehend."
.... but they would still think this was unfair ! Ridiculous.
Loved this article. Over a year ago, I decided to leave both my technology stack (in which I had 8 years experience), and the big corporate employment record behind. I had about 10 years of industry experience, and considered myself a generalist - capable in many technology stacks, and highly capable of learning and quickly becoming effective in a new one.
Also, I had a nice comfy savings account, so I decided not to work for 6 months while I focused on me-time and learning NewTechnologyStack. I fell in love with it, and havent looked at OldTechnologyStack since. I wrote many demo apps, started making open source contribs, and engaged with NewTechStack communities online. Damn that was fun! I joined local meetup groups and other communities to fill my sponge-like passion for NewTechStack to the max.
Let me also state that with OldTechStack, I never, ever had a problem finding a job - really, it usually takes me about a week to land a new job somewhere else in my bustling tech-hub city. My resume with OldTechStack was, well, very trendy. I could often tell in interviews that people were going to hire me based on my resume before they even spoke to me in person, and that the in-person resume was just to reassure themselves that I wasnt bullshitting them. Dont get me wrong, Im a great programmer, but I know I was hired because my resume was nice 'n trendy (with the OldTechStack culture).
So, once I started looking for jobs in NewTechStack culture, the reality of the superficial hiring process really set in. Firstly, NewTechStack people have a cultural internalization that OldTechStack really sucks. Mind you, I am not referring to technical folk, but HR people, recruiters, entrepreneurial types, etc. Secondly, I had no idea that taking time off from work made me look really bad to most folks at the base level of recruitment. Because I was not currently employed, I was not making it past many screening interviews. Thirdly, there was this magic phrase, "x many years of PRODUCTION experience in NewTechnologyStack..." that was really screwing me over. Despite the fact that I am proficient in 4-5 major programming languages, and have experience in several more. Despite the fact that I have lots of demonstrable work, both public open source, and my own creations (living at live URLs). Despite that my career experience has given me challenges that are scaled many-powers-of-ten beyond what your rinky-dink logistics app's customers demand. Despite that I interview wonderfully, am willing to do programming challenges for free, and usually deliver above and beyond expectations... Despite basically being a wonderful candidate from your engineering team's perspective, I found myself being rejected for these and many other superficial reasons.
Right now I am elated that none of those companies hired me, as I found my way via freelance contracts, and my life could not possibly be more well balanced and I have never been happier in my career. Right now I also work on about 7 different NewTechnologyStack production websites with relatively complex e-commerce and marketplace challenges, as again, I handle generalism well, and well adopt AnyTechnologyStack much better than your "senior dev" with a whopping "5 years production experience."
Guess what? Every single company that I interviewed with about 9 months ago is still hiring for the same damned position I applied for. Some of them have even reached out to me on StackOverFlow or other career sites, like a zombie, unknowing that theyve already interviewed this candidate...
Why?
Because theyre obsessed with an irrational, non-science based, peacock-dance, superficial standard for hiring people, specifically with standards they are not at all knowledgeable of, that they somehow believe will lead them to "hiring the best" and becoming the next Google (with their rinky-dink logistics app with about 4K customers).
I am not suggesting some "science based" checklist system take its place. I do think the peacock-dance, and other completely inauthentic behavioral checks, need to be removed from evaluation. Dont get me started on all the dinosaurs still relying on "technical tests" that throw college-level abstract puzzle games at you, which have been scientifically proven to produce false positives as well as false negatives.
Pro tip: copy and paste documentation from YourTechnologyStack official site to your blog, write one sentence saying, "please leave a comment below!" and youre instantly more hireable (to recruiters at the screening level) because, well, youre a blogger, and youre clearly engaged in the community.
Companies are so obsessed with "hiring the best" that they basically buy a "hiring kit 'n a box" and hope it automates itself, when all theyre doing is creating an absurd gauntlet that rejects lots of great candidates, and leaves them without any hired candidates 6 months later. The "rockstar" programmer, mythical though she/he may be, is still the goose that lays the golden egg and turns your company into NextGoogle.
The level of absurdity with the hiring process as a software engineer is obscene. For people who spend most of their professional time dwelling in thought-realms of pure logic, getting hired is a maddeningly illogical process.
Yeah, this really affects old non-degreed codgers like myself. I find it difficult to get "a foot in the door", but luckily my performance always leads to comfortable retention once I do. As I get older though, that's going to get much more difficult.
And both groups do a disservice to each other when they cannot work together. There are a lot of new ideas that we greybeards can learn from the greenies and there are a lot of tested foundations of which the newer generation is often unaware of, especially those that were taught to not worry about resources.
Also, one of my old 40something coworkers taught me how to do so many legit unix things. He also told me what stocks to buy and gave hilarious relationship advice. ... Maybe I shouldn't have quit that job.. haha
tired of software engineers getting shit on...the only reason that software engineers can't make large amount of money is the company suppresses the wages.
all you need is someone business saavy, partner up with them while you offer technical know how to realize the vision. For me, this is the much easier and better rewards. My odds are better in this environment because I'm naturally inclined to danger, the unknown, and creatively approaching problems that have only come by trial and error.
whenever I read articles like this, it makes me glad I chose entrepreneurship over 9-5. With rise of deep learning, as you've recently witnessed with AlphaGo, relying on salary as part of the cost operations part of a business doesn't have good prospects into the future.
its time engineers began looking outside the well but I fear my advice will largely be ignored and scoffed at. Entrepreneurship? Sales? Business? These things don't matter in a world of commoditized open source software?
Surely, one has the wisdom to know the pitfalls of narrow mindedness past late 20s.
"Wall of text" is usually reserved for people who don't use paragraph breaks; such a complaint is unfair on a perfectly well-formatted page. If you want it narrower, maybe make your window smaller?
IMO, it's almost impossible to stay focused when the text is poorly formatted as it was in this article. With just a couple of lines of CSS the readability would increase tenfolds.
I shouldn't have to make the window smaller in order to be able to read the authors post. If people find the formatting too frustrating, it's their problem as they want to convey a message to me, not the other way around.
>...the TLDR seems to be not to hire based of trendy skills,
No, the essay is not about skills but about trendy pedigree. The examples of "pedigree" being defined by being at trendy company such as Google/Facebook/Apple instead of a stodgy one like Microsoft. Or the "pedigree" is graduate of top-10 university instead of a 2nd-tier one.
One possible solution to outsmart the flawed thinking about pedigree is a "Moneyball" approach that ignores schools or past employers.
Well, I personally would have a lot of caution inviting in someone who is familiar with .Net, out of concern that he would be a Trojan horse. He would be more familiar with proprietary than free software, and hence when confronted with the choice to either bring in what he knows or learn something new, under pressure, it's likely that he would try to bring in something from Microsoft or another proprietary vendor.
No thank you. Free software is better for developers, better for my team and better for our company. Yes, rarely it's missing a necessary capability, and sometimes it's missing a nice capability, but it's normally quite as effective and, y'know, free.
I share his surprise about the concern over someone with too varied experience. In my own experience, that's generally a plus, since such folks often have surprising depth along with their breadth.
I also share his surprise about the concern over someone who's been contracting. I've run into contractors who basically just fill in the same template at every job, granted, but I've also run into contractors who are sharp and have — again — an amazing breadth and depth of experience. You just have to spend some time interviewing them to find out who is who.
But, again, I'd have huge concerns about bringing in someone whose experience has been with Microsoft or Oracle or IBM. I'd be very concerned that he was just a glue monkey assembling bits & bobs of proprietary software and not an actual developer.
That's an incredibly naive filter. I'd estimate the vast majority of developers don't care too much about whether their technology stack fits some arbitrary ideal. A good interviewer would focus on gleaming out a person's understanding of the fundamentals, and more important, actual instances of problem-solving, creativity, and learning.
> I'd be very concerned that he was just a glue monkey assembling bits & bobs of proprietary software and not an actual developer.
You can remove the word proprietary and apply that claim to a significant number of developers, regardless of their chosen tech stack.
Any Windows developer will tell you about the time they spent four days tracking down a bug because, say, they thought that the memory size returned by LocalSize would be the same as the memory size they originally requested with LocalAlloc, or some similar bug they could have fixed in ten minutes if they could see the source code of the library.
And on the flip side, I've had potentially hours of source dives (because, spoiler alert, I have the .NET Framework source code on hand and have for a couple years now) made unnecessary because the MSDN documentation, which was made because they would like programmers to be productive, explained an edge case before I stepped on that rake. Compare to plenty of other environments where the source dive was the only option because the documentation was an atrocity.
It's not nearly as cut-and-dry as you want it to be and you will be better at your job when you internalize that environments you don't like have pluses of their own. I choose to use Unix for my own projects because it makes sense for the projects I'm working on (the stray game-development project aside), but I have declined to participate in the epistemic closure of those who think Windows and .NET have nothing to offer or to teach.
Would you feel the same way about someone who is currently doing .Net but has experience with other technologies in the past, and still learns and uses non-MS technology outside of their current job?
Like me, for example, I currently have a job doing .Net, but in my past I have several years experience working in iOS, a year working with a Java stack, I used to do PHP way back in the day, I use Python all the time for personal scripts, and I'm currently learning React-Native to update my mobile experience and get a couple new apps out there.
> Would you feel the same way about someone who is currently doing .Net but has experience with other technologies in the past, and still learns and uses non-MS technology outside of their current job?
No, certainly not. But I've reviewed resumes of folks who were clearly just cogs in the Microsoft machine (without any indication that they were doing things with Unix & free software in their free time). I really am concerned about bringing in someone like that.
I like your breadth of experience, particularly your usage of Python. Many of my concerns about .Net are applicable to iOS, and I have other concerns about Java, and still other ones about PHP, but anyone who uses Python is probably a good egg.
That's encouraging, thank you. I've had some frustration in interviews because interviewers I've encountered have only really paid attention to my most recent tech stack and job and seem to almost completely ignore the rest of my resume, despite working with a wide variety of tech and fields in the past successfully (games, mobile, web, startups, enterprise).
Calling it a large open-source community is perhaps stretching the definition. When I look at the Java open-source universe, or the Ruby one, or even the Node one, I see a lot more activity and a lot more generally applicable, useful stuff coming out of any of those compared to .NET.
Which is not to say .NET isn't fine--I use it for many things because I enjoy the language and what it lets me effectively target (you're not targeting the PlayStation 4 with MRI or Node). But the community still, despite very strong and laudable efforts from Microsoft, strikes me as anemic.
Depends on what you count into the community I guess. Making a language that already has a very large community open source, means that bit of open source has a very large community. Only a small fraction is active in developing open source applications on it, but a large user community is also a great asset.
Enh. Fair point, but if it's inactive, it kind of hurts at the same time. Everybody waiting for somebody else to fix a problem, nobody's contributing so we won't worry about issues or pull requests...
- adaptability -- can this person deal with situations that are spontaneous and unplanned without losing his/her cool.
- openness -- is this person instinctively scornful of new concepts/ideas or energized by the chance to be exposed to something potentially interesting.
- resourcefulness -- has this person found ways to keep learning after college and has he/she continued to up his/her game.
- desire to ship -- does this person have a deep desire to produce work that will be used by others.
- desire to grow -- does this person have clear career growth goals. Ideally these are not tied to titles or bossing others around, which are negatives.
- rationality -- can this person admit to being wrong and speak about it authentically... can I imagine this person admitting to being slightly wrong and even very wrong about something and handling it gracefully.
- territoriality -- does this person seem territorial or speak about issues of proudly defending turf in previous positions? Doing so is a negative.
- poker face -- does this person seem to be playing it close to the vest in the interview or otherwise have very low levels of openness? This is a big warning sign and an instant do not hire.