I got to the point that I am telling recruiters that "leet coding" on the whiteboard will double my financial demands.
I did a fair share of problem solving on whiteboard, replying to very specific questions about libraries and frameworks. But I've also accumulated some experience. And if my experience isn't enough, if you can't tell by having a discussion with me if I fake it or not, then with all due respect I will tell you that I have better things to do with my time.
Your experience is not enough. I've seen candidates with a resume full of experience that are very good at talking fall flat on their face on incredibly simple coding problems. I can't tell from a discussion if you faked it. Lying is not as difficult as you think it is.
Bit of a false dichotomy don't you think? Surely a 'incredibly simple coding problem' is different than whiteboarding leetcode. Perhaps your questions aren't enough?
Not necessarily. Leetcode problems go from “just iterate over the collection collecting some data as you go” to “Knuth solved this once and thankfully he wrote it down in a book for us mortals”.
Maybe part of the problem with these threads is that posters don’t clarify exactly what they’re complaining about.
I think you can asses the knowledge of a person better by asking meaningful questions than asking to write a binary search or sorting integers in place. Unless you want to hire juniors.
For typical dev jobs, hardly anyone writes sorting/searching algorithms, or algorithms at all. In fact, if someone working on web/apps/database stuff is spending time on algorithms, they are doing it wrong. Algorithms are a sort of pleasant luxury for most devs.
Depending on details, it's similar to asking an author to recite, verbatim, Shakespeare verses and calling gotcha on every mistake.
There are situations where it is legitimate, but for app/web/db devs it's just a rigged trivia competition.
Then what should the interviewer ask when it comes to coding question?
Take home small project? "Half" hates that.
Leetcode? The other "half" of population can't stand that.
Fizzbuzz? Too easy.
No code? How can they tell you can code, naturally?
Framework specific questions? Write something in Flask? What if you're looking for someone who's good at development but have not used a specific tech stack?
As the interviewer, I approach candidates with an open-ended friendly gesture. I tell them flat out I want to assess their coding ability, but on grounds they are comfortable and familiar with.
Before the interview, I tell them to bring whatever materials they want to the interview, whether it is virtual or not. Books, notes, videos, whatever, I don’t care. The materials are to support them showing off to me, in their favorite language, with their favorite ecosystem (editor, IDE, compiler, version control, etc.), solving a typical task they address that takes up about half a page (or more) of code on a standard US letter or international A4 sheet in 12-point Courier New, 1.5 inch or 3.8 cm borders, no lines that are entirely a comment (when counting just code).
The hyper-specific layout specifications were emerged by all sorts of crazy responses I got in the early years when I started this approach.
Could be a standalone program, a function, a code fragment, a library, really anything. I’m pretty comfortable in a very wide range of coding environments, and always happy to learn new ones, so I’ve been comfortable navigating the responses I get.
I even tell people I don’t care whether it is their own code or something they literally copy and paste off the Internet. But they need to be ready to talk about it, and take it in different directions. Just tell me ahead of time the toolchain, materials and code they chose.
I tell them I’m going to ask them to teach me about the code, and ask questions to clarify for myself along the way as if I’m going to take it over and maintain it myself, and customize the code and they will help me troubleshoot the customizations.
Through this approach I’m looking for the ability to communicate about code. Without springing surprises upon them. Being able to code is table stakes and this approach quickly flushes out who cannot code at a specific level required for the position. But I work exclusively in settings where there are teams of coders, and even “lone wolf” coders must be effective at conveying what they’ve built to others in case they win the Powerball/Lotto.
I can work with a variety of levels within a certain band of coding ability. I can work with someone who isn’t familiar with my clients’ specific toolchains. I can work with poor documentation, commenting, and writing skills. What I’ve been unable to solve for so far though, is someone not only utterly incapable of communicating what they’ve coded, but completely indifferent to improving.
You’ve missed some other formats that are already used in production. Domain-specific pair programming exists for both web and mobile engineering interviews. There’s also “read this code and debug it” which is an underrated format.
I’m not even sure the Leetcode style needs to be abolished, so much as it can be tweaked to remove some of the additional pressures of the format (time limits, stress from monitoring, overuse of memorization)
> What if you're looking for someone who's good at development but have not used a specific tech stack?
Then you have an interview specific to that use case of candidate. It doesn’t invalidate the format used in interviewing other types of candidates.
There will always be, I think, people who don't like whatever way you choose to do software job interviews, because they're not good at it, and then, some of them don't like it.
> Take home small project
Hmm yes it's unfair that some people don't have the time. What if everyone got to start at the same time, say, 12:00 UTC the 1st Saturday each month, and everyone got exactly 2 hours. Maybe that could work also for people with kids -- after all, 2 hours isn't much more than travelling to and back from an on site interview (in the same city).
Time and days could vary, trying to make it work for everyone
"Half" population over HN dislikes take home: argument ranging from waste time to pay us if you want us to do your job (I highly doubt 98% take home asked candidates to solve real business issue the employer are facing with since take home has a deadline that is longer than in-person interview but also take into consideration your time).
I've seen this counter-point come up so many times. It's true, some times people might lie and be good at talking. Amongst programmers, though, that is simply not true, at least to the point of being frequent enough that we need to subject everyone to obnoxious whiteboard and leetcode interview formats.
And in the event of the [rare] mistake in hiring someone who does this, you will know quite quickly after they start and you will fire them. Problem solved.
Look, I only have anecdotes to offer, but based on my experience, you are simply wrong.
It might be a different place, a different market from yours, etc. But at least here (and I've verified this anecdote with numerous people), many people who have very high experience levels simply can't code, at all. Or at least, can't do it given the coding task we give them (2 hours with a computer and full internet access, to do a few tasks in Python).
I'm not saying that people are lying on their resumes - just that supposed experience in an impressive sounding company doesn't necessarily translate to real world coding ability.
That may be and it could just be coincidence or location or whatever. Don't get me wrong, though, I have definitely interviewed highly (well, supposedly) experienced software engineers who, when faced with actually writing and reading code, could barely understand. So, I acknowledge that it does happen but just that in my experience, not overly often.
In the times that I have seen it happen, though, I almost always wonder how the heck it's possible? Either they really did lie about their experience (possibly but let's assume not) OR they have worked at places that didn't interview well (?) somehow just skated by for years on end? Or maybe there is another possibility for why this happens.
This could perhaps be construed arrogant, as though you're able to pick and choose who you interview for, but now that I've been through such an annoying amount of full interview chains until I was inevitably rejected, I think the opposite is true. The better place to spend your time, unless you're deliberately grinding data structures and algorithms problems, is on other companies' interviews who aren't just hiring arbitrary programming/salary/status ladder chasers (not there's necessarily anything wrong with that if you are one of the first two, or maybe the third).
So many companies try and do the FAANG thing now, that if they aren't going to compensate you accordingly for the same work, when really what they should be looking for is experience and general competency, then you might as well not interview with them. It's truly an unbelievable amount of time and energy involved in some of these, and there's so much variability in how you're evaluated, that it's probably not worth it.
A graph of "Interview Effort vs Total Compensation" should become an industry meme so that eventually HR are required to show on job ads where the advertised position lay on the graph.
How do they react to being told that? I don’t imagine many companies are prepared to rework their interviews on the fly like that, so my naive guess would be that they continue with the interview and proceed to offer the original salary they planned to
Few years ago I went to an interview at EA for some infrastructure engineering job. And the technical person/director conducting the interview asked me some FizzBuzz questions. I provoked him by saying: "let's take 30 minutes you and me and see who can write more FizzBuzz variants and more performant." He replied he doesn't have time for bullshit. I asked him why he assumed that I had time for bullshit? And that was the end of the interview.
The previous answer he disliked was to the question of what would I do if something happens and he is not available for me to ask him to take a decision. He expected that I'll fire him a mail. But instead the answer was that I will assume the responsibility and take the decision myself.
After the interview I was glad I did not work for them.
My reading of your story is that you decided during an interview that it wasn’t a good fit (totally fair!) and then responded to that situation by rudely provoking your interviewer. If you think that’s an appropriate response in a work setting I would suggest you work on your interpersonal skills.
Your coworkers have feelings and you want them to have positive feelings towards you since that makes it easier to work together. Deliberate rudeness kinda breaks that.
I am not rude to people. I want people to like me. I do respect people. One thing I don't tolerate, though, is being treated poorly.
By that point I was looking to terminate the interview. Looking retrospectively, I should just have stand up and thank them for their time. That person rubbed me in the wrong way and I couldn't support the belittling.
Nah, some of these manager types are a little too full of themselves and could use a reality check. He did them a favor. If enough people do this, maybe late one night this manager may start to wonder if there may be a point. If we all smile and nod things will never change.
It's been a while since I interviewed but that second one has always come up for me, and I find it exceedingly horrible.
Why can't the business just say "the policy is that you email a manager/call another manager/assume responsibility" and that's the end of it? Getting quizzed on the "correct" answer to their business policies is absurd.
It's perfectly reasonable for interviewers to ask FizzBuzz type questions. Many candidates can't answer them, so it's a good quick filter.
I'm not surprised he didn't have time for your bullshit.
I agree that other question is stupid. Clearly it depends on what the something is. He shouldn't expect an email for every decision so either he's a crazy micromanager or there's no valid answer.
I've just told numerous companies that I won't do it, and they don't care, because they're just trying to fill seats and the sellers market isn't what some people make it out to be.
I am searching for work, but these are largely asked by recruiters of me. It usually goes 0.5) Call with recruiter 1) coding test 2) call with 1 - 3 other people 3) Call with CTO or something 4) usually there's some other call
Well, that gives me all the more reason to try and continue to avoid these tasks. It is genuinely distressing though how many mediocre companies are adopting these horrific interview chains. More often than not starting with some multi-hour coding task before speaking with anyone worth speaking to. It's pretty bad.
I had someone hiring a web/app dev position and they had me make a 2D platformer game for a coding task... I still have no idea why.. It was some education startup trying to find someone to write socket code. They were really nice but by the end of it I wasn't super stoked on coding tasks.
>How do they react to being told that?
They tell me that this is how things work for their company. Which means I do not have to waste my time with them.
I don't like the expectation that all people should approach problem solving in a particular way. Everybody is different.
Personally, I like to continuously implement and clarify in small tight loops.
It might look something like: implement a rough structure with lots of empty functions and very basic control flow. Ask questions. Restructure. Ask more questions. Restructure again and add implementation detail. Etc etc.
This way of working is tied with my thinking. It reveals the problem to me over time instead of all at once. I get a deeper understanding this way.
There is an assumption by some people that my process is wrong. They would argue that I should instead break the problem down on paper, or in my head, or using test driven development, or with design documents etc etc. But this is the process that I've found works best for me.
I can enforce a particular process to please interviewers (and I have). But solving the problem becomes much harder.
Even if your typical way of solving problems is via intuition and internal processing, materializing this stage on the whiteboard probably improves your odds in a coding interview. It's not a statement about how things should be, but how things are. Even if this isn't your preferred mode of problem solving, I'm confident that any qualified candidate is capable of doing what was demonstrated in this post, and if they really want the job, they probably should.
> Even if this isn't your preferred mode of problem solving, I'm confident that any qualified candidate is capable of doing what was demonstrated in this post, and if they really want the job, they probably should.
I get that Big Tech jobs are glamorized the way high-paying jobs in Big Law, High Finance, etc. have historically been glamorized, but unlike professions like law and finance, there are lots of opportunities for
people with even decent development skills to make really good money today even if they don't work for Big Tech.
As an example, my friend's son is making well over $120,000 annually as a freelance PHP developer with a few years of experience. He's in his late 20s, doesn't have a comp sci degree and lives in a lower cost of living city where he's probably at least in the top 5% of earners.
On the flip side, I know people who are hiring for developers and they're more and more flexible, especially in regards to location and remote work, because the demand for decent developers greatly exceeds the supply.
Why someone should feel compelled to pretend that they problem solve differently to ace a whiteboard in this market is beyond me.
I think the late 20s self-taught php freelancer is far less common than big tech engineers. And 120k that is still only a small fraction of what such engineers typically make at the same age. And FAANG jobs are far more secure against recession than freelancing. I don't think it's that complicated why someone would want a FAANG job even in this market. There are other routes to moderate wealth but few as easy and reliable.
This is what I like to call a "Valley Bubble" comment.
There are lots of people working in tech-related jobs at/for companies not named Facebook, Amazon, Apple or Google who earn well above average incomes.
How to make this look like underachievement? Claim that these people are fewer in number than they actually are and suggest that there are hoards of 20-something FAANG engineers making $500,000/year+ such that a freelancer pulling in $120,000 is somehow a loser compared to his peers.
> There are other routes to moderate wealth but few as easy and reliable.
If it was as easy as showing up with a comp sci degree and a heartbeat, why is prepping for whiteboarding interviews even a thing?
You're reading a lot into my comment that's not there. I don't have any disdain for the freelancer in question, and I think it's quite an achievement to be earning $120k without training in the given field. But you said
> "Why someone should feel compelled to pretend that they problem solve differently to ace a whiteboard in this market is beyond me."
and I'm making observations that I feel adequately explain why someone might want to pass a whiteboard interview at a big tech company. It's completely valid to not want to do that as well, but you said that you couldn't understand why someone would want to, and I've explained why someone might.
> If it was as easy as showing up with a comp sci degree and a heartbeat, why is prepping for whiteboarding interviews even a thing?
I didn't say it was easy. I just said there are not other routes I'm aware of that are easier.
> I don't have any disdain for the freelancer in question, and I think it's quite an achievement to be earning $120k without training in the given field.
I was only pointing out that there are a lot more people who earn really good incomes in tech-related fields who don't have formal STEM educations. This isn't as uncommon as your comments seem to suggest.
Frankly, a lot of web and mobile app development doesn't require a computer science degree, nor is a formal computer science education the only form of "training" there is. Heck, assuming computer science curriculums are the same as when I was college-aged, I wouldn't even consider them "training" for a typical development job today.
> I didn't say it was easy. I just said there are not other routes I'm aware of that are easier.
I've dated both a lawyer (who worked at a Big Law firm) and an investment banker (who worked at a bulge bracket firm) in the earliest stages of their careers. Frankly, I don't think making a really good living in these professions is any more difficult than "FAANG worker". Different people have different aptitudes and when someone aligns his or her aptitudes to his or her career choices, everything sort of looks "easy".
> Frankly, I don't think making a really good living in these professions is any more difficult than "FAANG worker."
That doesn't match the reports of work life balance I have heard from practitioners of those professions vs. practitioners of programming. But I don't have any reliable, objective evidence to share on this point of view.
I'd say far more common, actually. The vast majority of the worlds engineers do not work for big tech. Also, most of the engineers I've worked with never got comp sci education, or even any related education. They all learned on their own or via courses / bootcamps. And what is moderate wealth anyway? Most engineers I know who have 5+ years of experience are easily in the top 3% of earners in their countries. I'd say that's pretty decent moderate wealth.
I had a look at the linked spread sheet¹ containing all the framework questions.
It's actually great advice for solving any programming problem — not only in case of a whiteboard interview.
Not that you need to write down everything explicitly for any problem. But you should in any case think through all of the mentioned things at least.
The other thing I've noticed is that a strong type-system will from the get go force you to describe exactly—down to the details—at least 80% of that stuff. Refinement types for example let you constrain further types with a lot of inhabitants. But even the usual things like optional or, for the contrast, non-empty types can help greatly. With some formal verification capabilities on top you could likely even codify more than 90% of the things mentioned in that spread sheet. Add some meaningfull comments with the reminder of the answers and you'll get almost perfect code in my opinion.
I would bet bug count in software overall would go down by more than 90% — if only all code would look like that in reality…
It's important to understand the question and communicate that understanding before you start coding. For relatively junior positions (L3-L4 at Google), I would expect that a candidate would do roughly what was described in the linked post.
However, it's really important not to behave in a way that is overly formulaic or comes across as being heavily "coached". If your interviewer is engaged and competent, coaching like this will entice the interviewer to mix things up and ask questions that force you to break out of the coached pattern. If the interviewer isn't as good, it's more likely they will just give you a lukewarm score and say that you were coached.
For example, if a candidate said something like "can strings be multicharacter?", I'd probably ask why that would matter. In the example question, it doesn't matter at all because we're just looking for equality of members (could be arrays of any type and the question would be the same), and would be a weird thing to seek clarification on. Maybe a better example would be if the candidate asked if there's an upper bound on the size of the input array. In real programming, that's not the sort of thing that usually matters, so I'd expect the candidate to explain why that matters in this case (maybe they have a O(n^3) algorithm in mind that would be untenable if the input can be large).
I've done a few hundred interviews, over 100 at Google.
An attempt of a framework for solving whiteboarding/Leetcode problems is helpful, and it should always invite us to ask what computer science pedagogy is missing. Any CS program worth its salt includes data structures and algorithms courses- they are simply fundamental elements of the discipline.
And yet, why are these interviews so difficult? I don't think it's simply an issue of many people having gone through these programs and failing to absorb or retain the material. Neither do I think there is a severe mismatch in the material covered and the interviews. Yes, some interview questions are awfully close to logic puzzles - having to know the two pointer trick for detecting cyclic graphs, what CS program covers that industry-specific technique? - but others are fundamental applications of trees, graphs, dynamic programming, etc.
Could it just be that academic CS doesn't approach solving problems in the same way as these interviews do? Are there meta-problem solving techniques that these courses simply don't cover? Heuristics that must be applied when one approaches a general problem before the topic is narrowed down? "Ah this involves a tree, ah this requires sorting, ah we should use a hash table?"
I'm absolutely baffled by some of the comments here. This is more akin to learning a language and learning grammar structures in a language. Once you have mastered the language you simply don't care about the linguistic structures. When you learn english you learn everything about verb placement, sentence composition and a whole lot of other things. Now, when you ask me about the meaning of grammatical structures I only know some of them because I'm learning Chinese and Japanese, but they're completely irrelevant to me when I'm thinking about anglosaxon or roman language. Worse yet, most of those structure I wouldn't be able to associate or name nowadays.
There is a reason why recent graduates perform better at these interviews than seasoned engineers. Even for computational math where you have all the basics of calculus 1-3 and bunch of other things where you have to prove the theorems in the exams it doesn't mean you recall all of it forever. It's like saying if you really know how to use a fork you create one yourself.
Totally agreed with you, but I don't even understand why people can't understand your point. It's obvious enough. Taking leetcode for interview is like memorizing all english vocabulary needed for SAT/GRE. Taking CS course in university is like learning how to write an essay properly.
Asking engineers to take leetcode for exams is like asking journalists to take the SAT exam. Obviously they will need to study a few months for such things.
You don't have a mental visualization of code execution? Quicksort, merge sort, binary search, hashmaps etc are just one image each, as easy to remember as the face of a friend. I don't see how anyone would forget that, there is no way I'll accidentally rearrange things to think the nose is above the eyes etc, and that is how stupid it looks when people make mistakes in coding interviews.
Describing or visualizing the algorithm is one thing.
Identifying that you need to use some variation of that algorithm and then writing flawless and optimal code that handles all corner cases in a 20 minute job interview, is an entirely different thing.
But being able to code up a bug free solution to a well defined problem seems very relevant to me when you apply to a software engineering job. If you can't do that then what can you do?
I think that part is to test soft skills, being able to work under stress and talk to a person is a part of being a good team mate. Straining both at the same time is a feature, since it is much easier to fake soft skills when you aren't distracted by a technical problem.
It is much more stressful than a real work situation, true, but people work much harder to appear nice and helpful in an interview setting as well so having a harsher test than real world situations to test soft skills seems appropriate.
There was a study conducted some time ago within the past year or so (it had a long discussion thread here, naturally) where it was discovered that when allowed to solve a whiteboard puzzle in a room alone, candidates performed far better. Now obviously communications soft skills must be tested. But perhaps the format can be tweaked so the candidate has some time to crack at a problem on their own prior to be asked to explain it.
The "niceness" of the interviewers is irrelevant. The power differential at stake sparks a survival instinct that leads to stress. (e.g. they literally hold power over your future meal prospects.) Though perhaps in others, it invigorates them with a sense of purpose, cool, and collectedness. Perhaps that is truly the 10x engineer.
I have. I just overdid the practice the first time and got good enough to place well in international competitions, so to me all these interview problems are really easy, I've never struggled with an interview problem since then (of course since I always pass there isn't much need to do many interviews so I have less data than some others). I know most people wont do that, but you don't need to be nearly as good at this to pass interviews so it should be possible to get good enough to never practice again with modest effort.
My Google interviews was basically me coding up a solution in 10 minutes and explaining what I did and proving it works with runtime etc. Then we spent the rest of the time talking about engineering problems, testing, what I did at previous jobs etc. I am rustier now many years later, but It is still good enough that I don't fail, it might take 20 minutes instead of 10 but there is still room to spare. So the limit is that you are maybe half as fast as I am when I'm rusty, doesn't seem like an overly high bar to me.
So if you're good enough to do well in international competitions, why are you even participating in this discussion? You're clearly an outlier who finds this easy, so you're not in a place to understand the challenges for most people.
Not every great engineer can be good at this stuff to do well in international level competitions, by definition that's a very small group.
I assume you're not on the same level as William Lin or tourist, so it would be like them wondering why you struggle on a particular problem when they can do every one easily.
But it takes like half a year to get that good, if you just practice a bit in college you get there. I'm not sure why people complain so much. People just need to stop memorizing and start practicing how to understand problems instead.
> I assume you're not on the same level as William Lin or tourist, so it would be like them wondering why you struggle on a particular problem when they can do every one easily.
Those have spent more than 10x as much time as me on that though. I got to my level in about 6 months that I spent to pivot from math to programming, that isn't unreasonable effort for anyone, most computer science grads have spent more time learning algorithms than I had.
Your leetcode may be A grade but your empathy, self-reflection, and frankly critical thinking skills need a ton of work.
If you had realized that
* in 90+% of job openings, whiteboard leetcode interviews optimize for the wrong thing, and neither the candidate nor the interviewer should be honored for its inclusion in the process
* live coding exercises are just as much an exercise in psychology - your willingness to submit to unreasonable demands and their willingness to subject you to them
* negative discrimination (i.e. weed out requirements) create biases in your hiring process and ultimately skill gaps in your personnel base
You would potentially be self-aware enough not to post this cavalier and self-aggrandizing comment.
As the person who replied to you says, you have a lot to learn about being a good, empathetic and kind human. I suggest for your own life you take some time to work on that if your technical skills are already good.
No I don't. AT ALL. It sounds to me like you've never encountered actual challenging engineering problems. If you have had anything to do with 3GPP all the algorithms used in there are a lot more complex than these silly compsci interview algorithms. You don't really spend that much time thinking about things that should be considered basic vocabulary.
I also had countless experiences where I ended up either rewriting from scratch or massively refactoring code that was written by computer scientists that dumped all their algorithm ideas into code without understanding a thing about performance, the underlying infrastructure or how to trace it.
Your sorting algorithm will mean jackshit when you spend all your time doing I/O reading files in and out of memory for example. Your super cool distributed algorithm means nothing if you don't understand CPU P-States(I guess that's irrelevant nowadays) or NUMA and didn't setup the machine properly.
I for one have opted out of this BS and will actively avoid mediocre programmers trying to boost their ego with cookiecutter algorithm questions.
> It sounds to me like you've never encountered actual challenging engineering problems
I worked on low level machine learning infrastructure and distributed algorithms at Google. I know pretty well what challenging engineering problems looks like.
> No I don't. AT ALL.
You don't think your inability to visualize computation affect your ability to come up with solutions to problems at all?
Edit:
> Your sorting algorithm will mean jackshit when you spend all your time doing I/O reading files in and out of memory for example. Your super cool distributed algorithm means nothing if you don't understand CPU P-States(I guess that's irrelevant nowadays) or NUMA and didn't setup the machine properly.
I understand those things well, it isn't hard to learn, we run benchmarks using proper prod setups and see what works faster and we know about CPU caches, memory overhead, file reading speed etc. The only difference is that I am fluent in algorithms and you aren't. I'm not sure why you'd think that I wouldn't know those thinks just because I know algorithms, it takes like a few months to master algorithms, there is still plenty of time to learn the other things.
I think it's easier to recall the "idea" of each algorithm, than visualizing the flow of the code. There's normally also some key implementation detail that's helpful to remember.
Quicksort - pivot - i <= hi
mergesort - merge - auxiliary array
When I started practicing DSA, I would code these algorithms from scratch ever week, to try and memorize them. Then as I did more general leetcoding, I realized that these are just solutions to problems, and there's no need to memorize the code exactly, just knowing the key idea is enough.
People dump on leetcode because they think it's memorizing solutions to problems, but it's not practical to do that. It's more like memorizing one sentence per problem, than memorizing a page of code. And when you see a new problem, you just adapt one of the "sentences" you memorized for a similar problem.
But if you've never looked at an efficient algorithm for dependency resolution, it's going to be impossible to come up with a good solution for a related problem in an interview.
> I realized that these are just solutions to problems, and there's no need to memorize the code exactly, just knowing the key idea is enough.
Right, my pictures are what the code is supposed to do, not the code that is executing it. Then I can just take pieces of it and compose it with other things, you need some kind of intuition to do that, for me that intuition takes the form of pictures.
But yeah, the trick to solve leetcode properly is to not solve leetcode, but to learn to get better than leetcode, that way leetcode problems will feel trivial for the rest of your life.
It's one thing to visualize, another to recall at will. Your friend is someone whom you've spent sufficient time with on at least a semi-regular basis to form emotional and mental bonds with, deep memories. How many times does a CS undergraduate re-implement sort algorithms? Not that most interviews actually ask one to regurgitate that from one's memory.
> How many times does a CS undergraduate re-implement sort algorithms
You don't have to implement, you just need to know the theory of why it works. Then you just code it up based on that theory, it isn't hard to do at all.
For example, merge sort explains itself, you partition and merge. That is all you need to remember. Hashmap is the same, you use a deterministic function to label objects and then put those into buckets, then you find them in the bucket with the same label later. These are like the basics of the basics.
Most such courses don't give a typical Leetcode problem in an exam with a 30-35 minute time limit, replete with continuous interruptions from the examiner.
It's a lot easier for me to do a Leetcode problem if I can explore and experiment a bit without having to verbalize my whole thought process.
Also, this is nothing unique to SW. I come from an engineering background, and we had to do calculus in almost every engineering course. Yet I'm sure within 12 months of graduation, most of my fellow grads would struggle with most calculus problems. I was once criticized by my peers for asking a basic calculus question while conducting an engineering interview.[1]
Just as with algorithms, you usually don't need calculus on the job. Looking at most of my SW development career, beyond the very basics (e.g. dictionary lookup is O(1) in Python), I probably needed algorithm knowledge on average once a year. And since almost none of my peers in most of the jobs have that knowledge:
1. Knowing it doesn't put me at an advantage within the company. It's a blind spot for everyone. Unless I happen to solve a serious business problem with that knowledge, which is very rarely the case (and likely why most people forget this knowledge).
2. My not knowing it won't put me at a disadvantage when it comes to career growth in the company.
This is the reality for most non-FAANG SW jobs. I suppose the one benefit of all these Leetcode interviews is that a lot more people have an incentive to review/learn this material.[2]
[1] Analogously, I was once criticized for being too tough when I asked a candidate to write a factorial function for a SW job.
[2] Although perhaps not really. Last week I interviewed a candidate who knew the theory really well - he understood complexity really well and seemed to understand data structures quite well too. But he couldn't write a basic function to split a string of numbers that were comma delimited, and return it as a list of integers. In both the languages he claimed proficiency in, there is a standard function to split a string, and not only didn't he know the functions, he had know idea of the concept. A classic example of textbook knowledge vs experience (of which he had 3-4 years).
Most people have to learn loads of information in a short time and former information doesn't get practiced after. In the span of 5 years, I learned trees at the end of year 1 and a bit at year 2. I didn't need them for anything I did past that. It wasn't until I sat down and practiced a little I was able to think in trees again.
Not everyone covers the same things. Loads of Leetcode tests and interviews apparently cover linked lists and tricks with linked lists. I got zero exercises on linked lists. Some structures require specific approaches, so if you have trouble getting a foot in, you'll fail miserably at any questions related to those structures no matter how simple or difficult.
There are other obvious points, but CS is simply too large a field. What one considers fundamentals is something another never uses, not even in libraries or frameworks. Easier is to prove that the student can learn what is deemed on an equal level as something else, and give them the tools to learn more on their own should they require to do so.
This all assumes the student in question has actually proven to know the material of the course, and the course covers some part of what people consider fundamentals. Things obviously change when considering some people get by despite not having proven they know the material, or CS courses missing these.
I also can't help but feel there is some irony in CS courses trying to accommodate an ever increasing need for more practical skills, ditching CS fundamentals and practicing them to do so, only to be met with interviews which test CS fundamentals despite the work not needing any of it.
Why don't you think it's simply an issue of many people who go through CS programs failing to master the material? It's hard to imagine someone who did well in, say, Sedgewick's sequence at Princeton (which, by the way, is available on Coursera for free for anyone who didn't want to go to Princeton) - it's hard to imagine someone like that struggling with entry level tech interview questions
Is it? If it's so straightforward, then why don't all of the people who are struggling with these questions simply take the Coursera course?
At the very least, these interviews require the applicant to be immediately familiar with any of the topics that might span a semester (or two) long course, so recall could be an issue.
You wont forget the basics of how to play chess either even if you haven't played it for 20 years, as long as you played it many times back then. I don't think there is a significant difference here, if you learned something properly then you wont forget it.
The human mind requires practice to retain skill and proficiency. Period. People who were solid engineers and later become executives, 20 years down the line many say they no longer can code. I've seen this many times.
Where are you getting this concept that humans never forget skills? That's completely false.
Come to think of it, the basics of playing chess is of insufficient complexity to be compared to an algorithm interview. More like being able to mount basic strategies and chess moves. Not sure how the average player would be able to recall those if they haven't played chess in a long time, because they were instead playing games derived from basic chess moves but with very different game mechanics.
If you demand more complexity then you need to take people with more skills. Do you think that Magnus Carlsen would forget how to mount a basic defence if he didn't touch chess for 20 years? He wouldn't be as good, sure, but he wouldn't forget how to play, he would still beat most people.
My rule of thumb is that people remember things one or two layers lower than their max. If you learned calculus then you wont forget basic algebra. If you took a grad course on electromagnetism then you wont forget basic calculus. The same goes for algorithms, if you learned them once and then never had a course where you built upon those to make more advanced algorithms you will forget them. But once you start to see them as basic building blocks for other things then you wont forget.
So from this rule, if you just learned the rules for chess you would forget them. But if you started trying to win chess games and viewed the rules as building blocks for strategies, then you will remember the rules for chess. Then you start to compose strategies etc.
The problem, then, would seem that the majority of software work is no longer building things, as it is jury-rigging together APIs and frameworks, thus leading to the loss of use of remembering the building blocks. And leads to the replacement of such vital components as building blocks with other components like design patterns or commonly used SDKs and libraries.
> Where are you getting this concept that humans never forget skills? That's completely false.
I never said humans never forget skills, I said humans never forget skills they master. Most people never master much at all, maybe 90-99% of software engineers would be in the never master bucket. Which is why I get downvoted, most people never get good and get angry when you tell them that they can work to improve.
That is true, that most people don't master anything. The question is if software engineers should be expected to master the material in interviews other than for interviews, if it truly makes them better engineers who build better software. And if so, they we arrive at my original question: why is current computer science education and training failing to convey that vital information? And how can this situation be remedied?
In terms of working to improve, given the excellent filtering capabilities of algorithmic interviews, surely if people were taught right and knew how to master it not to mention what they should master in the field, then it would not be so difficult to improve (though this would render the filter ineffective)? Because then more engineers would have known from day one what they should focus on (not that they don't at present, because these interviews are now broadly known to the public), and thus having mastered the material, be passing the interviews easily.
I really think that college could be structured better, yes. I think that algorithmic fluency helps in many areas both in science and in the industry, and the way college is taught isn't good for reaching that stage. If nothing else it provides you with a good framework for how to think about and structure computation.
Is it in a language that one has not encountered in day to day work in a long time? Because if so, that is analogous to interviewees encountering an algorithmic problem they have not dealt with in a while.
> Yes, some interview questions are awfully close to logic puzzles - having to know the two pointer trick for detecting cyclic graphs, what CS program covers that industry-specific technique?
My comp sci degree covered this, and it isn't a particularly prestigious university either. Most leetcode problems are just extensions of (or exact duplicates of) problems covered in my Data Structures and Algorithms class.
It seems like some of the skill involved for passing leetcode interviews is pattern-recognition to quickly recognise the kind of problem and what the likely tools to solve it are.
There are some Universities that somehow use similar materials for their CS courses and their students happened to work for these big/sv hi tech companies that reached out their old algo trick questions and asked them during interviews.
Not all schools picked up these tricks, unfortunately.
I completely gave up on giving whiteboard interviews after trying a couple of them. It's just so hard to get it right, and even if you do, I'm not sure you gain that much more useful knowledge as opposed to other interview styles.
Most college programs has very straightforward tests designed to be easy to pass, and students usually try to learn as little as possible by memorizing the material they can memorize instead of internalising anything. This combination means that a large majority of students who graduate will have a horrible understanding of the material studied. And since students have a horrible understanding and try to memorize, if you design tests that aren't straightforward and easy then almost everyone will fail, so you can't do that...
And then these students graduate, think that the paper is proof that they actually learned these things, but if you prod their knowledge then it all just falls down since there is no substance there.
> That is certainly a factor, and there is also an emphasis more on the mathematical proofs of algorithms than applications in such courses.
Proofs are great if you learn to write your own, but courses mostly just wants you to memorize proofs which isn't terribly useful. If your tests has a lot of problems where you need to write your own proofs then it is a pretty good course, but the normal case is that the test wants you to write down proofs from memory.
I give a lot of FAANG interviews these days, and I'm stunned how many people skip this important step of understanding the problem, asking clarifying questions to surface important constraints, and making a plan before launching off into implementation. It's not rocket science.
Also because no one works this way. The coding challenges are contrived and don't really reflect typical work. Like I've never had to do anything like strings that occur more than k times.
And these clarifications are mostly dumb. If you pass an emoji into my string function I'm okay with it not working. Similarly if you pass in a negative number then that's your own dumb fault if it does something unexpected.
That's how the real world works. You go to the extent that covers 99.99% of cases. If that means it doesn't work in your crazy scenario, well, create a ticket and it'll sit in the backlog until we clean up old tickets.
Asking clarifying questions is a pretty key part of software development. You get a loose requirement the system must do X, you then need to drill into all the details. For this reason I think lots of these coding problems are deliberately under-specified.
One of the big things about being an experienced developer is knowing when things are important. If someone asks me to write a function that returns strings over X characters what counts as a char is going to be whatever the language counts as a char for the length function. Asking about emojis will just lead to a time-wasting discussion about something that doesn't matter.
> if you pass in a negative number then that's your own dumb fault if it does something unexpected
If a negative number isn't valid input that should be gracefully handled by the program (e.g. by responding with an appropriate error indicating what is valid and/or invalid) instead of doing something unexpected.
I'm not much of a fan of the current state of tech interviews but seeking clarity around validity of input and how to react to invalid input is one aspect that does (or at least should) mimic "real life"
Because common expectations are two complex algorithmic problems finished coding up in 45 minutes? “Show me the incentives and I will show you the outcome.” and “It’s not rocket science.”
I have personal experience that it’s common at Amazon and have many corroborating examples it’s done at Google too. Maybe not years back, but definitely in the past couple years.
Can't you just ask the clarifying questions in the first minute and then code it up if you understand the problem? That is what I did when I got hired at Google, I don't think the other steps are necessary, a problem you can code a solution for in an interview doesn't require planning.
A lot of the people who I interview think they understand the problem, but I purposfully leave several things vague, like a product person would do, and I assure you they don't know it.
One of my simplest questions is that you have 2 files, one with vendor_sku and price and another with sku and vendor sku, I want a file with sku and price. People will just start coding from there or just assume they're getting in arrays.
But then they didn't use the first minute to ask clarifying questions, such as what the file format of those files are. I agree that asking clarifying questions is necessary, I don't think the other steps are necessary.
Yes, but that's what I want to see in the first 5 minutes, that you understand the question and have asked clarifying Questions. Step 2 is explaining that you have an approach to the problem before you launch off. A lot of people come up with crazy solutions that won't work like trees based on primes instead of HashMaps. This next 5 minutes helps me understand that they're able to communicate their design and discuss it, and that they've got a plan. Also verifies they really understood the problem. This is where I course correct if needed.
Really, that 2nd 5 minutes is for YOU, the candidate. You can test out if I'm even going to accept your solution, also by talking it through I'm going to give you partial credit even if you have issues coding it as at least you could code it out.
Partial credit isn't a thing to sneeze at, stuff happens in an interview. Network issues, software issues. I've had interviews where many things happened and the interview went sideways or just had to end over the 1k or so interviews I've given.
I had an interview where I was the candidate recently where the system wouldn't log correctly, or log what failed, and wouldn't debug correctly. My code was correct except a flipped check, but the interviewer wanted it to run all the test cases. It took 15 minutes to debug the web ui to even see what test case was failing.
If I was the interviewer in that case, the partial credit for explaining the answer and the 98% code would have been fine and I would have called it the instant the candidate found an issue with the tool, and moved onto other questions.
I've also had interviews where the interviewer admitted they were trying out a new question and couldn't guide me or grade me well, that first 10 minutes counters that. Sure that's not really professional, however it happens.
I think it's ridiculous and dumb that otherwise smart people take months out of their busy lives to train these bullshit skills they'll never use outside these "interviews". FWIW, I never prepare for more than a couple of days, and my batting average is pretty good, albeit not perfect. And I'm not a genius by any means. I just line up the interviews from least to most desirable (so I get some "interview training" in), and go with the flow.
When companies that offer these interviews are often offering 2x or more compensation, taking the time to study seems to really work out.
Doing leetcodes once or twice a week for the rest of my career (if these tests persist) seems like a really cheap price to pay to make over 400k a year.
> I think it's ridiculous and dumb that otherwise smart people take months out of their busy lives to train these bullshit skills they'll never use outside these "interviews".
Just wait until you hear about people wasting 16 years learning mostly useless things they will never need again in their life... I'm not sure why spending a few months more is such a big deal, you only need to learn these things once if you do it properly. If you don't do it properly then you deserve to waste your time, not sure why you'd take that route though. I guess it shows you don't know how to learn things properly, that is also a filter.
If you mean school, the first 10 years of grade school are basically glorified daycare. The remaining 2 (hopefully) teach you how to think and prepare you for college. That's why in countries which actually do care about their high-IQ kids (US is not one of them) they remove kids who aren't pulling their weight and send them to vocational schools.
College (at least for me) was darn near useless - 95% of the skills I use daily were self-taught. I could just take calculus, linear algebra, and differential equations and call it a day. Most people don't even need that much.
I _really_ wish college wasn't a requirement in our field. There is no doubt in my mind whatsoever that for a sufficiently motivated, moderately smart person looking to learn how to code, college is a horrible waste of time and money, unless they're looking to learn something highly specialized, and/or are looking for an advanced degree. They'd be much better off pair programming with someone experienced, as an apprentice, and they'd be making money and acquiring real experience while learning. The same is true for quite a few other professions as well.
It is ridiculous, but at this point with the dumb money that is sloshing around in the industry, and is now actually being sent to engineer compensation packages (a lot of money that has gone into the public markets and thus into RSU offerings, it would seem), there definitely seems to be incentive to jump through these hoops for a piece of the pie.
I think it's fine - everyone gets to interview as they please.
What's unfortunate is there being a handful of companies that pay 2-3x what everyone else pays.
The very existence of such companies should ring 'ding ding ding anti-trust' but seeing how Microsoft is still kicking, better than ever with its disastrous operating system plaguing humanity, I don't have much faith in future legislators.
Usually 3-4. It's tough to line them up, so the ordering is not always complete, but I do get a couple of "doesn't matter" interviews done before I go for a "meaningful" one. Sorry for wasting your time, companies that don't matter. You wrote the rules, I merely play the game.
Thanks, I dont interview much and now that I'm looking I realize it really hurts me. I'd love to know if its best to apply for do a few jobs a year or dozens. I haven't seen many stats on this, you just see people getting new roles without saying how many times they got rejected.
If you are otherwise competent, as you progress through your career, you will realize that most of the time (if not _all_ of the time) you get rejections not because there's something wrong with you, but because _their_ interview process failed. You know what you know. They don't. If they failed to ask you about things you know - that's their loss. I know it's hard to view it like that after a rejection, but believe me, you'll agree with me on this over time.
I don't want to doxx myself, but more than once I have been rejected by companies that were looking to do X, while I was demonstrably one of the world's foremost experts on X. Once I was rejected because I didn't know a tiny detail about CPU cache coherence protocols, for example. Something that I learned on my way back to my car in the parking lot. Makes zero sense to reject people with deep domain expertise over the trivia questions like that.
Another thing you'll see is that not every rejection is a downside. When we look forward to something we tend to paint an overly idealized picture in our minds for how it's going to be. But it rarely turns out quite like that, and you sometimes find out much later how much of a bullet you've dodged, thanks to a rejection you were super bummed about at the time. It's just work. It doesn't matter _that_ much.
I do not find this useful or correct. Furthermore, the author has another post that says to avoid Cracking the Coding Interview, which is actually a stellar book (even for those who are not interviewing).
I have generally avoided or balked at typical coding interview questions. They’re very seldom representative of real work, and have more value for helping big companies avoid lawsuits than helping anyone else evaluate candidates or prospective jobs.
That said… the article offers good advice for people going through the process. I’d add that it’s not much different from type- and test-driven development. Which, conceptually, might help readers who are reading this advice and feeling overwhelmed by the idea of trying to gather the information discussed while “buying time to think of a solution”.
Which is to say, if that feels like a lot of cognitive load on top of trying to navigate “soft skills” and/or other challenges in an interview, you may find it more grounding if you map the advice to ways you already work.
Another way to look at it that occurred to me: it doesn’t feel very different from a very short sprint (or new Kanban card or choose your working cycle abstraction).
1. Planning meeting to clarify user intent, acceptance criteria, whatever you need to get started.
2. Definition of smaller tasks and any feedback loop that might entail.
> They’re very seldom representative of real work, and have more value for helping big companies avoid lawsuits than helping anyone else evaluate candidates or prospective jobs.
The point of a coding interview is to eliminate, as fast as possible, people who simply can't code. I'm being completely serious here. They can even have a CS degree (or will claim to but if you look closely they were in an easier program to get into and took CS electives) but cannot write a simple program on the board in an hour.
> The point of a coding interview is to eliminate, as fast as possible, people who simply can't code.
I sincerely don’t believe this is true or that there’s any supporting evidence for it besides people repeating it. The point is to eliminate employment discrimination claims for large companies and this is what baseline they established to support the claim.
The point for everyone else is: we have too much to do, figuring out how to interview effectively and fairly is too much more work. Let’s use what everyone else uses.
I always thought the point of the coding interview was to promote a meritocracy, if you can do the job then you can have it. Fuck your "credentials" and the class you were born in, just be able to perform the job.
Now they very much care about your credentials and the way you got those credentials, but also jump through these extra hoops. No doctor, lawyer, nurse, etc is asked to do this asinine shit in their interviews (and yes, an RN in the bay area can bring in 200k/year...I know some).
What started out as a great idea has turned into an even more employer-centric way to sort the working class.
This seems completely pointless: either you understand how to solve the question, in which case you don't need any of this, or you don't, in which case all of this won't help you. Reminds me of Magnus Carlsen saying his biggest advantage is that he is better than his opponent at chess. The real "framework" for passing coding interviews is to get better at understanding and solving the problems.
This framework help people feel better about themselves when they fail the interview, since at least they wrote a lot of things on the board and didn't just get stuck.
Rather droll and overly uncharitable reading, perhaps. Given the highly performative nature of these interviews, where the candidate is expected to explain their "way of thinking", writing out thought processes in a systemized, scientific investigative matter will at least earn partial credit.
> Asking questions like: “Can strings be multi-character?” can provide you with critical information to the problem.
I don't understand this example, which immediately leads to imposter syndrome kicking in because this is supposed to be basic, right? But as I gain more experience I begin to expect this is just poorly written.
So can anyone help me understand what is meant by this question?
I think the author is referring to characters in a Unicode string that are combined to form a composite glyph.
For instance, a large number of emojis are formed by combining two Unicode characters with a zero width joiner character. Skin tones are a notable instance, along with gendered occupation glyphs (female doctor, say).
It would be a simplifying (and usually correct) assumption to ignore such composite characters in a basic processing task.
It's unfortunate that the question isn't itself posed, but I think you're overcomplicating it somewhat (because I think the question/explanation is bad).
The input is in the form Array<Str>, not Array<char>, yet the example is ['a', 'b', 'c', 'd']. By the API, an input could just as easily be ['apple', 'banana', 'cherry', 'durian'] by the types provided.
For this problem, I also think that's a weird clarification, because it doesn't really change how you approach the problem, which should be to construct a hash table keyed by string and whose values are the number of appearances. In python this would end up being something like `return max(collections.Counter(strs).values()) > k`. Whether the strings are single- or multicharacter isn't relevant unless you're doing something weird.
This is a great explanation and perhaps this is the author's intent. I think this example is so specific it is not a good way to start. It is better to start by eliminating broad categories and narrow in: can we assume ascii or do we allow unicode?
What the author describes... is called drafting a spec.
It's something that's taught to first year engineering students (map and understand all the constrains of a problem before attempting to solve it, or else pay the price later...).
I was brought in to a project on Friday to help figure something out. The gist of the project is that we have a system which generates tons of event data. Basically interaction logging. And until recently the system sent those logs to some other systems via nightly files. But someone in the business decided it made more sense to do it “real time” and so some folks have been diligently coding away and have built their solution only to find there was a gigantic miss:
The system is generating thousands, sometimes tens or hundreds of thousands of events per hour. And the api that is to receive this data can only receive 100 “records” (events) per request, and the system that generates the data can’t do any sort of multi threading or multiple requests simultaneously…it can just fire off one request after another..
I did the math and on our busiest day this process would take something like 827 years to send all of the event data…
I say all of this to highlight your point that it’s a foundational skill to gather requirements..and yet I’m working with sr engineers that created this mess…
> and yet I’m working with sr engineers that created this mess…
Engineers or "Engineers"? I'm asking because there's a trend in this industry to call everyone engineer (6 months bootcamp front-end engineer). But then you see things like that...
Learning and building stuff won't help you pass these interviews.
Which is ironic but it also means that applying for a company that uses such interviews is a completely different task to becoming a better engineer.
Personally I avoid such places, there is sufficiently good compensation available elsewhere from people that don't expect me to perform parlour tricks on a whiteboard for their amusement.
I think even having a working knowledge of common algorithms and data-structures is insufficient preparation for these. They often call for incredibly esoteric algos/datastructures or test for patterns incredibly uncommon in industry, i.e dynamic programming.
I would say even if you had a good background in how to implement a proper production ready database, i.e b+trees, bloom filters, journaling, etc. This while certainly enough to do the job of building a database would be insufficient to pass the interview to build a database.
In my experience, it's never a good sign when coding exercises are brought up when I'm being interviewed. I make a point of not doing that my self when I'm interviewing because I consider it disrespectful and actively harmful. Most people that I'd actually like to hire would hate me doing that.
I've actually had interviewers apologize for their lame test during the interview to me; probably because they sensed my annoyance. Usually it's a thing that HR insists on as a screening mechanism because they've heard that that is a thing they should do. Also a work culture where that is a thing breeds employees that believe it is actually a good thing. It's the old A's hire A's, B's hire C's thing. Usually, screening becomes necessary when you have recruiters supplying you a lot of C's with the occasional B. Those recruiters don't have access to either good candidates or good projects, typically.
Either way, there's a class of engineers that will happily decline to take coding exercises because they have better options to choose from because they just are that good and are probably not even answering the spam that your recruiter is shoveling out by the thousands. That's actually the kind of people I'd like to hire. I don't waste my time with doing interviews for jobs a lot and any hint of a coding exercise is a sure way for me to loose interest. I mostly ignore recruiter spam. Just like most half decent engineers I know.
It's actually a good way to screen projects; there is a lot of mediocrity in this industry. I look for a certain level of quality in projects I join. Coding exercises as part of the process is a red flag. Not necessarily fatal but a red flag still. Especially when interviewing seniors with decades of experience. Never really ends well and I tend to end the process early when that happens.
Good engineers don't come begging for a position on your project/company. You actually have to work hard to get them onboard. I know, because I've actually hired a few people that were genuinely very good. Better than me even. I love working with smart people and I love it when I find someone that is clearly razor sharp. When you come across such a person, you become a sales person. It's a time critical process where during the brief moment this engineer is actually not busy working for somebody else, you have an opportunity to convince them to work for you. That's how I interview and that's what I'm looking for when I'm being interviewed. Anything less than that signals to me a level of indifference and mediocrity that I probably need to walk away from.
Confronting candidates with some lousy coding exercise is not a great way to sell them on the opportunity. It screens candidates alright: you lose many of the good ones that you probably don't even get to talk to. Beware the candidates that get really enthusiastic about your coding exercise. They might still be good but it's something to explore further during the actual interview.
I did a fair share of problem solving on whiteboard, replying to very specific questions about libraries and frameworks. But I've also accumulated some experience. And if my experience isn't enough, if you can't tell by having a discussion with me if I fake it or not, then with all due respect I will tell you that I have better things to do with my time.