I give a lot of FAANG interviews these days, and I'm stunned how many people skip this important step of understanding the problem, asking clarifying questions to surface important constraints, and making a plan before launching off into implementation. It's not rocket science.
Also because no one works this way. The coding challenges are contrived and don't really reflect typical work. Like I've never had to do anything like strings that occur more than k times.
And these clarifications are mostly dumb. If you pass an emoji into my string function I'm okay with it not working. Similarly if you pass in a negative number then that's your own dumb fault if it does something unexpected.
That's how the real world works. You go to the extent that covers 99.99% of cases. If that means it doesn't work in your crazy scenario, well, create a ticket and it'll sit in the backlog until we clean up old tickets.
Asking clarifying questions is a pretty key part of software development. You get a loose requirement the system must do X, you then need to drill into all the details. For this reason I think lots of these coding problems are deliberately under-specified.
One of the big things about being an experienced developer is knowing when things are important. If someone asks me to write a function that returns strings over X characters what counts as a char is going to be whatever the language counts as a char for the length function. Asking about emojis will just lead to a time-wasting discussion about something that doesn't matter.
> if you pass in a negative number then that's your own dumb fault if it does something unexpected
If a negative number isn't valid input that should be gracefully handled by the program (e.g. by responding with an appropriate error indicating what is valid and/or invalid) instead of doing something unexpected.
I'm not much of a fan of the current state of tech interviews but seeking clarity around validity of input and how to react to invalid input is one aspect that does (or at least should) mimic "real life"
Because common expectations are two complex algorithmic problems finished coding up in 45 minutes? “Show me the incentives and I will show you the outcome.” and “It’s not rocket science.”
I have personal experience that it’s common at Amazon and have many corroborating examples it’s done at Google too. Maybe not years back, but definitely in the past couple years.
Can't you just ask the clarifying questions in the first minute and then code it up if you understand the problem? That is what I did when I got hired at Google, I don't think the other steps are necessary, a problem you can code a solution for in an interview doesn't require planning.
A lot of the people who I interview think they understand the problem, but I purposfully leave several things vague, like a product person would do, and I assure you they don't know it.
One of my simplest questions is that you have 2 files, one with vendor_sku and price and another with sku and vendor sku, I want a file with sku and price. People will just start coding from there or just assume they're getting in arrays.
But then they didn't use the first minute to ask clarifying questions, such as what the file format of those files are. I agree that asking clarifying questions is necessary, I don't think the other steps are necessary.
Yes, but that's what I want to see in the first 5 minutes, that you understand the question and have asked clarifying Questions. Step 2 is explaining that you have an approach to the problem before you launch off. A lot of people come up with crazy solutions that won't work like trees based on primes instead of HashMaps. This next 5 minutes helps me understand that they're able to communicate their design and discuss it, and that they've got a plan. Also verifies they really understood the problem. This is where I course correct if needed.
Really, that 2nd 5 minutes is for YOU, the candidate. You can test out if I'm even going to accept your solution, also by talking it through I'm going to give you partial credit even if you have issues coding it as at least you could code it out.
Partial credit isn't a thing to sneeze at, stuff happens in an interview. Network issues, software issues. I've had interviews where many things happened and the interview went sideways or just had to end over the 1k or so interviews I've given.
I had an interview where I was the candidate recently where the system wouldn't log correctly, or log what failed, and wouldn't debug correctly. My code was correct except a flipped check, but the interviewer wanted it to run all the test cases. It took 15 minutes to debug the web ui to even see what test case was failing.
If I was the interviewer in that case, the partial credit for explaining the answer and the 98% code would have been fine and I would have called it the instant the candidate found an issue with the tool, and moved onto other questions.
I've also had interviews where the interviewer admitted they were trying out a new question and couldn't guide me or grade me well, that first 10 minutes counters that. Sure that's not really professional, however it happens.