Why don't you think it's simply an issue of many people who go through CS programs failing to master the material? It's hard to imagine someone who did well in, say, Sedgewick's sequence at Princeton (which, by the way, is available on Coursera for free for anyone who didn't want to go to Princeton) - it's hard to imagine someone like that struggling with entry level tech interview questions
Is it? If it's so straightforward, then why don't all of the people who are struggling with these questions simply take the Coursera course?
At the very least, these interviews require the applicant to be immediately familiar with any of the topics that might span a semester (or two) long course, so recall could be an issue.
You wont forget the basics of how to play chess either even if you haven't played it for 20 years, as long as you played it many times back then. I don't think there is a significant difference here, if you learned something properly then you wont forget it.
The human mind requires practice to retain skill and proficiency. Period. People who were solid engineers and later become executives, 20 years down the line many say they no longer can code. I've seen this many times.
Where are you getting this concept that humans never forget skills? That's completely false.
Come to think of it, the basics of playing chess is of insufficient complexity to be compared to an algorithm interview. More like being able to mount basic strategies and chess moves. Not sure how the average player would be able to recall those if they haven't played chess in a long time, because they were instead playing games derived from basic chess moves but with very different game mechanics.
If you demand more complexity then you need to take people with more skills. Do you think that Magnus Carlsen would forget how to mount a basic defence if he didn't touch chess for 20 years? He wouldn't be as good, sure, but he wouldn't forget how to play, he would still beat most people.
My rule of thumb is that people remember things one or two layers lower than their max. If you learned calculus then you wont forget basic algebra. If you took a grad course on electromagnetism then you wont forget basic calculus. The same goes for algorithms, if you learned them once and then never had a course where you built upon those to make more advanced algorithms you will forget them. But once you start to see them as basic building blocks for other things then you wont forget.
So from this rule, if you just learned the rules for chess you would forget them. But if you started trying to win chess games and viewed the rules as building blocks for strategies, then you will remember the rules for chess. Then you start to compose strategies etc.
The problem, then, would seem that the majority of software work is no longer building things, as it is jury-rigging together APIs and frameworks, thus leading to the loss of use of remembering the building blocks. And leads to the replacement of such vital components as building blocks with other components like design patterns or commonly used SDKs and libraries.
> Where are you getting this concept that humans never forget skills? That's completely false.
I never said humans never forget skills, I said humans never forget skills they master. Most people never master much at all, maybe 90-99% of software engineers would be in the never master bucket. Which is why I get downvoted, most people never get good and get angry when you tell them that they can work to improve.
That is true, that most people don't master anything. The question is if software engineers should be expected to master the material in interviews other than for interviews, if it truly makes them better engineers who build better software. And if so, they we arrive at my original question: why is current computer science education and training failing to convey that vital information? And how can this situation be remedied?
In terms of working to improve, given the excellent filtering capabilities of algorithmic interviews, surely if people were taught right and knew how to master it not to mention what they should master in the field, then it would not be so difficult to improve (though this would render the filter ineffective)? Because then more engineers would have known from day one what they should focus on (not that they don't at present, because these interviews are now broadly known to the public), and thus having mastered the material, be passing the interviews easily.
I really think that college could be structured better, yes. I think that algorithmic fluency helps in many areas both in science and in the industry, and the way college is taught isn't good for reaching that stage. If nothing else it provides you with a good framework for how to think about and structure computation.
Is it in a language that one has not encountered in day to day work in a long time? Because if so, that is analogous to interviewees encountering an algorithmic problem they have not dealt with in a while.