I think that’s the way we were taught in college / grad school. If the premise of the class is relational databases, the professor says, for the purpose of this course, assume the data does not fit in memory. Additionally, assume that some normalization is necessary and a hard requirement.
Problem is most students don’t listen to the first part “for the purpose of this course”. The professor does not elaborate because that is beyond the scope of the course.
FWIW if they were juniors, I would've continued the interview and direct them with further questions, and observer their flow of thinking to decide if they are good candidates to pursue further.
But no, this particular person had been working professionally for decades (in fact, he was much older than me).
I took a Hadoop class. We learned hadoop and were told by the instructor we probably wouldn’t’t need it, and learned some other Java processing techniques (streams etc)
People can always find excuses to boot candidates.
I would just back-track from a shipped product date, and try to guess who we needed to get there... given the scope of requirements.
Generally, process people from a commercially "institutionalized" role are useless for solving unknown challenges. They will leave something like an SAP, C#, or MatLab steaming pile right in the middle of the IT ecosystem.
One could check out Aerospike rather than try to write their own version (the dynamic scaling capabilities are very economical once setup right.)