I've seen rampant pre-ChatGPT cheating at my university where it's popular for business undergrads to take computer science as a second major to get a leg up for Product Manager roles.
These people don't intend to ever write a single line of code after university. They aim to be just familiar enough with software development to be able to nod along and throw in a few buzzwords in interviews.
Big money attracts people who just want the money.
This essentially makes credentials more meaningless than ever, even as tuition has grown astronomically.
The cost of making a poor hiring decision can be vast throughout an organization and punish its velocity. It’s pretty typical to want to minimize risk in scenarios like that, so reputable credentials can instill confidence and get past gatekeepers.
Wouldn’t it be nice if we could have a better way to prove domain expertise, adaptive reasoning, and collaboration skills?
I wish. They are the most easily exploitable, sadly. People memorize and dump stuff all the time, and it usually doesn’t show reasoning, just how much money and effort you spent on CS fundamentals.
My team has weeded out a lot of bad candidates with super simple, practical tasks. Explain DNS. Order keys from a JSON object. Make a 2 column layout in plain html. You would be amazed at who can’t do that.
What we don’t do is ask the same questions too many people times, as we know some candidates compare notes and even publish the interview questions. With large language models doing the talking we have already found candidates that have been unable to describe the for loop copilot made for them off-screen… so I guess the best system is to be good at being humans and having a go at working together in something.
These people don't intend to ever write a single line of code after university. They aim to be just familiar enough with software development to be able to nod along and throw in a few buzzwords in interviews.
Big money attracts people who just want the money.