These are the same people who would pooh-pooh teaching Excel and basic coding skills to non-STEM majors or have CS students take ethics or GenEd classes.
AI/ML isn't going to completely shift the world, but understanding how to do basic prompt engineering, validate against hallucinations, and know what the difference between ChatGPT and GPT-4o is valuable for people who do not have a software background.
It's more about knowing the tricks to get llms to give you the output you want.
However, there's no reason to think any trick would be relevant even in a year. As llms get better, why wouldn't we just have them auto rewrite prompts using appropriate prompt engineering tricks?
AI/ML isn't going to completely shift the world, but understanding how to do basic prompt engineering, validate against hallucinations, and know what the difference between ChatGPT and GPT-4o is valuable for people who do not have a software background.
Gaining any kind of knowledge is a net win.