Hacker News new | past | comments | ask | show | jobs | submit login

I'm giving a programming class and students uses LLMs all the time. I see it as a big problem because:

- it puts focus on syntax instead of the big picture. Instead of finding articles or posts on Stack explaining things beyond how to write them. AI give them the "how" so they don't think of the "why"

- students almost don't ask questions anymore. Why would they when an AI give them code?

- AI output contains notions, syntax and API not seen in class, adding to the confusion

Even the best students have a difficult time answering basic questions about what have been seen on the last (3 hours) class.




Job market will verify those students, but the outcome may be potentially disheartening for you, because those guys may actually succeed one way or another. Think punched cards: they are gone along with the mindset of "need to implement it correctly on first try".


> but the outcome may be potentially disheartening for you, because those guys may actually succeed one way or another

Your sentence is very contradictory to say the least! I'll be very glad for each of them to succeed in any way.


students pay for education such that at the end, they know something. if the job market filters them out because they suck, the school did a bad job teaching.

the teachers still need to figure out how to teach with LLMs around




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: