Lets say we are training ChatGPT2, would you let it use ChatGPT for solving things? Do you think it would learn better if it had to learn itself without relying on ChatGPT?
We learn by doing. Relying on smart tools puts you in a local optima, it is hard for you to improve over those tools, but people who don't rely on tools continue improving and then start using the tools later.
I was just trying to allude to the nature of the universities’ business. (Being assessing that they’ve taught what they claim to teach.)
My PhD mentor is actually leading a discussion about ChatGPT use by students, and the tone is mostly concern. But it’s not all bad; it’s an incredible tool for quickly diving into a subject at a surface level and getting your bearings regarding what’s important. It saved my ass on one of my finals too when the rest of the questions were harder than I expected and ChatGPT helped me tear through the remaining T/F questions.
We learn by doing. Relying on smart tools puts you in a local optima, it is hard for you to improve over those tools, but people who don't rely on tools continue improving and then start using the tools later.