Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> My O-chem professor … told me to quit when I didn’t just understand it immediately… He wanted to appear helpful, but then acted resentful when I asked questions, “wasting his time” …

Stories like that really leap out at me these days. For the past few months, I’ve been experimenting with learning through interaction with GPT-4. While some subjects work better than others and GPT-4’s lack of personality can make the lessons a bit dry, it never gets annoyed at my questions or belittles my ignorance. I suspect more and more people will find that they prefer learning from chatbots to being taught by humans.

(Later) As one example of how I have been learning from GPT-4, below is a conversation I have had with it over the past few days:

https://gally.net/temp/20241017exchangewithGPT-4.pdf



The core problem I think seems to be time. Everyone lower than C-level is packed full with work, no matter if in academia, government or the private sector.

So, when someone comes to me with an issue and shows they have done at least done the basic legwork, I'll be more likely to help them out than someone who just says "xyz doesn't work" but has done zero work on their own to troubleshoot it, because I don't have time to guide some junior on utterly basic stuff - and that is why so many job postings say "x years of experience" required: the existing staff is drowning and can't handle the enormous workload of training juniors.

Obviously that has negative mid- to long-term consequences as when no juniors get trained you'll eventually run out of seniors, but I have no idea how to get this fixed, especially not on a societal level.


What makes you think that the C-level is not packed with work?


I've been doing the same. Sometimes chatting with with bot gives me something I can confirm and the process feels like I'm learning more than I would had I been reading the textbook.

Sometimes the bot can't help, but by the time I come to that conclusion I'm now familiar enough to ask the professor in a way that doesn't feel like a waste of his time.

Your O-chem professor sounds like a jerk, but there probably is some merit, in general, to the idea that we can maximize the efficacy of teachers by thinking a bit harder about the questions first.


I read somewhere that "in the 40's and 50's they built computers for the computation, but it turned out the storage was almost more important.

Now we built AI for the cognition, but it will turn out that the patience is almost more important."


> but it will turn out that the patience is almost more important

You never used a 3D printer or other kind of slow robot, have you?

Computers are infinitely patient and relentless. That has huge implications.


ChatGPT will also teach you important skills of "trust but verify" and "how to detect straight-face lying"

I've experimented with asking GPT-4 technical questions, and the moment you have something moderately complex, you start occasionally getting outright incorrect information (library functions which do not exist, missing correct answers entirely, products of entirely wrong category, descriptions of completely wrong things...)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: