Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The novelty to me is that the “The experience seemed roughly on par with trying to advise a mediocre, but not completely incompetent, graduate student.” in so many subject areas! I have found great value in using LLMs to sort things out. In areas where I am very experienced it can be really helpful at tons of small chores. Like Terrence was pointing out in his third experiment — if you break the problem down it does solid work filling in smaller blanks. You need the conceptual understanding. Part of this is prompting skill. If you go into an area you don’t know you have to try and build the prompts up. Dive into something small and specific and work outward if the answer is known. Start specific and focused if starting from the outside in. I have used this to cut through conceptual layers of very complex topics I have zero knowledge in and then verify my concepts via experts on YT/research papers/trusted sources. It is an amazing tool.


This has been my experience as well. I treat LLMs like an intern or junior who can do the legwork that I have no bandwidth to do myself. I have to supervise it and help it along, checking for mistakes, but I do get useful results in the end.

Attitudinally, I suspect people who have had experience supervising interns or mentoring juniors are probably those who are able to get value out of LLMs (paid ones - free ones are no good) rather than grizzled lone individual contributors -- I myself have been in this camp for most of my early career -- who don't know how to coax value out of people.


> ... that I have no bandwidth to do myself.

One of the most interesting aspects of this thread is how it brings us back to the fundamentals of attention in machine learning [1]. This is a key point: while humans have intelligence, our attention is inherently limited. This is why the concept behind Attention Is All You Need [2] is so relevant to what we're discussing.

My 2 cents: our human intelligence is the glue that binds everything together.

[1] https://en.wikipedia.org/wiki/Attention_(machine_learning)

[2] https://en.wikipedia.org/wiki/Attention_Is_All_You_Need




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: