Hacker News new | past | comments | ask | show | jobs | submit login

That writing is the only way to do deep, clear, thinking simply isn't true.

Stephen Hawking is the first example that comes to mind.

He developed a remarkable ability to perform complex calculations and visualize intricate mathematical concepts entirely in his mind. He once mentioned that his ALS diagnosis, which limited his physical abilities, led him to focus intensely on theoretical physics, as it required more intellectual than physical effort.

But sure, writing (and drawing) is a great tool to aid in deep thinking. So are AI tools.




I think you have understood "writing" in a very narrow sense. As mentioned in other replies, Stephen Hawking was a very prolific author. He did not write much, but he sure knew how to write.

PG is obviously talking about the mental process of writing, i.e. of organizing a complex network of thoughts in a linear hierarchy that others can grasp, not the physical one.


> That writing is the only way to do deep, clear, thinking simply isn't true.

You're correct here.

> Stephen Hawking is the first example that comes to mind.

The post is obviously speaking of the general population or at best average professional, and in my opinion choosing one of the most brilliant exceptional scientific minds of our lifetimes is not a good counterargument for a piece that speaks of a potential problem with society at large.


As someone who teaches PhD students who are quite far beyond "average professional", I concur completely with PG on this one. Writing forces you to make very clear and concrete ideas that feel like they make sense but are still fuzzy. It's certainly not the only way, but it's the most common and easy way.


To use an overextended computer metaphor: serializing data structures to a wire format forces lazy evaluation, turning up any errors that were previously hidden by laziness.


I don't disagree, just want to mention that as someone married to someone who supervises Phd students, they're not by any means "far beyond average professional"... but perhaps you're on a exceptionally highly regarded faculty where that may be the case.


That is probably the case.


One of the most exceptional scientific minds of the time, who, I might add, despite not picking up a pen, nevertheless wrote books!

Strange example to pick as someone who did not write.


Best-selling author Stephen Hawking?


A Brief History Of Time?


Reading and writing are essential for the transfer and percolation of knowledge across society.

Stephen Hawking's thinking and imagination wouldn't have meant much had he not finally penned them down for others to read, and neither would his ideas have been taken seriously had he chosen to make tiktoks or podcasts to explain them instead.


> That writing is the only way to do deep, clear, thinking simply isn't true.

You have committed the Fallacy of the Inverse.


But for most of the rest of us in practice I suspect that it is more true than false.

Most of us have neither the intellect of Hawking nor his situation.


It is weird he doesn’t think of AI as deep thinking tool at all.

Sure some will thoughtlessly copy and paste but for many AI helps to structure their thoughts and they think clearer as a result.


I think what he's getting at is that while you CAN use an AI to assist with "ideation," we will inevitably create new, low paying jobs where there is no ideation and the employee just operates an AI, because economics. That will in turn create a large cohort within society who are functionally illiterate. Literacy profoundly alters the brain for the better, and this won't happen to those people.


Can you expand on that? I can't see any sense in which an LLM improves the structure of the user's thought process


It's useful for ideation: suggesting ideas and concepts that you might not think of. A bit like a conceptual thesaurus. But it doesn't replace the hard work of thinking for yourself.


In the same way it can detour your thoughts to the mainstream and lead you away from a line of thought that might have ended up somewhere else.


Here are two examples:

a) No / little data: Whenever you are starting to think about a subject, you can ask it to give you a structure / categories.

b) Existing data: What I do very often is to give it a lot of "raw data" like unstructured thoughts or an unstructured article, then I ask him to find suitable top categories.


a) Doesn't that mean to limit oneself to bias, mediocrity, and preconceived judgements instead of actually thinking?


Well it helps me to get started. Not more but also not less.

For me it’s very important to emphasize that AI is a tool. You have to use it responsibly. But there is no reason not to use it.


I see, I don't want to shame this kind of use. It's kind of almost like talking about something briefly with an educated person.

Until it's not.

I'm not the type who'd say "don't use AI". Use whatever works. Myself I became really fascinated by transformer LLMs / GPTs in winter 2019, then again when ChatGPT was published and a good few months after that.

It's just that my interest&enthusiasm has almosted vanished by now. Surely it will reemerge at some point.


Very good point. I often use AI to see things from multiple points of view. It is a good tool to check if you have included obvious things in your argumentation. Spell checking is just one of those obvious things.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: