Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> fireworks, cannons, jellyfish squeezing water out to accelerate, no sudies of orbits from moons and planets, no chemistry experiments, no inspiration from thousands of years of flamethrowers

Fireworks, cannons, chemistry experiments and flamethrowers are all human inventions

And yes, exactly! We studied orbits of moons and planets. We studied animals like Jellyfish. We choose to observe the world, we extracted data, we experimented, we saw what worked, refined, improved, and succeeded

LLMs are not capable of observing anything. They can only regurgitate and remix the information they are fed by humans! By us, because we can observe

An LLM trained on 100% wrong information will always return wrong information for anything you ask it.

Say you train an LLM with the knowledge that fire can burn underwater. It "thinks" that the step by step instructions for building a fire is to pile wood and then pour water on the wood. It has no conflicting information in its model. It cannot go try to build a fire this way and observe that it is wrong. It is a parrot. It repeats the information that you give it. At best it can find some relationships between data points that humans haven't realized might be related

A human could easily go attempt this, realize it doesn't work, and learn from the experience. Humans are not simply parrots. We are capable of exploring our surroundings and internalizing things without needing someone else to tell us how everything works

> That is, nobody taught humans to split the atom and then humans literally parotted the mechanism and did it, but you attempting to present splitting the atom as a thing which appeared out of nowhere and not remixing any existing concepts is, in your terms, absolute drivel

Building on the work of other humans is not parroting

You outlined the absolute genius of humanity building from first principles all the way to splitting the atom and you still think we're just parroting,

I think we disagree what parroting is entirely.




Your point is contingent on sensor availability to an llm. Llms are a frozen human mind until they behave like live ml algos.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: