Hacker News new | past | comments | ask | show | jobs | submit login

My take on science is defined by the scientific method:

- we want to be able to understand and predict things

- we can come up with various explanations that might be able to do this

- if our explanation fails to predict an observation, it is wrong and we need to update or replace it

If you're doing that, you're doing science.




This is very close to Richard Feynman's explanation of the scientific method: https://youtu.be/kBqemHR49-c?t=33

" Guess -> Compute Consequences -> Compare to Nature / Experiment / Experience / Observation

If it disagrees with experiment, it's wrong. In that simple statement is the key to science."


In a universe with spherical cows, maybe. But the issue is that theories we view as true (at least within their scope) often fail to predict observations. Which is fine: we modify them with ad hoc hypotheses to round off the rough edges. The issue is that all theories, true or false, do that. Copernican theories were initially less predictive than Ptolemaic ones, for instance; a simple heuristic of rejecting theories that generate more incorrect predictions than other theories would have left us committed to a Ptolemaic universe. And yet we moved.


I'm not saying to follow a purely incremental optimization path, leading to local maxima. Thinking outside the box is clearly important for generating new ideas to test. We're not done until we run out of things to perfectly predict.

I would say some of the ancients did science too, to the best of their knowledge.


> But the issue is that theories we view as true (at least within their scope) often fail to predict observations. Which is fine: we modify them with ad hoc hypotheses to round off the rough edges.

This passage reminds me of how neural nets learn and often fail to generalise.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: