Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Firstly: this can probably be neither proven nor disproven, but my intuition tells me that it's by definition impossible for mathematics to run out of problems.

Secondly:

> It cannot remain healthy with its incredible publication rate today of mostly useless generalizations.

So the issue isn't that mathematics is running out of problems. The issue is that there are more publications than there are new problems being discovered / solved, and, ergo, the majority of publications are of limited value / interest. And that isn't an issue unique to mathematics, that's just how academic research is in the 21st century!



My definition of "running out of problems" as I stated was "running out of problems that more than a handful of people care about", and I think this is definitely true in math.


I don't want to be snarky, but I do seriously wonder how many people really cared about calculus when Newton/Leibniz developed it. It honest couldn't have been more than a handful because Newton slept on it for the better part twenty years.

I honestly think Math as a field has always been defined by "problems that only a handful of people care about".

The only exception I can think about is maybe basic addition and multiplication.


Yes, exactly. A whole lot of math problems were not interesting until we got relatively fast digital computers as they would have taken till the end of the universe if done by hand. Then suddenly it becomes a product you can implement in a library and perform simulations of reality on. Now suddenly a lot more people are interested in the math because improving the algorithm could lead to millions in power saving, or it could lead to far higher accuracy.

Simply put, it's very hard to predict the usefulness of math at the time it's created.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: