"I can imagine a future where some future model is better at proving theorems than any human mathematician" Please do not overestimate the power of the algorithm that is predicting next "token" (e.g. word) in a sequence of previously passed words (tokens).
This algorithm will happily predict whatever it was fed with, just ask Chat GPT to write the review of non-existing camera, car or washing machine, you will receive nicely written list of advantages of such item, so what it does not exist.
I can also write you a review of a non-existent camera or washing machine. Or anything else you want a fake review of! Does that mean Iām not capable of reasoning?
If you are not capable of distinguishing between truth and lie, and not capable of reflection which is the drive behind learning from past mistakes - then yes.
This algorithm will happily predict whatever it was fed with, just ask Chat GPT to write the review of non-existing camera, car or washing machine, you will receive nicely written list of advantages of such item, so what it does not exist.