I specifically gave a reason why boats and swimming is an entirely different situation. Due to different incentives, AI can take away opportunities for people to learn math the old fashioned way, but boats did not do that to swimming precisely because the incentives for swimming (moving without a boat) are different. But I added that in an edit before I saw your comment.
Those reasons exist for learning math: mental fitness, personal enjoyment, sport, etc.
But you’ve subtly changed your argument: before you were arguing that the beauty was in creating mathematics, not merely learning already written mathematics.
My exact point is that learning surmathematics (math taken further by AI) is it’s own interesting pursuit — and appeals to my sense of aesthetics and adventure more than piddling around merely to say it was all done by human hands.
I’m not following where you believe the swimming and boat analogy breaks down: there’s still the same personal reasons to learn and do mathematics one might swim; but learning surmathematics is an adventure to a whole new land.
- - - Edit - - -
Responding to sibling comment as well:
> I am arguing that we should limit our intellectual journey, to preserve the humanistic aspects of the journey. That is exactly my position.
That’s exactly what I compared to swimming rather than boats — because you won’t reach the same places and it’s done for aesthetic reasons.
Some people (eg, myself) want the surmathematics adventure.
> That’s exactly what I compared to swimming rather than boats — because you won’t reach the same places and it’s done for aesthetic reasons.
For some reason, and I can't explain it, but I do believe that people still value personal physical achievements even when machines can do it better, but the same is not true of mental achievements. I take it as an axiom.
> Some people (eg, myself) want the surmathematics adventure.
That is where we fundamnetally differ, again axiomatically. I think it's offensive. But even if you do like it, that will eventually lead to the path where AI is just doing mathematics so well that no one will have much of a chance to understand what it is doing at all. And that ultimate conclusion, or even a probably chance of it, is enough reason to scrap the whole thing.