The line is clear: everything today branded "AI" is just applied statistics. AI is a buzzword. I don't know what the definition of intelligence is, but I have a feeling it doesn't rest anywhere near concepts like function approximation, and that's all even the most sophisticated "AIs" at Google or Facebook or Apple boil down to.
What was not clear from your earlier comment, and is now, is that when you say AI you don't mean AI as is practiced by most of academia and the industry but the vision of Artificial General Intelligence (AGI). If so, yes, that's a good point to make. However, it is debatable whether the path of statistical learning wont lead to AGI, or is not how our brains function, or the truth partly does comprise of statistical learning and part of something else. The Norvig-Chomsky debate is an example of the arguments on both sides.
I didn't make an earlier comment. You're replying to my one and only comment.
> when you say AI you don't mean AI as is practiced by most of academia and the industry but the vision of Artificial General Intelligence (AGI).
What I actually mean is people practicing what they call "AI" in academia and the industry have co-opted the name to make what they do sound more interesting. First it was called "statistics". Then it was called "pattern matching". Then it was called "machine learning". Now it's called "AI". But it hasn't changed meaningfully through any iteration of these labels.
If you can definitely a problem rigorously, you've essentially defined a function. So "function approximation" is basically "general problem solving approximation".
I don't really think that characterization is fair, for example GANs, there is no data set of correct input output pairs for the function that is learned.