> Backpropagation has zero creativity. It's an elaborate mechanical parrot, and nothing more. It can never relate to you on a personal level, because it never experiences the world. It has no conception of what the world is.
The problem is: it's not really clear how much creativity we have, and how much of it is better explained by highly constrained randomized search and optimization.
> It can never relate to you on a personal level
Well, sure. Even if/once we reach AGI, it's going to be a highly alien creature.
> because it never experiences the world.
Hard to put this on a rigorous setting.
> It has no conception of what the world is.
It has imperfect models of the world it is presented. So do we!
> At least a dog gets hungry.
I don't think "gets hungry" is a very meaningful way to put this. But, yes: higher living beings act with agency in their environment (and most deep learning AIs we build don't, instead having rigorous steps of interaction not forming any memory of the interaction) and have mechanisms to seek novelty in those interactions. I don't view these as impossible barriers to leap over.
The problem is: it's not really clear how much creativity we have, and how much of it is better explained by highly constrained randomized search and optimization.
> It can never relate to you on a personal level
Well, sure. Even if/once we reach AGI, it's going to be a highly alien creature.
> because it never experiences the world.
Hard to put this on a rigorous setting.
> It has no conception of what the world is.
It has imperfect models of the world it is presented. So do we!
> At least a dog gets hungry.
I don't think "gets hungry" is a very meaningful way to put this. But, yes: higher living beings act with agency in their environment (and most deep learning AIs we build don't, instead having rigorous steps of interaction not forming any memory of the interaction) and have mechanisms to seek novelty in those interactions. I don't view these as impossible barriers to leap over.