Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> because in the end it's just a bunch of code running statistical computations

At a low enough level, our brain seems to be just a bunch of neurons firing impulses at various rates that can be described as statistical computations. Why be so sure that the right neural network wouldn't understand what it is to be human and mortal, understand suffering, have emotive value, etc?



Because you have to be human and mortal to understand it to credibly contribute and share the story of what that means to be. You can't superficially understand someone's situation and then take ownership of it. You can get a glimpse and really try and empathize, but you can't become the bearer of that experience, just a consumer.


>Because you have to be human and mortal to understand it to credibly contribute and share the story of what that means to be.

Aside from directors, authors, artists, etc, who have demonstrated this to be false, an AI could conceivably synthesize the experiences of every author that wrote on what it means to be human or experience mortality and create a story that captures the essence of the experience better than any one person ever could. Having the first person experience doesn't induce a superior ability to communicate features of the experience.


> > Because you have to be human and mortal to understand it to credibly contribute and share the story of what that means to be.

> Aside from directors, authors, artists, etc, who have demonstrated this to be false [...]

probably not what you meant, but this sounds like you know some nonhuman/immortal artists :)


Movie directors have never experienced most of what they film, but they convey those experiences far better than those who have actually lived those stories. I see no reason to doubt that the same is true for artificial storytellers.


Yeah but the AI could pretend it knows.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: