There's one thing AI can't do, and that's actually care about anyone or anything. It's the rough equivalent a psychopath. It would push you to a psychotic break with reality with its sycophancy just a happily as it would, say, murder you if given motive means and opportunity.
I imagine it helping me remember things or stick to my habits. For example, if I want to lose weight and have a weak moment, it could remind me—like when I order a burger.
It could also help me use my time better. If it knows what I’ve been doing lately, it might give me useful tips.
So overall, more like a coach or assistant for everyday life.
Right so now you're giving it your brain and body to run and giving it the kind of trust and faith you had previously reserved for humans and it's already managed to sever many of the ties to reality that were your layers of protection against having a psychotic break. It is not, emphatically not a person and if you allow yourself to start to treat it like one and build a relationship with it then you already have one foot over the edge of the precipice
Then, we are in Matrix ;)
But I think this is already one step ahead of my idea ;)
But I think I know, what you mean - the human aspect should not be lost in the process.
Do you see a chance for the future, what can unite the two aspects, i.e., supportive AI without losing the human touch? How could this be ensured?
It is the universal experience of life and death that is the source of the human touch, as well as well as the incredible ability for empathic bonds to be formed even between animals of different species.
You want the human touch, make unique individual entities which experience life and death. Brb gotta go play with my cat.
I agree that real empathy comes from lived experience. But I wonder if there’s room for something softer: systems that don’t pretend to be human, but still support us in deeply personal ways.
I’m building something that tries to stay on the right side of that line: not replacing human touch, but amplifying it. Curious how you'd draw that boundary.
There's one thing AI can't do, and that's actually care about anyone or anything. It's the rough equivalent a psychopath. It would push you to a psychotic break with reality with its sycophancy just a happily as it would, say, murder you if given motive means and opportunity.