Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're making claims about those systems not being autonomous. When we want to, we create them to be autonomous. It's got nothing to do with agency or survival instincts. Experiments like that have been done for years now - for example https://techcrunch.com/2023/04/10/researchers-populated-a-ti...


Yes, because they aren't. Against your fantasy that some might be brought into existence sometime in the future I present my own fantasy that there won't be.


I linked you an experiment with multiple autonomous agents operating continuously. It's already happened. It's really not clear what you're disagreeing with here.


No, that was a simulation, akin to Conway's cellular automata. You seem to consider being fully under someone else's control to qualify as autonomy, at least in certain casees, which to me comes across as very bizarre.


You seem to be taking about some kind of free will and perfect independence, not autonomy as normally understood. Agents can have autonomy within the environment they have access to. We talk about autonomous vehicles for example, where we want them to still stay within some action boundaries. Otherwise we'd be discussing metaphysics. It's not like we can cross physical/body boundaries just because we've got autonomy.

https://en.wikipedia.org/wiki/Autonomous_robot

> An autonomous robot is a robot that acts without recourse to human control. Historic examples include space probes. Modern examples include self-driving vacuums and cars.

The same idea is used for agents - they're autonomous because they independently choose actions with a specific or vague goal.


I don't see the relevance of things that carry their own power supply either, and I still disagree that Conway automata and similar software exhibit autonomy.

I did not mention "free will and perfect independence".


You also carry your own power supply...

I could go into more details, but basically you tried to call out some weird use of "autonomous" when I'm using the meaning that's an industry standard. If you mean something else, you'll need to define it. Saying you can't be autonomous under someone's rules brings a serious number of issues to address, before you get to anything AI related.


Well, I disagree that computers exhibit intelligence and according to "industry standard" they do so in my view that does not carry any weight on its own.

Autonomy implies self-governance, not just any form of automaton.


Humans are not physical machines? Please explain.


depends what you mean by "machine".


There are inputs and outputs. The human body can be analyzed systematically (not necessarily easily). There’s no “magic”.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: