Hacker News new | past | comments | ask | show | jobs | submit login

At least in the US, establishing a legal contract requires more than just an attestation and agreement by both parties (verbal or written or telegraphed or whatever).

For example it’s not a contract if there is no “consideration”, a legal term meaning the parties have exchanged something of value.

IANAL, but “abuse of telecom resources” is the more likely flavor of legal hot-water you might land in. I would absolutely not worry about a fraudster taking me to court.




Contract requires "meeting of minds", i.e. intentional assent from both sides. I am not sure text generated by fully automated bot can be treated as intentional assent.


All this non-lawyer programmer legal analysis is always fun because no one really knows. When I send email aren't I just telling my email "robot" to do something? This is one layer beyond that, my 'llm robot' is sending text messages on my behalf.


When you send an email, there's your conscious intent behind it. So it doesn't matter what technology is in between, as long as your mind is moving it. If you didn't intend it (as in, I know you are on vacation and send you an email saying "if you agree to pay me $1000 send me back a vacation reply" then your mail system sending me a vacation reply does not constitute an intentional action, because it would send the reply to anything. It is true that I am not a lawyer, but laws often make sense, and derive from common sense. Not always, but is such a fundamental matter as contracts they usually do make sense.


That's a good example. But that auto reply is a kind of bot. "Sensible" is just separate from what's legally actionable in too many cases. I do see llms as just that next step in auto replay. We already know companies use them to process your text requests / descriptions when getting help, and they auto-answer things and there are endless stories even today of awful unsuitable responses triggered on llm systems.


All true, but these llm systems aren't random, there's certain intent behind them, they are supposed to do something. So if they do what they are supposed to, then the intent - which is human intent - exists, but it's something that the human creator of the tool did not intend, I don't think any human court would recognize it as a basis for a contract.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: