Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One runs the risk of being reductive when examining a mechanisms irreducible parts.

User expression is a beast unto itself, but I wonder if that alone absolves the service provider? I imagine Blizzard has an extensive and mature moderation apparatus to police and discourage such behavior. There's an acceptable level of justice and accountability in place. Yet there are even more terrible real-life examples of illicit behavior outpacing moderation and overrunning platforms to the point of legal intervention and termination. Moderating user behavior is one thing, but how do you propose moderating AI expression?

A digression from copyright - portraying models as a "blank canvas" is itself a poor characterization, output might be triggered by a prompt, like a query against a database, but its ultimately a reflection of the contents of the training data. I think we could agree that a model trained on the worst possible data you can imagine is something we don't need in the world, no matter how well behaved your prompting is.



I do not propose moderating "AI expression" - I explicitly propose otherwise, and further propose mandating that the user is provided with source attribution information, so that they can choose not to infringe, should they be at risk of doing so, and should they find that a concern (or even choose to acquire a license instead). Whether this is technologically feasible, I'm not sure, but it very much feels like to me that it should be.

> A digression from copyright - portraying models as a "blank canvas" is itself a poor characterization, output might be triggered by a prompt, like a query against a database, but its ultimately a reflection of the contents of the training data.

I'm not sure how to respond to this if at all, I think I addressed how I characterize the functionality of these models in sufficient detail. This just reads to me like an "I disagree" - and that's fine, but then that's also kinda it. Then we disagree and that's okay.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: