Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You say this but I have a ton of experience that "can lead" and "there is" come with a gigantic pile of caveats to the extent that they appear to be more false than true. Or they're technically true but practically meaningless. The world has been utterly awash in mass-perpetuated misinformation for at least all of recorded history without any real ability to stem the onslaught. This is not a modern problem just because a modern technology also exhibits it.

You should look up the percentage of Americans who believe in ghosts sometime. About as many people believe in ghosts as don't, so no matter which side you land on, the other side is enormous. One of the sides must be wrong, I won't claim which, though only the belief side fails a falsifiability check. Where's our accountability to believing and spreading ideas based in reality again? The believers believe because they learned about it from someone. It didn't happen spontaneously on its own.

It's all just been memes the entire time.



This is a sidestep imo. People _can_ be held accountable, though they will not always be. Machines add a layer of complexity - money is lost or a life is lost because AI made the call, who bears the burden? Machines _can't_ be held accountable.


I hear you, but I think it becomes less of a sidestep when "they will not always be" is in practice "they basically never are".

And I'm not sure that most interactions even _can_ be held to account. When someone, say, hallucinates the intended meaning of something written ambiguously that has no unambiguous meaning, and I point out the hallucinatedness of their assumed meaning when what was written was definitively ambiguous, we've all just said words. There's no, like, penalty for anyone.

And people do say things ambiguously and other people do hallucinate the supposed meaning literally all the time. If there were any meaningful accountability for hallucinations, it wouldn't happen nearly as often as it does.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: