Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So far even the least* "ethical" companies (raising initial money by pretending to be open) don't use machine learning based on users psychological profiles to create perfect hypno-drug-dopamine-poison (sorry, but the English word for drugs doesn't really carry the deserved weight).

And Snapchat, TikTok, Facebook, YouTube, Instagram, more?.. are just it. They waste time worth $50-$100 just to make $0.05 in ads.

LLMs seem to be far from that.

* I might be ignorant of the real picture, please correct me if I'm wrong and not aware of really evil LLM companies.



Wait what? Companies are absolutely doing this at scale, they're just laundering it through third-party content providers, who get seen or not seen based on what the user has lingered on in the past. If you haven't seen your relatives' Facebook feeds lately, they're close to 100% LLM-generated slop, which gradually gets tuned to the specific user based on what their eyeballs have lingered on in the past. TikTok isn't quite there yet but is trending in the same direction.

So it's platform partners plus filtered selection putting the tuned dopamine poison in front of people for the moment, but it's absolutely happening. And eventually the platform owners can and will cut out the middlemen.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: