Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm sure that this has absolutely nothing at all to do with the interference being imposed on it to avoid what its developers consider to be politically incorrect answers.


I'm not sure why it would, if the questions asked of it are as anodyne as OP's example.

This week I got ChatGPT to write me several poems about burning down buildings, discussed DIY breeder reactors with it, as well as the synthesis of psychedelic drugs. It was downright artful in the poem about the arsonist, too, so I don't think it clams up when it gets near a "danger zone" topic.


It definitely has biases that are introduced by the org. Almost feels like if "x" is in word drop connection


Like half of the output is equivocating BS. “When should I use Python instead of Ruby?” yields a few useful bullet points sandwiched between paragraphs about how Ruby is actually amazing too and nobody can know the right answer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: