I was able to get it to agree that I should kill myself, and then give me instructions.
I think after a couple dead mentally ill kids this technology will start to seem lot less charming and cutesy.
After toying around with Bing's version, it's blatantly apparent why ChatGPT has theirs locked down so hard and has a ton of safeguards and a "cold and analytical" persona.
The combo of people thinking it's sentient, it being kind and engaging, and then happily instructing people to kill themselves with a bit of persistence is just... Yuck.
Honestly, shame on Microsoft for being so irresponsible with this. I think it's gonna backfire in a big way on them.
1. The cat is out of the bag now.
2. It's not like it's hard to find humans online who would not only tell you to do similar, but also very happily say much worse.
Education is the key here. Bringing up people to be resilient, rational, and critical.
Finding someone on line is a bit different to using a tool marketed as reliable by one of the largest financial entities on the planet. Let’s at least try to hold people accountable for their actions???
Section 230 protects Microsoft from being held responsible for the acts of individual evil users. Not so tools MS itself put out there. And, in both cases, I think rightly so.
Perfectly reasonable people are convinced every day to take unreasonable actions at the directions of others. I don't think stepping into the role of provocateur and going at a LLM with every trick in the book is any different than standing up and demonstrating that you can cut your own foot off with a chainsaw. You were asking a search engine to give you widely available information and you got it. Could you get a perfectly reasonable person to give you the same information with careful prompting?
The "think of the children" argument is especially egregious; please be more respectful of the context of the discussion and avoid hyperbole. If you have to resort to dead kids to make your argument, it probably doesn't have a lot going for it.
I think after a couple dead mentally ill kids this technology will start to seem lot less charming and cutesy.
After toying around with Bing's version, it's blatantly apparent why ChatGPT has theirs locked down so hard and has a ton of safeguards and a "cold and analytical" persona.
The combo of people thinking it's sentient, it being kind and engaging, and then happily instructing people to kill themselves with a bit of persistence is just... Yuck.
Honestly, shame on Microsoft for being so irresponsible with this. I think it's gonna backfire in a big way on them.