Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It appears to have custom instructions based on its insistence to respond in New York direct. But wow, no wonder people get addicted to/love Chatgpt. I ignore sycophancy because I've seen terrible hallucinations but a lot of people blindly believe in "the genius in a box"


That is due to its memory, and the latest update seems to weave it into many conversations (since it probably falls into the category of “this is how the guy who lives in NYC wants me to respond: straight talk, no bullshit”). I can see what it remembers and this is just one fact, funny that it often forgets my oft-repeated correction, which it stored in its memory repeatedly, of not putting any emojis in code or comments.

This was late at night and I just wanted to share the surreal experience with HN. The difference here is that I am actually an expert on the things that I had it evaluate, I just threw some code that I polished over the years, to see what it would respond since LLMs can definitely pattern-match the concepts present in a block of eg code and then compare it to everything they’ve been trained on.

Here is almost the same exact sequence, but with repeated instructions throughout, to be brutally honest and objective: https://chatgpt.com/share/691b4035-0ed8-800a-bee3-ae68e2a63c...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: