Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is called prompt engineering. If you find a popular, frequently repeated code snippet and then fashion a prompt that is tailored to that snippet then yes the NN will recite it verbatim like a poem.

But that doesn't mean it's the only thing it does or even that it does it frequently. It's like calling a human a parrot because he completed a line from a famous poem when the previous speaker left it unfinished.

The same argument was brought up with GPT too and has been long debunked. The authors (and others) checked samples against the training corpus and it only rarely copies unless you prod it to.



I don't know if I agree with your argument about GPT-3, but I think our disagreement seems to be besides the point: if your human parrot did that, they would--not just in theory but in actual fact! see all the cases of this in the music industry--get sued for it, even if they claim they didn't mean to and it was merely a really entrenched memory.


The point is that many of the examples you see are intentional, through prompt engineering. The pilot asked the copilot to violate copyright, the copilot complied. Don't blame the copilot.

There also are cases where this happens unintentionally, but those are not the norm.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: