Hacker News new | past | comments | ask | show | jobs | submit login

Does it really? If you want an LLM to edit code you need to feed it every single line of code in a prompt. Is it really that surprising that having just learnt it has been timed out, and then seeing code that has an explicit timeout in it, it edits it?? This is just a claim about the underlying foundational LLM since the whole science thing is just a wrapper.

I think this bit of it is just a gimmick put in for hype purposes.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: