Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

OpenAI is moving in that direction. The Canvas mode of ChatGPT can now runs its own python in a WASM interpreter, client side, and interpret results. They also have a server-side VM sandboxed code interpreter mode.

There are a lot of things that people ask LLMs to do, often in a "gotcha" type context, that would be best served by it actually generating code to solve the problem rather than just endlessly making more parameter/more layer models. Math questions, data analysis questions, etc. We're getting there.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: