Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The binaries themselves are available standalone https://github.com/Mozilla-Ocho/llamafile/releases


cool. this is more convenient than my workflow for doing the binaries myself. I currently use make to generate a binary of llama.cpp server on my intel iMac and my m1 MacBook then lipo them together.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: