Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> llama.cpp already compiles and runs pretty much everywhere.

Well, it simplifies things when you don't need to compile things.

Also, you literally can't download or compile the wrong binary by mistake, it's the same binary for all supported processor/OSes Cartesian product matrix.

> Why introduce one more container?

It makes stuff more convenient.

`application/zip` is also a ubiquitous standard. I doubt anyone is being "introduced to it".

I also appreciate the fact that tooling for handling `application/zip` is very widespread, so you don't need totally bespoke tooling to retrieve the models from inside a `llamafile`.

> Who benefits from binary distribution of this sort?

Anyone that doesn't have a compiler SDK on their computer.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: