Hacker News new | past | comments | ask | show | jobs | submit login

Thank you! I’m working on supporting local llms via llama.cpp currently, so cost won’t be an issue anymore



Given that the ollama API is openai compatible, that should be a drop in, no?


Not really, I believe it’s missing function calling

Edit: and grammar as well


Ahh yeah gotcha




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: