Hacker News new | past | comments | ask | show | jobs | submit login

> it's too slow to be useful with such specs.

Only if you insist on realtime output: if you're OK with posting your question to the model and letting it run overnight (or, for some shorter questions, over your lunch break) it's great. I believe that this use case can fit local-AI especially well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: