Hacker News new | past | comments | ask | show | jobs | submit login

I built Fluid app exactly with that in mind. You can run local AI on mac without really knowing what an LLM/ollama is. Plug&Play.

Sorry for the blatant ad, though I do hope it's useful for some ppl reading this thread: https://getfluid.app




Probably not the best choice of names: https://fluidapp.com. I don't know that it's been updated in a while, but it still works nicely.


I'm interested, but I can't find any documentation for it. Can I give it local content (documents, spreadsheets, code, etc.) and ask questions?


> Can I give it local content (documents, spreadsheets, code, etc.) It's coming roughly in December (may be sooner).

Roadmap is following:

- October - private remote AI (when you need smarter AI than your machine can handle, but don't want your data to be logged or stored anywhere)

- November - Web search capabilities (so the AI will be capable of doing websearch out of the box)

- December - PDF, docs, code embedding. 2025 - tighter MacOS integration with context awareness.


Oh awesome, thank you! I will check back in December.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: