Spoken like someone who hasn’t spent hours trying to get LocalAI to build and run, only to find out that while it’s “OpenAI API compatible!0” it doesn’t support streaming so the Mattermost OpenAI plugin doesn’t work. I finally gave up and went back to ooba (which also didn’t work with the MM plugin… hmm.) Next time I’ll just hack something on the side of llama.cpp