Hacker News new | past | comments | ask | show | jobs | submit login

Will it be possible to run such model family in ollama?



Mamba is supported in llama.cpp so should be (edit - apparently it's not strictly the mamba architecture, it's a mix of mamba and transformers, so it looks like it would have to be ported to llama.cpp)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: