Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not really. I run ollama on an AMD Radeon Pro and it works great.

For tooling to train models it's a bit more difficult but inference works great on AMD.

My CPU is an AMD Ryzen and the OS Linux. No problem.

I use OpenWebUI as frontend and it's great. I use it for everything that people use GPT for.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: