Hacker News new | past | comments | ask | show | jobs | submit login

I set up both stable diffusion and LLMs on my desktop without Nvidia GPU. Everything goes well. Stable diffusion can run on onnx backend on my AMD GPU, and LLMs run through gguf format through ollama on CPU, model scale and speed are limited though.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: