Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Macs don't support CUDA which means all that wonderful hardware will be useless when trying to do anything with AI for at least a few years. There's Metal but it has its own set of problems, biggest one being it isn't a drop in CUDA replacement.


You can do LLM inference without CUDA just fine. Download Ollama and see for yourself


I'm assuming this won't support CUDA either?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: