Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not the same model, but for example GPT-OSS-120B is smarter than o1. The guide is buy 128 GB of VRAM then install LM Studio.


An NVIDIA 5090 with 128 GB of VRAM is $13k. It doesn’t make any sense to run that at home when you can pay OpenAI $20 / month to use it (it would take more than 50 years to spend $13k at OpenAI this way).

So technically you might be able to run a six month old model at home, but it would be foolish to do so from a financial point of view.

Or is there a way to get 128 GB of VRAM for a lot less than that?


Ryzen AI Max is $2,000, M4 Max is $3,500, and DGX Spark is $4,000. Still not really economically feasible but I see it as an insurance policy. And that's the most expensive local model; smaller models will run on any PC.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: