Hacker News new | past | comments | ask | show | jobs | submit login

You'll need a GPU. One with a LOT of RAM, like an RTX 3090, which has 24 GB.



According to this post, it needs 6.9 Gb. So the 3070, 3070-Ti, 3080, etc. can all run it. Sadly, my RTX 2060 is below that limit...


Apparently the model decompresses, and it won't fit very well on the 8gb models... I'm willing to give the max settings a spin on my 3070ti, but I'm not very hopeful.


It works great on my 1080 (8gb). They did an awesome job making this work on mainstream hardware.


It works on my 2070 8 gb ram.


Would a Mac Pro with a Radeon Pro W5700X with 16 GB of GDDR6 memory work?


It says that NVIDIA chips are recommended but that they are working on optimizations for AMD. This implies to me that it probably involves CUDA stuff and getting it to run on a Radeon would be potentially difficult (I am not an expert on the current state of CUDA to AMD compatibility, though).


AMD's answer to CUDA is called ROCm. I've been doing a little research on it since a few weeks ago and it seems to be funky when not outright broken. It's absolutely maddening that after all this time AMD doesn't have proper tooling on consumer GPUs.


They’re also working on M1/M2 support.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: