Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Are you able to memory pool two 3090s for 48gb and if so what's your setup?

I looked into this previously[1] but wasn't super confident it's possible or what hardware is required (2x x8 pcie and official SLI support?). AFAICT still would look like two GPUs to the system.

[1] https://discuss.pytorch.org/t/is-there-will-have-total-48g-m...



You can memory pool with the right software however pytorch supports spreading large models over multiple GPUs OOTB. Just pass the --gpu-memory parameter with two values (one per GPU) to oobabooga's text-generation-webui for example.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: