Thanks! Would "under 10G" also include 8 GB, by any chance? Although I do die inside a little every time I see "install Torch for your CUDA version", because I never managed to get that working in Linux.
It actually uses less than 3 GB of VRAM. One issue is that the research code is actually loading multiple models instead of one, which is why it was initially reported you need 8 GB if VRAM.
However, it cannot be used for the same use case because it’s currently very slow, so real time usage is not yet possible with the current release code, in spite of the 0.15 RTF claimed in the paper.
Thanks, but I don't think I'm going to reinstall my entire OS to run these. I'll see if I can get Docker working, it's been more reliable with CUDA for me.
Very good voice cloning capability. Runs under 10G vram nvidia gpu.