many text generation models run on my 11G 1080Ti
you can run quantized versions of these models
if you aren't running it quantized, I'd say even 24 gig is not enough
If you want to get the most bang for your buck, you definitely need to run quantized versions. Yes, there are models that run in 11G, just like there are models that run in 8G, and for any other amount of VRAM - my point is that 24G is the sweet spot.