I got a 3090 in the depths of the 2021 GPU shortage and for my purposes (mainly JRPGs) I can still run basically every new game at 4k max settings. I don't really see much need to upgrade.
Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.
I got a 3080 which I managed to pre-order at MSRP, up until ~1.5 year ago that thing was selling for more than I payed for it in the used market.
> Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.
I dunno, it seems the scaling is different for AI. Like AI is more about horizontal scaling and gaming is more about vertical scaling (after you get to native 4k resolutions).
An RTX 5090 is nowhere near enough if you want to play graphically demanding games on the latest high-resolution VR headsets. Many are close to 4k*4k per eye, so essentially an 8k screen that needs to run at a stable 90 fps not to cause nausea, and ideally higher.
It's it a good rush to want to run open models on your own computer? This isn't like Bitcoin mining where there is some direct monetary reward for hoarding compute.
I got a 2080Ti and was looking to upgrade, and I also enjoy running local AI models. Given that the card only has 16GB memory I don't see it as a huge upgrade over my 2080Ti, I can only get marginally larger models in memory on it. And if the model is in memory then the 2080Ti is fast enough.
Not really, given that it doesn't increase the amount of RAM compared to the old 4080 Super. If you want to do 'modern' AI on a (relative) budget you should be looking at a 4090 or 5090. This seems to be the card targeted most squarely at gamers.
I heard nvidia is gimping consumer-grade cards to not be good at LLM training, is this true? If so are they gimped only for training or also for running LLMs?
I guess the limited amount of RAM is also a way to limit the cards.
Many Nvidia "gaming" SKUs are already at the point where memory is often the biggest likely limitation on their gaming use case, and they'll be noticeably better products for the consumer with a small cost increase by adding more memory.
So I'd say there's good evidence that something outside cost and value to the gaming use case is why they don't have higher memory SKUs, and eating into "professional" priced AI SKUs is an obvious possibility.
I doubt anyone outside Nvidia itself knows "for sure", but it's a pretty big indication.
Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.