Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia RTX 5080 is on average 8.3% faster than RTX 4080 SUPER – first review (videocardz.com)
22 points by 4k 3 months ago | hide | past | favorite | 20 comments



I got a 3090 in the depths of the 2021 GPU shortage and for my purposes (mainly JRPGs) I can still run basically every new game at 4k max settings. I don't really see much need to upgrade.

Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.


Any additional capability will be quickly filled with bloat, especially in the video game industry, where optimization is for post-release.

Expect a lot of creatively bankrupt tech demos with eye-watering hardware requirements.


Man I sure love some blurry mess of pixels that looks like it’s straight out of the Wii U catalogue and requires a 4070 to run a 1080p60

Graphics programming is a lost art, buried deep below an *unreal* amount of abstraction layers


I see what you did there.


Oh and DLSS 4.0. A lot of AI frames, with actual responsive frame rate entering into cinematic territory...


I got a 3080 which I managed to pre-order at MSRP, up until ~1.5 year ago that thing was selling for more than I payed for it in the used market.

> Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.

I dunno, it seems the scaling is different for AI. Like AI is more about horizontal scaling and gaming is more about vertical scaling (after you get to native 4k resolutions).


An RTX 5090 is nowhere near enough if you want to play graphically demanding games on the latest high-resolution VR headsets. Many are close to 4k*4k per eye, so essentially an 8k screen that needs to run at a stable 90 fps not to cause nausea, and ideally higher.


Announced launch prices here are about what the 4080 SUPER has been, so in that regard it's not great but not terrible like the 5090.


It's targeted at the "AI" gold rushers isn't it? Not at gamers.


It's it a good rush to want to run open models on your own computer? This isn't like Bitcoin mining where there is some direct monetary reward for hoarding compute.


Either gold rush or fear of missing out if you ask me.

I've succesfully run (not trained) local models * on my mac mini that cost less than a single video card anyway.

* That fit in my ram. They were probably slower than the FOMO hardware but good enough.


it is a gold rush from nvidia's perspective in that they're selling shovels


I got a 2080Ti and was looking to upgrade, and I also enjoy running local AI models. Given that the card only has 16GB memory I don't see it as a huge upgrade over my 2080Ti, I can only get marginally larger models in memory on it. And if the model is in memory then the 2080Ti is fast enough.


Not really, given that it doesn't increase the amount of RAM compared to the old 4080 Super. If you want to do 'modern' AI on a (relative) budget you should be looking at a 4090 or 5090. This seems to be the card targeted most squarely at gamers.


I heard nvidia is gimping consumer-grade cards to not be good at LLM training, is this true? If so are they gimped only for training or also for running LLMs?

I guess the limited amount of RAM is also a way to limit the cards.


Many Nvidia "gaming" SKUs are already at the point where memory is often the biggest likely limitation on their gaming use case, and they'll be noticeably better products for the consumer with a small cost increase by adding more memory.

So I'd say there's good evidence that something outside cost and value to the gaming use case is why they don't have higher memory SKUs, and eating into "professional" priced AI SKUs is an obvious possibility.

I doubt anyone outside Nvidia itself knows "for sure", but it's a pretty big indication.


At least Mistral 7B for its 128 token text generation is 58% faster with 5090 compared to 4090. https://www.phoronix.com/review/nvidia-rtx5090-llama-cpp/3


Nvidia's Digits is a better deal: 128GB/$3000.


There are plenty of deranged gamers spending 5-10k every few years don't worry


Well that's good news for current 4000 series owners I suppose.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: