Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How is there no mention of CPU?

Simulators, especially ones of this size, are usually much more CPU intensive than GPU intensive. So a lot of reviews are obsessive about how beefy their GPU card is, but I suspect a lot of people have "top-heavy" rigs.

I play a lot of simulator games. Most of them are not paragons of optimization (well, except maybe Factorio). But these are not action games so FPS is not really something I personally optimize for and I don't necessarily understand the importance of benchmarking it against, like, Cyberpunk.

I wouldn't be surprised if Paragon's optimization fix will be just to nerf the depth-of-field.



The issues in question here surface even with an empty city, at higher populations the CPU will almost certainly become the issue but right now most of the complaints are unrelated to sim performance and it even seems like for most people the sim performs quite well even at higher populations.


The sim performs OK, but it's still based on Game Objects to my knowledge (eg. All entities are a class type interacting OOP style)

A city sim could be made massively more performant using a data oriented architecture like Unity DOTS or any other ECS style engine


The speed of your simulation in a game like this should have minimal impact on the framerate of the game. You can paint the screen 144 times a second, but only tick the simulation 10 times a second and get a much better experience than painting the screen and updating the simulation 20 times a second. Maybe your city runs slowly, but you can still look around at the various bits moving slowly.


Not trying to be rude, but this honestly reads like you don’t actually play simulation/4X games.

> Maybe your city runs slowly

If the simulation is running slow that has a much more detrimental on the quality of game play than jumping from 45 fps to 75 fps. Sure, it’s a simulation, but it’s primarily a game, not a weather model. And I say this as an early adopter of high hertz monitors and a frequent fps player. I absolutely need high frames and low input latency in a competitive PvP game, but in a strategy game, it’s much more important that the tick rate is fast enough that it’s interesting. A slow sim is boring, and it’s not just me, people complain about this all the time in (late game) Stellaris(another Paradox title).


If the issue is occuring even on the main menu and a completely new, bare level, that possibility can be safely eliminated. It's also highly doubtful that someone with a 4090 would have a CPU pitiful enough to excuse 20FPS.


Also, the speed of the simulation should not be bound to the rendering speed.


FWIW I don't think that was the implication. Even when the simulation and rendering are asynchronous having a busy CPU will result in lower rendering performance, regardless of how busy the GPU is.


They probably should have mentioned the CPU specs at least, but it's very easy to look at utilization while the game is running and see that it's GPU bound even with high-end video cards.

Not to mention that this is on an empty map, so there's very little simulation even happening yet, and the big FPS gains come from turning down GPU effects like DoF and motion blur.


I'm wondering about any abuse of GPU for non GFX tasks in recent games.


If it has embarrassingly parallel tasks that it can dispatch to a massively parallel subsystem dedicated to solving embarrassingly parallel tasks, is that abuse or smart use of resources?

That being said most simulation games are usually memory-latency and memory-bandwidth limited, not compute limited.


It looks a perfect match. At first. Then you realize that you are not alone.

Very much like using the Java stream parallel API in a webserver is doing wonders in dev, but not in production, as you have many other threads serving other requests also starved for CPU cores


Most people who have the money to invest in a 4090 are also going to invest in a beefy CPU. It's not a secret that CPUs are bottlenecks for games within the gaming community.


CPU is not a bottleneck in Cities Skylines 2, nor most game to be frank.


This is so far from reality it's not even funny.

I mean its only common enough that the headline grabbing feature of the markets undisputed largest GPU vendor is a feature literally designed to reduce its impact as a bottleneck.


Disagree. Many games are ultimately limited by the performance of a single hot thread (often physics). Some things just don't parallelize, so throwing 8/12/16/24 cores at them doesn't help..


Rimworld, especially after running RimPy and converting textures for GPU, is extremely fast/well optimized even with hundreds of mods.


> Most of them are not paragons of optimization (well, except maybe Factorio). But these are not action games so FPS is not really something I personally optimize for and I don't necessarily understand the importance of benchmarking it against, like, Cyberpunk.

In many of these games, FPS is correlated with simulation speed. So when the fps starts to chug, the simulations starts going slower too.


Not in Cities Skylines 2.


Uhhh, this doesn't even make sense. Simulation speed is usually controlled by a button.


It makes perfect sense. Simulations are done in discrete time chunks, as the framerate drops and those time chunks grow larger there are two choices: make the simulation bad, or make the world slow down.

If you don't adjust simulation rate you start seeing higher instances of objects phasing through each other, collisions not preserving energy correctly, pathfinding just fundamentally breaking.


Well ideally sim "ticks" are completely seperate from rendering, but this is not the reality in many games where they share the same thread, or where cross communication (like UI stuff) blocks enough to slow the other component down.

Even in Minecraft, with its completely seperate server/client, the rendering can bog down TPS due to JVM pauses and other reasons I don't even understand.


> Well ideally sim "ticks" are completely seperate from rendering

I have a counterexample: there was a game called der Omnibussimulator, in which the simulation and rendering were 1:1 coupled.


But the simulation can be done as fast the CPU dictates. If the GPU has to drop frames to keep up, it doesn't impact the simulation.


if it was cpu bound because of world sim, changing the resolution would not improve the performance. also, there's no reason for the world to be that much more complicated than the first game.


Can you recommend some favorites?


The Tropico games are a perfectly pleasant place to start. They get progressively easier and more casual.

The Anno games are my all around favorite. 1404 is my favorite, but honestly 1800 is probably the best.

Banished spawned an entire genre unto itself even as it hasn't aged gracefully. These survival sims have a lot more "bite" to them. Try Planetbase for a more streamlined experience. Timberborn if you like physics. Also have heard good things about Farthest Frontier.

And then there's Frostpunk which is an all-around amazing experience. The theming and mood rivals any first-person cinematic shooter.


> Banished spawned an entire genre unto itself

Reading the Wikipedia description[1] makes it sound like a slightly tarted up Settlers, not anything genre-defining. What have I missed?

[1] https://en.wikipedia.org/wiki/Banished_(video_game)


Your people in Banished die. A lot. So unlike most other economic sims, building your city is not the primary goal, population survival is. And you have to balance population and resources pretty intimately.

Calling Banished a tarted up Settlers would be similar to calling Dark Souls a tarted up Zelda game.


The DRM on Anno 1800 is dreadful. Avoid at all costs.


I'm not the guy you're responding to but...

Over the last 10 years, my favorite simulation games have been Factorio, Tropico (4 and 6), OpenTTD, and Two-point Hospital.


I'm still an old blowhard and think Tropico 1 is still the best. It was so tough that you had to become a fascist every time. Which made it a harder game but also a much sharper commentary.


Captain of Industry is pretty good fun.


Factorio, Satisfactory, Dyson Sphere Program, and Oxygen Not Included are all amazing.


Check out Oxygen Not Included


Sure it uses CPU but desktop CPUs that gamers use are beasts. Even if it’s a few years old mid level CPU it’s better than any console or mobile device by a long shot. They have dedicated tower or liquid coolers so they can push a lot of power into the chips


With sim games it's not purely about clock speed or TDP. The previous generation 5800X3D outperformed the newest chips in simulation games for a while thanks to it's very large cache.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: