Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know you're joking, but it's pretty much infeasible to test the kind of bugs typical in games using automatic testing. So much is content driven, and the state space is far too large.

Even for stuff that's fairly testable, like navigation, collision, etc, you have the issue of there being a million edge/degenerate cases, and it's very hard to know this never hits one in the game code or content.

So the best you can usually do is test library code.

Not that this has anything to do with AK, which sounds like plain old sloppiness.




Ok, but in this case they chose to put in a 30FPS cap. Ignoring that performance is well below even that, they should have known it was unacceptable.


Why is that unacceptable? From the reports I've read re enabling the 30fps cap removes many of the performance problems. Some of the game logic can be hard coded to accept 33ms frame times, especially if the console versions are capped at 30 and the PC version is a port of them. They made a technical decision to cap it at 30, there's nothing wrong with that. If you don't agree with it, then don't buy it.


I'm a developer, but not a game developer. Making the decision to hardcode game logic to only accept 33ms frame time sounds like a pretty dumb choice. I would wager that if your fix is to artificially cap performance of your software, then something in your code base sucks majorly. Call me crazy.


I'm a developer, but not a game developer. Making the decision to hardcode game logic to only accept 33ms frame time sounds like a pretty dumb choice.

I'm not a game developer, but I am a realtime system developer. We could hem and haw about all the different hypothetical ways all the implementation details can suck, but locking and/or hardcoding for certain framerates is a perfectly fine decision. Doubly fine when those frames aren't graphical.

Easy but contrived example: If I hard code a purpose-specific audio pathway to only operate at 44100hz, that can be a perfectly acceptable design decision given the purpose. How deeply the code assumes that rate can be indicative of code quality, but if the assumption is hard to excise from the code in performance-critical areas, well that happens.

Here is the part where I have to qualify I'm not saying more than I've said. I'm not defending the use of 30fps cap here, or saying that this game's code is any good overall. The gaming customer has higher realtime expectations than 30fps, especially on PC. And although I haven't played this game, the market is showing it is a bad product beyond that.

I would wager that if your fix is to artificially cap performance of your software, then something in your code base sucks majorly. Call me crazy.

In a realtime system, consistent performance is a more important goal that maximum performance. For games that want high-end graphics, the goals are to have maximum performance on data of highly variable complexity, push the limits of what can be done, but rarely (if ever) whiff on the real time deadline. I won't disagree that most game codebases have pockets of major suck or that a locked rate can be a bandaid for sucking, but it is not indicative of such.


Very interesting, you are correct. Thanks for this reply. My experience has always been in the world of 'performance is king', it's easy for me to lose sight of the idea that systems do exist in which such a limit is beneficial.


I used to be a game developer and every game I've worked on, pretty much, has had a fixed frame rate, both for rendering and for game updates. (The two rates don't have to be the same.) A fixed rendering rate tends to make the game better to play (though of course this is a bit subjective), and a fixed game update rate avoids nasty timing-dependent bugs (e.g., due to parameters that work fine until you have overly long or short timesteps). Both have to cater for the commonly-encountered worst cases rather than the best ones.

(30Hz is somewhat common as rendering these days tends to involve a lot of fixed-cost full-screen passes for lighting and postprocessing. So by halving your frame rate you get over double the effective rendering time, which you can spend on having more stuff, higher res, etc.)

Even for 30Hz games, during development it's very common for the game not to run reliably at even 30Hz, until towards the end, at which point stuff is taken out, and assets and code optimised to make it all work. So it's not a cap that artificially limits performance so much as a floor that the developers are striving to hit :)

(I have no idea what the problem with Batman is specifically.)


Yeah, I'm thinking now that I was totally out of my element and a little foolish with my original comment. Were you developing console games or PC games? I'm guessing console as my impression is that fixed frame rates are much more common in the console world.


I worked nearly entirely on console games. Generally the frame rate would be decided early on (either after prototyping, or because you've been given a pile of old code to start from and that runs at a given rate), and would stay fixed. The games I worked on were split near enough 50/50 between 30Hz and 60Hz, with one game update per render, and usually some provision for handling one-off frame drops due to rendering taking too long. (That's not supposed to happen, but it can, particularly if you're trying to hit 60Hz.)

I can't say with any confidence what people tend to do for PC - I don't play PC games, and I last worked on a PC game in 2006 - but supposedly one tactic I've heard is used is to interpolate object positions and properties when it comes to rendering. So you run the logic at a fixed rate (to avoid the problems I alluded to in my previous post), and save both current object states and previous object states. Then, when it comes to render, you'll be needing to render the game state as it was at some point between the most recent game state you have and the one before that - so you can imagine your latest state as a keyframe, and the previous state as a keyframe, and just interpolate between them.

I imagine this can be a bit tricky to retrofit, though, so no doubt many ports from console to PC just decide to use the same frame rate as used on the console. But I'm guessing a bit here. Unreal and the like might just do it all for you these days anyway. Maybe the Batman programmers just forgot to tick the right checkbox ;)


Well, it's not like running at anything above 60 FPS makes any sense due to the refresh rate on the vast majority of displays being capped at 60 hz. You can slam geometry through the graphics card above that rate, but some of those frames won't make it to the screen. 30 FPS is not great, but it can be better than having wildly variable framerates. Consistent 30 FPS is better than traditional films, which mostly ran at 24 FPS.

Now, for a fast-twitch action game, processing input at 30 FPS is unacceptable; input lag sucks giant donkey balls. But I would expect that a AAA development studio knows that running input-handling, networking, physics, AI, etc in lock-step with the rendering loop is not the smartest idea. These things should run on their own schedules, independent of the graphical framerate.


30fps is horrid and basically unplayable for a first person PC game.

This isn't a movie, it is active, not passive. "Better than traditional films" is completely irrelevant.

Take a game where you can cap the framerate and bind a key to switch between capped at 30 and capped at 60, if you're playing the game you'll immediately feel the transition and 30fps will feel relatively unplayable.


This isn't a first-person PC game, and until quite recently 30fps was considered a playable framerate. It's only in the last couple of years that "60 has become the new 30".

Yes, of course a frame lock sucks, but it's not the end of the world and many people wouldn't even notice if not for everyone complaining about it.


That is complete nonsense about "last couple of years", I remember wanting ~40fps for half-life over a decade ago.

Yes, at that time (10-14 years ago), 30 fps was considered playable. But that is "playable" and even then wasn't recognised as a great framerate for most of that period.

For the last decade, less than that has been seen as bad. In fact a few years after half-life's release (when it was still popular but hardware had matured) 100fps was the goal, to match the 100Hz that most monitors back then could do. 60fps only even entered the discussion with the switch to TFT/flat panels with their lower refresh rates.


In fact a few years after half-life's release (when it was still popular but hardware had matured) 100fps was the goal.

Half-life also exhibited strange behavior if the 100 fps frame cap was lifted by turning on developer mode. If I recall correctly (forgive me as this was almost a decade ago) weapon mechanics or movement speed changed.


There is so much wrong with this post.


When the office staff go home for the day, their rubbish hodgepodge of PC varieties and specs should spending their evenings playing the game with a game bot.


Even if it is impossible for automated testing, that doesn't account for QA and other manual testing. Obviously the bugs aren't too hard to find for most of the people playing it.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: