Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Speaking as a huge fan of Ruby, I think it's absolutely the wrong language to write games in, because games are enormously sensitive to runtime fluctuations, and as you found out, Ruby has them. In particular, a stop-the-world GC is just never going to give you satisfactory performance, unless every GC pass happens in <1ms and never happens more than once per frame.

That said, you might consider Lua as a primary scripting engine for your game. It's got a lot of your standard scripting language perks, but it's also obnoxiously fast and has an incremental garbage collector. Furthermore, there's no GIL, and it's designed to be embeddable, making it a very powerful choice for game development.



Usually the way you solve GC issues is by either not allocating after some initial massive allocation, or by running the GC constantly and having it clean up bits and peices every frame. I don't know enough about ruby to know if this is possible but this is how you handle GC in Java and C#.

As for Lua, Garry's mod runs almost entirely in Lua (outside of the source engine).


You can run GC manually in Ruby, but at the end of the day, it's a stop-the-world GC; your program doesn't get to do anything until it decides that it's finished. If that takes too long, you drop that frame. Additionally, Ruby doesn't have the concept of primitives (mostly), and all objects are stored on the heap! So, every time you use a float (which you do a lot of in games), you end up with a new struct on the heap that has to be garbage collected eventually. Ouch.

(By way of demonstration:)

    ree-1.8.7-2011.03 :002 > 123.456.object_id
     => 33681660
    ree-1.8.7-2011.03 :003 > 123.456.object_id
     => 33674740
Languages with incremental GCs will yield between phases of the GC pass, letting program execution continue even when a GC pass is in progress, making sure that you don't end up with dropped frames, which vastly improves the player experience.


I'll look into LUA. Never used it back in my games dev days as it was just coming into vogue.


Mike Pall's LuaJIT[0] is excellent from a game dev perspective--incremental GC (so you can spend a fixed amount of your frame budget in it) and JIT with a huge number of optimisation make it fairly compelling to write the top 50% of your game in (and not a terrible choice for the bottom 50%).

I think you were absolutely right to switch from Ruby, I can't see the problems that you had getting any better unless Ruby lets you manage object allocation (in which case you could allocate to a per-frame memory buffer and just nuke it at the end of every frame).

[0] http://luajit.org/


The cause of its popularity in the games community comes from the fact that it is, by design, meant to be embedded in larger programs. So, for example, it will compile on any platform with a conforming ANSI C compiler.

And for x86/x64 and ARM, you can try LuaJIT to get even more zing (though you probably won't need to).


Just because the Lua community will ream you out for the mistake (though it's the only thing they're mean about), it's "Lua" (Portuguese for "moon"), not an acronym. :)


The joys of iPhone autocorrect :) thanks.


I think it's absolutely the wrong language to write games in, because games are enormously sensitive to runtime fluctuations

No-one is nailing down the definition of "game" in this entire discussion. For a FPS or game filled with live special effects, sure. For a chess game, a puzzler, or even a graphic adventure? Ruby's GC wouldn't be a problem at all (although Ruby has other issues that could well crop up).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: