Hacker News new | past | comments | ask | show | jobs | submit login
Why not Lua (danjou.info)
82 points by sedachv on April 26, 2011 | hide | past | favorite | 43 comments



I had to chuckle when I read this snippet:

"Being small is not an excuse

One common argument to choose Lua is that it has a small footprint. Yeah, that's true, but that's useless. Bummer! When I program, I don't have any resource usage pressure. People who have such pressure are either paranoids or playing in the world of embedded computers. This is also a no more existing conception since quad core processors equiped phones are coming into the market. I'm rather confident that what we used to call embedded devices are just dead and are now plain computers"

How I would savor watching him sit in one of our engineering meetings where we discuss byte and cycle budgets available to subroutines on our network routing layer. And, I'd really like to see what he thinks about our 1 MByte firmware limits, 4 MByte flash footprints....

Ironically, We've only recent had the memory budget to consider LUA.


I use Lua on similarly-sized embedded systems. As a result I get readable code, easily ported (and debugged!) under Linux. It really changed the scope of what we could embed.


The truth is that one should be concerned with how much memory is taken in our world of big-RAM consumer computers, too. To do otherwise is purely egocentric, being a bad citizen: he assumes that his software is the only thing running on that system that needs memory. Of course, this concern needs to be weighed against all the other usual concerns of optimization, but frankly, one should not flagrantly waste memory where it is unnecessary. This is a shared resource, not a limitless supply.

We heard this argument a decade ago with CPU. "The hardware will get faster. Moore's Law, lol." Oops?


That sounds positively luxurious... I recently worked on a project where the microcontroller had 768 bytes of RAM and around 32k flash. The hardware team were really pushing for a slightly cheaper version of the microcontroller with only 512 bytes RAM but as we software folks were feeling pinched even at 768, they got overruled.


This will quickly devolve into the Four Yorkshiremen skit, but...

> I recently worked on a project where the microcontroller had 768 bytes of RAM and around 32k flash.

Luxury! I'm in an entertainment industry startup. Our main product uses an AVR board with 512 bytes of RAM and 8k of flash. Program size optimization for 8k is a significant challenge for us.

We also use an embedded networking device with 8M of flash, which is about the sweet spot for Lua...Perl, Python, and Ruby are all too bloated for this context.

To the author, embedded apparently means "phones." With the rise of Arduino and hardware hacking in general, I think there are more and more opportunities involving development under extreme constraints, not fewer.


> We also use an embedded networking device with 8M of flash, which is about the sweet spot for Lua...Perl, Python, and Ruby are all too bloated for this context.

That sounds plausible. We had a 512k dsPIC (I think that's as big as they come) and considered Lua on it, but it wasn't quite practical.


I have done stack-based development integrating C and Lua.

While I might agree with Julien that Lua has a few quirks that can reduce your productivity, I have never had any problem finding helpful resources to help solve my specific problems.

I think that avoiding Lua because of these limitations is quitting too soon, or throwing the baby out with the bathwater. I think Lua is a very useful tool with wide applicability.

The IRC channel #lua on Freenode is a friendly and welcoming place.

The official Lua mailing list is also a great place to explore the particulars of a problem and receive the benefit of many expert eyes on your code. [0]

There is example code in various places on the Internet showing good ways to do careful stack management with C/Lua. I found these to be the most helpful. [1]

If you are used to stack-based programming in languages like Forth or PostScript, you will probably feel at home.

If you are not, you will have a learning curve before you start to feel productive.

0: http://vlists.pepperfish.net/cgi-bin/mailman/listinfo/lua-l-...

1: http://lua-users.org/wiki/SampleCode (see "Lua C API samples")

1: http://cc.byexamples.com/2008/11/19/lua-stack-dump-for-c/

1: http://docclox.blogspot.com/2011/01/about-lua-stack.html

Edit: clarity, tone, links, more links, typo


Thanks for the links! This is one of the reasons I love HN: even an article that I have a <meh> response to can generate lots of useful responses and great links. (He said, awaiting the inevitable Vim post that will once again provide a couple of idioms or links he hadn't run into before.)


- the stack-based API makes it reasonably easy not to goof with references and introduce hard-to-find GC bugs.

- reference-counting simplifies some stuff, but introduce other bug classes (e.g. circular references). Moreover, ref-counting GCs are generally outperformed by mark & sweep GCs (because most objects have a very short lifespan with dynamic languages).

- to create mutual references between userdata, you have to use the stack. It's cumbersome but it ensures that the GC state is always consistent. The choice is between cumbersome, buggy user code, and buggy GC code.

- lack of paradigms (or "officially preferred way to do things", as magistrally done by the Python community), is indeed annoying. It would be a fatal flaw in a general purpose language. In a language intended to be embedded, it is merely a debatable choice, although one I don't support. The language's purpose is limited to provide basic building blocks, APIs and paradigms are to be provided by the embedding application. I wished preferred paradigms were strongly suggested, too.

It would be possible to define a very elegant and productive programming ecosystem around Lua; one which would probably be marginally better than Python or Ruby at many tasks. But I'm not convinced that the improvements would be dramatic enough to let that Lua++ carve its own niche between the established generalist, stand-alone languages. Better stay the best language to embed with C IMO.


The author says: >> "They claim it's very easy to integrate Lua into your C application, because it does not use pointer, nor references counting, nor anything that requires a minimum amount of skills to be used." And then immediately afterwards, says that keeping track of the stack is too difficult. It sounds like there is a minimum amount of skill required.

I've done huge projects using Lua as an embedded language and I really enjoyed the experience. I found the stack API easy to grasp, and the performance was outstanding. SWIG (swig.org) made wrapper generation a snap, and having Lua objects with access to the C++ data model allowed us to write test code that used introspection and reflection very naturally.

This article strikes me as a case of "This language isn't my favorite, so I don't think anyone should use it."


Well, I think there is some impedance mismatch here. He kind-of suggested resetting the stack "to remove left-over items". He also mentioned something about forgetting to pop an item. These kinds of comments are really... weird. I mean, what is Lua supposed to do if you forget to pop something?

This is an example where a valid personal opinion ("I don't like stack-based apis") crosses the line into something else ("Lua is defective").


Indeed. My perspective is that if you can grok Forth(which is unprotected stacks all the way down) embedding Lua should be cake, especially if you aren't planning on doing something fancy.

And realistically...how often is it going to come up that you need the embedding to pass around more than a few numbers and strings?


It actually is a case of that.

In his own words: "I liked Python object model and wanted to have it in Lua, and spending time rewriting Python is just not worth it. I probably should have chose Python, not Lua. YMMV."

I guess he needs to learn more about compilers/interpreters implementation to understand that the reasons he dislikes Lua are the same reasons that make it fast and small and embeddable.


Over my last 10 years with Lua I have never ever done the kind of things described in the author's first argument: call C functions with side-effects on the Lua stack. Things like:

lua_pushnumbe(L, mycomputingfunction(L, ...));

are just calling for trouble.

In my mind Lua and C code remain in two different spaces. C handles low-level stuff and returns a status, simple objects or just references. Lua is there to manipulate handles on C-level objects. Code separated this way has never given me any trouble for debugging.

The argument about blaming Lua for using little resources and being uselessly fast is so preposterous I would rather consider it either a bad joke or sheer inexperience.


Interesting, I developed a very large kiosk platform in Lua and the experience was fantastic. I found the stack based C integration to be far preferable to any other FFI I've ever used. Also the whole complaint that Lua is somehow paradigm-less (whatever that means), is very strange. Nobody who worked on our system had an issue with that.

There are very few libraries for Lua and that's its main weakness. However, because the C interface is so awesome, that never really hurt in practice—at least for our application. For something like a web app it would be completely untenable.


Lack of a paradigm is likely part of why there aren't many libraries - it's as though there are multiple languages or dialects and you have to subdivide your potential audience when you choose one for your library.


The bulk of his issues seem to be binding Lua to C/C++, in which case there quite a few helper libraries he could use to simplify the process (like LuaBind and toLua).

As for the language itself, I love Lua. The syntax is easy to learn and the concepts aren't too difficult. Plus, the (relatively) easy binding can lead to a lot of interesting opportunities, like scripting NES games with FCEUX. I actually starting working on a genetic algorithm in FCEUX a while ago and I had a lot of fun doing it.


I really have to wonder what he's doing in his C functions that the stack model is a problem. Most of mine are very simple: pop some values, maybe read some table fields or pcall a function, then do whatever the C function actually does, and push some values back. Yeah, lua_settable is a little awkward, but how complex is your code that calls it?


I couldnt tell you but...

Look at awesomewm (dude wrote it)

http://awesome.naquadah.org/download/


After a very cursory scan, it appears that he's shadowing a lot of standard Lua functions to provide his own versions, like a replacement for ipairs that checks an __ipairs metamethod first, that sort of thing. It doesn't actually look that complicated to me, but I've also done a bunch of embedding-Lua-in-stuff projects before.


I'm back to lua (actually luajit) after not using it for few years, and this time using C only through luajit's FFI. So far onlly one issue (not a small one) - no callbacks from C -> lua - so you have to be careful what libraries you would choose.

And connecting "C" and "LuaJIT" through FFI - that's quite easy. I've started a little project with such bindings http://github.com/malkia/ufo - OpenGL, OpenCL, AntTweakBar (OpenGL "property" dialog), ZeroMQ, GLFW (GLUT like library that can be used without callbacks), and few others.


Could you expand some on "no callbacks from C to Lua"? I have C code calling Lua code all the time, but I'm also not using LuaJIT, so I don't know if that's a limitation of it.


For example "void glutDisplayFunc( void (func)(void) ):"

ffi.cdef[[ void glutDisplayFunc(void (func)(void)); ]]

glut = ffi.load( "glut.dll/so/dylib/etc" ) glut.glutDisplayFunc( you_cant_put_lua_callback_function_here )

There are probably some ways using coroutines (as Mike explained), or later if luajit allows libffi to build the trampoline (but Mike said that right now luajit -> C -> luajit would not work for the same luajit context). But maybe a different context would work, and luajit can create different luajit context using itself through the FFI.

All this as altnernative to the natural lua "C" binding way described in the article.

Right now everything I'm doing is without any such binding. It's purely experimental right now.

I've also did some small port of the zile editor (lua version) to directly use curses (curses.dll) instead of bindings

github.com/malkia/luajit-zile

the curses.dll is not there yet.


Ah, I see, that's a different thing than I've been doing. I have a bridge for Cocoa that lets Lua post and observe NSNotifications, and so I have a stub class in ObjC that can call back into Lua in response to receiving a notification. I'm not using FFI at all.

The way I do it is, there's a C function that takes a Lua function as an argument (through the stack) and sticks it in the Lua registry, then stores the (int) index that it's stored at. Later on I can pull the function back out into the Lua stack and pcall it.


Thanks for the ideas!! I have to try it out...


I'm still writing it, but my plan is, have all communication between Lua and everything else go through NSNotifications, to enforce the idea that the game logic (I'm writing a game) should be in Lua and the animations / etc. in ObjC. If you'd like to look at the bridge, it's here:

https://github.com/randrews/ballmaze/blob/master/Ballmaze/Lu...


I had the completely opposite experience when using Lua: tables felt like a great structure to base a language on.


Tables/dictionaries have been working great for Python and JavaScript, too. :D


Python and Javascript has arrays, classes, too.


Lua tables combine arrays and dictionaries, though. They have a dictionary-style interface, but basically, if you use consecutive integer keys, that portion of the dictionary is stored in an array with O(1) access and can be iterated in order.


lots of people are commenting on the stack/size issues, but does anyone have any experience with the metaprogramming points he mentions?

from doing bad things to python i know that it's very frustrating when the language suddenly becomes "less dynamic than expected" because something is hardcoded (usually for sensible performance-related reasons - but that doesn't stop it hurting). how serious is this issue with lua (eg the inability to redefine # as described in the article).


The length thing is a known issue and will be fixed in 5.2. You can also fix it pretty easily NOW, if (as is common) you're using a local fork of Lua inside your codebase. Either way, it's not a big deal in practice.

It's just a small change in lvm.c (http://www.lua.org/source/5.1/lvm.c.html) - search for OP_LEN. Yeah, really - that's it.

The Lua codebase is pretty good reading, by the way.


Very very little is hardcoded, and you can override almost anything. Or even change the source if you are embedding it. Most people seem to manage, but there are a few areas which seem odd like the length one.


Yeah, it's arguably a matter of preference, but my experience (working on a team that used some Lua alongside other languages) is that Lua's preference for closed ranges and 1-based indices led to lots of off-by-one errors that probably would not have occurred in, say, Javascript or C.


How about lua does not have an operator to do xor or two integers? Seriously! Once I tried to write a wireshark plugin and used lua and found out I had to implement my own xor function! Stupidest thing I have done...


See http://bitop.luajit.org/

It comes standard as part of LuaJIT, which is likely what should be used for anything like a wireshark plugin (I don't know what wireshark actually uses, but LuaJIT is a drop-in replacement for the most part).

Bitop (or something similar) is included in Lua 5.2 as well.


On the one hand, by (interpreter compile-time) default, all numeric values in Lua are floats, and xor on floats doesn't make a lot of sense. On the other hand, given the popularity of embedded Lua (in which case the numeric type is probably int) the lack of bitwise operators gets frustrating. Actually I can attest to this, having used Lua in an embedded context.


Actually, by default numeric values are doubles, which can easily handle a 32-bit int without losing precision. And as I posted above, the Lua bitop library (which is standard in LuaJIT and Lua 5.2) gives you bitwise operations.


I did use bitop extensively. Having every bitwise operation be a function call was not as pleasant to type (or read) as dedicated bitwise operators but it did get the job done.


There are a few lines in pure Lua by the author of the language:

http://lua-users.org/lists/lua-l/2002-09/msg00134.html

If you can load a C extension module then there's:

http://bitop.luajit.org/

Yes Lua is that small.


Sounds more and more like Forth - even to complaints about the stack-based API.


No, that's only if you're dealing with the C API - it uses a stack-based interface to move data between C and the Lua VM.

Lua itself is nothing like Forth - it's more like a cleaned-up hybrid of Python and Javascript.


TIL that Lua has been around since 1993 http://www.lua.org/news.html




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: