A contrary view, just to be fair and balanced. I really didn't enjoy using Lua, though it had its moments.
A. Lua's source code is pretty inscrutable, and written in a style that was pretty alien to me. It was also a bit horrid to single-step through because of all those damn reference copying macros. But it's actually fairly easy to add stuff to it and modify it, and this was what actually convinced me to go with Lua in the end; look-mummy-my-first-Lua-hack was surprisingly easy to put in.
B. Lack of variable declarations and global-by-default was a terrible idea. Languages should require declarations, or at least be local-by-default, and have a special interactive mode for when you want the no-declarations-and-global-by-default behaviour. Because making things "convenient" is fine, until it isn't, at which point it just becomes a bug magnet. For a language supposedly designed for use by people who don't know what they're doing, this borders on criminal; when you've got years of experience behind you, it's merely a huge pain in the arse. MaxScript gets this wrong too.
C. 1-based indexing is annoying. If it makes no difference, switch to zero, because it's then the same as other languages; if it makes a big difference, switch to zero so that it isn't so difficult. Just because you can switch back from one way of doing things to the other doesn't make 1-based indexing right! MaxScript gets this wrong as well. But 1-based indexing would be wrong even if MaxScript and Lua didn't do it.
D. Lack of a proper array type is silly; you have this ugly thing that's like an array, but not, and it's easy to stop it being like an array. Because it's not an array... it's a mapping. A mapping with some very odd ideas. Come on people, mappings and sequences aren't the same! Shape up!
(At this point in my argument I used to be in the habit of adding, "after all, you wouldn't skip having separate integer and floating-point types, now, would you?" - because Lua famously had just the float variety. But these days they have integers too. So I'm still in the habit of adding it.)
(The Lua guys' insistence on minimizing number of data types can also be seen in the lack of a separate symbol type. Even MaxScript doesn't get that wrong.)
E. The C-side API is actually kind of weird, but on the plus side it's actually somewhat difficult to use improperly from C. So... I guess that's even. Maybe there shouldn't even be a point E, but... well. The C-side API really does stick in my mind.
Aside from that, no complaints!
(This post is at least partly just so that other people who don't like Lua know that they aren't alone.)
As someone who chose the Lua runtime as a basis for his own toy language, these are my observations:
A. True. Paradoxically Lua's source is well-documented in places that arguably didn't need documentation in the first place, but it tends to contain no comments whatsoever in tricky places like the VM or the implementation of internal data objects.
B. Global-by-default has bugged me as well. Thankfully it's not difficult to patch it so declarations are mandatory and local though.
C. It's a matter of taste in the end. I struggled for a while on whether to convert it to 0-based indexing or not, but so far have left things as they are. Depending on your programming style, you might end up not using indexes directly all that much.
D. The distinction between having separate array and hash parts in a table kind-of works, but only if you make the conscious decision never to work with numerical indexes. Generally, I think this type of array/map hybrid is something PHP got right with their ordered hash maps, whereas Lua's way of doing things have a few gotchas.
[] For a lot of applications, having only one number type works surprisingly well. I just wish they had opted for morphing the internal number representation according to usage in the script, instead they are now doing a separate int type in Lua 5.3 which feels a little out of place for my taste.
[] I really don't think Lua needs a separate symbol type.
[] Things I didn't like include how often you have to check for certain conditions in Lua to avoid raising fatal errors. Not being able to use keywords inside the table dot notation. The idea that the self operator works on a function in the table instead of its metatable. The lacking core library, especially when it comes to table usage. The fact that a require()'d file doesn't know its own file name. No default to-string serialization on tables. The pattern that control flow statements always require code block components instead of single expressions. Goto statements.
Here's a link to my project which addresses my personal Lua pain points: http://np-lang.org/
You should have a look at Squirrel (http://www.squirrel-lang.org/) - it addresses most of the points above but is still very lightweight and easy to embed. The main disadvantage is that the documentation and community is not as well developed as Lua.
Variables being global by default is definitely a wart. You can mitigate it a lot with a linter or with runtime tests (extending the global metatable). Local by default would be a bad idea and worse than global by default, IMO, as it would mess up lexical scoping. The ideal would be having a compilation error by default.
It would be fine, I think. Just create the variable in the scope you want it, then assign to it somewhere else. If you get it wrong, you get it wrong, and there's scope for weird bugs from that - but the effect is at least localized, lexically.
(Of course, the right thing to do is have explicit variable declarations...)
Localized bugs are still bugs. Better always require a variable declaration and have no defaults.
Usually languages use local-by-default to avoid the need for variable declarations in the first place (in misguided manner, IMO) instead of doing so to avoid globals.
A. C is a beautiful language. I believe a more productive attitude towards any usage of C - as a binding/engine language layer, or whatever - is to assume: there is no one familiar style. (What Lua/src has, is good coding rules compliance..)
B. You control the VM, completely. The Lua language is designed with the VM integrator in mind. Lua frameworks are not something you inherit. So .. Why does global name-spacing versus local definition bother you? Show me your offensive global/local case; in every single case, it can be solved with better design. At the very least, walk _G yourself.
C/D. Its the 21-st Century, 1-based indexing makes sense to most of the non-hacker world, since you can't have a "0-th" box in which to store things in the real world. Except of course you can, you just make one and label it, arbitrarily, the '0' box .. in fact you do have the ability to index everything 0-based yourself in Lua, so have at it. Lua provides the 'one-type to rule all things' type, the Lua table{}, to give you the options you need. Want an array? Use your new_array{} as an array. Want a string-dictionary? Have at it. Need a hash: ditto. Bonus points: understand what you're doing, and get features from the table type; like scatter detection, packing/indexing/ordering operations, and so on. Lua metatables are your friend, not an enemy. Learn to use the table type and you will become a convert.
E. C-side API. You do have to understand the design constraints of the VM. That's about all you have to do. Is that really so weird?
Arrays start at zero because what you're counting is how far you're moving a pointer from the initial address in memory, not the items themselves. Array[0] is literally 'read what's at pointer sizeof(n)+0 from the start.' Zero-based indices make perfect sense when you know what they actually indicate.
Although to be fair, starting with one and actually counting the items makes sense too, and seems more intuitive, and this may not even apply to Lua at all. But one-based indexing is still wrong regardless.
a) it doesn't matter b) lua doesn't have these pointer things you speak of. If you are going to hate on 1 based indexing at least have the fine style of EWD.
Off by one errors are enough of a problem without having to remember that lua decided [1] means something other than what [1] means in almost every other language, so plan accordingly.
Symbol types[1] are useful for storing identifiers, method names or enums. They are common in LISPs and in Lua you use (immutable) strings for a similar purpose[2].
The main advantage of symbol types are that they have a more restricted API than strings and that they might be more efficient (integer vs string representation)
Since you pointed out that Lua strings are immutable you're aware of what I'm about to say, but just to make it explicit:
The efficiency argument is a common one for symbols in general, but in Lua it doesn't really hold because string comparison has constant cost (ie, it's a simple pointer test, since strings are immutable).
(This may or may not be an issue, but it still offended me somewhat, since you're getting all of the compile-time cost at runtime, combined with none of the expressiveness. Not a tradeoff that impresses me, personally, even if the speed is fine - but it takes all sorts.)
A. Lua's source code is pretty inscrutable, and written in a style that was pretty alien to me. It was also a bit horrid to single-step through because of all those damn reference copying macros. But it's actually fairly easy to add stuff to it and modify it, and this was what actually convinced me to go with Lua in the end; look-mummy-my-first-Lua-hack was surprisingly easy to put in.
B. Lack of variable declarations and global-by-default was a terrible idea. Languages should require declarations, or at least be local-by-default, and have a special interactive mode for when you want the no-declarations-and-global-by-default behaviour. Because making things "convenient" is fine, until it isn't, at which point it just becomes a bug magnet. For a language supposedly designed for use by people who don't know what they're doing, this borders on criminal; when you've got years of experience behind you, it's merely a huge pain in the arse. MaxScript gets this wrong too.
C. 1-based indexing is annoying. If it makes no difference, switch to zero, because it's then the same as other languages; if it makes a big difference, switch to zero so that it isn't so difficult. Just because you can switch back from one way of doing things to the other doesn't make 1-based indexing right! MaxScript gets this wrong as well. But 1-based indexing would be wrong even if MaxScript and Lua didn't do it.
D. Lack of a proper array type is silly; you have this ugly thing that's like an array, but not, and it's easy to stop it being like an array. Because it's not an array... it's a mapping. A mapping with some very odd ideas. Come on people, mappings and sequences aren't the same! Shape up!
(At this point in my argument I used to be in the habit of adding, "after all, you wouldn't skip having separate integer and floating-point types, now, would you?" - because Lua famously had just the float variety. But these days they have integers too. So I'm still in the habit of adding it.)
(The Lua guys' insistence on minimizing number of data types can also be seen in the lack of a separate symbol type. Even MaxScript doesn't get that wrong.)
E. The C-side API is actually kind of weird, but on the plus side it's actually somewhat difficult to use improperly from C. So... I guess that's even. Maybe there shouldn't even be a point E, but... well. The C-side API really does stick in my mind.
Aside from that, no complaints!
(This post is at least partly just so that other people who don't like Lua know that they aren't alone.)