Why I Don't Use Lua

6 minutes

I sometimes find myself having to defend my choice to never use Lua, both professionally, and in my more personal work. Rather than explain it at length every time, I figure I would place all my misgivings into a single post. I don’t think Lua will ever improve or fix these issues, as it takes a different mindshare from that of lother languages, and has quite a following in its community. Frankly, Lua and I just don’t mesh.

This post can be seen as a rant, and should be taken as opinion. It is also a brief and shallow discussion of the many other minutiae I have regarding Lua. This post is not a call for discussion. It is a short rant on why I don’t like Lua, and why I don’t use it.

Let’s start with why people like Lua. It’s a small language, built for embedding, with a small runtime written in C89. This is, if nothing more, its primary appeal. It is also released under the permissive MIT licese. Lua also evolved over time, and code written in Lua 5.1 is quite different from Lua 5.3, both at the language and runtime (C API) level.

But that is where I find the first of many issues with Lua. Because Lua is so simple to modify, there is no canonical Lua implementation available to others. Nearly every game I’ve ever seen written using Lua uses a specific version, or has made adjustments to make the name Lua not even apply. The effects of a community split like this can be seen in the Python community over Python 2 and Python 3. This issues occurs in other places as well, such as the scheme language community. However, the difference with the scheme community is that the split there is caused by diverging implementations, each from a different set of contributors. Lua however is 3 implementations, each competing for use amongst all members of the Lua community. Lua was so successful with its 5.1 branch that it will be used for years to come. Any improvements made to the language or its runtime will simply ot be seen in many places.

Now as a language, there are many things I take fault with in Lua. Let’s look the past the common argument of 1 based indices. There are more pressing, more dangerous, quirks within Lua that one must be concerned about. For starters, the declaration of all variables is global by default. When this is brought up, the arguments are many. Although one of the authors says:

Local by default is wrong. Maybe global by default is also wrong, [but] the solution is not local by default.

However, my argument is that there shouldn’t be a default. For each type of variable declaration, there should be an equivalent keyword to declare said variable. Javascript has solved its scoping issues (more or less), by adding a let keyword to aid in the delcaration of lexically scope values. However, because Lua permits x = 4 to not be just an assignment but a variable declaration, it has backed itself into a language design corner.

There are other issues as well. The use of ~= for a not equals operator, when the ! character is not used anywhere. Can you guess what one of the biggest modifications to Lua is since its a single char?

And let’s just do a rapid fire set of linguistic issues I have with Lua:

There is also the issue of a famous competing implementation, LuaJIT, which is a frankenstein monster of all these features. But LuaJIT differs from Lua at a runtime level as well, and it is this (the C API) that I take issue with the most in regards to Lua.

Lua has this bizarre idea of handing the user a stack of arguments, instead of passing a pointer to the first argument and a length as most languages do. This pushing and popping interface is, for whatever reason, well liked by its users for reasons that are completely alien to myself. Aside from adding an additional set of possible off by one errors to the user, there is simply no purpose to providing access to a “stack” that is not even that.

There is also the issue of coroutines, lauded by so many as a useful feature. The Lua coroutines are useful, if writing C. However, because of guarantees that must exist in a C++ context, it results in using a try-catch for every coroutine when compiled into C++. This causes performance degradation in certain platforms when games are involved (an area where Lua is most loved).

Add in that coroutines are a compile time option (not even available to the user unless provided by its host, if embedded) and their usefulness quickly wanes, especially in comparison to more useful language and runtime features such as continuations.

My biggest gripe with Lua is its allocator system. There is only function that takes a set of parameters and depending on their value must allocate, free, or reallocate a function. If this was meant to keep Lua small, they focused on doing it in the wrong place. Lua calls this function for every small thing, allocating small byte sizes at a time, causing considerable strain on what could have been a slab allocated system. Arguments that this isn’t necessary will fall on deaf ears when sqlite has somehow managed to permit slab allocation and is a small and powerful tool with a fantastic C API.

I also take issue with LuaJIT. It is an amazing piece of software. Mike Pall is extremely good at what he does. However, LuaJIT has an issue where one cannot allocate memory for use by LuaJIT outside the first 2GB of a given program. While this may not be a concern for some, I have worked on pieces of software with hundreds of GB in their program, a place that LuaJIT just isn’t welcome in.

For some games (and developers), Lua and LuaJIT meets their needs. However, I find Lua to be beneath what I want, and desire, in a language. If scripting, a language used should permit the host environment to decide how strings are handled, lists, dictionaries, even if there should be an object system at all. At the very least, it should match up to, or permit matching to, the host language’s environment. Lua, I feel, does not permit this. I don’t want to use it in my personal work if I can avoid it.

Rants

Lua