Metalua Manual

Previous Up

Chapter 4  Samples and tutorials

4.1  Advanced examples

This section will present the implementation of advanced features. The process is often the same:
  1. ``That language has this nifty feature that would really save my day!''
  2. Implement it as a metalua macro.
  3. ???
  4. Profit!
The other common case for macro implementation is the development of a domain-specific language. I'll eventually write a sample or two demonstrating how to do this.

4.1.1  Exceptions

As a first non-trivial example of extension, we'll pick exception: there is a mechanism in Lua, pcall(), which essentially provides the raw functionality to catch errors when some code is run, so enhancing it to get full exceptions is not very difficult. We will aim at:
  • Having a proper syntax, the kind you get in most exception-enabled languages;
  • being able to easily classify exception hierarchically;
  • being able to attach additional data to exception (e.g. an error message);
  • not interfere with the usual error mechanism;
  • support the ``finally'' feature, which guaranties that a piece of code (most often about resource liberation) will be executed.


There are many variants of syntaxes for exceptions. I'll pick one inspired from OCaml, which you're welcome to dislike. And in case you dislike it, writing one which suits your taste is an excellent exercice. So, the syntax for exceptions will be something like:

   <protected block of statements>
  <exception_1> -> <block of statements handling exception 1>
| <exception_2> -> <block of statements handling exception 2>
| <exception_n> -> <block of statements handling exception n>
Notice that OCaml lets you put an optional ``|'' in front of the first exception case, just to make the whole list look more regular, and we'll accept it as well. Let's write a gg grammar parsing this:

trywith_parser = 
   gg.sequence{ "try",  mlp.block,  "with",  gg.optkeyword "|",
                gg.list{ gg.sequence{ mlp.expr,  "->",  mlp.block },
                         separators = "|", terminators = "end" },
                builder = trywith_builder }
mlp.lexer:add{ "try", "with", "->" }
mlp.block.terminator:add{ "|", "with" }
We use gg.sequence to chain the various parsers; gg.optkeyword lets us allow the optional ``|''; gg.list lets us read an undetermined series of exception cases, separated by keyword ``|'', until we find the terminator ``end'' keyword. The parser delegates the building of the resulting statement to trywith_builder, which will be detailled later. Finally, we have to declare a couple of mundane things:
  • that try, with and -> are keywords. If we don't do this, the two firsts will be returned by the lexer as identifiers instead of keywords; the later will be read as two separate keywords ``-'' and ``>''. We don't have to declare explicitly ``|'', as single-character symbols are automatically considered to be keywords.
  • that ``|'' and ``with'' can terminate a block of statements. Indeed, metalua needs to know when it reached the end of a block, and introducing new constructions which embed blocks often introduce new block terminators. In our case, ``with'' marks the end of the block in which exceptions are monitored, and ``|'' marks the beginning of a new exception handling case, and therefore the end of the previous case's block.
That's it for syntax, at least for now. The next step is to decide what kind of code we will generate.

The fundamental mechanism is pcall(func, arg1, arg2, ..., argn): this call will evaluate
func(arg1, arg2, ..., argn), and:
  • if everything goes smoothly, return true, followed by any value(s) returned by func();
  • if an error occurs, return false, and the error object, most often a string describing the error encountered.
We'll exploit this mechanism, by enclosing the guarded code in a function, calling it inside a pcall(), and using special error objects to represent exceptions.

Exception objects

We want to be able to classify exceptions hierarchically: each exception will inherit form a more generic exception, the most generic one being simply called ``exception''. We'll therefore design a system which allows to specialize an exception into a sub-exception, and to compare two exceptions, to know whether one is a special case of another. Comparison will be handled by the usual < > <= >= operators, which we'll overload through metatables. Here is an implementation of the base exception exception, with working comparisons, and a new() method which allow to specialize an exception. Three exceptions are derived as an example, so that exception > exn_invalid > exn_nullarg and exception > exn_nomorecoffee:

exception = { } ; exn_mt = { } 
setmetatable (exception, exn_mt)

exn_mt.__le = |a,b| a==b or a<b 
exn_mt.__lt = |a,b| getmetatable(a)==exn_mt and 
                    getmetatable(b)==exn_mt and
                    b.super and a<=b.super

function exception:new()
   local e = { super = self, new = }
   setmetatable(e, getmetatable(self))
   return e

exn_invalid     = exception:new()
exn_nullarg     = exn_invalid:new()
exn_nomorecofee = exception:new()
To compile a try/with block, after having put the guarded block into a pcall() we need to check whether an exception was raised, and if is has been raised, compare it with each case until we find one that fits. If none is found (either it's an uncaught exception, or a genuine error which is not an exception at all), it must be rethrown.

Notice that throwing an exception simply consists into sending it as an error:

throw = error
To fix the picture, here is some simple code using try/catch, followed by its translation:

-- Original code:
   print("You shouldn't see that")
| exn_nomorecofee -> print "you shouldn't see that: uncomparable exn"
| exn_nullarg     -> print "you shouldn't see that: too specialized exn"
| exn_invalid     -> print "exception caught correctly"
| exception       -> print "execution should never reach that far"

-- Translated version:
local status, exn = pcall (function ()
   print("You shouldn't see that")

if not status then
   if exn < exn_nomorecoffee then
      print "you shouldn't see that: uncomparable exn"
   elseif exn < exn_nullarg then
      print "you shouldn't see that: too specialized exn"
   elseif exn < exn_invalid then
      print "exception caught correctly"
   elseif exn < exception then
      print "execution should never reach that far"
   else error(exn) end
In this, the only nontrivial part is the sequence of if/then/elseif tests at the end. If you check the doc about AST representation of such blocks, you'll come up with some generation code which looks like:

function trywith_builder(x)
   -- Get the parts of the sequence:
   local block, _, handlers = unpack(x)

   -- [catchers] is the big [if] statement which handles errors
   -- reported by [pcall].
   local catchers = `If{ }
   for _, x in ipairs (handlers) do
      -- insert the condition:
      table.insert (catchers, +{ -{x[1]} <= exn })
      -- insert the corresponding block to execute on success:
      table.insert (catchers, x[2])

   -- Finally, put an [else] block to rethrow uncought errors:
   table.insert (catchers, +{error (exn)})

   -- Splice the pieces together and return the result:
   return +{ block:
      local status, exn  = { pcall (function() -{block} end) }
      if not status then
         -{ catchers }
      end }

Not getting lost between metalevels

This is the first non-trivial example we see, and it might require a bit of attention in order not to be lost between metalevels. Parts of this library must go at metalevel (i.e. modify the parser itself at compile time), other parts must be included as regular code:
  • trywith_parser and trywith_builder are at metalevel: they have to be put between -{...}, or to be put in a file which is loaded through -{ require ... }.
  • the definitions of throw, the root exception and the various derived exceptions are regular code, and must be included normally.
The whole result in a single file would therefore look like:

-{ block:
   local trywith_builder = ...
   local trywith_parser  = ...
   mlp.stat:add ...
   mlp.lexer:add ...
   mlp.block.terminator:add ... }

throw     = ...
exception = ...
exn_mt    = ...

exn_invalid     = ...
exn_nullarg     = ...
exn_nomorecofee = ...

-- Test code
| ... -> ...
Better yet, it should be organized into two files:
  • the parser modifier, i.e. the content of ``-{block:...}'' above, goes into a file ``ext-syntax/exn.lua'' of Lua's path;
  • the library part, i.e. throw ... exn_nomorecoffee ... goes into a file ``ext-lib/exn.lua'' of Lua's path;
  • the sample calls them both with metalua standard lib's extension function:

-{ extension "exn" }
| ... -> ...


This first attempt is full of bugs, shortcomings and other traps. Among others:
  • Variables exn and status are subject to capture;
  • There is no way to put personalized data in an exception. Or, more accurately, there's no practiccal way to retrieve it in the exception handler.
  • What happens if there's a return statement in the guraded block?
  • There's no finally block in the construction.
  • Coroutines can't yield across a pcall(). Therefore, a yield in the guarded code will cause an error.
Refining the example to address these shortcomings is left as an exercice to the reader, we'll just give a couple of design hints. However, a more comprehensive implementation of this exception system is provided in metalua's standard libraries; you can consider its sources as a solution to this exercice!


Addressing the variable capture issue is straightforward: use mlp.gensym() to generate unique identifiers (which cannot capture anything), and put anti-quotes at the appropriate places. Eventually, metalua will include an hygienization library which will automate this dull process.

Passing parameters to exceptions can be done by adding arbitrary parameters to the new() method: these parameters will be stored in the exception, e.g. in its array part. finally, the syntax has to be extended so that the caught exception can be given a name. Code such as the one which follows should be accepted:

   throw (exn_invalid:new "I'm sorry Dave, I'm afraid I can't do that.")
| exn_invalid e -> printf ("The computer choked: %s", e[1])
The simplest way to detect user-caused returns is to create a unique object (typically an empty table), and return it at the end of the block. when no exception has been thrown, test whether that object was returned: if anything else than it was returned, then propagate it (by returning it again). If not, do nothing. Think about the case when multiple values have been returned.

The finally block poses no special problem: just go through it, whether an exception occured or not. Think also about going through it even if there's a return to propagate.

As for yielding from within the guarded code, there is a solution, which you can find by searching Lua's mailing list archives. The idea is to run the guarded code inside a coroutine, and check what's returned by the coroutine run:
  • if it's an error, treat it as a pcall() returning false;
  • if it's a normal termination, treat it as a pcall() returning true;
  • if it's a yield, propagate it to the upper level; when resumed, propagate the resume to the guarded code which yielded.

4.1.2  Structural pattern matching

FIXME: refer to the official match extension instead of re-explaining what pattern matching is about.

Basic principles

In many languages, including Lua, ``pattern matching'' generally refers to the ability to:
  • Analyse the general shape of a string;
  • capture specified parts of it into temporary variables
  • do some computation when the string matches a certain pattern, generally using the temporary captured variables.
In languages derived from ML1, pattern matching can be done on arbitrary data, not only strings. One can write patterns which describe the shape of a data structure, bind some interesting sub-parts of it to variables, and execute some statements associated with a given pattern whenever the data matches. This sample aims to implement this capability into metalua. It will discuss:
  • How to describe data patterns (it turns out that we'll hijack metalua's expression parser, which is a great time-saving trick when it can be pulled).
  • How to compile them into equivalent code.
  • How to wrap all this up into a working compiled statement
  • What improvements can be done to the proposed design.


A match statement consists of a tested term, and a series of (pattern, block) pairs. At execution, the first pair whose pattern accurately describes the tested term is selected, and its corresponding block is evaluated. Other blocks are ignored. We will keep the implementation of patterns as simple as possible. To do that, we will put a first constraint: the AST of a pattern must be a legal expression AST. This way, we save most of the syntax handling trouble. More specifically:
  • numbers are valid patterns, which only match identical numbers;
  • strings are valid patterns, which only match identical strings;
  • tables are valid patterns if all of their keys are integers or strings, and all of their values are valid patterns. A table pattern matches a tested term if:
    • the tested term is a table;
    • every key that appears in the pattern also appears in the tested term;
    • for every key, the value associated to this key in the tested term is matched by the sub-pattern associated to this key in the pattern;
  • variables are valid patterns, and match everything. Moreover, the term matched by the variable captures it, i.e. in the corresponding block, that variable is set to the matched term.
Notice that since tag is a regular field in metalua (albeit with an optional dedicated syntax), it doesn't need a special case in our pattern definition.
Let's consider implementing an evaluator for Metalua expressions. We will restrict ourselves to binary operators, variables and numbers; we will also consider that we have a values table mapping variable names to their value, and binopfuncs which associate an operator name with the corresponding function. The evaluator would be written:

function eval(t)
   match t with
   | `Op{ op, a, b } -> return binopfuncs[op](eval(a), eval(b))
   | `Number{ n }    -> return n
   | `Variable{ v }  -> return values[v]

Pattern compilation

A pattern in a case will translate into:
  • tests: testing that the type of the tested case is the same as the pattern's type for numbers, strings and tables;
  • local declarations: a variable must be bound to the tested term's value;
  • nested tests for every pattern key/value pair, when the pattern is a table. Moreover, since there might be multiple tests to run against the tested term's field, it should be put in a temporary variable.
For instance, consider the following pattern:

{ tag = "Foo", 10, { 20 }, { x = a }, b }
It corresponds to the series of tests and assignments on the left:

let v1 = <tested_term>
type(v1) == "table"
let v2 = v1.tag
v2 ~= nil
type(v2) == "string"
v2 == "Foo"
let v2 = v1[1]
v2 ~= nil
type(v2) == "number"
v2 == 10
let v2 = v1[2]
v2 ~= nil
type(v2) == "table"
let v3 = v2[1]
v3 ~= nil
type(v3) == "number"
v3 == 20
let v2 = v1[3]
v2 ~= nil
type(v2) == "table"
let v3 = v2.x
v3 ~= nil
local a = v3
let v2 = v1[4]
v2 ~= nil
local b = v2

let v1 = tested_term
if type(v1) == "table" then
 let v2 = v1.tag
 if v2 ~= nil then
  if type(v2) == "string" then
   if v2 == "Foo" then
    let v2 = v1[1]
    if v2 ~= nil then
     if type(v2) == "number" then
      if v2 == 10 then
       let v2 = v1[2]
       if v2 ~= nil then
        if type(v2) == "table" then
         let v3 = v2[1]
         if v3 ~= nil then
          if type(v3) == "number" then
           if v3 == 20 then
            let v2 = v1[3]
            if v2 ~= nil then
             if type(v2) == "table" then
              let v3 = v2.x
              if v3 ~= nil then
               local a = v3
               let v2 = v1[4]
               if v2 ~= nil then
                local b = v2
end ... end -- (16 times)
Notice that the relative order of tests and assignments is meaningful: we cannot put all assignments on one side, and all tests on an other, e.g. v2 = v1.tag on line 3 doesn't make sense if type(v1) == table on line 2 fails. We will compile patterns in several steps: first, accumulate all the tests and assignments in order; then, we will collapse them in a single big nesting of ``if'' statements. At first, we won't try to optimize the form of the final term. The list above left would be collapsed into the single compound statement above on the right.
Accumulating constraints and tests
This is done by a parse_pattern() function, which does just what is described in the bullet list above:

      -- Turn a pattern into a list of conditions and assignations,
      -- stored into [acc]. [n] is the depth of the subpattern into the
      -- toplevel pattern; [tested_term] is the AST of the term to be 
      -- tested; [pattern] is the AST of a pattern, or a subtree of that
      -- pattern when [n>0].
      local function parse_pattern (n, pattern)
         local v = var(n)
         if "Number" == pattern.tag or "String" == pattern.tag then
            accumulate (+{ -{v} == -{pattern} })
         elseif "Id" == pattern.tag then
            accumulate (+{stat: local -{pattern} = -{v} })
         elseif "Table" == pattern.tag then
            accumulate (+{ type( -{v} ) == "table" } )
            local idx = 1
            for _, x in ipairs (pattern) do
               local w = var(n+1)
               local key, sub_pattern
               if x.tag=="Key" 
               then key = x[1];           sub_pattern = x[2]
               else key = `Number{ idx }; sub_pattern = x; idx=idx+1 end
               accumulate (+{stat: (-{w}) = -{v} [-{key}] })
               accumulate (+{ -{w} ~= nil })
               parse_pattern (n+1, sub_pattern)
         else error "Invalid pattern type" end
This function relies on the following helper functions:
  • var(n) generates the variable name ``vn'', which is used to store the tested term at depth level n. Indeed, sub-patterns in table fields are matched against sub-terms of the tested term. It also remembers of the biggest n it ever received, and stores it into max_n (this will be used to know which local vars have to be generated, see below);
  • accumulate() just stores an additional code snippet in a dedicated list;
In the quotes, you might notice the parentheses around the variable in ``(-{w}) = -{v} [-{key}]'': they're here to let the compiler know that what's in -{...} is an expression, not a statement. Without them, it would expect w to be the AST of a statement (since we're in a statement context at this point), then choke on the unexpected ``='' symbol. This is a common idiom, which you should think about everytime you generate an assignment to an anti-quoted identifier.
Collapsing the accumulated quotes
As written above, our collapsing function will be kept as simple as possible, and will not try to minimize the amount of generated code. It takes as parameters n, the index of the quote currently collapsed in the accumulator, and inner_term the statement block to put inside the innermost part of the test. It calls itself recursively, so that the collapsed term is built inside out (generally speaking, working with trees, including AST, involves a lot of recursive functions). acc is the list filled by the accumulate() function:

      -- Turn a list of tests and assignations into [acc] into a
      -- single term of nested conditionals and assignments.
      -- [inner_term] is the AST of a term to be put into the innermost
      -- conditionnal, after all assignments. [n] is the index in [acc]
      -- of the term currently parsed.
      -- This is a recursive function, which builds the inner part of
      -- the statement first, then surrounds it with nested 
      -- [if ... then ... end], [local ... = ...] and [let ... = ...]
      -- statements.
      local function collapse (n, inner_term)
         assert (not inner_term.tag, "collapse inner term must be a block")
         if n > #acc then return inner_term end
         local it = acc[n]
         local inside = collapse (n+1, inner_term)
         assert (not inside.tag, "collapse must produce a block")
         if "Op" == it.tag then 
            -- [it] is a test, put it in an [if ... then .. end] statement
            return +{block: if -{it} then -{inside} end }
            -- [it] is a statement, just add it at the result's  beginning.
            assert ("Let" == it.tag or "Local" == it.tag)
            return { it, unpack (inside) }
To fully understand this function, one must remember that test operations are translated into `Op{ <opname>, <arg1>, <arg2>} AST. That's why we didn't have to put an explicit reminder in the accumulator, remembering whether a quote was a test or a statement: we'll just check for `Op's presence.

Match statement compilation

We already know how to translate a single pattern into a Lua test statement. This statement will execute a given block, with all pattern variables bound appropriately, if and only if the tested term matches the pattern. To build a complete match statement, we need to test all patterns in sequence. When one of them succeeds, we must skip all the following cases. We could do that with some complex ``else'' parts in the tests, but there is a simpler way to jump out of some nested blocks: the ``break'' statement. Therefore, we will enclose the cases into a loop that executes exactly once, so that a ``break'' will jump just after the last case. Then, we will add a ``break'' at the end of every statement that doesn't already finish with a ``return'', so that other cases are skipped upon successful matching. To sum up, we will translate this:

match <foo> with
| <pattern_1> -> <x1>
| <pattern_2> -> <x2>
| <pattern_n> -> <xn>
into this:

  local v1, v2, ... vx -- variables used to store subpatterns
  let v1 = <tested_term>
  if <compilation of pattern_1> ... then x1; break end
  if <compilation of pattern_2> ... then x2; break end
  if <compilation of pattern_n> ... then xn; break end
until true
First, we add final break statements where required, and we compile all (pattern, block) pairs:

      -- parse all [pattern ==> block] pairs. Result goes in [body].
      local body = { }
      for _, case in ipairs (cases) do
         acc = { } -- reset the accumulator
         parse_pattern (1, case[1], var(1)) -- fill [acc] with conds and lets
         local last_stat = case[2][#case[2]]
         if last_stat and last_stat.tag ~= "Break" and 
            last_stat.tag ~= "Return" then
            table.insert (case[2], `Break) -- to skip other cases on success
         local compiled_case = collapse (1, case[2])
         for _, x in ipairs (compiled_case) do table.insert (body, x) end
Then, we can just splice it into the appropriate quote:

      local local_vars = { }
      for i = 1, max_n do table.insert (local_vars, var(i))  end
      -- cases are put inside a [repeat until true], so that the [break]
      -- inserted after the value will jump after the last case on success.
      local result = +{ stat: 
            -{ `Local{ local_vars, { } } }
            (-{var(1)}) = -{tested_term}
            -{ body }
         until true }
      return result
There is one point to notice in this quote: body is used where a statement is expected, although it contains a list if statements rather than a single statement. Metalua is designed to accept this, i.e. if a, b, c, d are statements, AST `Do{ a, b, c, d } and `Do{ a, { b, c }, d} are equivalent. This feature partially replaces Lisp's @(...) operator.

Syntax extension

To use this, we provide a syntax inspired by OCaml2:

match <foo> with
  <pattern> -> block
| <pattern> -> block
| <pattern> -> block
For this, we need to declare new keywords match, with and ->. Then, we build the (pattern, block) parser with gg.sequence{ }, and read a list of them, separated with ``|'', thanks to gg.list{ }. Moreover, we accept an optional ``|'' before the first case, so that all cases line up nicely:

mlp.lexer:add{ "match", "with", "->" }

mlp.stat:add{ "match", mlp.expr, "with", gg.optkeyword "|",
              gg.list{ gg.sequence{ mlp.expr, "->", mlp.block },
                       separators  = "|",
                       terminators = "end" },
              builder = |x| match_parser (x[1], x[3]) }
Now, if you try this... it won't work! Indeed, Metalua needs to know what keywords might terminate a block of statements. In this case, it doesn't know that ``|'' can terminate a block. We need therefore to add the following statement:

mlp.block.terminators:add "|"
Finally that's it, we have implemented a working pattern matching system in 75 lines of code!

Possible improvements

Here are a couple of suggestions to further improve the pattern matching system presented above. Some of these proposals can be implemented very quickly, some others more complex; all of them present some practical interest. The code of the basic version, as presented here, is available at
Boolean constants aren't handled in the system above, neither as table keys nor as patterns. They're easy to add, and doing it will help you get gently into the code.
Gotos considered beneficial
Gotos might be harmful in hand-written programs, but they're a bliss for machine-generated code. They would slightly simplify the code of pattern matching as presented above; but for many extension proposals listed below, they will make reasonnably easy some things which would otherwise be awfully contrived. Exercice: simplify the implementation above as much as possible by using gotos. Labels and gotos in metalua ASTs are represented as `Label{ id } and `Goto{ id } respectively, with id an identifier, typically generated by mlp.gensym(). It is always safe to jump out of a block; jumping into a block is not guaranteed against weird interactions with local variables and upvalues.
collapse() optimization
Instead of nesting if statements systematically, two nested ifs without else branches can be simplified in a single branch with an and operator. Not sure it would change the bytecode's efficiency, but that's a good exercice of AST manipulation.
Superfluous assignments
When parsing a table entry, we assign it to a variable, then recursively call parse_pattern() on it; in the frequent case where the entry was simply a variable, it re-assigns it again. This could be optimized away.
Checking table arity
In many cases, it is practical to check the number of elements in the array-part of the table. Here is a simple modification proposal: by default, a table pattern matches a table term only if they have the same number of array-part elements. However, if the last element of the pattern is `Dots (a.k.a. +{...}), then the term simply has to have at least as many array-part elements as the pattern.
Adding guards
It is sometimes desirable to add arbitrary conditions for a pattern to match, conditions which might no be expressed by a pattern. OCaml allows to add them with a ``when'' keyword:

match n with
| 0               -> print "zero"
| n when n%2 == 0 -> print "even number"
| _               -> print "odd number"
I'd advise you to prefer if as a dedicated keyword, rather than when: it's unambiguous in this context, and saves a keyword reservation.
More bindings
The way pattern matching is currently implemented, one can either bind a subterm to a variable, or check its structure against a sub-pattern, not both simultaneously. OCaml provides an ``as'' operator, which allows to do both (Haskell calls it ``@''). For instance, in the following example, any ADT whose tag is "RepeatMe" will be replaced by two occurrences of itself, while others will remain unchanged:

match something with
| `RepeatMe{ ... } as r -> { r, r }
| x -> x
``as'' will have to be declared as an infix operator, whose meaning will remain undefined in expressions which are not patterns. As an alternative, you can reuse an existing infix operator, thus avoiding to mess the expression parser. For instance, use * instead of as. You can go further, and implement + as an ``or'' operator (pattern1 + pattern2 would match if either of the patterns matches), although this would significantly complicate the implementation of parse_pattern(). The + operator might prove tricky to implement, if you don't convert your code generator to gotos and labels first.
Linear bindings
We should check, when compiling a pattern, that it is left-linear, i.e. that variables don't appear more than once in the pattern. People might be tempted to write things like this to check whether a tree is symmetric:

match t with
| `Tree{ x, x } -> print "Symmetric!"
| `Tree{ x, y } -> print "Not symmetric"
| `Leaf{ _ }    -> print "Symmetric!"
However, this would work in Prolog but not with our pattern matching, as two occurences of the same variable in the pattern don't cause an equality test to be added. We should detect such non-linear variables, and implement a suitable reaction:
  • throw an error, or at least a warning;
  • or add an equality test between the terms matched by the non-linear variable;
  • or offer a syntax extension that lets the user provide his own equality predicate.
Notice that the latter choice would drive us towards a Prolog unification algorithm, which opens interesting opportunities. You might offer an exception for variable ``_'', which is often intended as a dummy, unused variable. Non-linear occurences of it should probably be silently accepted, without even performing the corresponding binding.
Generalized assignments
Yet another OCaml-inspired feature: assignments such as ``foo = bar'', is almost a special case of pattern matching with only one case: the left-hand side is the pattern, and the right-hand side is the ``raw'' ``foo=bar'' assignment. Seen this way, it allows to write things such as ```If{ cond, block } = some_ast }'' to assign cond and block to the subparts of some_ast (if we know that some_ast is the AST of an if statement). If you go this way, however, make sure that the code generated for simple lets is as efficient as before! Moreover, there is an (easy) scoping issue: the variables assigned belong to the scope of the surrounding block.
Pattern matchings as expressions
Pattern matching are currently statements, and take statements as right-hand sides of cases. We could allow pattern matchings where expressions are expected: these would take expressions instead of statements as right-hand sides. Two ways to implement this: the dirty one (hack with functions to change match statements into expressions), and the clean one (refactoring existing code, so that it is agnostic about its right-hand side type, and provide two specialized versions of it for statements and expressions).
Bootstrap it
That's something language designers love to do, for largely mystic reasons: writing a language's compiler in the language itself. Here, the idea is to re-implement the pattern matching extension by using pattern matching, and compile it with the older version. Comparing the firsrt and second versions of the code will give you an idea of how much code clarification is brought to you by the pattern matching extension.
Pattern conjunction
Another feature to take from OCaml is multiple patterns for a single block. Instead of associating one block with one pattern, cases associate a block with a (non-empty) list of patterns. All of these patterns have to bond the same variables, except for _. The first pattern in the list to match the tested term does the binding. Patterns are separated by ``|''. Example:

match x with
| 1 | 2 | 3 -> print(x)
| n -> print "more than 3"
(Hint: put the block in a local function. 2nd hint: sort bound variables, e.g. by lexicographic order. Or much simpler and more effective: convert your code generator to gotos+labels first).
XML munching
Ever tried to transform some XML document through XSLT? Did you feel that this was even more kludgy than XML itself? Here is a challenging proposal:
  • Realize, if you didn't already, that Metalua's ADT are isomorphic to XML, if you identify string-keys in tables with attributes, and limit there content to strings and number. For instance, ``<foo bar=3><baz/>eek</foo>'' easily maps to `` `foo{ bar=3, `baz, "eek" }'';
  • compare what ML-style pattern matching does with what XSLT does (and with what you'd like it to do);
  • design, implement, publish. You might want to google ``CDuce''3 for neat ideas.
If you do this, I'd be really interested to put back your contribution in the next version of Metalua!


Most of the improvements proposed here are actually implemented in the match library provided with metalua. Check its (commented) sources!

@.2  Digging in the sources

This section is dedicated to people who want to dig into Metalua sources. It presents the main files, their current state, and where to start your exploration.

@.2.1  gg

The real core of Metalua is gg, implemented in a single file Understanding it is really the key to getting the big picture. gg is written in a rather functional style, with a lot of functors and closures.

Gg is a restricted version of Haskell's parser combinator library parsec: parsec allows to build complex parsers by combining simpler ones with concatenation, alternative choices, etc; it allows, among others, to handle backtracking, i.e. a given parser can have several interpretations of a single sentence, and parser combinators handle this non-determinism by choosing the interpretation which allows the combined parser to yield a result.

Gg intentionnaly doesn't support backtracking: not only would it be slightly harder to read in a non-lazy language such as Lua, but it isn't required to parse Lua. More importantly, built-in backtracking would encourage people to create ambiguous syntax extensions, which is an awfully bad idea: indeed, we expect different extensions to cohabitate as smoothly as possible, and two extensions with ambiguous grammars generally cause a lot of chaos when mixed together. Finally, a lot of Lua's essence is about friendly, unsurprizing, clear syntax. We want to encourage people into respecting this spirit as much as possible, so if they want to introduce chaotic syntax, Metalua won't actively prevent them to do so, but it certainly won't help by providing the tools.

Gg offers no atomic parser, besides keyword parser generators; it's up to the programmer to provide these. Parsers are simply functions which take a lexer as a parameter, and return an AST. Such function examples are provided for mlp in mlp_expr.lua. Lexers are objects with a couple of mandatory methods: peek, next and is_keyword. Lexer API shall be discussed in the part about mll.

gg.lua is correctly refactored and commented, and should be readable by anyone with some notions of Lua and functional programming. Having dealt with parsec might help a bit, but is definitely not required.

Going further
From gg, there are two main leads to follow: either look down to mll, Metalua's lexer, or look up to mlp, the Metalua parser implemented on top of gg.

@.2.2  lexer, mlp_lexer

As stated above, gg relies on a lexer respecting a certain API. We provide such a lexer for Metalua parsing, which deals with the usual lexing tasks: getting rid of spaces and comments, and discriminating between keywords, identifiers, strings, numbers etc. It's implemented in the file lexer.lua

This lexer can be parameterized with a list of keywords; mlp_lexer.lua defines the mlp lexer, i.e. uses the generic lexer to create a derived lexer, and adds Lua keywords to it.

lexer.lua is somewhat extensible by someone willing to inspect its sources carfully. Among others, playing with the list lexer.extractors will let you create new lexical entities readers. However, the global architecture of the lexer still deserves a serious refactoring.

@.2.3  mlp

Mlp is a very important part of Metalua, the one most people will actually have to deal with. Understanding it requires to understand gg. Mlp is cut into several parts:
  • mlp_expr.lua parses expressions, except literal tables and quotes. It includes other constants (booleans, strings, numbers), the compound expressions built by combining them with operators, and function bodies. Most of its complexity is handled by the expression parser generator gg.expr.
  • mlp_table.lua parses tables. Not much to say about this, this is probably the simplest subpart of mlp.
  • mlp_stat.lua parses statements. Except for assignements, every different statement is introduced by a distinct initial keyword, and it should remain that way.
  • mlp_ext.lua gathers the parts of the metalua syntax that aren't regulat Lua: customizable assignments, short lambda syntax etc.
  • mlp_meta.lua handles the meta-operation, splicing and quoting.
  • mlp_misc.lua contains various little bits that wouldn't fit anywhere else. In other words, it's sort of a mess.

@.2.4  Bytecode generation

Bytecode generation by metalua is a quick and dirty hack, which sort-of does the trick for now, but should eventually be largely rewritten. The current scaffolding has been hacked from Kein-Hong Man's Yueliang project (, but Yueliang design rationales don't really fit metalua's needs. Indeed, Yueliang is a translation, almost statement by statement, of the official C compiler included in Lua distribution. This has the following consequences:
  • it helps to understand the original compiler in C;
  • it's easy to backport extensions from the C version to Yueliang (I had to do it, since Yueliang was 5.0 and I neede 5.1)
  • it's rather easy to get bitwise-identical compilations, between what Yueliang produces and what the C version does. And that's good, because testing a compiler is a non-trivial problem.
  • being in Lua, it's much easier to debug and extend than C.
The big drawback is that the code is very much structured like C, and is therefore, big, memory and time hungry, harder than necessary to understand and maintain, not suitable for clean and easy extensions. Two possible evolutions could be considered for metalua:
  • either port the bytecode producer to C, to speed up things;
  • or rewrite it in ``true'' (meta)Lua4 , i.e. by leveraging all the power of tables, closures etc. This would allow the bytecode producer to be extensible, and interesting things could be done with that. Whether it's a good idea, I don't know. The drawback is, it would be much harder to keep compatibility with the next Lua VM versions.
So, the files borrowed from Yueliang are lopcode.lua, lcode.lua, and ldump.lua. They've been quickly hacked to produce 5.1 bytecode rather than 5.0. Finally, compile.lua, which uses the previous modules to convert AST to bytecode, is a grossly abused version of Yueliang's lparse.lua.

Notice a big current limitation: it only dumps little endian, double floats, 32 bits integers based bytecode. 64 bit and/or big endian platforms are currently not supported (although it shouldn't be hard to extend ldump.lua to handle those).

@.2.5  The bootstrapping process


@.3  Abstract Syntax Tree grammar

block: { stat* }

| `Do{ block }
| `Set{ {lhs+} {expr+} }
| `While{ expr block }
| `Repeat{ block expr }
| `If{ (expr block)+ block? }
| `Fornum{ ident expr expr expr? block }
| `Forin{ {ident+} {expr+} block }
| `Local{ {ident+} {expr+}? }
| `Localrec{ {ident+} {expr+}? }
| `Goto{string}
| `Label{string}
| `Return
| `Break
| apply

| `Nil | `Dots | `True | `False
| `Number{ number }
| `String{ string }
| `Function{ { ident* `Dots? } block } 
| `Table{ ( `Pair{ expr expr } | expr )* }
| `Op{ binopid expr expr } | `Op{ unopid expr }
| `Paren{ expr }
| `Stat{ block expr }
| apply
| lhs

| `Call{ expr expr* }
| `Invoke{ expr `String{ string } expr* }

lhs: ident | `Index{ expr expr }

ident: `Id{ string }

binopid: "add" | "sub" | "mul"    | "div"
       | "mod" | "pow" | "concat" | "eq"
       | "lt"  | "le"  | "and"    | "or"

unopid:  "not" | "len" | "unm"

for instance OCaml, SML, Haskell or Erlang.
It is actually the same syntax as OCaml's, except that we introduced an explicit end terminator, to stay homogeneous with Lua.
Pure Lua would probably be a much better idea: it would keep bootstrapping issues trivial.

Previous Up