Dynamic nginx upstreams with Lua and Redis

In my previous post i tried to implement a simple yet flexible functionality to route HTTP requests to different backends based on data from redis. With a fairly small amount of code it seemed like a good solution, but after some tests in real world i decided not to use it. For some reason (i didnt investigate) the requests were hanging on node's side even though the backend (rails) responded in a proper manner. The issue might have been with response buffers and in order to fix it i had to restart the whole server, which means that all cached data would be dropped.

Today, i digged around lua extension for nginx. I'm not familiar with Lua language but it turned out to be pretty simple. Extension has a lot of functionality and is really flexible in terms of dealing with headers and responses. Plus, each lua command (inline or from external file) runs in its own isolated environment.

Basically, i wanted to "replicate" functionality of my node.js alternative. First thing, you'll have to compile nginx with third-party modules. I was using a clean Ubuntu 11.10 64-bit Vagrant box. You can check more boxes here.

Setup

To build nginx from source we need to install all dev packages first:

apt-get -y update && apt-get -y upgrade
apt-get install -y build-essential autoconf libssl-dev curl libcurl4-gnutls-dev zlib1g zlib1g-dev libxml2 libxml2-dev libxslt-dev libreadline6-dev

Also, we need Lua:

apt-get install -y lua5.1 liblua5.1-0 liblua5.1-0-dev
# Make sure to symlink library so linker can find it 
ln -s /usr/lib/x86_64-linux-gnu/liblua5.1.so /usr/lib/liblua.so

Nginx complete installation script:

#!/bin/bash

# Download all files
wget -O "nginx-1.0.15.tar.gz" "http://nginx.org/download/nginx-1.0.15.tar.gz"
wget -O "pcre-8.30.tar.gz" "ftp://ftp.csx.cam.ac.uk/pub/software/programming/pcre/pcre-8.30.tar.gz"
wget -O "zlib-1.2.7.tar.gz" "http://zlib.net/zlib-1.2.7.tar.gz"
wget -O "openssl-1.0.1c.tar.gz" "http://www.openssl.org/source/openssl-1.0.1c.tar.gz"
wget -O "nginx_devkit.tar.gz" "https://github.com/simpl/ngx_devel_kit/tarball/v0.2.17rc2"
wget -O "nginx_lua.tar.gz" "https://github.com/chaoslawful/lua-nginx-module/tarball/v0.5.0rc31"

# Extract everything
tar -zxf nginx-1.0.15.tar.gz
tar -zxf pcre-8.30.tar.gz
tar -zxf zlib-1.2.7.tar.gz
tar -zxf openssl-1.0.1c.tar.gz
tar -zxf nginx_devkit.tar.gz
tar -zxf nginx_lua.tar.gz

# Configure and build nginx
cd nginx-1.0.15
export LUA_LIB=/usr/lib
export LUA_INC=/usr/include/lua5.1

./configure --prefix=/usr/local/nginx \
            --with-http_gzip_static_module \
            --with-http_ssl_module \
            --with-http_stub_status_module \
            --with-zlib=../zlib-1.2.7 \
            --with-pcre=../pcre-8.30 \
            --with-openssl=../openssl-1.0.1c \
            --add-module=../simpl-ngx_devel_kit-bc97eea \
            --add-module=../chaoslawful-lua-nginx-module-ead78b7

# Build and install
make && make install

We have nginx ready to roll. Installed under /usr/local/nginx folder.

Lua extensions

# Install socket library
apt-get install liblua5.1-socket2

# Install redis library
mkdir -p /usr/local/lib/lua/5.1
cd /usr/local/lib/lua/5.1
wget https://raw.github.com/nrk/redis-lua/version-2.0/src/redis.lua

Dynamic Upstreams

Now, to the implementation. The idea behind dynamic upstreams is to simply invoke a standard proxy_pass to a backend address:port pair that was defined in redis key. Lets assume that we have a few sample keys:

site1.com 192.168.1.1:3000
site2.com 192.168.1.2:4000

Nginx configuration should look like this:

server {
  listen 80;
  server_name _;
  server_name_in_redirect off;
  port_in_redirect off;
  root /root/html;

  location / {
    set $upstream "";
    rewrite_by_lua '
      -- load global route cache into current request scope
      -- by default vars are not shared between requests
      local routes = _G.routes

      -- setup routes cache if empty
      if routes == nil then
        routes = {}
        ngx.log(ngx.ALERT, "Route cache is empty.")
      end

      -- try cached route first
      local route = routes[ngx.var.http_host]
      if route == nil then
        local redis  = require "redis"
        local client = redis.connect("localhost", 6379)
        route        = client:get(ngx.var.http_host)
      end

      -- fallback to redis for lookups
      if route ~= nil then
        ngx.var.upstream = route
        routes[ngx.var.http_host] = route
        _G.routes = routes
      else
        ngx.exit(ngx.HTTP_NOT_FOUND)
      end
    ';

    proxy_buffering             off;
    proxy_set_header            Host $host;
    proxy_set_header            X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_redirect              off;
    proxy_connect_timeout       10;
    proxy_send_timeout          30;
    proxy_read_timeout          30;
    proxy_pass                  http://$upstream;
  }
}

If you notice, there is a _G variable used, which points to the global environment. This quick hack allows us to to persist route data between requests since lua extensions works only in per-request mode.

To test if it works set a sample route (which leads to google page):

SET localhost 74.125.225.136:80

Test:

curl -i -X HEAD "http://127.0.0.1/" # -> 404 page
curl -i -X HEAD "http://localhost/" # -> shows content from google

Summary

With a little overhead in form of custom compilation process, nginx could be used as a base for more advanced routing logic powered by Lua. Maybe it isn't that flexible as any other programming language for web, as node.js or ruby, but it surely battle-hardened and proved itself as a stable and performant solution. In fact, the modifications made does not really affect any routing logic under the hood and only redefine static upstream declaration. Give it a try.