Meet Mentat the thinking machine which thinks about how best to serve a website on a local machine when working on it. I was reading Dune at about the time I was working on putting the compose file together so there are a bunch of references in there. Except the passwords, no these are all left as the defaults ;). At the moment Mentat is not available on the internet, there isn’t anything super special but currently there isn’t a nice neat way to share it.

Mentat consists of two parts really :

  1. A Docker compose file
  2. A bash script to “provision” new sites

The compose file

The logic being (and this is different from what other projects do) that there is one set of docker containers running all the websites rather than a compose file for each project. The compose file runs the following services :

  • nginx
  • php8.0 (php-fpm, composer, wp-cli)
  • mariadb
  • elasticsearch (off by default)

The bash script

With each new project having its own nginx.conf configuration generated from the bash script. The bash script will also pop an entry in /etc/hosts for you although it won’t manage/remove any of this if a project is removed. The script is called ./thufir (get it!?) and is some really nice hacky bash. The projects are defined in their own file which only holds an associative array:

#!/usr/bin/env bash
declare -A Sites;


There is some pretty simple templating which takes this data and feeds it through some text files to generate the nginx/site-confs/domain.tld.conf as well as including an entry in /etc/hosts.

The containers

Most of the containers are off the shelf containers from Docker Hub. WordPress provide some good ones which have everything you need to run WordPress. Perhaps a little too much as it installed WordPress for everything, even if you are only wanting to use wp-cli. A good starting point for anything Docker is to look up containers on as they do a good job standardising a host of services to use with good documentation.

Recently I took the plunge and wrote a custom Dockerfile which was based on the PHP file. This allowed me to get a version of PHP that doesn’t kick up any warnings that is suitable for both WordPress and Laravel applications. It also allowed me to bundle in wp-cli as well as composer to make running these commands easier and quicker than before. It really was quicker. We save about 1 second for every command as it no longer has to spin up, run and spin down a container each time. The PHP container is already running you see?


This service is used on one or two websites but takes up a bunch of resources to keep it running. For this reason the service is off by default. You can turn it on using its profile :

sudo docker compose --profile elasticsearch up -d && sudo docker-compose restart

# accessible at http://docker:9200

PHP commands

In the PHP 8 container there is php-fpm for actually running the sites, this is used by nginx and referenced in the nginx configuration files for each site. If you want to run a different version then you can spin up another PHP container and update the relevant configuration to point to that container instead.

You can run PHP scripts inside the container using docker exec. For example, running an artisan migration might look like this :

$ sudo docker exec -it php80 /bin/bash

> cd /srv/www/project/public/

> php artisan migrate


I am pretty pleased with the way this works and have been adding improvements and conveniences over time. I think there is still scope to make it a bit better so I think it will trundle along nicely for a bit yet. What will push me to change is something like nix which is reproducible (it works on your machine now) and perhaps doesn’t take so much disk space.

Memory Lane (Bonus Content)

Here is a bit of a bonus, these are largely all the server stacks I have used for local web development over the years. I might have skipped one or two things out here and there but largely this is how it has gone.

  1. Browser files
  2. MAMP
  3. Codekit
  4. Vagrant
  5. Docker

Browser files

I have a huge fondness for these days. In the same way you remember your Nokia 5140. We used to write a file with a .html extension and drag that into the browser. It was easy. I remember the day I learnt that a server looks for a file called index.html and having to rethink how I named my stuff.


YAWN. It worked fine. Bit boring though. We did set it up so it ran on port 80 which required you to enter your admin password when you started or stopped the server. This was enough to make me want to look else where. That and I didn’t understand it very well. Probably would have been happy on MAMP for longer than we were on it tbh.


This was the first step which made me rethink my approach. Why was I doing all this trial and error to get this working when MAMP was doing fine. I imagine the logic was that by learning how it all pieced together made me a better tech person. Which is somewhat true but I don’t know if it is a bit self fulfilling. I wouldn’t need to be better at tech if we stuck to tools like MAMP and Codekit. Perhaps it is a bit like hiding the office stationary, they can’t fire you if you are the only one who knows where it is.

It took (not even kidding) about 15 minutes to build each morning when you turned it on. I learnt a lot about how a server needed to be set up and how to configure each piece which did work out well on at least one occasion. I was able to provision and deploy a new service whilst on the phone to a client to test what they were asking about. Good brownie points that day.

Ultimately I got really tired of debuggin why vagrant got slow (how it mounted file systems) and waiting an age for each change to take effect (no persistence really, it needed to check everything every time.

Which leads us to …


Hello! I really resisted docker and probably will one day add another technology stack to this list as switching would be easier than dealing with its failings. The major shortcomings are :

  • Too much guess work
  • Networking is hard at the best of times
  • Too much persistence (when do my changes take effect?!)
  • Doesn’t actually solve the problem of “it works on my machine” (better than Vagrant did though)
  • It takes a lot of disk space.