Installing OpenWRT on a Netgear D7800 (Nighthawk X4S) router

I had blogged back in October of last year about setting up DNS over HTTPS, and it’s been very reliable, except for the parts where I’ve had to run Software Update on the Mac mini to pick up security update, and while it’s restarting all of our DNS resolution stops working! I’d come across OpenWRT a while back, which is an open-source and very extensible firmware for a whole variety of different routers, but I did a bunch of searching and hadn’t come across any reports of people fully-successfully using it on our specific router, the Netgear D7800 (also known as the Nighthawk X4S), just people having various problems. One of the reasons I was interested in OpenWRT because it’s Linux-based and extensible and I would be able to move the DHCP and DNS functionality off the Mac mini back onto the router where it belongs, and in theory bring the encrypted-DNS over as well.

I finally bit the bullet and decided to give installing it a go today, and it was surprisingly easy. I figured I’d document it here for posterity and in the hopes that it’ll help someone else out in the same position as I was.

Important note: The DSL/VDSL modem in the X4S is not supported under OpenWRT!

Installation

  1. Download the firmware file from the “Firmware OpenWrt Install URL” (not the Upgrade URL) on the D7800’s entry on OpenWRT.org.
  2. Make sure you have a TFTP client, macOS comes with the built-in tftp command line tool. This is used to transfer the firmware image to the router.
  3. Unplug everything from the router except power and the ethernet cable for the machine you’ll be using to install OpenWRT from (this can’t be done wirelessly).
  4. Set your machine to have a static IP address in the range of 192.168.1.something. The router will be .1.
  5. Reset the router back to factory settings by holding the reset button on the back of it in until the light starts flashing.
  6. Once it’s fully started up, turn it off entirely, hold the reset button in again and while still holding the button in, turn the router back on.
  7. Keep the reset button held in until the power light starts flashing white.

Now the OpenWRT firmware file needs to be transferred to the router via TFTP. Run tftp -e 192.168.1.1 (-e turns on binary mode), then put <path to the firmware file>. It’ll transfer the file and then install it and reboot, this will take several minutes.

Once it’s up and running, the OpenWRT interface will be accessible at http://192.168.1.1, with a username of root and no password. Set a password then follow the quick-start guide to turn on and secure the wifi radios — they’re off by default.

Additional dnsmasq configuration and DNS-over-TLS

I mentioned in my DNS-over-HTTPS post that I’d also set up dnsmasq to do local machine name resolution, this is very trivially set up in OpenWRT under Network > DHCP and DNS and putting in the MAC address and desired IP and machine name under the Static Leases section, then hitting Save & Apply.

The other part I wanted to replicate was having my DNS queries encrypted. In OpenWRT this isn’t easily possible with DNS-over-HTTPS, but is when using DNS-over-TLS, which gets you to the same end-state. It requires installing Stubby, a DNS stub resolver, that will forward DNS queries on to Cloudflare’s DNS.

  1. On the router, go to System > Software, install stubby.
  2. Go to System > Startup, ensure Stubby is listed as Enabled so it starts at boot.
  3. Go to Network > DHCP and DNS, under “DNS Forwardings” enter 127.0.0.1#5453 so dnsmasq will forward DNS queries on to stubby, which in turns reaches out to Cloudflare; Cloudflare’s DNS servers are configured by default. Stubby’s configuration can be viewed at /etc/config/stubby.
  4. Under the “Resolv and Hosts Files” tab, tick the “Ignore resolve file” box.
  5. Click Save & Apply.

Many thanks to Craig Andrews for his blog post on this subject!

Quality of Service (QoS)

The last thing I wanted to set up was QoS, which allows for prioritisation of traffic when your link is saturated. This was pretty straightforward as well, and just involved installing the luci-app-sqm package and following the official OpenWRT page to configure it!

Ongoing findings

I’ll update this section as I come across other little tweaks and changes I’ve needed to make.

Plex local access

We use Plex on the Xbox One as our media player (the Plex Media Software runs on the Mac mini), and I found that after installing OpenWRT on the router, the Plex client on the Xbox couldn’t find the server anymore despite being on the same LAN. I found a fix on Plex’s forums, which is to go to Network > DHCP and DNS, and add the domain plex.direct to the “Domain whitelist” field for the Rebind Protection setting.

Xbox Live and Plex Remote Access (January 2020)

Xbox Live is quite picky about its NAT settings, and requires UPnP to be enabled or you can end up with issues with voice chat or gameplay in multiplayer, and similarly Plex’s Remote Access requires UPnP as well. This isn’t provided by default with OpenWRT but can be installed with the luci-app-upnp and the configuration shows up under Services > UPnP in the top navbar. It doesn’t start by default, so tick the “Start UPnP and NAT-PMP service” and “Enable UPnP” boxes, then click Save & Apply.

Upgrading to a new major release (February 2020)

When I originally wrote this post I was running OpenWRT 18.06, and now that 19.07 has come out I figured I’d upgrade, and it was surprisingly straightforward!

  1. Connect to the router via ethernet, make sure your network interface is set to use DHCP.
  2. Log into the OpenWRT interface and go to System > Backup/Flash Firmware and generate a backup of the configuration files.
  3. Go to the device page on openwrt.org and download the “Firmware OpenWrt Upgrade” image (not the “Firmware OpenWrt Install” one).
  4. Go back to System > Backup/Flash Firmware, choose “Flash image” and select your newly-downloaded image.
  5. In the next screen, make sure “Keep settings and retain the current configuration” is not ticked and continue.
  6. Wait for the router light to stop flashing, then renew your DHCP lease (assuming you’d set it up to be something other than 192.168.1.x like I did).
  7. Log back into the router at http://192.168.1.1 and re-set your root password.
  8. Go back to System > Backup/Flash Firmware and restore the backup of the settings you made (then renew your DHCP lease again if you’d changed the default range).

I had a couple of conflicts with files in /etc/config between my configuration and the new default file, so I SSHed in and manually checked through them to see how they differed and updated them as necessary. After that it was just a case of re-installing the luci-app-sqm, luci-app-upnp, and stubby packages, and I was back in business!

Cardiovascular health and a new shiny: Apple Watch Series 5

We bought a treadmill back at the start of 2014 and it came with a heart rate monitor that you wear around your chest, which is pretty cool. I gave the treadmill a pretty good going and was doing one of those Couch to 5K programs, but I keep having issues with my knees where running messes one of them up. We bought an elliptical in August last year — which apparently I didn’t post about here — and I’ve been thoroughly enjoying using it. The one we have has a tablet holder right at eye level so I’ve been watching TV shows on Netflix while using it, and it really helps pass the time.

The downside was that I had no heart rate monitor, as the one that came with the treadmill only works with the treadmill (it shows your current heart rate right alongside the distance and estimated calories burned and such). I’d been going pretty hard on it but had noticed that I was getting some heart palpitations, and had a couple of feeling-dizzy moments a while after I’d finished exercising. I went to the doctor and she suggested cutting down on caffeine to start with — I was on four admittedly only instant coffees a day — and see if that improves things to start with, and if not we could get an EKG done.

Quite conveniently timed, the Apple Watch Series 5 was announced on the 10th of September this year, and it comes with an always-on display. Prior models had their display totally black and would only light up when you’d either raise your wrist or tap on the screen. I’d been eyeing the Apple Watch off for a couple of years, and finally decided I’d jump on board because it’d be usable as a regular watch even if the screen doesn’t fully light up. I got the 40m stainless steel with black leather Modern Buckle band and it looks classy as hell.

(I also realised after my first workout that I needed to get one of the cheaper non-leather bands as well because man do I get sweaty wrists when I’m exercising 😛).

Apple has been leaning pretty hard into the health thing with the Apple Watch in recent years, and as well as the heart rate monitor — which is constantly taking your heart rate periodically throughout the day as well as constantly when you start a workout — it comes with an app called “Activity” on the iPhone to help motivate you to keep moving. The way it works is that there’s three “rings” you should try to close each day, called Move, Exercise, and Stand. Move is just generally getting up and about and not sitting on your arse, and is set to 1422kJ for me based on my height and weight. Exercise is 30 minutes of brisk movement — I walk fast enough that I get a few minutes counted towards it each time I’m walking to or from the station or taking the stairs at work. The stand goal is standing up and moving for at least a minute during a one hour period for 12 separate hours during the day, and if you’ve been sitting around for 50 minutes in a given hour you get a little buzz on your wrist at ten minutes to the next hour that reminds you to stand up and move around a bit.

Apple must have done a whole lot of psychological research into what’s most satisfying in terms of motivation because god damn closing those rings feels good. You get a little round fireworks animation of the given colour of ring when you fully complete one for the day, and the one with all three when you’ve finished all of them. I bought the Watch on the 23rd of September and every single day since then I’ve closed all three rings! You get little badges called “Awards” when you complete certain goals, like getting a full week of closing all three rings, which has meant that when I’ve been working from home I’ve been jumping on the treadmill or elliptical for just a quick half hour to get my exercise goal done. I also downloaded an app for the Watch called HeartWatch that gives you a little speedometer of heart rate when you’re exercising and ensures you keep it in the correct zone — not too fast and not too slow — for what you’re trying to do, in my case just generally be fitter.

I completed October with every single day’s rings fully closed, which I’m pretty chuffed about!

A screenshot of the Activity app for Apple Watch showing every single day in October having all three rings closed

We’d also bought a set of smart scales last year that sync with the Health app on iOS, I’ve been weighing myself each morning and as a result of all of this fitness I’m hovering around 70.2kg, which is a weight I don’t recall being for many years now; I was at 82kg a few years back. The heart palpitations have definitely decreased as well and I haven’t had any dizziness since I’ve been monitoring what my heart rate has been while exercising.

I don’t do much by way of outdoor exercising, but the Apple Watches all come with GPS as well so you can keep maps of the routes you’ve taken and see the speed you did during each section. Overall I’m wildly impressed with this bit of technology! I hadn’t worn a watch since about 2001 when I got a job and bought my first mobile phone, but now I feel naked without it, haha.

An artistic update

An artistic update

I posted back in February about some of the stuff I’d been doing in Procreate on my iPad, and I’m overdue for another post! I haven’t been doing as much in the intervening months, as there’s been lots of other things taking up my time and I haven’t felt as inspired but I still managed to do a few.

I’ve quite enjoy using Procreate’s Acrylic brush, you can get some really nice layer and lighting effects with it, and I used only that brush for this one:

A painting of a window at night, from inside a room. There's sheer curtains over the window, a candle is on a small table at the right casting light, and there's a tall cupboard at the left in the shadows.
The Window

I don’t actually remember the brush I used for this next one, but I definitely took full advantage of Procreate’s symmetry guides so I could get it properly even:

A painting of a cybernetic woman, her eyes look like blue glass and she has green and very shiny "skin". She has a purple hood over the back of her head.
Cybernetic Woman

This next one is interesting, I was intending on the main structures that take up the top two-thirds of the image to look like a big craggy mountain range, but I showed it to Kristina and she can’t see it as anything but a tornado coming down!

A painting of a craggy grey mountain range in the top two-thirds of the image, with a river of fire making its way the whole way across the image, and a bunch of conifers at the bottom.
The River

I quite enjoy doing epic-looking landscapes, and this one ended up starting out in a very different place than it finished. It was much more brown, the feature in the middle was a river, and the sky was a sunset which I didn’t manage to get looking how I wanted. In the end it became very much inspired by the aesthetic of the Hive from Destiny!

A painting looking down a desolate grey rocky valley. A deep black rift runs down the middle with a sickly green glow at the bottom, at the left is a crystal embedded in the ground with the same green glow coming from it. At the right is a cave entrance in the valley wall with another glowing crystal. The sky is awash with stars, and the moon peeks from behind the valley peak at the far left.
The Emergence

The paintings above were all done from about March to half-way through May, then there was a bit of a break until July.

I decided to take advantage of Procreate’s drawing guide again, this time with the perspective guide. I was aiming for buildings in a futuristic city but the thing that I always struggle with is details and a sense of scale, so it didn’t turn out to be anything but big blocks. ? Still pleased with the shadows and sense of lighting though.

A very clean geometric painting of grey and blue city buildings. The sky is purple and the light is coming from the very right, the buildings casting shadows to the left.
City Buildings

This next one I did as “speed-painting”, and did it in about 45 minutes! It was a combination of the acrylic brush and a palette knife brush from a big third-party brush pack I bought.

A painting of a volcano erupting atop a hill, the hill is surrounded by taller mountains all around, and the sky above is filled with striated dark orange clouds.
Volcano

Then lastly, this one was done in August, again with Procreate’s symmetry guide on! I was going to give her a witch’s hat but couldn’t get it looking right.

A head and shoulders portrait painting of a white woman with piercing green eyes, long red hair, and dark green lipstick. She’s wearing a dark purple top, and there’s a bright light shining behind her that’s lighting up her shoulders and the very edges of her hair.
The Witch

I also had a burst of inspiration and got some more miniature painting done! I’m still working my way through the Dark Imperium box set I got nearly two years ago, but the main impetus here was Games Workshop releasing their “Contrast” line of paints. They’re essentially a base coat plus wash combined into one single coat, and they’re seriously incredible. Dark Imperium comes with twenty poxwalkers which I was dreading having to paint, but the Contrast paints made them far quicker to deal with! There’s twenty models (but only ten unique ones), and I’ve done half of them so far.

As part of doing this, I also discovered how much better the miniatures look when you apply a varnish to them! The Contrast paint specifically comes off a lot more easily than regular paint, so varnish is a necessity, but it also really makes the colours pop, they’re a lot more vibrant than without it.

Poxwalker 1
Poxwalker 2
Poxwalker 3
Poxwalker 4
Poxwalker 5

I also finally finished off the Plague Marine champion that’d been sitting there mostly-finished for months, and I’m really happy with the base I did. I had a bunch of really old Space Marines from a starter painting box that a friend had given me, so I sacrificed one of them and cut him up to adorn the base, and it looks absolutely fantastic.

Plague Marine Champion

It’s fascinating seeing the evolution of Games Workshop’s plastic miniatures, back when I started (*cough*24 years ago*cough*) plastic was the cheap and crappy option, and the pewter (or lead as they were back then!) miniatures were much more detailed. Nowadays it’s very much the reverse, the plastic is INSANELY detailed — have a look at the full-size poxwalkers on Flickr and zoom all the way in — and the pewter ones are a bit shit by comparison.

There’s also a small-scale Warhammer 40,000 game called Kill Team that I’ve started playing at work with some people, and have bought the new box set that was released in September. It’s similar to Shadespire in that your squads only have a small number of miniatures so it’s much more feasible to get them painted, but it comes with a bunch of absolutely amazing-looking terrain. I put it together and took a couple of photos prior to it being painted, just to get a sense of the scale and what the terrain looks like.

A photo of some Death Guard and Space Wolves miniatures on the new Kill Team starter box terrain. The terrain itself is unpainted grey plastic but is towering over the miniatures and has a very steampunk aesthetic to it.
A photo of some Death Guard and Space Wolves miniatures on the new Kill Team starter box terrain. The terrain itself is unpainted grey plastic but is towering over the miniatures and has a very steampunk aesthetic to it.

I’ve finished painting a couple of pieces of it, but it’s so big that I don’t have a large enough white backdrop that’ll fit the whole terrain piece! Photos will definitely be forthcoming once I do get said backdrop though. ?

Installing Linux Mint 19.1 on a Late-2010 MacBook Air

Installing Linux Mint 19.1 on a Late-2010 MacBook Air

(Update December 2022: As suggested in the latest comments, this entire blog post is pretty much redundant now! Linux Mint 21.1 installs without a hitch, even using Cinnamon, and I have fully-functional brightness and sound keys straight out of the box.)

(Update December 2020: I successfully upgraded from Linux Mint 19.3 to Linux Mint 20 by following the official Linux Mint instructions. The only additional post-upgrade work I had to do was re-adding the Section "Device" bit to /usr/share/X11/xorg.conf.d/nvidia-drm-outputclass-ubuntu.conf as described below to get the brightness keys working again.)

(Update May 2020: I’ve re-run through this whole process using Linux Mint 19.3 and have updated this blog post with new details. Notably, no need to install pommed, and including the specific voodoo needed for the 2010 MacBook Air from Ask Ubuntu regarding PCI-E bus identifiers.)

We have a still perfectly usable Late-2010 MacBook Air (“MacBookAir3,2”, model number A1369), but with macOS 10.14 Mojave dropping support for Macs older than 2012 (it’s possible to extremely-hackily install it on older machines but I’d rather not go down that route), I decided I’d try installing Linux on it. The MacBook Air still works fine, if a bit slow, on macOS 10.13 but I felt like a bit of nerding!

Installation

My distribution of choice was Linux Mint, which is Ubuntu-based but less with the constant changes that Canonical keep making. The first hurdle right out of the gate was which “edition” to choose: Cinnamon, MATE, or xfce. There was zero info on the website about which to choose, I started with Cinnamon but that kept crashing when booting from the installation ISO and giving me a message about being in fallback mode. It turns out Cinnamon is the one with all the graphical bells and whistles, and it appears that an eight-year ultralight laptop’s video card isn’t up to snuff, so I ended up on “MATE” edition, which looks pretty much identical but works fine.

My installation method was using Raspberry Pi Imager to write the installation ISO to a spare SD card (despite the name, it can be used to write any ISO: scroll all the way down in the “Choose OS” dialog and select “Use custom”). Installing Linux requires you to partition the SSD using Disk Utility, I added a 2GB partition for the /boot partition, and another 100GB to install Linux itself onto. It doesn’t matter which format you choose as it’ll be reformatted as part of the installation process.

After partitioning, reboot with the SD card in and the Option key held down, and choose the “EFI Boot” option. The installer is quite straightforward, but I chose the custom option when it asked how to format the drive, formatted both the 2GB and 100GB partitions as ext4, with the 2GB one mounted at /boot and the 100GB at /. The other part is to install the bootloader onto that /boot partition, to make it easy to get rid of everything if you want to go back to single-partition macOS and no Linux.

Post-install

The next hurdle was video card drivers. Mint comes with an open-source video card driver called “Nouveau” which works but isn’t very performant, and there was lots of screen tearing as I’d scroll or move windows around. This being Linux, it was naturally not as simple as just installing the official Nvidia one and being done with, because that resulted in a black screen at boot. 😛 I did a massive amount of searching and eventually stumbled across this answer on AskUbuntu which worked where nothing else did: I followed those instructions and was able to successfully install the official Nvidia drivers without getting a black screen on boot!

(Update May 2020: I honestly don’t remember whether I had to go through Step 1 of Andreas’ instructions, “Install Ubuntu in UEFI mode with the Nvidia drivers”, but check for the existence of the directory /sys/firmware before running the rest of this. That directory is only created if you’ve booted in EFI mode. If it doesn’t exist, follow the link in Step 1).

I’m copying the details here for posterity, in case something happens to that answer, but all credit goes to Andreas there. These details are specifically for the Late 2010 MacBook Air with a GeForce 320M video card, so using this on something else might very well break things.

Create the file /etc/grub.d/01_enable_vga.conf and paste the following contents into it:

cat << EOF
setpci -s "00:17.0" 3e.b=8
setpci -s "02:00.0" 04.b=7
EOF

Then make the new file executable and update the grub config files:

$ sudo chmod 755 /etc/grub.d/01_enable_vga.conf
$ sudo update-grub

And then restart. Double-check that the register values have been set to 8 for the bridge device and 7 for the display device:

 $ sudo setpci -s "00:17.0" 3e.b
 08
 $ sudo setpci -s "02:00.0" 04.b
 07

Next, load up the “Driver Manager” control panel and set the machine to use the Nvidia drivers, once it’s finished doing its thing — which took a couple of minutes — restart once more, and you’ll be running with the much-more-performant Nvidia drivers!

At this point I realised that the brightness keys on the keyboard didn’t work. Cue a whole bunch more searching, with fix being to add the following snippet to the bottom of /usr/share/X11/xorg.conf.d/nvidia-drm-outputclass-ubuntu.conf:

Section "Device"
  Identifier     "Device0"
  Driver         "nvidia"
  VendorName     "NVIDIA Corporation"
  BoardName      "GeForce 320M"
  Option         "RegistryDwords" "EnableBrightnessControl=1"
EndSection

And now I have a fully-functioning Linux installation, with working sleep+wake, audio, wifi, and brightness!

I’m certainly not going to be switching to it full-time, and it feels like a lot more fragile than macOS, but it’s fun to muck around with a new operating system. And with 1Password X, I’m able to use 1Password within Firefox under Linux too!

Nginx, PHP-FPM, and Cloudflare, oh my!

I use my Linode to host a number of things (this blog and Kristina’s, my website and Kristina’s, an IRC session via tmux and irssi for a friend and me, and probably another thing or two I’m forgetting). Kristina started up a travel blog a few months ago which I’m also hosting on it, and shortly after that point I found that maybe once every two weeks or so my website and our blogs weren’t running anymore. I looked into it and it was being caused by Linux’s Out-Of-Memory Killer, which kicks in when the system is critically low on memory and needs to free some up, killing the Docker container that my website runs in as well as MariaDB.

The main cause was Apache and MariaDB using up entirely too much memory for my little 1GB Linode, it was evidently just sitting on this side of stable with two WordPress blogs but adding a third seems like it tipped it over the edge. The reason MariaDB and my website’s Docker container were being killed is because although Apache was using up a heap of memory it was spread over a number of worker threads, so individually none of those were high, and MariaDB and my website were the largest on the list. There’s lots of tweaks you can do, several of which I tried, but all that happened was that it delayed the inevitable rather than entirely resolving it. Apache is powerful but low-resource-usage it ain’t. The primary low-resource-usage alternative to Apache is Nginx, so I figured this weekend I’d have a crack at moving over to that.

Overall it was pretty straightforward, this guide from Digital Ocean was a good starting point, the bits where it fell short was mostly just a case of looking up all of the equivalent directives for SSL, mapping to filesystem locations, etc. (I have ~15 years of history of hosted images I’ve posted on the Ars Technica  forums and my old LiveJournal—which is now this blog—and wanted to make sure those links all kept working). 

One difference is with getting WordPress going… WordPress is all PHP, and Apache by default runs PHP code inside the Apache process itself via mod_php, whereas when you’re using Nginx you have to be using PHP-FPM or similar which is an entirely separate process that runs on the server and that Nginx talks to to process the PHP code. I mostly followed this guide, also from Digital Ocean though there were a couple of extra gotchas I ran into when getting it fully going with Nginx for WordPress:

  • Edit /etc/nginx/fastcgi_params and add a new line with this content or you’ll end up with nothing but an empty blank page: fastcgi_param PATH_TRANSLATED $document_root$fastcgi_script_name;
  • Remember to change the ownership of the WordPress installation directory to the nginx user instead of  apache
  • The default settings for PHP-FPM assume it’s running on a box with significantly more than 2GB of RAM; edit /etc/php-fpm.d/www.conf and change the line that says pm = dynamic to be pm = ondemand; with ondemand PHP-FPM will spin up worker processes as needed but will kill off idle ones after ten seconds rather than leaving them around indefinitely.

Additionally, Nginx doesn’t support .htaccess files so if you’ve got WordPress set up to use any of the “pretty”-type links, you’ll end up with 404s when you try to view an individual post instead. The fix is to put the following into the server block at the bottom:

location / {
  try_files $uri $uri/ /index.php?$args;
}

So it’ll pass the correct arguments to WordPress’ index.php file. You’ll also want to block access to any existing .htaccess files as well:

location ~ /\.ht {
  deny all;
}

The last thing I did with this setup was to put the entirety of my website, Kristina’s, and our respective blogs behind Cloudflare. I had great success with their DNS over HTTPS service, and their original product is essentially a reverse proxy that caches static content (CSS, Javascript, images) at each of their points of presence around the world so you’ll load those from whichever server is geographically closest to you. For basic use it’s free, and includes SSL, you just need to point your domain’s nameservers at the ones they provide. The only thing I needed to do was to set up another DNS record so I could actually SSH into my Linode, because now the host virtualwolf.org resolves to Cloudflare’s servers which obviously don’t have any SSH running!

Overall, the combination of Nginx + PHP-FPM + Cloudflare has resulted in remarkably faster page loads for our blogs, and thus far significantly reduced memory usage as well.

GPG and hardware-based two-factor authentication with YubiKey

As part of having an Ars Technica Pro++ subscription, they sent me a free YubiKey 4, which is a small hardware token that plugs into your USB port and allows for a bunch of extra security on your various accounts because you need the token physically plugged into your computer in order to authenticate. It does a number of neat things:

  • Generating one-time passwords (TOTP) as a second-factor when logging in to websites;
  • Storing GPG keys;
  • Use as a second-factor with Duo;

And a bunch of other stuff as well, none of which I’m using (yet).

My password manager of choice is 1Password, and although it allows saving one-time passwords for websites itself, I wanted to lock access to the 1Password account itself down even further. Their cloud-based subscription already has strong protection by using a secret key in addition to your strong master password, but you can also set it up to require a one-time password the first time you log into it from a new device or browser so I’m using the YubiKey for that.

I also generated myself GPG keys and saved them to the YubiKey. It was not the most user-friendly process in the world, though that’s a common complaint that’s levelled at GPG. I found this guide that runs you through it all and, while long, it’s pretty straightforward. It’s all set up now, though, my public key is here and I can send and receive encrypted messages and cryptographically sign documents, and the master key is saved only on an encrypted USB stick. You can also use the GPG agent that runs on your machine and reads the keys from the YubiKey to also be used for SSH, so I’ve got that set up with my Linode.

The last thing I’ve done is to set the YubiKey up as a hardware token with Duo and put my Linode’s SSH and this blog (and soon Kristina’s, though hers not with the YubiKey) behind that. With the Duo Unix module, even sudo access requires the YubiKey, and the way that’s set up is that you touch the button on the YubiKey itself and it generates a code and enters it for you.

It’s all pretty sweet and definitely adds a bunch of extra security around everything. I’m busily seeing what else I can lock down now!

Setting up DNS over HTTPS on macOS

Back in April, Cloudflare announced a privacy-focused DNS server running at 1.1.1.1 (and 1.0.0.1), and that it supported DNS over HTTPS. A lot of regular traffic goes over HTTPS these days, but DNS queries to look up the IP address of a domain are still unencrypted, so your ISP can still snoop on which servers you’re visiting even if they can’t see the actual content. We have a Mac mini that runs macOS Server and does DHCP and DNS for our home network, among other things, and with the impending removal of those functions and their suggested replacements with regular non-UI tools with a upcoming version of it, I figured now would be a good time to look into moving us over to use Cloudflare’s shiny new DNS server at the same time.

Turns out it wasn’t that difficult!

Overview

  1. Install Homebrew.
  2. Install cloudflared and dnsmasq: brew install cloudflare/cloudflare/cloudflared dnsmasq
  3. Configure dnsmasq to point to cloudflared as its own DNS resolver.
  4. Configure cloudflared to use DNS over HTTPS and run on port 54.
  5. Install both as services to run at system boot.

Configuring dnsmasq

Edit the configuration file located at /usr/local/etc/dnsmasq.conf and uncomment line 66 and change it from server=/localnet/192.168.0.1 to server=127.0.0.1#54 to tell it to pass DNS requests onto localhost on port 54, which is where cloudflared will be set up.

Configuring cloudflared

Create the directory /usr/local/etc/cloudflared and create a file inside that called config.yml with the following contents:

no-autoupdate: true
proxy-dns: true
proxy-dns-port: 54
proxy-dns-upstream:
  - https://1.1.1.1/dns-query
  - https://1.0.0.1/dns-query

Auto-update is disabled because that seems to break things when the update occurs, and the service doesn’t start back up correctly.

Configuring dnsmasq and cloudflared to start on system boot

dnsmasq: sudo brew services start dnsmasq will both start it immediately and also set it to start at system boot.

cloudflared: sudo cloudflared service install, which installs it for launchctl at /Library/LaunchDaemons/com.cloudflare.cloudflared.plist.

Updating your DNS servers

Now that dnsmasq and cloudflared are running, you need to actually tell your machines to use them as their DNS servers! Open up System Preferences > Network, hit Advanced, and in the DNS tab click the + button and put your computer’s local IP address in. (You’ll want to make sure your machine has a static IP address, of course). Repeat the process for everything else on your local network to have them all send their DNS traffic to 1.1.1.1 as well.

You can confirm that all your DNS traffic is going where it should be with dnsleaktest.

And done!

I was surprised at how straightforward this was. I also didn’t realise until I was doing all of this that dnsmasq also does DHCP, so with the assistance of this blog post I’ve also replaced the built-in DHCP server on the Mac mini and continue to have full local hostname resolution as well!

The spiritual successor to SimCity, Cities: Skylines

I first played the original SimCity Classic back in the early 1990s on our old Macintosh LC II, and absolutely loved it. Laying out a city and watching it grow was extremely satisfying, and the sequel, SimCity 2000 was even more detailed. I played a bit of SimCity 4, which came out in 2003, but the latest entry in the series, titled just “SimCity“, by all accounts sucked. The maps were significantly smaller in size, and it required an internet connection and was multiplayer to boot.

It’s actually possible to play SimCity 2000 on modern machines and I definitely got stuck into it a few years ago. This is a screenshot of my most recent city!

Screenshot of SimCity 2000, zoomed out and showing as much of my city as possible.

If you’re wanting a proper modern SimCity 2000-esque experience though, Cities: Skylines is what you’re after. It came out in March of 2015 on desktop, and was ported to Xbox One in April of 2017 and they did a damned good job of it, the controls are all perfectly suited to playing on a controller as opposed to with a mouse and keyboard.

The level of detail of the simulation is fantastic, you can zoom all the way in and follow individual people (called “cims”, as opposed to SimCity’s “sims”) or vehicles and see where they’re going. There’s a robust public transport system and you can put in train lines (and buses, and trams, and a subway, and in the most recent expansion called Mass Transit, even monorails, blimps, and ferries!) and see the cims going to and from work, and how many are waiting at each station and so on.

We recently upgraded to the Xbox One X and a shiny new OLED 4K TV (quite the upgrade from our nine year-old 37″ giant-bezeled LCD TV!), and it makes for some very nice screenshots. These are from my largest city called Springdale, currently home to ~140k people!

Nostalgia and the Classic Mac OS

I’ve been a Mac user my entire life, originally just because my dad used them at his work and so bought them for home as well. My earliest memories are of him bringing his SE/30 home and playing around in MacPaint. We also had an Apple IIe that we got second-hand from my uncle that lived in my bedroom for a few years, though that doesn’t count as a Mac.

The first Mac my dad bought for us at home was the LC II in 1992 (I was 9!), and I can remember spending hours trawling through Microsoft Encarta being blown away at just how much information I could look up immediately. I also remember playing Shufflepuck Café and Battle Chess, and I’m sure plenty of others too that didn’t leave as large an impression. There was also an application that came with the computer called Mouse Practice that showed you how to use a mouse, and we had At Ease installed for a while as well until I outgrew it.

After the LC II we upgraded to the Power Macintosh 6200 in 1995, which among other things came with a disc full of demos on it including the original Star Wars: Dark Forces (which I absolutely begged my parents to get the full version of for Christmas, including promising to entirely delete Doom II which they were a bit disapproving of due to the high levels of gore), and Bungie’s Marathon 2: Durandal (which I originally didn’t even bother looking at for the first few months because I thought it was something to do with running!). Marathon 2 was where I first became a fan of Bungie’s games, and I spent many many hours playing it and the subsequent Marathon Infinity as well as a number of fan-made total conversions too (most notably Marathon:EVIL and Tempus Irae).

The period we owned the 6200 also marked the first time we had an internet connection as well (a whopping 28.8Kbps modem, no less!). The World Wide Web was just starting to take off around this time, I remember dialing into a couple of the local Mac BBSes but at that point they were already dying out anyway and the WWW quickly took over. The community that sprang up around the Marathon trilogy was the first online community I was really a member of, and Hotline was used quite extensively for chatting. Marathon Infinity came with map-making tools which I eagerly jumped into and made a whole bunch of maps and put them online. I was even able to dig up the vast majority of them, there’s only a couple of them that I’ve not been able to find. I have a vivid memory of when Marathon:EVIL first came out, it was an absolutely massive 20MB and I can recall leaving the download going at a blazing-fast 2.7KB/s for a good two or three hours, and constantly coming back to it to make sure it hadn’t dropped out or otherwise stopped.

After the Marathon trilogy, Bungie developed the realtime strategy games Myth: The Fallen Lords and its sequel Myth II: Soulblighter, both of which I also played the hell out of and was a pretty active member of the community in.

After the 6200 we then had a second-gen iMac G3 then a “Sawtooth” Power Mac G4 just for me as my sister and I kept arguing about who should have time on the computer and the Internet. 😛 The G4 was quite a bit of money as you’d imagine, so I promised to pay it back to dad as soon as I got a job and started working.

macOS (formerly Mac OS X then OS X) is obviously a far more solid operating system, but I’ve always had a soft spot for the Classic Mac OS even with its cooperative multitasking and general fragility. We got rid of the old Power Mac G4 probably eight years ago now (which I regret doing), and I wanted to have some machine capable of running Mac OS 9 just for nostalgia’s sake. Mum and dad still had mum’s old PowerBook G3 and I was able to get a power adapter for it and boot it up to noodle around in, but it was a bit awkwardly-sized to fit on my desk and the battery was so dead that if the power cord wasn’t plugged in it wouldn’t boot at all.

There was a thread on Ars Technica a few months ago about old computers, and someone mentioned that if you were looking at something capable of running Mac OS 9 your best bet was to get the very last of the Power Mac G4s that could boot to it natively, the Mirrored Drive Doors model. I poked around on eBay and found a guy selling one in mint condition, and so bought it as a present to myself for my birthday.

Behold!

Power Mac G4 MDD

Dual 1.25GHz G4 processors, 1GB of RAM, 80GB of hard disk space, and a 64MB ATI Radeon 8500 graphics card. What a powerhouse. 😛

There’s a website, Macintosh Repository, where a bunch of enthusiasts are collecting old Mac software from yesteryear, so that’s been my main place to download all the old software and games that I remember from growing up. It’s been such a trip down memory lane, I love it!

More miniatures: Warhammer 40,000 edition

Warhammer 40,000 used to be quite the complicated affair, lots of rules and looking things up on different tables to check what dice roll you needed for different effects, and needing many hours to finish a game. The 8th Edition of the game came out last year, and was apparently extremely streamlined and simplified and seems to have been received very well. Since I’d been doing well with Shadespire, I decided to get the 8th Edition core box set as well, and had almost exactly enough in Amazon gift card balance for it! It comes with Space Marines, as always, but the opposing side is Chaos this time. 7 Plague Marines, a few characters, a big vehicle, and about 20 undead daemon things. I decided to alternate between painting a handful of each side at once, so as not to get bored, and have gone with Space Wolves (big surprise, I know) as the paint scheme for the Imperial side.

Space Wolves Intercessor

There’s another five of these Space Marines but they’re all identical apart from the poses so I didn’t take photos of all of them.

The Plague Marines are all unique though, so I’ve been taking photos of each of them, my first batch was four of them.

Plague Marine 1

Plague Marine 2

Plague Marine 3

Plague Marine 4

My mobile painting table has been a great success, but after the first batch of Space Marines I realised I was getting a sore neck and back from hunching over towards the miniatures as I was painting them because everything was too low. Another trip to Bunnings, and lo and behold…

Painting table from the side, showing the two vertical blanks to give it some hight

Problem solved!

I also realised the other day why I was enjoying painting my miniatures a lot more now than I used to… it’s thanks to being able to combine my hobbies of painting and also photography. 😛 I can paint the miniatures and be happy with my work, but then also take professional-looking photos of them and share them with the world!