Art update, February 2019

Art update, February 2019

As previously mentioned, I’ve been enjoying the hell out of my iPad Pro and Apple Pencil, and have been doing a whole lot more drawing/sketching/painting since then. I moved my Mastodon account over to a much smaller instance, mastodon.art, which as you’d expect is very focused on art and has a lot of extremely creative people on it, and I’ve been getting a daily dose of inspiration. (As a side-note, I can’t recommend enough moving to a smaller Mastodon instance as opposed to one of the huge ones like mastodon.social… it really does feel much more like a community, and the local timeline is something you actually can keep up with and interact with).

I found a really good series of tutorials on drawing people and facial expressions, and drew this!

Four cartoon-style heads: a white girl with long blonde hair looking unimpressed, a bald Asian man looking shocked, a black man with big hair and a goatee looking sleepy, and an Indian man smirking with one eyebrow raised.

After that were a bunch more extremely munted-looking people which I’m not going to post, but then I eventually got my proportions better.

A portrait sketch of a pale red-haired woman with green eyes, smirking and raising one eyebrow.

One of the artist-types I follow on Mastodon is Noah Bradley, he’s done paintings for all sorts of places including for Magic: The Gathering! He said he biggest piece of advice was “Use more reference!”, i.e. have an actual picture/photo/whatever next to the thing you’re drawing so you can get the proportions and such correct. I took that advice to heart, and painted this picture of what ended up as a queen!

A painting of a white woman from the chest up, lit from the left side while her right side is heavily shadowed. She has long braided blonde hair and is wearing a purple high-collared dress. Behind her and to her left and right are glowing red eldritch runes, giving a faint red tinge to her outline.

She looks nothing like the original photo, of course, but it really helped to get the angles of everything correct. I also really started getting the hang of lighting, I’m so pleased with the light from the runes that’s reflecting off her hair, especially on the right.

Lily’s Christmas present was the first-gen Apple Pencil for her own iPad and she’s been absolutely drawing her heart out as well. One of the things she likes drawing are My Little Pony characters… she’s never watched the show but enjoys making up her own characters. She set me a challenge of making a character myself, with the theme of “neon”, and I took that opportunity to do some more practice with lighting (see the linked image for the full effect). I found an outline online and traced over that for the shape, but the colour is all me!

A side-on painting of a My Little Pony character on a black background, coloured in dark blue with neon blue lightning at the top of her hooves and a dark purple tail with a purple neon light running down it.

We also took my old teddy bear Neddy out, who I’ve had since I was a year old, and used him for some lighting practice.

A three-quarter profile of a brown teddy bear on a black background, the edges of the left side of him are lit from a bright white light source to the left of the painting.

Then my latest work was Maria Franz from the band Heilung! They do epic pagan/folk music and it’s absolutely fantastic (see the video of their live show). I introduced Lily to them and she’s now completely obsessed. She sent me a picture of Maria Franz that she’d traced over and coloured in, and I realised that’d be a perfect bit of subject matter.

A painting of a woman from the chest up, she has antlers on her head and long red hair, and has headgear that goes over her forehead, with tassles obscuring her eyes.

I had a picture of her open in Safari in split-screen view so I could get the outline and proportions right, and Procreate tells me this was nearly 10 hours all up! I’m absolutely stoked with how it ended up, and just seeing the difference between my earliest stuff and now is great, even though it’s only been two months. The “Use reference” mantra is one that I’m definitely taking to heart.

Five years of Beanie!

Five years of Beanie!

As of today, we’ve had Beanie for five years!

We got him as a rescue dog when he was a year old, he’d already had three previous owners before us, and the house he was living in when we got him had two other much larger dogs that apparently were taking most of the food and generally bullied him. The idiot woman who owned him also had him on some “raw chicken” diet where he’d have basically just raw chicken pieces, and his hair was all short and his tail was permanently very tightly curled — which is how he gets when he’s nervous or anxious — and didn’t wag at all for the first three or so weeks that he was living at ours.

A very young-looking Beanie when he was one year old at the end of our hallway in the midst of running back to us.
One of the very first photos of Beanie shortly after we got him.

He had quite bad separation anxiety, we eventually went to the vet and got some anti-anxiety medicine because he would scratch at the door repeatedly when we’d leave the house, to the point that he was scratching the paint off it. The medicine combined with giving him a marrowbone treat whenever we leave for a period of more than an hour or two has definitely helped, he now gets excited we’re we’re getting ready to leave because he often gets a treat. 😛

He’s never liked other dogs, and generally still doesn’t, but we were able to take him to puppy training session where he at least got some exposure to dogs around his size, and that led to our now-regular visits with Leo!

Leo, a Jack Russell, and Beanie facing each other about to leap at each other.
Leo and Beanie playing in Leo’s backyard.

Because Kristina and I are out of the house for a good 10-11 hours a day, we have a dog walker to comes every day after lunch to let him out and give him a walk. We have her come around anyway if one of us is working from home or sick, and he gets so excited when she arrives.

Beanie is a good boy, he never gets into any trouble whatsoever: he doesn’t chew things, he goes to bed at night when we do and doesn’t get out of bed in the morning until we’re up (or even after we’re up on occasion), if one of us is sick he’ll happily snuggle up on the lounge with us. One of his favourite things to do is to “up” at us, which he frequently does after we’re lying in bed and getting ready to go to sleep!

Beanie sitting up on his hind legs, smiling.
He thinks he’s people!

The only irritation is his bark… there is ZERO wind-up or warning, and it’s an extremely loud and sharp bark that gives you a heart-attack! If he’s bored he’ll find things to bark at which can be super-annoying, but usually a quick walk around the block will tire him out and he calms down afterwards. Given how destructive other dogs can be though, I think we have it pretty good with Beanie. 🙂

Installing Linux Mint 19.1 on a Late-2010 MacBook Air

Installing Linux Mint 19.1 on a Late-2010 MacBook Air

(Update December 2022: As suggested in the latest comments, this entire blog post is pretty much redundant now! Linux Mint 21.1 installs without a hitch, even using Cinnamon, and I have fully-functional brightness and sound keys straight out of the box.)

(Update December 2020: I successfully upgraded from Linux Mint 19.3 to Linux Mint 20 by following the official Linux Mint instructions. The only additional post-upgrade work I had to do was re-adding the Section "Device" bit to /usr/share/X11/xorg.conf.d/nvidia-drm-outputclass-ubuntu.conf as described below to get the brightness keys working again.)

(Update May 2020: I’ve re-run through this whole process using Linux Mint 19.3 and have updated this blog post with new details. Notably, no need to install pommed, and including the specific voodoo needed for the 2010 MacBook Air from Ask Ubuntu regarding PCI-E bus identifiers.)

We have a still perfectly usable Late-2010 MacBook Air (“MacBookAir3,2”, model number A1369), but with macOS 10.14 Mojave dropping support for Macs older than 2012 (it’s possible to extremely-hackily install it on older machines but I’d rather not go down that route), I decided I’d try installing Linux on it. The MacBook Air still works fine, if a bit slow, on macOS 10.13 but I felt like a bit of nerding!

Installation

My distribution of choice was Linux Mint, which is Ubuntu-based but less with the constant changes that Canonical keep making. The first hurdle right out of the gate was which “edition” to choose: Cinnamon, MATE, or xfce. There was zero info on the website about which to choose, I started with Cinnamon but that kept crashing when booting from the installation ISO and giving me a message about being in fallback mode. It turns out Cinnamon is the one with all the graphical bells and whistles, and it appears that an eight-year ultralight laptop’s video card isn’t up to snuff, so I ended up on “MATE” edition, which looks pretty much identical but works fine.

My installation method was using Raspberry Pi Imager to write the installation ISO to a spare SD card (despite the name, it can be used to write any ISO: scroll all the way down in the “Choose OS” dialog and select “Use custom”). Installing Linux requires you to partition the SSD using Disk Utility, I added a 2GB partition for the /boot partition, and another 100GB to install Linux itself onto. It doesn’t matter which format you choose as it’ll be reformatted as part of the installation process.

After partitioning, reboot with the SD card in and the Option key held down, and choose the “EFI Boot” option. The installer is quite straightforward, but I chose the custom option when it asked how to format the drive, formatted both the 2GB and 100GB partitions as ext4, with the 2GB one mounted at /boot and the 100GB at /. The other part is to install the bootloader onto that /boot partition, to make it easy to get rid of everything if you want to go back to single-partition macOS and no Linux.

Post-install

The next hurdle was video card drivers. Mint comes with an open-source video card driver called “Nouveau” which works but isn’t very performant, and there was lots of screen tearing as I’d scroll or move windows around. This being Linux, it was naturally not as simple as just installing the official Nvidia one and being done with, because that resulted in a black screen at boot. 😛 I did a massive amount of searching and eventually stumbled across this answer on AskUbuntu which worked where nothing else did: I followed those instructions and was able to successfully install the official Nvidia drivers without getting a black screen on boot!

(Update May 2020: I honestly don’t remember whether I had to go through Step 1 of Andreas’ instructions, “Install Ubuntu in UEFI mode with the Nvidia drivers”, but check for the existence of the directory /sys/firmware before running the rest of this. That directory is only created if you’ve booted in EFI mode. If it doesn’t exist, follow the link in Step 1).

I’m copying the details here for posterity, in case something happens to that answer, but all credit goes to Andreas there. These details are specifically for the Late 2010 MacBook Air with a GeForce 320M video card, so using this on something else might very well break things.

Create the file /etc/grub.d/01_enable_vga.conf and paste the following contents into it:

cat << EOF
setpci -s "00:17.0" 3e.b=8
setpci -s "02:00.0" 04.b=7
EOF

Then make the new file executable and update the grub config files:

$ sudo chmod 755 /etc/grub.d/01_enable_vga.conf
$ sudo update-grub

And then restart. Double-check that the register values have been set to 8 for the bridge device and 7 for the display device:

 $ sudo setpci -s "00:17.0" 3e.b
 08
 $ sudo setpci -s "02:00.0" 04.b
 07

Next, load up the “Driver Manager” control panel and set the machine to use the Nvidia drivers, once it’s finished doing its thing — which took a couple of minutes — restart once more, and you’ll be running with the much-more-performant Nvidia drivers!

At this point I realised that the brightness keys on the keyboard didn’t work. Cue a whole bunch more searching, with fix being to add the following snippet to the bottom of /usr/share/X11/xorg.conf.d/nvidia-drm-outputclass-ubuntu.conf:

Section "Device"
  Identifier     "Device0"
  Driver         "nvidia"
  VendorName     "NVIDIA Corporation"
  BoardName      "GeForce 320M"
  Option         "RegistryDwords" "EnableBrightnessControl=1"
EndSection

And now I have a fully-functioning Linux installation, with working sleep+wake, audio, wifi, and brightness!

I’m certainly not going to be switching to it full-time, and it feels like a lot more fragile than macOS, but it’s fun to muck around with a new operating system. And with 1Password X, I’m able to use 1Password within Firefox under Linux too!

New shiny: 11″ iPad Pro and Apple Pencil 2

I bought an iPad mini 2 back in April of 2014, which was the first mini with a retina display. It got fairly slow with the upgrade to iOS 11, and even though iOS 12 gave it a bit of a shot in the arm, it still ultimately struggled to do much beyond very basic web browsing and social media things — not entirely surprising given it’s five years old at this point. Apple announced the latest version of the iPad Pro at the end of October last year, along with an updated Apple Pencil, and all the reviews said the iPad was absolutely gobsmackingly fast (to the point where it beats all but the highest-end Core i9 15″ MacBook Pro in a number of CPU benchmarks), so I decided to finally retire the iPad mini and upgrade.

Holy. Crap.

It’s honestly one of the most impressive pieces of technology I’ve used in recent years; almost the entire thing is screen, there’s only enough bezel to comfortably hold the edges and no more, and it’s about as thin as I recall the iPhone 4/4S being, with the same industrial design. I’m still on the iPhone 7 so haven’t used Face ID before, and it works like magic. The screen has a 120Hz refresh rate as opposed to the standard 60Hz of most displays, and it means that everything feels just subtly more fluid and responsive. Everything I do on it is just totally effortless, it responds immediately without any hint of lag or hesitation.

However, I think my favourite part so far is the Apple Pencil. It’s much the same as the original in terms of usage, but magnetically pairs and charges on the right side of the iPad, and has an option to double-tap the Pencil itself to switch between your current drawing tool and the eraser tool. It has pressure and angle sensitivity, so can behave exactly like an actual pencil. Turn it sideways and use the edge and you can do subtle shading, and the harder you press the darker the shade.

The built-in Notes app has basic Pencil support, but I’ve been using Procreate — which I actually originally bought on my iPad mini but didn’t use very much due to it being a pain trying to do any sort of detail with a finger — and it’s so awesome. It’s absolutely not going to win any sort of awards, but I’ve done two things so far and am really happy with both of them.

The first was done with the drawing assist turned on and an isometric grid, followed by a whole bunch of layers to get the lighting looking right.

And the second is just a pencil sketch. Like I said, objectively it’s not very good, but not having done anything like this before, I’m still very pleased.

I had a photo open in Safari in split-screen view beside Procreate, the woman in my sketch looks absolutely nothing like the photo but it was more just generally to get the angles right. 😛 My having done photography definitely helped with the shading because I could easily visualise in my head how the shadows would fall.

I need to make sure I keep up the practise so I can improve!


I also remembered that GarageBand on iOS is a thing, and goddamn, it’s also impressive. It’s the same idea as in GarageBand on the Mac with adding preset loops of instruments and combining them together — or recording your own — but they’ve done an amazing job of translating that to something that’s usable with a touch interface. I’d made a couple of songs in GarageBand for Mac previously (nearly twelve years ago now, jesus), but not really anything since.

I bought a USB-C to 3.5mm headphone adapter for the iPad so I could use my big Audio Technica headphones — doing audio work requires very low latency, and Bluetooth has way too much latency, to the point where if you try to use Bluetooth headphones with GarageBand on iOS it’ll give you a big warning to that effect — and did a bunch of dabbling the other day, and made another new song!


I’m very interested to see what Apple does with iOS 13 this year, because the iPad Pro’s hardware is astonishingly capable, but it feels like the software could be doing more. I’ve hooked up our spare Bluetooth keyboard and dabbled around in that and it’s neat, but there’s not enough support for keyboard shortcuts even in Apple’s own applications . In Messages, for instance, you can use Cmd-↑ and Cmd-↓ to switch between conversations, but there’s no way to get the focus back to the input field once you’ve done so… you have to reach for the screen. There’s an official Apple keyboard cover that turns your iPad into something resembling a laptop, but I don’t know how well it would work remaining steady on a lap on a train.

All that said, I’m absolutely stoked with the new iPad, and am seriously keen to see the software catch up to the capabilities of the hardware!

More fun with Yubikey: Signed Git commits and GPG agent forwarding

I’ve been on a “What other neat things can I do with my Yubikey” kick after my last post, and it turns out one of those neat things is to cryptographically sign Git commits. This allows you to prove that the owner of a particular GPG key is actually the person who committed the code. 

Setting up signed Git commits locally is very easy, run git config --global user.signingkey "<ID of your GPG signing subkey>" (mine is C65E91ED24C34F59 as shown in the screenshot below), then run your Git commit normally but with the added flag -S to sign it.

Bitbucket Cloud doesn’t currently support displaying signed Git commits in the UI, but you can do it on GitHub and you get a shiny little “Verified” badge next to each one and this message when you click on it:

You can also show it locally with git log --show-signature.

This is all well and good, but what if you want to sign something on a remote server that you’re connected to via SSH? Enter GPG agent forwarding!

Just like you can do SSH agent forwarding to have your private SSH key securely forwarded to a machine you’re connecting to, you can do the same with the GPG agent that stores your GPG keys and allow it to access your signing subkey. Setting up GPG agent forwarding is broadly straightforward, but make a note of which versions of GNUPG you’re running at each end. The “modern” version is 2.1 and higher, I’m running 2.2.x on my Macs but my Linode runs CentOS 7 which only comes with GPUPG 2.0.x and I wasn’t able to fully get agent forwarding working between it and 2.2.x on my Macs. I tested the latest Debian with 2.1 and that worked.

I followed this guide, but one extremely important note is that you can’t use a relative path for the local or remote sockets, they have to be the full absolute path. This becomes a pain when you’re connecting to and from different OSes or machines where your username differs. Thankfully, SSH has a Match exec option where you can run a command to match different hosts and use different host definitions (and thus put in different paths for the sockets) depending on your local and remote machines.

Mine looks like this :

# Source machine is a personal Mac, connecting to another personal Mac on my local network; the local network all uses the .core domain internally
Match exec "hostname | grep -F .core" Host *.core
RemoteForward /Users/virtualwolf/.gnupg/S.gpg-agent /Users/virtualwolf/.gnupg/S.gpg-agent.extra

# Source machine is a personal Mac, connecting to my Linux box
Match exec "hostname | grep -F .core" Host {name of the Host block for my Linode}
RemoteForward /home/virtualwolf/.gnupg/S.gpg-agent /Users/virtualwolf/.gnupg/S.gpg-agent.extra

# Source machine is my work Mac, connecting to my Linux box
Match exec "hostname | grep -F {work machine hostname}" Host {name of the Host block for my Linode}
RemoteForward /home/virtualwolf/.gnupg/S.gpg-agent /Users/{work username}/.gnupg/S.gpg-agent.extra

(Yes, technically this doesn’t work as I mentioned at the start due to my Linode being on CentOS 7 and having GNUPG 2.0, but the socket forwarding bit works, just not when I actually want to do anything with it. :P)

Nginx, PHP-FPM, and Cloudflare, oh my!

I use my Linode to host a number of things (this blog and Kristina’s, my website and Kristina’s, an IRC session via tmux and irssi for a friend and me, and probably another thing or two I’m forgetting). Kristina started up a travel blog a few months ago which I’m also hosting on it, and shortly after that point I found that maybe once every two weeks or so my website and our blogs weren’t running anymore. I looked into it and it was being caused by Linux’s Out-Of-Memory Killer, which kicks in when the system is critically low on memory and needs to free some up, killing the Docker container that my website runs in as well as MariaDB.

The main cause was Apache and MariaDB using up entirely too much memory for my little 1GB Linode, it was evidently just sitting on this side of stable with two WordPress blogs but adding a third seems like it tipped it over the edge. The reason MariaDB and my website’s Docker container were being killed is because although Apache was using up a heap of memory it was spread over a number of worker threads, so individually none of those were high, and MariaDB and my website were the largest on the list. There’s lots of tweaks you can do, several of which I tried, but all that happened was that it delayed the inevitable rather than entirely resolving it. Apache is powerful but low-resource-usage it ain’t. The primary low-resource-usage alternative to Apache is Nginx, so I figured this weekend I’d have a crack at moving over to that.

Overall it was pretty straightforward, this guide from Digital Ocean was a good starting point, the bits where it fell short was mostly just a case of looking up all of the equivalent directives for SSL, mapping to filesystem locations, etc. (I have ~15 years of history of hosted images I’ve posted on the Ars Technica  forums and my old LiveJournal—which is now this blog—and wanted to make sure those links all kept working). 

One difference is with getting WordPress going… WordPress is all PHP, and Apache by default runs PHP code inside the Apache process itself via mod_php, whereas when you’re using Nginx you have to be using PHP-FPM or similar which is an entirely separate process that runs on the server and that Nginx talks to to process the PHP code. I mostly followed this guide, also from Digital Ocean though there were a couple of extra gotchas I ran into when getting it fully going with Nginx for WordPress:

  • Edit /etc/nginx/fastcgi_params and add a new line with this content or you’ll end up with nothing but an empty blank page: fastcgi_param PATH_TRANSLATED $document_root$fastcgi_script_name;
  • Remember to change the ownership of the WordPress installation directory to the nginx user instead of  apache
  • The default settings for PHP-FPM assume it’s running on a box with significantly more than 2GB of RAM; edit /etc/php-fpm.d/www.conf and change the line that says pm = dynamic to be pm = ondemand; with ondemand PHP-FPM will spin up worker processes as needed but will kill off idle ones after ten seconds rather than leaving them around indefinitely.

Additionally, Nginx doesn’t support .htaccess files so if you’ve got WordPress set up to use any of the “pretty”-type links, you’ll end up with 404s when you try to view an individual post instead. The fix is to put the following into the server block at the bottom:

location / {
  try_files $uri $uri/ /index.php?$args;
}

So it’ll pass the correct arguments to WordPress’ index.php file. You’ll also want to block access to any existing .htaccess files as well:

location ~ /\.ht {
  deny all;
}

The last thing I did with this setup was to put the entirety of my website, Kristina’s, and our respective blogs behind Cloudflare. I had great success with their DNS over HTTPS service, and their original product is essentially a reverse proxy that caches static content (CSS, Javascript, images) at each of their points of presence around the world so you’ll load those from whichever server is geographically closest to you. For basic use it’s free, and includes SSL, you just need to point your domain’s nameservers at the ones they provide. The only thing I needed to do was to set up another DNS record so I could actually SSH into my Linode, because now the host virtualwolf.org resolves to Cloudflare’s servers which obviously don’t have any SSH running!

Overall, the combination of Nginx + PHP-FPM + Cloudflare has resulted in remarkably faster page loads for our blogs, and thus far significantly reduced memory usage as well.

GPG and hardware-based two-factor authentication with YubiKey

As part of having an Ars Technica Pro++ subscription, they sent me a free YubiKey 4, which is a small hardware token that plugs into your USB port and allows for a bunch of extra security on your various accounts because you need the token physically plugged into your computer in order to authenticate. It does a number of neat things:

  • Generating one-time passwords (TOTP) as a second-factor when logging in to websites;
  • Storing GPG keys;
  • Use as a second-factor with Duo;

And a bunch of other stuff as well, none of which I’m using (yet).

My password manager of choice is 1Password, and although it allows saving one-time passwords for websites itself, I wanted to lock access to the 1Password account itself down even further. Their cloud-based subscription already has strong protection by using a secret key in addition to your strong master password, but you can also set it up to require a one-time password the first time you log into it from a new device or browser so I’m using the YubiKey for that.

I also generated myself GPG keys and saved them to the YubiKey. It was not the most user-friendly process in the world, though that’s a common complaint that’s levelled at GPG. I found this guide that runs you through it all and, while long, it’s pretty straightforward. It’s all set up now, though, my public key is here and I can send and receive encrypted messages and cryptographically sign documents, and the master key is saved only on an encrypted USB stick. You can also use the GPG agent that runs on your machine and reads the keys from the YubiKey to also be used for SSH, so I’ve got that set up with my Linode.

The last thing I’ve done is to set the YubiKey up as a hardware token with Duo and put my Linode’s SSH and this blog (and soon Kristina’s, though hers not with the YubiKey) behind that. With the Duo Unix module, even sudo access requires the YubiKey, and the way that’s set up is that you touch the button on the YubiKey itself and it generates a code and enters it for you.

It’s all pretty sweet and definitely adds a bunch of extra security around everything. I’m busily seeing what else I can lock down now!

Setting up DNS over HTTPS on macOS

Back in April, Cloudflare announced a privacy-focused DNS server running at 1.1.1.1 (and 1.0.0.1), and that it supported DNS over HTTPS. A lot of regular traffic goes over HTTPS these days, but DNS queries to look up the IP address of a domain are still unencrypted, so your ISP can still snoop on which servers you’re visiting even if they can’t see the actual content. We have a Mac mini that runs macOS Server and does DHCP and DNS for our home network, among other things, and with the impending removal of those functions and their suggested replacements with regular non-UI tools with a upcoming version of it, I figured now would be a good time to look into moving us over to use Cloudflare’s shiny new DNS server at the same time.

Turns out it wasn’t that difficult!

Overview

  1. Install Homebrew.
  2. Install cloudflared and dnsmasq: brew install cloudflare/cloudflare/cloudflared dnsmasq
  3. Configure dnsmasq to point to cloudflared as its own DNS resolver.
  4. Configure cloudflared to use DNS over HTTPS and run on port 54.
  5. Install both as services to run at system boot.

Configuring dnsmasq

Edit the configuration file located at /usr/local/etc/dnsmasq.conf and uncomment line 66 and change it from server=/localnet/192.168.0.1 to server=127.0.0.1#54 to tell it to pass DNS requests onto localhost on port 54, which is where cloudflared will be set up.

Configuring cloudflared

Create the directory /usr/local/etc/cloudflared and create a file inside that called config.yml with the following contents:

no-autoupdate: true
proxy-dns: true
proxy-dns-port: 54
proxy-dns-upstream:
  - https://1.1.1.1/dns-query
  - https://1.0.0.1/dns-query

Auto-update is disabled because that seems to break things when the update occurs, and the service doesn’t start back up correctly.

Configuring dnsmasq and cloudflared to start on system boot

dnsmasq: sudo brew services start dnsmasq will both start it immediately and also set it to start at system boot.

cloudflared: sudo cloudflared service install, which installs it for launchctl at /Library/LaunchDaemons/com.cloudflare.cloudflared.plist.

Updating your DNS servers

Now that dnsmasq and cloudflared are running, you need to actually tell your machines to use them as their DNS servers! Open up System Preferences > Network, hit Advanced, and in the DNS tab click the + button and put your computer’s local IP address in. (You’ll want to make sure your machine has a static IP address, of course). Repeat the process for everything else on your local network to have them all send their DNS traffic to 1.1.1.1 as well.

You can confirm that all your DNS traffic is going where it should be with dnsleaktest.

And done!

I was surprised at how straightforward this was. I also didn’t realise until I was doing all of this that dnsmasq also does DHCP, so with the assistance of this blog post I’ve also replaced the built-in DHCP server on the Mac mini and continue to have full local hostname resolution as well!

Knights Ridge “eco retreat”

Knights Ridge “eco retreat”

We went for a weekend away up to a place in the Hunter Valley billing itself as an “eco retreat” and it was pretty great! We were able to bring Beanie along too, which he loved. Kristina had the great idea to get one of those extendo-leads so he was able to roam up to five metres away and smell all the smells while still remaining technically on his lead.

The whole place is entirely off-grid: electricity comes from solar plus battery storage (though they also included instructions for how to start up the backup generator in case anything happened), the toilet is a composting one, and water is all captured from the rain and stored. They did have a somewhat anaemic ADSL2 connection though, so I don’t know if you count that as still being entirely off-grid. 😛 There was zero mobile phone signal though, the whole time we were there our phones said “No service”.

The place was decorated in quite the rustic style, with all sorts of old bits and bobs around, but it was all totally clean and dust-free.

Sitting
Untitled
Untitled

There was no electric kettle, only an old stovetop one, and you don’t realise  how spoiled you are until you remember just how damn long it takes for a kettle to come to a boil on a gas stovetop!

Putting a cuppa on

The whole place was completely and totally silent in terms of any sort of human noise, the only sounds were from the trees and birds, and it was absolutely delightful. Sunset down the shallow valley that we were in was quite nice too.

Spotlight
Sunset

At dusk we saw a couple of wombats, though Beanie had to bark at them probably because they were low and had four legs and so looked somewhat dog-shaped, but we also saw some kangaroos! Beanie was absolutely fascinated by the kangaroos, we were watching from a goodly distance and he was sitting there absolutely laser-focused on them.

A distant kangaroo!
The kangaroo is the little tiny spec right in the middle where the field starts turning into the side of the valley.
Intently watching the distant kangaroo

My only complaint with the place is the number of bugs that manage to come in at night! Only about half the windows have flyscreens on them, so we had to run around and mostly close the place up once dusk arrived. All in all it was extremely relaxing, though. A++ would relax again.

The full album of photos is on Flickr.

Book recommendations, 2018 edition

Fair warning, I’m rubbish at writing reviews so I’m just going to copy and paste the summary from Booktopia. 😛 But I’ve read all of these books and they’re all fantastic and highly recommended!

Vigil by Angela Slatter (Book 1 of the Verity Fassbinder series)

Urban fantasy set in Brisbane.

Verity Fassbinder has her feet in two worlds.

The daughter of one human and one Weyrd parent, she has very little power herself, but does claim unusual strength — and the ability to walk between us and the other — as a couple of her talents. As such a rarity, she is charged with keeping the peace between both races, and ensuring the Weyrd remain hidden from us.

But now Sirens are dying, illegal wine made from the tears of human children is for sale — and in the hands of those Weyrd who hold with the old ways — and someone has released an unknown and terrifyingly destructive force on the streets of Brisbane.

And Verity must investigate, or risk ancient forces carving our world apart. 

The Stars are Legion by Kameron Hurley

Science fiction, a review on Amazon described it as “biopunk” which I think is perfect. It’s one of those books where very little is explicitly spelled out and you gather more and more little nuggets about the world as the book goes on.

Somewhere on the outer rim of the universe, a mass of decaying worldships known as the Legion is traveling in the seams between the stars. For generations, a war for control of the Legion has been waged, with no clear resolution. As worlds continue to die, a desperate plan is put into motion.

Zan wakes with no memory, prisoner of a people who say they are her family. She is told she is their salvation — the only person capable of boarding the Mokshi, a world-ship with the power to leave the Legion. But Zan’s new family is not the only one desperate to gain control of the prized ship. Zan must choose sides in a genocidal campaign that will take her from the edges of the Legion’s gravity well to the very belly of the world. Zan will soon learn that she carries the seeds of the Legion’s destruction — and its possible salvation. 

The Collapsing Empire by John Scalzi (Book 1 of the Interdependency series)

Science fiction.

In the far future, humanity has left Earth to create a glorious empire. Now this interstellar network of worlds faces disaster — but can three individuals save their people?

The empire’s outposts are utterly dependent on each other for resources, a safeguard against war, and a way its rulers can exert control. This relies on extra-dimensional pathways between the stars, connecting worlds. But “The Flow” is changing course, which could plunge every colony into fatal isolation.

A scientist will risk his life to inform the empire’s ruler. A scion of a Merchant House stumbles upon conspirators seeking power. And the new Empress of the Interdependency must battle lies, rebellion and treason. Yet as they work to save a civilization on the brink of collapse, others have very different plans…

The Fifth Season by N.K. Jemisin (Book 1 of the Broken Earth trilogy)

Science-fiction/fantasy, it’s kind of hard to nail it down to a single genre.

Three terrible things happen in a single day.

Essun, masquerading as an ordinary schoolteacher in a quiet small town, comes home to find that her husband has brutally murdered their son and kidnapped their daughter. Mighty Sanze, the empire whose innovations have been civilization’s bedrock for a thousand years, collapses as its greatest city is destroyed by a madman’s vengeance. And worst of all, across the heartland of the world’s sole continent, a great red rift has been torn which spews ash enough to darken the sky for years. Or centuries.

But this is the Stillness, a land long familiar with struggle, and where orogenes — those who wield the power of the earth as a weapon — are feared far more than the long cold night. Essun has remembered herself, and she will have her daughter back.

She does not care if the world falls apart around her. Essun will break it herself, if she must, to save her daughter.

Leviathan Wakes (Book 1 of The Expanse series)

Gritty science fiction. (It should be noted that there are currently seven books in this series so far, and they are all brilliant.)

Humanity has colonised the planets — interstellar travel is still beyond our reach, but the solar system has become a dense network of colonies. But there are tensions — the mineral-rich outer planets resent their dependence on Earth and Mars and the political and military clout they wield over the Belt and beyond.

Now, when Captain Jim Holden’s ice miner stumbles across a derelict, abandoned ship, he uncovers a secret that threatens to throw the entire system into war. Attacked by a stealth ship belonging to the Mars fleet, Holden must find a way to uncover the motives behind the attack, stop a war and find the truth behind a vast conspiracy that threatens the entire human race.

Bound by Alan Baxter (Book 1 of the Alex Caine trilogy)

Urban fantasy set (at least initially) in Sydney.

Alex Caine is a martial artist fighting in illegal cage matches. His powerful secret weapon is an unnatural vision that allows him to see his opponents’ moves before they know their intentions themselves.

After a fight one night, an enigmatic Englishman, Patrick Welby, claims to know Alex’s secret. Welby shows Alex how to unleash a breathtaking realm of magic and power, drawing him into a mind-bending adventure beyond his control. And control is something Alex values above all else. 

Changer by Matt Gemmell (Book 1 of the Kestrel series)

Urban fantasy-slash-science-fiction, set in the Edinburgh, Scotland.

Jutland, Denmark: a billionaire industrialist seizes control of a top-secret project that the European Defence Agency calls Destiny, manipulating it for his own ends.

Edinburgh, Scotland: physicist Neil Aldridge’s life is saved by an elite EU special forces team, codenamed KESTREL, drawing him into a race against time to prevent a disaster that will claim millions of lives.

As the chase leads to London, Amsterdam and beyond, Aldridge and his allies must battle a ruthless adversary: a trained killer with an unnatural ability, who seeks to hasten the cataclysm.

With time running out, Aldridge discovers that he and his enemy share an astonishing secret, which may be the key to salvation — or cause death on an unprecedented scale…