Setting up DNS over HTTPS on macOS

Back in April, Cloudflare announced a privacy-focused DNS server running at 1.1.1.1 (and 1.0.0.1), and that it supported DNS over HTTPS. A lot of regular traffic goes over HTTPS these days, but DNS queries to look up the IP address of a domain are still unencrypted, so your ISP can still snoop on which servers you’re visiting even if they can’t see the actual content. We have a Mac mini that runs macOS Server and does DHCP and DNS for our home network, among other things, and with the impending removal of those functions and their suggested replacements with regular non-UI tools with a upcoming version of it, I figured now would be a good time to look into moving us over to use Cloudflare’s shiny new DNS server at the same time.

Turns out it wasn’t that difficult!

Overview

  1. Install Homebrew.
  2. Install cloudflared and dnsmasq: brew install cloudflare/cloudflare/cloudflared dnsmasq
  3. Configure dnsmasq to point to cloudflared as its own DNS resolver.
  4. Configure cloudflared to use DNS over HTTPS and run on port 54.
  5. Install both as services to run at system boot.

Configuring dnsmasq

Edit the configuration file located at /usr/local/etc/dnsmasq.conf and uncomment line 66 and change it from server=/localnet/192.168.0.1 to server=127.0.0.1#54 to tell it to pass DNS requests onto localhost on port 54, which is where cloudflared will be set up.

Configuring cloudflared

Create the directory /usr/local/etc/cloudflared and create a file inside that called config.yml with the following contents:

port: 54
no-autoupdate: true
proxy-dns: true
proxy-dns-upstream:
  - https://1.1.1.1/dns-query
  - https://1.0.0.1/dns-query

Auto-update is disabled because that seems to break things when the update occurs, and the service doesn’t start back up correctly.

Configuring dnsmasq and cloudflared to start on system boot

dnsmasq is easy, simply run: sudo brew services start dnsmasq which will both start it immediately and also set it to start at system boot.

Due to a bug that isn’t fixed as of writing, setting the port for cloudflared requires being set via a launchctl environment variable. Install it as a service with sudo cloudflared service install, then run sudo launchctl unload /Library/LaunchDaemons/com.cloudflare.cloudflared.plist  to temporarily turn off the service. Next, run sudo launchctl setenv TUNNEL_DNS_PORT 54 to set the environment variable such that the launch script will pick it up, and lastly run sudo launchctl load /Library/LaunchDaemons/com.cloudflare.cloudflared.plist to start the service up again.

This is the same thing as setting port: 54 in the configuration file above but works around the aforementioned bug where that setting is ignored (and so tries to start on the default port 53 which fails because dnsmasq is already running there).

And done!

Apart from a bunch of work to figure out how to work around that cloudflared bug, I was surprised at how straightforward this was. I also didn’t realise until I was doing all of this that dnsmasq also does DHCP, so with the assistance of this blog post I’ve also replaced the built-in DHCP server on the Mac mini and continue to have full local hostname resolution as well!

Knights Ridge “eco retreat”

The house

We went for a weekend away up to a place in the Hunter Valley billing itself as an “eco retreat” and it was pretty great! We were able to bring Beanie along too, which he loved. Kristina had the great idea to get one of those extendo-leads so he was able to roam up to five metres away and smell all the smells while still remaining technically on his lead.

The whole place is entirely off-grid: electricity comes from solar plus battery storage (though they also included instructions for how to start up the backup generator in case anything happened), the toilet is a composting one, and water is all captured from the rain and stored. They did have a somewhat anaemic ADSL2 connection though, so I don’t know if you count that as still being entirely off-grid. 😛 There was zero mobile phone signal though, the whole time we were there our phones said “No service”.

The place was decorated in quite the rustic style, with all sorts of old bits and bobs around, but it was all totally clean and dust-free.

Sitting
Untitled
Untitled

There was no electric kettle, only an old stovetop one, and you don’t realise  how spoiled you are until you remember just how damn long it takes for a kettle to come to a boil on a gas stovetop!

Putting a cuppa on

The whole place was completely and totally silent in terms of any sort of human noise, the only sounds were from the trees and birds, and it was absolutely delightful. Sunset down the shallow valley that we were in was quite nice too.

Spotlight
Sunset

At dusk we saw a couple of wombats, though Beanie had to bark at them probably because they were low and had four legs and so looked somewhat dog-shaped, but we also saw some kangaroos! Beanie was absolutely fascinated by the kangaroos, we were watching from a goodly distance and he was sitting there absolutely laser-focused on them.

A distant kangaroo!
The kangaroo is the little tiny spec right in the middle where the field starts turning into the side of the valley.
Intently watching the distant kangaroo

My only complaint with the place is the number of bugs that manage to come in at night! Only about half the windows have flyscreens on them, so we had to run around and mostly close the place up once dusk arrived. All in all it was extremely relaxing, though. A++ would relax again.

The full album of photos is on Flickr.

Book recommendations, 2018 edition

Fair warning, I’m rubbish at writing reviews so I’m just going to copy and paste the summary from Booktopia. 😛 But I’ve read all of these books and they’re all fantastic and highly recommended!

Vigil by Angela Slatter (Book 1 of the Verity Fassbinder series)

Urban fantasy set in Brisbane.

Verity Fassbinder has her feet in two worlds.

The daughter of one human and one Weyrd parent, she has very little power herself, but does claim unusual strength — and the ability to walk between us and the other — as a couple of her talents. As such a rarity, she is charged with keeping the peace between both races, and ensuring the Weyrd remain hidden from us.

But now Sirens are dying, illegal wine made from the tears of human children is for sale — and in the hands of those Weyrd who hold with the old ways — and someone has released an unknown and terrifyingly destructive force on the streets of Brisbane.

And Verity must investigate, or risk ancient forces carving our world apart. 

The Stars are Legion by Kameron Hurley

Science fiction, a review on Amazon described it as “biopunk” which I think is perfect. It’s one of those books where very little is explicitly spelled out and you gather more and more little nuggets about the world as the book goes on.

Somewhere on the outer rim of the universe, a mass of decaying worldships known as the Legion is traveling in the seams between the stars. For generations, a war for control of the Legion has been waged, with no clear resolution. As worlds continue to die, a desperate plan is put into motion.

Zan wakes with no memory, prisoner of a people who say they are her family. She is told she is their salvation — the only person capable of boarding the Mokshi, a world-ship with the power to leave the Legion. But Zan’s new family is not the only one desperate to gain control of the prized ship. Zan must choose sides in a genocidal campaign that will take her from the edges of the Legion’s gravity well to the very belly of the world. Zan will soon learn that she carries the seeds of the Legion’s destruction — and its possible salvation. 

The Collapsing Empire by John Scalzi (Book 1 of the Interdependency series)

Science fiction.

In the far future, humanity has left Earth to create a glorious empire. Now this interstellar network of worlds faces disaster — but can three individuals save their people?

The empire’s outposts are utterly dependent on each other for resources, a safeguard against war, and a way its rulers can exert control. This relies on extra-dimensional pathways between the stars, connecting worlds. But “The Flow” is changing course, which could plunge every colony into fatal isolation.

A scientist will risk his life to inform the empire’s ruler. A scion of a Merchant House stumbles upon conspirators seeking power. And the new Empress of the Interdependency must battle lies, rebellion and treason. Yet as they work to save a civilization on the brink of collapse, others have very different plans…

The Fifth Season by N.K. Jemisin (Book 1 of the Broken Earth trilogy)

Science-fiction/fantasy, it’s kind of hard to nail it down to a single genre.

Three terrible things happen in a single day.

Essun, masquerading as an ordinary schoolteacher in a quiet small town, comes home to find that her husband has brutally murdered their son and kidnapped their daughter. Mighty Sanze, the empire whose innovations have been civilization’s bedrock for a thousand years, collapses as its greatest city is destroyed by a madman’s vengeance. And worst of all, across the heartland of the world’s sole continent, a great red rift has been torn which spews ash enough to darken the sky for years. Or centuries.

But this is the Stillness, a land long familiar with struggle, and where orogenes — those who wield the power of the earth as a weapon — are feared far more than the long cold night. Essun has remembered herself, and she will have her daughter back.

She does not care if the world falls apart around her. Essun will break it herself, if she must, to save her daughter.

Leviathan Wakes (Book 1 of The Expanse series)

Gritty science fiction. (It should be noted that there are currently seven books in this series so far, and they are all brilliant.)

Humanity has colonised the planets — interstellar travel is still beyond our reach, but the solar system has become a dense network of colonies. But there are tensions — the mineral-rich outer planets resent their dependence on Earth and Mars and the political and military clout they wield over the Belt and beyond.

Now, when Captain Jim Holden’s ice miner stumbles across a derelict, abandoned ship, he uncovers a secret that threatens to throw the entire system into war. Attacked by a stealth ship belonging to the Mars fleet, Holden must find a way to uncover the motives behind the attack, stop a war and find the truth behind a vast conspiracy that threatens the entire human race.

Bound by Alan Baxter (Book 1 of the Alex Caine trilogy)

Urban fantasy set (at least initially) in Sydney.

Alex Caine is a martial artist fighting in illegal cage matches. His powerful secret weapon is an unnatural vision that allows him to see his opponents’ moves before they know their intentions themselves.

After a fight one night, an enigmatic Englishman, Patrick Welby, claims to know Alex’s secret. Welby shows Alex how to unleash a breathtaking realm of magic and power, drawing him into a mind-bending adventure beyond his control. And control is something Alex values above all else. 

Changer by Matt Gemmell (Book 1 of the Kestrel series)

Urban fantasy-slash-science-fiction, set in the Edinburgh, Scotland.

Jutland, Denmark: a billionaire industrialist seizes control of a top-secret project that the European Defence Agency calls Destiny, manipulating it for his own ends.

Edinburgh, Scotland: physicist Neil Aldridge’s life is saved by an elite EU special forces team, codenamed KESTREL, drawing him into a race against time to prevent a disaster that will claim millions of lives.

As the chase leads to London, Amsterdam and beyond, Aldridge and his allies must battle a ruthless adversary: a trained killer with an unnatural ability, who seeks to hasten the cataclysm.

With time running out, Aldridge discovers that he and his enemy share an astonishing secret, which may be the key to salvation — or cause death on an unprecedented scale…

The spiritual successor to SimCity, Cities: Skylines

I first played the original SimCity Classic back in the early 1990s on our old Macintosh LC II, and absolutely loved it. Laying out a city and watching it grow was extremely satisfying, and the sequel, SimCity 2000 was even more detailed. I played a bit of SimCity 4, which came out in 2003, but the latest entry in the series, titled just “SimCity“, by all accounts sucked. The maps were significantly smaller in size, and it required an internet connection and was multiplayer to boot.

It’s actually possible to play SimCity 2000 on modern machines and I definitely got stuck into it a few years ago. This is a screenshot of my most recent city!

Screenshot of SimCity 2000, zoomed out and showing as much of my city as possible.

If you’re wanting a proper modern SimCity 2000-esque experience though, Cities: Skylines is what you’re after. It came out in March of 2015 on desktop, and was ported to Xbox One in April of 2017 and they did a damned good job of it, the controls are all perfectly suited to playing on a controller as opposed to with a mouse and keyboard.

The level of detail of the simulation is fantastic, you can zoom all the way in and follow individual people (called “cims”, as opposed to SimCity’s “sims”) or vehicles and see where they’re going. There’s a robust public transport system and you can put in train lines (and buses, and trams, and a subway, and in the most recent expansion called Mass Transit, even monorails, blimps, and ferries!) and see the cims going to and from work, and how many are waiting at each station and so on.

We recently upgraded to the Xbox One X and a shiny new OLED 4K TV (quite the upgrade from our nine year-old 37″ giant-bezeled LCD TV!), and it makes for some very nice screenshots. These are from my largest city called Springdale, currently home to ~140k people!

Nostalgia and the Classic Mac OS

I’ve been a Mac user my entire life, originally just because my dad used them at his work and so bought them for home as well. My earliest memories are of him bringing his SE/30 home and playing around in MacPaint. We also had an Apple IIe that we got second-hand from my uncle that lived in my bedroom for a few years, though that doesn’t count as a Mac.

The first Mac my dad bought for us at home was the LC II in 1992 (I was 9!), and I can remember spending hours trawling through Microsoft Encarta being blown away at just how much information I could look up immediately. I also remember playing Shufflepuck Café and Battle Chess, and I’m sure plenty of others too that didn’t leave as large an impression. There was also an application that came with the computer called Mouse Practice that showed you how to use a mouse, and we had At Ease installed for a while as well until I outgrew it.

After the LC II we upgraded to the Power Macintosh 6200 in 1995, which among other things came with a disc full of demos on it including the original Star Wars: Dark Forces (which I absolutely begged my parents to get the full version of for Christmas, including promising to entirely delete Doom II which they were a bit disapproving of due to the high levels of gore), and Bungie’s Marathon 2: Durandal (which I originally didn’t even bother looking at for the first few months because I thought it was something to do with running!). Marathon 2 was where I first became a fan of Bungie’s games, and I spent many many hours playing it and the subsequent Marathon Infinity as well as a number of fan-made total conversions too (most notably Marathon:EVIL and Tempus Irae).

The period we owned the 6200 also marked the first time we had an internet connection as well (a whopping 28.8Kbps modem, no less!). The World Wide Web was just starting to take off around this time, I remember dialing into a couple of the local Mac BBSes but at that point they were already dying out anyway and the WWW quickly took over. The community that sprang up around the Marathon trilogy was the first online community I was really a member of, and Hotline was used quite extensively for chatting. Marathon Infinity came with map-making tools which I eagerly jumped into and made a whole bunch of maps and put them online. I was even able to dig up the vast majority of them, there’s only a couple of them that I’ve not been able to find. I have a vivid memory of when Marathon:EVIL first came out, it was an absolutely massive 20MB and I can recall leaving the download going at a blazing-fast 2.7KB/s for a good two or three hours, and constantly coming back to it to make sure it hadn’t dropped out or otherwise stopped.

After the Marathon trilogy, Bungie developed the realtime strategy games Myth: The Fallen Lords and its sequel Myth II: Soulblighter, both of which I also played the hell out of and was a pretty active member of the community in.

After the 6200 we then had a second-gen iMac G3 then a “Sawtooth” Power Mac G4 just for me as my sister and I kept arguing about who should have time on the computer and the Internet. 😛 The G4 was quite a bit of money as you’d imagine, so I promised to pay it back to dad as soon as I got a job and started working.

macOS (formerly Mac OS X then OS X) is obviously a far more solid operating system, but I’ve always had a soft spot for the Classic Mac OS even with its cooperative multitasking and general fragility. We got rid of the old Power Mac G4 probably eight years ago now (which I regret doing), and I wanted to have some machine capable of running Mac OS 9 just for nostalgia’s sake. Mum and dad still had mum’s old PowerBook G3 and I was able to get a power adapter for it and boot it up to noodle around in, but it was a bit awkwardly-sized to fit on my desk and the battery was so dead that if the power cord wasn’t plugged in it wouldn’t boot at all.

There was a thread on Ars Technica a few months ago about old computers, and someone mentioned that if you were looking at something capable of running Mac OS 9 your best bet was to get the very last of the Power Mac G4s that could boot to it natively, the Mirrored Drive Doors model. I poked around on eBay and found a guy selling one in mint condition, and so bought it as a present to myself for my birthday.

Behold!

Power Mac G4 MDD

Dual 1.25GHz G4 processors, 1GB of RAM, 80GB of hard disk space, and a 64MB ATI Radeon 8500 graphics card. What a powerhouse. 😛

There’s a website, Macintosh Repository, where a bunch of enthusiasts are collecting old Mac software from yesteryear, so that’s been my main place to download all the old software and games that I remember from growing up. It’s been such a trip down memory lane, I love it!

A trip to Queenstown

Last week we finally got around to visiting New Zealand! We’d been meaning to go for a good couple of years now, but never actually did it. We started small and visited just Queenstown and surrounds, and were only there for three full days.

We flew in at night on Saturday night, and the descent in was rather long and bumpy which I guess is to be somewhat expected when it’s surrounded by mountains. We went into Queenstown for dinner first, and had one of the best burgers I’ve ever had at The World Bar.

The place we were staying was about 15 minutes drive from Queenstown, but because it was night time all we could see was brilliant yellow leaves on the trees at the side of the road where the headlights were lighting them up. We woke up the next morning, and holy crap, the view!

The Remarkables, morning light

Untitled

We went up the Skyline Gondola which has a hell of a view over Queenstown itself.

The view from Skyline Queenstown

Next we drive up to Glenorchy, which is about an hour away. Lunch was surprisingly delicious beef noodle soup from a Chinese restaurant there (there are a lot of Chinese tourists around).

On the road to Glenorchy

Kristina

Orange

Glenorchy Wharf

We spent the afternoon wandering around Queenstown Gardens and Queenstown itself. This was definitely a fantastic time to visit, the air was cool and crisp and all the leaves were changing and everything was bright yellow.

Sitting

Untitled

Secluded

Smokey

Dinner was whole baked flounder with shaved fennel and orange from Public Kitchen and it was absolutely magnificent, cooked to total perfection. The “whole fish” bit was slightly off-putting because it’s literally that, a entire fish, eyeballs and all, sitting on your plate staring up at you but I pretty quickly got over it. 😛

The morning view the next morning was even better than before.

The Remarkables, morning light 2

We paid a brief visit to Arrowtown, though there wasn’t a whole lot there and it was mostly tourist shops.

Untitled

Untitled

We then drove up to Wanaka and took the obligatory photo of the tree there.

That Wanaka Tree

The drive itself had some great scenery along the way too.

Untitled

Winding road

Untitled

Dinner was at the Pig & Whistle pub, I got the dry-rubbed steak with veggies and red wine jus and Kristina had chicken and mushroom pasta, and they were both absolutely incredible.

The final full day we were there, we drove two and a half hours up to Lake Pukaki. As before, the drive itself was quite scenic too.

Stopping at Tarras

Lindis Pass

The lake itself is amazing, it’s this crazy neon-blue colour. The first two photos don’t really do it justice, but the third one is exactly how it looked even in person.

Lake Pukaki

Lake Pukaki

Lake Pukaki

We drove a little further north along the western edge of the lake to get a bit closer to Looking towards Aoraki / Mount Cook, which was looking very dramatic with its peaks covered in clouds.

Looking towards Aoraki / Mount Cook

Overall it was a fantastic trip, we definitely want to go back again but we’re thinking we’ll fly into Christchurch next time and drive around further in the north of the South Island.

The full photosets are here:

More miniatures: Warhammer 40,000 edition

Warhammer 40,000 used to be quite the complicated affair, lots of rules and looking things up on different tables to check what dice roll you needed for different effects, and needing many hours to finish a game. The 8th Edition of the game came out last year, and was apparently extremely streamlined and simplified and seems to have been received very well. Since I’d been doing well with Shadespire, I decided to get the 8th Edition core box set as well, and had almost exactly enough in Amazon gift card balance for it! It comes with Space Marines, as always, but the opposing side is Chaos this time. 7 Plague Marines, a few characters, a big vehicle, and about 20 undead daemon things. I decided to alternate between painting a handful of each side at once, so as not to get bored, and have gone with Space Wolves (big surprise, I know) as the paint scheme for the Imperial side.

Space Wolves Intercessor

There’s another five of these Space Marines but they’re all identical apart from the poses so I didn’t take photos of all of them.

The Plague Marines are all unique though, so I’ve been taking photos of each of them, my first batch was four of them.

Plague Marine 1

Plague Marine 2

Plague Marine 3

Plague Marine 4

My mobile painting table has been a great success, but after the first batch of Space Marines I realised I was getting a sore neck and back from hunching over towards the miniatures as I was painting them because everything was too low. Another trip to Bunnings, and lo and behold…

Painting table from the side, showing the two vertical blanks to give it some hight

Problem solved!

I also realised the other day why I was enjoying painting my miniatures a lot more now than I used to… it’s thanks to being able to combine my hobbies of painting and also photography. 😛 I can paint the miniatures and be happy with my work, but then also take professional-looking photos of them and share them with the world!

More Raspberry Pi adventures: the Pi Zero W and PaPiRus ePaper display

I decided I wanted to have some sort of physical display in the house for the temperature sensors so we wouldn’t need to be taking out our phones to check the temperature on my website if we were already inside at home. After a bunch of searching around, I discovered the PaPiRus ePaper display. ePaper means it’s not going to have any bright glaring light at night, and it also uses very little power.

The Raspberry Pi is hidden away under a side table, and already has six wires attached to the header for the temperature sensors, so I decided to just get a separate Raspberry Pi Zero W — which is absurdly small — and the PaPiRus display.

Setting it up

I flashed the SD card with the Raspbian Stretch Lite image, then enabled SSH and automatic connection to our (2.4GHz; the Zero W doesn’t support 5GHz) wifi network by doing the following:

  1. Plug the flashed SD card back into the computer
  2. Go into the newly-mounted “boot” volume and create an empty file called “ssh” to turn on SSH at boot
  3. Also in the “boot” volume, create a file called “wpa_supplicant.conf” and paste the following into it:

    country=AU
    ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
    update_config=1
    network={
    ssid="WIFI_SSID"
    scan_ssid=1
    psk="WIFI_PASSWORD"
    key_mgmt=WPA-PSK
    }

  4. Unmount the card, pop it into the Pi, add power, and wait 60-90 seconds and it’ll connect to your network and be ready for SSH access! The default username on the Pi is “pi” and the password is “raspberry”.

(These instructions are all thanks to this blog post but I figured I’d put them here as well for posterity).

The PaPiRus display connection was dead easy, I just followed Pi Supply’s guide after soldering a header into the Pi Zero W. If you want to avoid soldering, they also offer the Zero W with a header pre-attached.

Getting the Python library for updating the display was mostly straightforward, I just followed the instructions in the GitHub repository to manually install the Python 3 version.

I wrote a simple Python script to grab the current temperature and humidity from my website’s REST endpoints, and everything works! This script uses the “arrow” and “requests” libraries, which can be installed with “sudo apt-get install python3-arrow python3-requests”.

Next step is to have the Pi 3 that has the sensors run a simple HTTP server that the Zero W can connect to it, so even if we have no internet connection for whatever reason, the temperatures will still be available at home. I’ve updated my Pi Sensor Reader to add HTTP endpoints.

Another year of Node.js (now also featuring React)

I posted last year about my progress with Node.js, and the last sentence included “I’m very interested to revisit this in another year and see what’s changed”.

So here we are!

There’s been a fair bit less work on it this year compared to last:


$ git diff --stat 6b7c737 47c364b
[...]
77 files changed, 2862 insertions(+), 3315 deletions(-)

The biggest change was migrating to Node 8’s shiny new async/await, which means that the code reads exactly as if it was synchronous (see the difference in my sendUpdate() code compared to the version above it). It’s really very nice. I also significantly simplified my code for receiving temperature updates thanks to finally moving over to the Raspberry Pi over the Christmas break. Otherwise it’s just been minor bits and pieces, and moving from Bamboo to Bitbucket Pipelines for the testing and deployment pipeline.

I also did a brief bit of dabbling with React, which is a frontend framework for building single-page applications. I’d tried to fiddle with it a couple of years ago but there was something fundamental I wasn’t grasping, and ended up giving up. This time it took, though, and the result is virtualwolf.cloud! All it’s doing is pulling in data from my regular website, but it was still a good start.

There was a good chunk of time from about the middle of the year through to Christmas where I didn’t do any personal coding at all, because I was doing it at work instead! For my new job, the primary point of contact for users seeking help is via a room on Stride, and we needed a way to be able to categorise those contacts to see what users were contacting us about and why. A co-worker wrote an application in Ruby a few years ago to scrape the history of a HipChat room and apply tags to it in order to accomplish this, but it didn’t scale very well (it was essentially single-tenented and required a separate deployment of the application to be able to have it installed in another room; understandable when you realise he wrote it entirely for himself and was the only one doing this for a good couple of years). I decided to rewrite it entirely from scratch to support Stride and multiple rooms, with the backend written in Node.js and the frontend in React. It really is a fully-fledged application, and it’s been installed into nearly 30 different rooms at work now, so different teams can keep track of their contact rate!

The backend periodically hits Stride’s API for each room it’s installed in, and saves the messages in that room into the database. There’s some logic around whether a message is marked as a contact or not (as in, it was someone asking for help), and there’s also a whitelist that the team who owns the room can add their team members to in order to never have their own messages marked as contacts. Once a message is marked as a contact, they can then add one or more user-defined tags to it, and there’s also a monthly report so you can see the number of contacts for each tag and the change from the previous month.

The backend is really just a bunch of REST endpoints that are called by the frontend, but that feels like I’m short-changing myself. 😛 I wrote up a diagram of the hierarchy of the frontend components a month or so ago, so you can see from this how complex it is:

And I’m in the middle of adding the ability to have a “group” of rooms, and have tags defined at the group level instead of the room level.

I find it funny how if I’m doing a bunch of coding at work, I have basically zero interest in doing it at home, but if I haven’t had a chance to do any there I’m happy to come home and code. I don’t think I have the brain capacity to do both at once though. 😛

Tea, five years later

I’d posted back in May of 2013, just before we moved into our house, that I was really enjoying my nighttime cuppa, and nearly five years later we’re now at the point where we have an entire shelf of our pantry that has nothing but tea on it!

Our selection of tea

We’ve got a handful of loose-leaf teas, but I tend to forget about them because they’re buried at the back and it’s a bit of a pain to clean up afterwards. A cup of herbal tea after dinner is glorious, especially when the weather has cooled down. The brand “Celestial Seasonings” has quite a few very tasty ones, and we’ll frequently pick up a new interesting-looking type when we’re out at the shops too (hence the whole shelf of tea that we now have).

I still don’t tend to drink a lot of black tea, mostly because it’s not caffeinated enough compared to coffee, but we discovered McVitie’s digestive biscuits thanks to a British documentary series (called “Inside the Factory”) about how they’re made, and ohmygod they’re so good dunked in a nice hot cup of black tea! 👌

Mobile painting

I mentioned in my last post that I’d brought all the paints and miniatures and everything inside because it was too hot in the back room to do any actual painting. Moving everything back and forth turned out to be a massive pain, so I decided to build myself a painting board that I could have everything sitting on, then just pick up and move back and forth as necessary.

After about $20 at Bunnings and some Liquid Nails as well as actual nails, it’s ready to go! The board ended up being somewhat larger than I was expecting, and it was a very tight squeeze with all the other stuff on the desk. Fortunately we still had the two shelves we’d originally put up in the office three and a half years ago and had since removed when we rearranged everything two years ago, so I put them up and moved basically everything that was on the desk onto them instead, and now everything is neat and tidy and organised!

Miniatures painting board

Board

Finally, some actual miniature painting

So despite having gotten the back room set up for miniature painting over three and a half years ago, I hadn’t actually done any of it since then. 😛 I also realised I hadn’t actually taken a photo of the setup.

I bought Games Workshop’s latest game Shadespire early last month, it does have miniatures to paint but only eight in the core set, and it’s a board game where the games last about half an hour or so versus the multi-hour affairs that are traditional Warhammer/Warhammer 40,000 games. I figured that with the holidays around and time to kill, and not having the prospect of endless amounts of miniatures to paint, I’d give it a go. I’m pleased to say that I clearly still have the painting skills!

I’ve finished five of them so far, so only three to go, and took some proper photos of them with the full external flash/umbrella setup.

Blooded Saek

Angharad Brightshield

Targor

Karsus the Chained

Obryn the Bold

(I’ll admit that I cheated slightly and didn’t actually paint any of these in the back room, however… during the week and a bit that I was doing them, the weather was really hot and the dinky little air conditioning unit in the back room wasn’t remotely up to keeping things cool, so I ended up bringing all the paints and bits inside and did them at the dining table).

The game Shadespire itself is really neat as well. I’ve only played a handful of games, but rather than just “Kill the other team” you also have specific objectives to accomplish as well. Have a read of Ars Technica’s review of it, they’re a lot more thorough and eloquent than I could be. 😛

Temperature sensors: now powered by Raspberry Pi

The Weather section on my website is now powered by my Raspberry Pi, instead of my Ninja Block! \o/

Almost exactly three years ago, I started having my Ninja Block send its temperature data to my website (prior to that, I was manually pulling the data from the Ninja Blocks API and didn’t have any historical record of it). Ninja Blocks the company went bust in 2015, and there was some stuff in the Ninja Blocks software that relied on their cloud platform to work and I ended up with no weather data for a couple of days because the Ninja Block couldn’t talk to the cloud platform. I ended up hacking at it and the result was this very simple Node.js application as a replacement for their software. It always felt a bit crap, though, because if the hardware itself died I’d be stuck; yes, it was all built on “open hardware” but I didn’t know enough about it all to be able to recreate it. I’d ordered a Raspberry Pi 3 in June last year, intending on replacing the Ninja Block and it’s sometimes-unreliable wireless temperature sensors with something newer and simpler and hard-wired, but I found there was a frustrating lack of solid information regarding something that on the surface seemed quite simple.

I’ve finally gotten everything up and running, the Ninja Block has been shut down, and I’ve previously said I’d write up exactly what I did. So here we are!

Components needed

  • Raspberry Pi 3 Model B+
  • AM2302 wired temperature-humidity sensor (or two of them in my case)
  • Ethernet cable of the appropriate length to go from the Pi to the sensor
  • 6x “Dupont” female to either male or female wires (eBay was the best bet for these, just search for “dupont female”, and it only needs to be female on one end as the other end is going to be chopped off)
  • 1.5mm heatshrink tubing
  • Soldering iron and solder
  • Wire stripper (this one from Jaycar worked brilliantly, it automatically adjusts itself to diameter of the insulation)

Process

  1. Cut the connectors off one end of the dupont cables, leaving the female connector still there, and strip a couple of centimetres of insulation off.
  2. Strip the outermost insulation off both ends of the ethernet cable, leaving a couple of centimetres of the internal twisted pairs showing.
  3. Untwist three of the pairs and strip the insulation off them, then twist them back together again into their pairs.
  4. Chop off enough heatshrink tubing to cover the combined length of the exposed ethernet plus dupont wire, plus another couple of centimetres, and feed each individual dupont wire through the tubing (there should be three separate bits of tubing, one for each wire).
  5. Solder each dupont wire together with one of the twisted pairs of ethernet cable, then move the heatshrink tubing up over the soldered section and use a hairdryer or kitchen blowtorch to activate the tubing and have it shrink over the soldered portion to create a nice seal.
  6. Repeat this feed-heatshrink-tubing/solder-wire/activate-heatshrink process again but with the cables that come out of the temperature sensor (ideally you should be using the same red/yellow/black-coloured dupont cables to match the ones that come out of the sensor itself, to make it easier to remember which is which).
  7. Install Raspbian onto an SD card and boot and configure the Pi.
  8. Using this diagram as a reference, plug the red (power) cable from the sensor into Pin 2 (the 5V power), the yellow one into Pin 7 (GPIO 4, the data pin), and the black one into Pin 6 (the ground pin).

AdaFruit has a Python library for reading data from the sensor, I’m using the node-dht-sensor library for Node.js myself. You can see the full code I’m using here (it’s a bit convoluted because I haven’t updated the API endpoint on my website yet and it’s still expecting the same data format as the Ninja Block was sending).

I’d found a bunch of stuff about needing a “pull-up” resistor when connecting temperature sensors, but the AM2302 page on adafruit.com says “There is a 5.1K resistor inside the sensor connecting VCC and DATA so you do not need any additional pullup resistors”, and indeed, everything is working a treat!

Christmas 2017

Christmas and Boxing Day this year were pretty great. My sister and her husband and kids were able to make it up from Nowra again, so we had the family Christmas at my parents’ place with everyone. My niece Arya, now three and a half, is GODDAMN ADORABLE. She’s totally happy to just go off and play by herself, and there wasn’t a single tantrum the whole time we were there either. Lily and Scarlett were happy to hang out with each too and got along very well.

It was interesting to see that Scarlett’s reading is definitely not at the same level as Lily’s was at the same age… Lily was eight when I posted this but she was absolutely tearing through pretty much everything, whereas Scarlett was struggling a little to read the jokes inside the Christmas crackers.

Beanie was terrific, he mostly just wandered around keeping an eye on what everyone was doing and didn’t bark once. Arya was a afraid of him to begin with, because he was so excited to see someone vaguely at his height that he kept jumping up on her and trying to lick her face! My sister’s dog at home isn’t one for jumping, so Arya wasn’t used to that. She got over it eventually, though, and was able to sit down and give him scritches.

Showing Scarlett how to Minecraft

Pennie, Mark, and Arya

AHHH A DINOSAUR BOOK!

Beanie amongst the Christmas paper

Snuggles with Aunty Kristina

Sitting on dad

I’ve been a bit stuck for what to get for Christmas and birthdays of late, so I’ve asked for mostly just books. This year’s haul:

So that should certainly keep me occupied for a while!

We went back over on Boxing Day for mum’s birthday, and went down to Collaroy Beach in the afternoon. It was completely overcast but the weather was otherwise glorious… temperature in the mid-20s and a lovely breeze. There were more photos, of course, and I gave the 135mm lens a good workout for once!

Waving from the kiddy pool

Posing

Strutting

Playing in the sand

Getting splashed #1

Even cheesier grin

Getting splashed #5

Photographical style

I never really thought of myself as having a particular “style” to my photos, but I was looking back at my old photos — originally specifically from Christmas and then just more generally all of them — and I’ve realised in the last year or two I’ve very much moved towards a lighter “high-key” style where I bring the shadows up and even bump the exposure of the whole photo by ⅓ to ⅔ of a stop or even more, to get a nice light airy feel to them.

Have a look just at my Christmas albums to see what I mean.

It’s even more obvious when going back and looking at all my photos as a whole! My photos from our trip to Boston in 2012 are a good example, they’re all super-saturated and high-contrast, with really dark shadows. I still have all the original RAW files from that trip, but I don’t want to go back and start re-editing photos lest I go down the path of George Lucas and just totally ruin everything with my meddling. 😛

A holiday in Perth

We went to Perth for a week last week, and it was damned lovely! A friend of mine, Mat, who I’ve known for over fifteen years and originally met through the now-mostly-defunct Everything2, lives over there and was able to offer some advice on places to eat at and suburbs to stay in.

We arrived on Saturday and stayed in a house in Highgate, which is about a 10-15 minute walk from the city itself. Less than a block away is Hyde Park, which is lovely, and so green (Perth has had like a month or two straight of rain, versus the next-to-none that Sydney’s had).

Path

Gazeebo

Departing

There’s a bunch of street art all around the place as well, and lots of interesting buildings to take photos of (full album is here).

Untitled

Untitled

Untitled

Five

High Grounds Coffee

On Sunday we visited Fremantle, to check out the markets there and hopefully get a view of a sunset over the ocean (something we’ve never seen given both Kristina and I grew up on the east coasts of our respective countries). It was indeed getting very nice, but sadly the clouds moved in right as the sun was getting low to the horizon.

Down the street

Ferris wheel

Untitled

Playing

Lighthouse

Untitled

Untitled

Fremantle is the main cargo port for Perth, so there were the giant cargo cranes there and also a massive submarine in dry-dock (have a look to the right of the second photo)!

Untitled

Cranes

Monday was spent first at the Western Australian Botanic Garden and then wandering Northbridge and the CBD itself.

The Garden is massive although there’s a lot of just regular bushland as well as flower beds.

Untitled

Untitled

Untitled

The Swan Brewery Co. Ltd.

Northbridge was neat, there’s a lot of laneways and little alleys, and most of them have art on the walls, often on a very large scale.

Untitled

Small steps

Dragon

Sugar glider

Goat

Untitled

Untitled

Those last two would have been probably four stories high!

We wandered through the CBD itself as well, had dinner at Durty Nelly’s Irish pub (highly recommended, the food was incredible), then continued wandering after night had fallen.

Spring

Untitled

Gothic windows

Untitled

Stairs

Untitled

On Tuesday we visited Rottnest Island! We made the mistake of taking a bus tour, which was filled with loud, obnoxious, racist boomers, and we only stopped to actually get off the bus twice. Otherwise we were driving past all this wonderful terrain and the occasional quokka, and everyone was snapping shitty photos out of the bus windows.

Thankfully that only lasted an hour and a half, and we were able to go visit a colony of quokkas that were all of about ten minutes from the main buildings on the island, and OH MY GOD they are adorable! They have no natural predators on the island so they were pretty well unafraid of people and we were able get up super-close to them.

Smiling

Round

Mine!

Gnarled

The water around the island is crystal clear.

Untitled

Untitled

Tuesday night, we had dinner in Northbridge at a Mexican restaurant called La Cholita, and holy crap if you’re in Perth you need to visit it. The food is amazing.

Chopping

Kristina and Mat

Taco and sangria

Afterwards we went for another wander around the area and snapped some photos.

Mat's ridiculous icecream

Waiting

Meat Candy

Wednesday was spent briefly at the Araluen Botanic Park (briefly, because Kristina’s legs were massively hurting from crouching down and getting up constantly on Tuesday while we were visiting the quokkas and the Botanic Park was filled with lots of hills), and then a leisurely wander through East Perth.

The weather starting turning a bit crap on Thursday, so we visited Mat’s sister and her boyfriend on their rural property and just hung out there with their horse Archie and hilariously uncoordinated Maremma sheep dog Iorek, then went back and played some Diablo III.

Untitled

Iorek the Maremma sheep dog

I can’t believe how well-timed the trip was, we booked it back in July and the week before the trip was almost non-stop rain and it’s now back to raining again for the next week! There would have been so much we wouldn’t have been able to see if the weather had been awful.

A vehicular upgrade

We bought a brand-new car today! \o/ It’s an extremely handsome-looking Kia Cerato hatchback in dark metallic grey.

Our brand-new MY18 Kia Cerato

We’d been toying with the idea for a little bit, then Kristina came across some videos of crash tests comparing somewhat older (from ~2000-ish, which is exactly what our current Corolla is) cars impacting with newer ones. They’re nothing like the utter crumpling of the cars from the 1970s, but still somewhat alarming. She did a bunch of research and found that the current Kias are extremely well-regarded and reliable, and it turns out there’s basically nothing else in the same price range that offer as much power and features; the equivalent cars like the Corolla and Civic and such were all several thousand dollars more, with less power and fewer included features. The Cerato has a 7-year warranty as well, which seemed to be more than most other cars.

Speaking of power (112kW and 192Nm to be exact), we took one for a test drive on Sunday… I put my foot down and was accidentally doing 70km/h almost immediately! There’s a hell of a lot of room as well, it’s only marginally larger on the outside compared to our Corolla, but it’s so spacious inside. I’m excited about having a hatchback again too, there’s been a few situations where we were trying to put something into the boot and it just wouldn’t fit. Now we can go to town and put the back seats down and put EVERYTHING in it! *maniacal laughter*

We’re going to keep the old Corolla for a while just to go to and from the station in, because parking there means the car is permanently covered in dust thanks to all the construction going on, and when it rains everything turns to mud and the trucks going along the road just splash mud across everything.

A photographical upgrade

Last Wednesday, we upgraded from our trusty Canon EOS 7D to a brand-new Canon EOS 5D Mark IV! It’s a hell of an upgrade in terms of basically every single aspect… the 7D originally came out in 2009 and the 5D4 was only last year and is full-frame to boot, 18 megapixels versus 30, and the 5D4 also has the insanely awesome autofocus system from Canon’s flagship ~$8k-for-the-body-alone EOS 1DX.

I fairly obsessively tag my photos on Flickr so it’s easy to find things, and the final tally for photos taken with the 7D is 2641!

The very first photo taken was of this flower at my parents’ place, when I got our original 35mm f/2 lens for my birthday in 2010 (the camera and lens are not mine alone, both Kristina and I share it equally, but getting the lens for my birthday was a handy way to not have pay the entire cost of it ourselves :P).

Untitled

It’s difficult to pull only a handful of favourite photos out of twenty-six hundred, but these would definitely be amongst them, in many cases more for the memory than any particular quality of the photograph…

Kristina being nibbled by a horse on our first wedding anniversary—
Horsey Nibbles

Meerkats warming themselves at Taronga Zoo—
Warming glowing warming glow

Dan looking right at a well-placed “Look right” sign—
Dan is waiting for a bus

Kristina looking stunning with our ring-flash—
My beautiful wife

Lily writing her name—
Writing

The train tunnels at Wynyard—
Into the tunnels

Lily feeding the lorikeets—
Feeding the lorikeets

A toothy grin—
Toothy grin

Christmas excitement—
Excitement

The first photo taken in our new house—
Tedison's new home

Kristina being extremely nudged by a calf at Featherdale—
Untitled

My very first photo of Beanie when we got him—
Untitled

Nanny at Christmas hugging one of Lily’s presents—
Nanny hugging Lily's pillow pet

Beanie in the office—
In the office

One of the several actresses we got in at work one Halloween, who were done up as zombies and CREEPY AS FUCK—
Zombiegirl #3

Kristina cracking up at how ridiculous Beanie is—
Cracking up

Lily and Scarlett’s matching bears at Christmas—
New bears

Family photo—
Family photo!

The fantastically creepy decorations and lighting for the latest Halloween at work—
Untitled

Playing around with coloured gels on our flashes with Adam and Stacey—
Untitled

Beanie playing with his best friend Leo—
Untitled

The extraordinarily epic storm aftermath we had—
Untitled

A photo walk we did at work one lunch where we had some volunteers to do a pseudo-modelling shoot—
Marlene

Leo and Beanie zooming down the hall—
Untitled

Wandering around Barangaroo before going to the Maritime Museum—
Untitled

We sold the 7D to friends, so it’s definitely going to continue on in a good home. It was an absolute workhorse, I didn’t think to check the shutter count before we sold it but it never once gave any sort of trouble whatsoever. Meanwhile, we’ve already started taking new memories with the 5D Mark IV and I’d say we ought to get at least 10 years out of it if not more.

Bandcamp is brilliant

For those unaware, Bandcamp is essentially a more indie iTunes Music Store—they don’t have any of the huge music labels there—but with a twist… you can stream entire albums before buying them (as opposed to the 90-second previews you get in iTunes), and a significantly larger percentage of the money you pay to them goes directly to the artist (Bandcamp says around 75-80%).

I found out about it around the start of this year, and it has me discovering and buying way more new music than I had previously. From 2013 to 2016, I’d added 30 albums to iTunes from various sources… this year so far I’ve bought 34 on Bandcamp! They have an iOS app that lets you browse artists by tag (usually genre, like “black metal” for instance, but there’s things like “female-fronted metal” that spans different genres, and really whatever else users have tagged the artist with), so I’ll spend an hour here and there just going through listening to new artists and adding albums to my wishlist, then once or twice a month will go back and buy a few of them.

One of the best bits is that the artists themselves set the prices and you can pay more if you’d like, and some don’t even have a minimum price. The highest I’ve come across so far us US$9.99 (currently about AU$12.50), which is a good bit cheaper than the standard AU$16.99 price you see on iTunes, and a lot have been closer to US$5.

They’ve also done some fantastic things like donating all their proceeds for a day when Trump tried his so-called “Muslim ban” back in January, and more recently doing a similar thing with the proposed ban on transgender service members in the US military.

So basically, if you like music and supporting artists, stop buying music anywhere else and start buying it on Bandcamp!

Adventures with Docker

For a few years now, the new hotness in the software world has been Docker. It’s essentially a very-stripped-down virtual machine, where instead of each virtual machine needing to run an entire operating system as well as whatever application you’re running inside it, you have just your application and its direct dependencies and the underlying operating system handles everything else. This means you can package up your application along with whatever other crazy setup or specific versions of software is required, and as long as they have Docker installed, anyone in the world can run it on pretty much anything.

The process of converting something to run in Docker is called “Dockerising”, and I’d tried probably two or so years ago to Dockerise my website (which was at the time still in its Perl incarnation), but without success. Most of it was not properly understanding Docker but also Docker’s terminology not being hugely clear and information on Dockering Perl applications being a bit thin on the ground at the time.

My new job involves quite a lot of Docker so I figured I should probably have another crack at it, so I sat down in June and managed to get my website running in a Docker container! The two-or-so-years between when I tried it last and now definitely helped, as did having had a little bit of experience with it in the new job.

I think the terminology was one of the bits that I struggled with most, so maybe this explanation will help someone… you have a Docker image, that’s basically a blueprint for a piece of software and all its associated dependencies. From that image (blueprint), you start up one or more containers which are the actual running form of the image. If one container dies (the application inside crashes or whatever), you don’t care and just start up another one and it’s identical each time. To build your own image, you start with a Dockerfile that tells Docker exactly how to construct your application and all the different parts that are required to support it (see my Lessn Archive’s Dockerfile for an example). There really wasn’t any substitute for actually going in and doing it; by struggling and failing I eventually got there in the end.

Since my initial success with my website, I’ve gone on to put both my old site archive and my URL shortener in Docker containers as well! Next stop is Kristina’s website, but that’s still using Perl and Mojolicious and my initial attempts have not been successful. 😛

Internet history

On Twitter recently, Mark had downloaded the whole archive of his Twitter account’s history and had been poking through it and randomly retweeting amusing old tweets. I downloaded my own Twitter history and quickly realised that a lot of the old things I’d linked to weren’t accessible because I’d been using my own custom URL shortener (this was before the days of Twitter doing their own URL shortening) and it wasn’t running anymore. Fortunately I’d had the foresight to take a full copy of all of my data and databases from Dreamhost before I shut down my account, and one of those databases was the one that had been backing my URL shortener. A quick import to PostgreSQL and a hacky Node.js application later, it’s all up and running! I’m under no illusions that it’s almost ever going to be accessed by anyone except me, but it’s nice to have another part of my internet history working. I’ve been hosting my own website and images and whatnot (things like pictures I’ve posted on my blog née LiveJournal, or in threads on Ars Technica) in one form or another since about 2002, and the vast majority of those links and images still work!

Speaking of my website, about four years ago now I went and tried to collect all my old websites into a single archive so I could look back and see the progression. The majority of them I actually still had the original source code to, though my very first one or two have been totally lost. The earliest I still have is from March of 1998 when I was not quite fifteen years old! I started out with just HTML, then discovered CSS and Javascript rollover images, and then around 2001 I started using PHP. I had to go in and hack up some of the PHP-based sites in order to get them to work, and oh dear god 18-year-old me was a FUCKING AWFUL coder. One of the sites consisted of a bit over three thousand lines in a single file, with all sorts of duplication and terribleness, and every single one of the sites that was hooked into MySQL had SQL injection vulnerabilities. I’m very proud of just how much my code has improved over the years.

I went back this weekend and managed to recover another handful of sites, and also included exports of the Photoshop files where the original site source wasn’t available. I’ve packed them all up into a Docker container (I’ll write another post about my experiences with Docker at some point soon) and chucked them up on archive.virtualwolf.org for the entire Internet to marvel at how terrible they all were! There’s a little bit more background there, but it’s a lot of fun just looking back at what I did.

Better Raspberry Pi audio: the JustBoom DAC HAT

I decided that the sound output from the Pi’s built-in headphone jack wasn’t sufficient after all and so went searching for better options (a DAC—digital-to-analog converter).

The Raspberry Pi foundation created a specification called “HAT” (Hardware Attached on Top) a few years ago which specifies a standard way for devices to automatically identify and configure a device and drivers that’s attached to the Pi via its GPIO (General Purpose Input/Output) pins. There’s a number of DACs now that conform to this standard, and the one I settled on is the JustBoom DAC HAT. It’s a UK company but you can buy them locally from Logicware (with $5 overnight shipping no less).

The setup is incredibly simple: connect the plastic mounting plugs, attach the DAC to the Pi, then edit /boot/config.txt to comment out the default audio settings and add three new lines in, then reboot.

To say that I’m impressed would be an understatement! I didn’t realise just how crappy the audio from the Pi’s built-in headphone jack was until I’d hooked up the new DAC and blasted some music out. I’m not an audiophile and it’s hard to articulate, but I’d compare it most closely to listening to really low-quality MP3s on cheap earbuds versus high-quality MP3s on a proper set of headphones.

If you’re going to be hooking your Pi into a good stereo system, I can’t recommend JustBoom’s DAC HAT enough!

Raspberry Pi project: AirPlay receiver

I bought a Raspberry Pi almost exactly a year ago, intending on eventually replacing my Ninja Block and its sometimes-unreliable wireless sensors with hardwired ones (apart from the batteries needing occasional changing, there’s something that interferes with the signal on occasion and I just stop receiving updates from the sensor outside for several hours at a time, and then suddenly it starts working again). To do that, I need to physically run a cable from outside under the pergola to inside where the Raspberry Pi will live and I don’t really want to go drilling holes through the house willy-nilly. I want to eventually get the electrician in to do some recabling so I’m going to get him to do that as well, but until then the Pi was just sitting there collecting dust. I figured I should find something useful to do it with, but having a Linode meant that any sort of generic “Have a Linux box handy to run some sort of server on” itch was already well-scratched.

I did a bit of Googling, and discovered Shairport Sync! It lets you use the Raspberry Pi as an AirPlay receiver to stream music to from iTunes or iOS devices, a la an Apple TV or AirPort Express. We already have an Apple TV but it’s plugged into the HDMI port on the Xbox One which means that to simply stream audio to the stereo we have to have the Xbox One, TV, and Apple TV all turned on (the Apple TV is plugged into the Xbox’s HDMI input so we can say “Xbox, on” and the Xbox turns itself on as well as the TV and amplifier, then “Xbox, watch TV” and it goes to the Apple TV; it works very nicely but is a bit of overkill when all you want to do is listen to music in the lounge room).

Installing Shairport Sync was quite straightforward, I pretty much just followed the instructions in the readme there then connected a 3.5mm to RCA cable from the headphone jack on the Raspberry Pi to the RCA input on the stereo. It’s mentioned in the readme, but this issue contains details on how to use a newer audio driver for the Pi that significantly improves the audio output quality.

The only stumbling block I ran into was the audio output being extremely quiet. Configuring audio in Linux is still an awful mess, but after a whole lot of googling I discovered the “aslamixer” tool (thanks to this blog post), which gives a “graphical” interface for setting the sound volume, and it turned out the output volume was only at 40%! I cranked it up to 100% and while it’s still a bit quieter than what the Apple TV outputs, it doesn’t need a large bump on the volume dial to fix—there’s apparently no amplifier or anything on the Raspberry Pi, it’s straight line-level output. The quality isn’t quite as good as going via the Apple TV, but it gets the job done! I might eventually get a USB DAC or amplifier but this works fine for the time being.

On macOS it’s possible to set the system audio output to an AirPlay device, so you can be watching a video but outputting the audio to AirPlay, and the system keeps the video and audio properly in sync. It works extremely well, but the problem we found with having the Apple TV hooked up to the Xbox One’s HDMI input is that there’s a small amount of lag from the connection. When the audio and video are both coming from the Apple TV there’s no problem, but watching video on a laptop while outputting the sound to the Apple TV meant that the audio was just slightly out of sync from the video. Having the Raspberry Pi as the AirPlay receiver solves that problem too!

UPDATE: Two further additions to this post. Firstly, and most importantly, make sure you have a 5-volt, 2.5-amp power supply for the Raspberry Pi. I’ve been running it off a spare iPhone charger which is 5V but only 1A, and the Pi will randomly reboot under load because it can’t draw enough power from the power supply.

Secondly, the volume changes done with the “alsamixer” tool are not saved between reboots. Once you’ve set the volume to your preferred level, you need to run “sudo alsactl store” to persist it (this was actually mentioned in the blog post I linked to above, but I managed to miss it).

New job!

No, I haven’t left Atlassian, but come Monday I’m starting in a new role!

We have the concept of a role called a “Shield” that’s essentially support but for helping our own internal developers and users (as opposed to external customers), and the idea is that it’s the first point of contact for developers who need help or have questions about the particular service/platform/whatever that the Shield is supporting, as well as being able to step back and look at the bigger picture in terms of pain points that those developers run into and what sort of things could be done to minimise that. The name “shield” comes from the fact that you’re essentially shielding the rest of the developers on the team from the distractions that come from other people constantly contacting them throughout the day, and letting them get on with what they do best (actual coding and improvements to the product). I’ll also have the opportunity to actually do some coding and make improvements too, though. \o/

The team I’m joining runs our internal microservices platform that an increasing number of our applications are being run on, and though the microservices themselves can be pretty much any language you’d like (it’s all Docker-based), the code that the platform itself runs on is Node.js which ties in rather nicely with all my learning over it over the past almost eighteen months.

I’ve been doing external customer-facing support for over thirteen and a half years so this is going to be a lovely change! It’s going to be really weird starting anew where I know next to nothing about the inner workings of the thing I’m supporting though. 😛 I expect my brain is going to be dribbling out my ears come the end of next week.

Back to Tasmania

We went back to Tasmania again last week, and it was pretty great!

Where last time we stayed in Hobart for the whole trip, this time we drove up to Bicheno first, which is about a two and a half hour drive north of Hobart. The accomodation itself (the “Diamond Island Retreat”) was not great, the house was built in probably the 1970s and had clearly had next to nothing done with it since. The kitchen was terrible and the two frying pans were both quite burnt and scratched up, and there was zero internet access (at least in terms of wifi, thankfully there was plenty of 4G reception). It was completely clean and tidy, at least.

That being said, the location was amazing. This was the view from the back deck –

Diamond Island

You could walk down the paddock and down to the beach, which had some of the whitest sand I’ve seen. The first sunset was pretty epic as well.

Sunset

Reflections

Walking

More patterns

They do “penguin tours” right near where we were staying, there’s a whole section of land that’s restricted to the public and they get lots of penguins living and breeding there. We went during the decidedly off-season and so only saw a couple of penguins, but one of them waddled its way up the beach and right past us to its burrow! The other penguins we could only see in the distance down on the rocks near the beach. The tours are done after the sun has set and the guides have special torches that emit really yellow light so as not to hurt the penguins’ eyes. During the breeding season you can apparently see upwards of a hundred penguins all coming ashore to feed their chicks.

There’s a few other things to do around Bicheno as well, one is Freycinet National Park which has some epic hiking trails through it (neither Kristina nor I are hikers so we opted to just go by what we could reach by car).

Untitled

Untitled

Cape Tourville Lighthouse

Then right up the road from where we were staying is Douglas-Apsley National Park, which is the same deal as Freycinet with the hiking, and requires a good couple of kilometres of dirt road to get to the carpark.

Untitled

Untitled

Untitled

Untitled

Untitled

(There’s a few more photos from each in the photoset).

The last bit of Bicheno we saw was East Coast Natureworld, a big wildlife sanctuary and conservation area.

Esther the wombat

Emu

Lazing

Ostrich

Don’t ask me why there was an ostrich there, I don’t know. 😛 The baby wombat at the top is named Esther, and she was just sitting there in the keeper’s arms dozing while the keeper was talking. They also do conservation and breeding for Tasmanian devils there, and we got to see one of them being fed which was pretty neat!

Feeding Dennis the Tasmanian Devil

Nom nom nom

After that, we drove back down to Hobart and spent the rest of the trip just wandering around some more.

The side path

190

Up the hill

Mirror selfie

Lit from below

Aurora Australis

Grafitti

Happy doggo grafitti

Under construction

Docked

The huge orange ship is an icebreaker

Aurora Australis is an Australian icebreaker. Built by Carrington Slipways and launched in 1989, the vessel is owned by P&O Maritime Services, but is regularly chartered by the Australian Antarctic Division (AAD) for research cruises in Antarctic waters and to support Australian bases in Antarctica.

And it’s quite an impressive sight in person!

We also went up to Mount Nelson, which is the next-highest mountain in Hobart, but unfortunately they were doing hazard-reduction burns (basically controlled bushfires) so it was really smokey and you mostly couldn’t see anything. 🙁

Despite that, all in all it was an excellent trip.

iPhoneography

Both Kristina and I upgraded to the iPhone 7 last month, I’d heard the camera was good but I took it out for a spin last week when I went on a lunchtime photo walk with some co-workers, and man. I can see why the point-and-shoot market is dying! I processed all these photos in Lightroom on my iMac so it wasn’t solely done on the iPhone, but even so, I’m incredibly impressed.

It’s not going to replace a full DSLR setup in low-light or shallow depth-of-field situations, but where I’d be wandering around during the day taking photos at f/8 anyway…

Untitled

Untitled

Untitled

Untitled

Untitled

Untitled

Untitled

Untitled

Untitled

A year of Node.js

Today marks one year exactly since switching my website from Perl to Javascript/Node.js! I posted back in March about having made the switch, but at that point my “production” website was still running on Perl. I switched over full-time to Node.js shortly after that post.

From the very first commit to the latest one:


$ git diff --stat 030430d 6b7c737
[...]
177 files changed, 11313 insertions(+), 2110 deletions(-)

Looking back on it, I’ve learnt a hell of a lot in that one single year! I have—

  • Written a HipChat add-on that hooks into my Ninja Block data (note the temperature in the right-hand column as well as the slash-commands; the button in the right-hand column can be clicked on to view the indoor and outdoor temperatures and the extremes for the day)
  • Refactored almost all of the code into a significantly more functional style, which has the bonus of making it a hell of a lot easier to read
  • Moved from callbacks to Promises, which also massively simplified things (see the progression of part of my Flickr– and HipChat-related code)
  • Completely overhauled my database schema to accomodate the day I eventually replace my Ninja Block with my Raspberry Pi (the Ninja Block is still running though, so I needed to have a “translation layer” to take the data in the format that the Ninja Block sends and converts it to what can be inserted in the new database structure)
  • Added secure, signed, HTTP-only cookies when changing site settings
  • Included functionality to replace my old Twitter image hosting script, and also added a nice front-end to it to browse through old images

Along with all that, I’ve been reading a lot of software engineering books, which have helped a great deal with the refactoring I mentioned above (there was a lot of “Oh god, this code is actually quite awful” after going through with a fresh eye having read some of these books)—Clean Code by Robert C. Martin, Code Complete by Steve McConnell, The Art of Readable Code by Dustin Boswell and Trevor Foucher.

I have a nice backlog in JIRA of new things I want to do in future, so I’m very interested to revisit this in another year and see what’s changed!

Farewell Dreamhost

After 12 years of service, I’m shutting my Dreamhost account down (for those unaware, Dreamhost is a website and email hosting service).

My very first—extremely shitty—websites were hosted on whichever ISP we happened to be using at the time—Spin.net.au, Ozemail, Optus—with an extremely professional-looking URL along the lines of domain.com.au/~username. I registered virtualwolf.org at some point around 2001-2002 and had it hosted for free on a friend’s server for a few years, but in 2005 he shut it down so I had to go find some proper hosting, and that hosting was Dreamhost.

The biggest thing I found useful as I was dabbling in programming was that Dreamhost offered PHP and MySQL, so I was able to create dynamic sites rather than just static HTML. Of course, looking back at the code now is horrifying, especially the amount of SQL injection vulnerabilities I had peppered my sites with.

Around the start of 2011, I started using source control—Subversion initially—and finally had a proper historical record of my code. I used PHP for the first year or so of it, then ended up outgrowing that and switched to a Perl web framework called Mojolicious. The only option to run a long-lived process on Dreamhost is to use Fast-CGI, which I never managed to get working with Mojolicious, but fortunately Mojolicious could also run as a regular CGI script so I was still able to use it with Dreamhost, albeit not at great speed.

At the same time I started using Subversion, I also signed up with Linode who offer an entire Linux virtual machine with which you can do almost anything you’d like as you have full root access. I originally used it mostly to run JIRA so I could keep track of what I wanted to do with my website and have the nifty Subversion/JIRA integration working to see my commits against each JIRA issue. I slowly started using the Linode for more and more things (and switched to Git instead of Subversion as well), until in 2014 I moved my entire website hosting over to the Linode.

At that point the only thing I was using Dreamhost for was hosting Kristina’s website and WordPress blog, and the email for our respective domains. Dreamhost’s email hosting wasn’t always the most reliable and towards the end of 2015 they had more than their usual share of problems, so we started looking for alternatives. Kristina ended up moving to Gmail and I went with FastMail (who I am extremely happy with and would very highly recommend!), I moved her blog and my previously-LiveJournal-but-now-Wordpress-blog over to the Linode, and that was that!

Moving my website hosting to the Linode also allowed me to move over to Node.js and I’ve been going full steam ahead ever since. Since that posted I’ve moved over from callbacks to Promises (so much nicer), I wrote myself a HipChat add-on to keep an eye on the temperature that my Ninja Block is reporting, and I moved my dodgy Twitter image upload Perl script functionality into my site and added a nice front-end to it. Even looking back at my code from 6 months ago to now shows a marked increase in quality and readability.

So in summary, thanks for everything Dreamhost, but I outgrew you. 🙂