Ten years of house!

Today marks ten years since we moved into our house!

This is the second-longest I’ve lived in any one place, when I was born we were living in a very small house and just before my 10th birthday we moved to where my parents still are to this day. After that, I moved out of home when I was 24, so only another four years here and it’ll be the longest I’ve lived anywhere!

It’s funny looking at the old photos and comparing them to now (apologies for the dreadful quality, these are upscaled from the really tiny real estate photos).

Before
Today
Before
Today

I blogged about the move a few days after we’d first moved in and how we turned the “office” that was the garage back into an actual garage, and since then we have:

We also want to redo the whole backyard and back room there, but that’s a massive project and is going to be a number of years down the track yet.

Phew!

Replacing the hard disk in a PowerBook G3 “Pismo”, and other fun with Mac OS 9

Replacing the hard disk in a PowerBook G3 “Pismo”, and other fun with Mac OS 9

I posted nearly five years ago about my shiny new Power Mac G4 and how much I was enjoying the nostalgia. Unfortunately the power supply in it has since started to die, and the machine will random turn itself off after an increasingly short period of time. Additionally, I’d forgotten just how noisy those machines were, and how hot they ran! I’ve bought a replacement power supply for it, but it involves rearranging the output pins from a standard ATX PSU to what the G4 needs, and that’s so daunting that I still haven’t tackled it yet. I decided to go back to the trusty old PowerBook G3, as I’ve since gotten a new desk and computer setup that has much more room on it, and having a much more compact machine has been very helpful.

One thing I was a bit concerned about was the longevity of the hard disk in it and I started investing the possibility of putting a small SSD into it. Thankfully such a thing is eminently possible by way of a 128GB mSATA SSD and an mSATA to IDE adapter! I followed the iFixit guide — though steps 6 through to 11 were entirely unnecessary — and now have a shiny new and nearly entirely silent PowerBook G3 (though it’s disconcerting at just how quiet it is given it’s an old machine… I realised I’m so subconsciously used to hearing the clicking of the hard disk).

A photo of a black PowerBook G3 sitting on a desk, booted to the Mac OS 9 desktop. The machine is big and chunky, but also has subtle curves to it, and the trackpad is HILARIOUSLY tiny compared to modern Macs.

I even had the original install discs from the year 2000 when mum first bought this machine, and they worked perfectly (though a few years ago I’d had to replace the original DVD drive with a slot-loading one because the original one died and it stopped reading discs entirely).

One I had it up and running, another sticking point is actually getting files onto it. As I mentioned in my previous post, Macintosh Repository has a whole ton of old software and if you load it up with a web browser from within Mac OS 9 it’ll load without HTTPS, but even so it’s pretty slow. Sometimes it’s nicer just to do all the searching and downloading from a fast modern machine and then transfer the resulting files over.

Mac OS 9 uses AFP for sharing files, and the AFP server that used to be built into Mac OS X was removed a few versions ago. Fortunately there’s an open-source implementation called Netatalk, and some kindly soul packed it all up into a Docker container.

I also stumbled across a project called Webone a while ago, which acts essentially as an SSL-stripping proxy that you run on a modern machine and point your old machine’s web browser to for its proxy setting. Old browsers are utterly unable to do anything with the modern web thanks to newer versions of encryption in HTTPS, but this lets you at least somewhat manage to view websites, even if they often don’t actually render properly.

Both Netatalk and Webone required a bit of configuration, and I rather than setting them up and then forgetting how I did so, I’ve made a GitHub repository called Mac OS 9 Toolbox with docker-compose.yml files and setup for both projects in them, plus a README so future-me knows what I’ve done and why. 😛 In particular getting write permissions to write from the Mac OS 9 machine to the one running Netatalk was tricky.

I also included a couple of other things in there too, and will continue to expand on it as I go. One thing is how to convert the PICT format screenshots from Mac OS 9 into PNG, since basically nothing will read PICTs anymore. It also includes a Mastodon client called Macstodon:

A screenshot of a multi-pane Mac OS 9 application showing the Mastodon Home and Local Timelines and Notifications at the top, and the details of a selected toot at the bottom.

And also the game Escape Velocity: Override (which I’m very excited to note is getting a modern remaster from the main guy who worked on the original):

A screenshot of a top-down 2D space trading/combat game with quite basic graphics. A planet is in the middle of the screen along with several starships of various sizes.

I mentioned both the Marathon and Myth games in my previous post, but those actually run quite happily on modern hardware since Bungie was nice enough to open-source them many years ago. Marathon lives on with Aleph One, and Myth via Project Magma.

More fun with temperature sensors: ESP32 microcontrollers and MicroPython

More fun with temperature sensors: ESP32 microcontrollers and MicroPython

I’ve blagged previously about our temperature/humidity sensor setup and how they’re attached to my Raspberry Pis, and they’ve been absolutely rock-solid in the three-and-a-half years since then. A few months ago, a colleague at work had mentioned doing some stuff with an ESP32 microcontroller and just recently I decided to actually look up what that was and what one can do with it, because it sounded like it might be a fun new project to play with!

From Wikipedia: ESP32 is a series of low-cost, low-power system on a chip microcontrollers with integrated Wi-Fi and dual-mode Bluetooth.

So it’s essentially a tiny single-purpose computer that you write code for and then flash that code onto the board, rather than like with the Raspberry Pi where it has an entire Linux OS running on it. It runs at a blazing fast 240MHz and has 320KB of RAM. The biggest draw for me was that it has built-in wifi so I could do networked stuff easily. There’s a ton of different boards and options and it was all a bit overwhelming, but I ended getting two of Adafruit’s HUZZAH32s which come with the headers for attaching the temperature sensors we have already soldered on. Additionally, they have 520KB of RAM and 4MB of storage.

Next up, I needed to find out how to actually program the thing. Ordinarily you’d write in C like with an Arduino and I wasn’t too keen on that, but it turns out there’s a distribution of Python called MicroPython that’s written explicitly for embedded microcontrollers like the ESP32. I’ve never really done much with Python before, because the utter tyre fire that is the dependency/environment management always put me off (this xkcd comic is extremely relevant). However, with MicroPython on the ESP32 I wouldn’t be having to deal with any of that, I’d just write the Python and upload it to the board! Additionally, it turns out MicroPython has built-in support for the DHT22 temperature/humidity sensor that I’ve already been using with the Raspberry Pis. Score!

There was a lot of searching over many different websites trying to find how to get all this going, so I’m including it all here in the hopes that maybe it’ll help somebody else in future.

Installing MicroPython

At least on macOS, first you need to install the USB to UART driver or your ESP32 won’t even be recognised. Grab it from Silicon Labs’ website and get it installed.

Once that’s done, follow the Getting Started page on the MicroPython website to flash the ESP32 with MicroPython, substituting /dev/ttyUSB0 in the commands for /dev/tty.SLAB_USBtoUART.

Using MicroPython

With MicroPython, there’s two files that are always executed when the board starts up, boot.py which is run once at boot time and is generally where you’d put your connect-to-the-wifi-network code, and main.py which is run after boot.py and will generally be the entry point to your code. To get these files onto the board, you can use a command-line tool called ampy, but it’s a bit clunky and also not supported anymore.

However, there is a better way!

Setting up the development environment

There are two additional tools that make writing your Python code in Visual Studio Code and uploading to the ESP32 an absolute breeze.

The first one is micropy-cli, which is a command-line tool to generate the skeleton of a VSCode project and set it up for full autocompletion and Intellisense of your MicroPython code. Make sure you add the ESP32 stubs first before creating a new micropy project.

The second is a VSCode extension called Pymakr. It gives you a terminal to connect directly to the board and run commands and read output, and also gives you a one-click button to upload your fresh code, and it’s smart enough not to re-upload files that haven’t changed.

There were a couple of issues I ran into when trying to get Pymakr to recognise the ESP32 though. To fix them, bring up the VSCode command palette with Cmd-Shift-P and find “Pymakr > Global Settings”. Update the address field from the default IP address to /dev/tty.SLAB_USBtoUART, and edit the autoconnect_comport_manufacturers array to add Silicon Labs.

Replacing the Raspberry Pis with ESP32s

After I had all of that set up and working, it was time to start coding! As I mentioned earlier I’ve not really done any Python before, so it was quite the learning experience. It was a good few weeks of coding and learning and iterating, but in the end I fully-replicated my Pi Sensor Reader setup with the ESP32s, and with some additional bits besides.

One of the things my existing Pi Sensor Reader setup did was to have a local webserver running so I could periodically hit the Pi and display the data elsewhere. Under Node.js this is extremely easily accomplished with Express, but using MicroPython the options were more limited. There are a number of little web frameworks that people have written for it, but they all seemed quite overkill.

I decided to just use raw sockets to write my own, though one thing I didn’t appreciate until this point was how Node.js’s everything-is-asynchronous-and-non-blocking makes doing this kind of thing very easy, you don’t have to worry about a long-running function causing everything else to grind to a halt while it waits for that function to finish. Python has a thing called asyncio but I was struggling to get my head around how to use it for the webserver part of things until I stumbled across this extremely helpful repository where someone had shown an example of how to do exactly that! (I even ended up making a pull request to fix an issue I discovered with it, which I’m pretty stoked with).

One of the things I most wanted to do was to have some sort of log file accessible in case of errors. With the Raspberry Pi I can just SSH in and check the Docker logs, but once the ESP32s were plugged into power and running, you can’t easily do a similar thing. I ended up writing the webserver with several endpoints to read the log, clear it, reset the board, and view and clear the queue of failed updates.

The whole thing has been uploaded to GitHub with a proper README of how it works, and they’ve been running connected to the actual indoor and outdoor temperature sensors and posting data to my website for just under a week now, and it’s been absolutely flawless!

Powering our house with a Tesla Powerwall 2 battery

I posted back in March about our our shiny new solar panels and efforts to reduce our power usage, and as of two weeks ago our net electricity grid power usage is now next to zero thanks to a fancy new Tesla Powerwall 2 battery!

A photo of a white Tesla Powerwall 2 battery and Backup Gateway mounted against a red brick wall inside our garage.
A side-on view of a white Tesla Powerwall 2 battery mounted against a red brick wall.

We originally weren’t planning on getting a battery back when we got our solar panels — and to to be honest they still don’t make financial sense in terms of a return on investment — but we had nine months of power usage data and I could see that for the most part the amount of energy the Powerwall can store would be enough for us to avoid having to draw nearly anything whatsoever from the grid*.

* Technically this isn’t strictly true, keep reading to see why.

My thinking was, we’re producing stonking amounts of solar power and are feeding it back to the grid at 7c/kWh, but have to buy power from the grid after the sun goes down at 21c/kWh. Why not store as much as possible of that for use during the night?

The installation was done by the same people who did the solar panels, Penrith Solar Centre, and as before, I cannot recommend them highly enough. Everything was done amazingly neatly and tidily, it all works a treat, and they fully cleaned up after themselves when they were done.

We have 3-phase power and the solar panels are connected to all three phases (⅓ of the panels are connected individually to each phase) and the Powerwall has only a single-phase inverter so is only connected to one phase, but the way it handles everything is quite clever: even though it can only discharge on one phase, it has current transformers attached to the other two phases so it can see how much is flowing through there, and it’ll discharge on its phase an amount equal to the power being drawn on the other two phases (up to its maximum output of 5kW anyway) to balance out what’s being used. The end result is that the electricity company sees us feeding in the same amount as we’re drawing, and thanks to the magic of net-metering it all balances out to next to zero! This page on Solar Quotes is a good explanation of how it works.

The other interesting side-effect is that when the sun is shining and the battery is charging, it’s actually pulling power from the grid to charge itself, but only as much as we’re producing from the solar panels. Because the Enphase monitoring system doesn’t know about the battery, it gives us some amusing-looking graphs whereby the morning shows exactly the same amount of consumption as production up until the battery is fully-charged!

We also have the Powerwall’s “Backup Gateway”, which is the smaller white box in the photos at the top of this post. In the event of a blackout, it’ll instantaneously switch over to powering us from the battery, so it’s essentially a UPS for the house! Again, 3-phase complicates this slightly and the Powerwall’s single-phase inverter means that we can only have a single phase backed up, but the lights and all the powerpoints in the house (which includes the fridge) are connected to the backed-up phase. The only things that aren’t backed up are the hot water system, air conditioning, oven, and stove, all of which draw stupendous amounts of power and will quickly drain a battery anyway.

We also can’t charge the battery off the solar panels during a blackout… it is possible to set it up like that, but there needs to be a backup power line going back from a third of the solar panels back to the battery, which we didn’t get installed when we had the panels put in in February. There was a “Are you planning on getting a battery in the next six months” question which we said no to. 😛 If we’d said yes, they would have installed the backup line at the time; it’s still possible to install it now, but at the cost of several thousand dollars because they need to come out and pull the panels up and physically add the wiring. Blackouts are not remotely a concern here anyway, so that’s fine.

In the post back in March, I included three screenshots of the heatmap of our power usage, and the post-solar-installation one had the middle of the day completely black. Spot in the graph where we had the battery installed!

We ran out of battery power on the 6th of November because the previous day had been extremely dark and cloudy and we weren’t able to fully charge the battery from the solar panels that day (it was cloudy enough that almost every scrap of solar power we generated went to just powering the house, with next to nothing left over to put into the battery), and the 16th and 17th were both days where it was hot enough that we had the aircon running the whole evening after the sun went down and all night as well.

Powershop’s average daily use graph is pretty funny now as well.

And even more so when you look all the way back to when we first had the smart meter installed, pre-solar!

For monitoring the Powerwall itself, you use Tesla’s very slick app where you can see the power flow in real time. When the battery is actively charging or discharging, there’s an additional line going to or from the Powerwall icon to wherever it’s charging or discharging to or from.

You can’t tell from a screenshot of course, but those on the lines connecting the Solar to the Home and Grid icons animate in the direction that the power is flowing.

It also includes some historical graph data as well, but unfortunately it’s not quite as nice as Enphase’s, and doesn’t even have a website, you can only view it in the app. There’s a website called PVOutput that you can send your solar data to, and we have been doing that via Enphase since we got the solar panels installed, but the Powerwall also has its own local API you can hit to scrape the power usage and flows, and battery charge percentage. I originally found this Python script to do exactly that, but a) I always struggle to get anything related to Python working, and b) the SQLite database that it saves its data into kept intermittently getting corrupted, and the only way I’d know about it is by checking PVOutput and seeing that we hadn’t had any updates for hours.

So, I wrote my own in TypeScript! It saves the data into PostgreSQL, so far it’s been working a treat and it’s all self-contained in a Docker container. The graphs live here, and to see the power consumption and grid and battery flow details, click on the right-most little square underneath the “Prev Day” and “Next Day” links under the graph. Eventually I’m going to send all this data to my website so I can store it all there, but for the moment PVOutput is working well.

It also won’t shock anybody to know that I updated my little Raspberry Pi temperature/power display to also include the battery charge and whether it’s charging or discharging (charging has a green upwards arrow next to it, discharging has a red downwards arrow).

My only complaint with the local API is that it’ll randomly become unavailable for periods of time, sometimes up to an hour. I have no idea why, but when this happens the data in the Tesla iPhone app itself is still being updated properly. It’s not a big deal, and doesn’t actually affect anything with regards to battery’s functionality.

Overall, we’re exceedingly happy with our purchase, and it’s definitely looking like batteries in general are going to be a significant part of the electrical grid as we move to higher and higher percentages of renewables!

Visualising Git repository histories with Gource and ffmpeg

First, a disclaimer: this is entirely based on a blog post from a co-worker on our internal Confluence instance and I didn’t come up with any of it. 😛

Gource is an extremely cool tool for visualising the history of a Git repository (and other source control tools) via commits, and it builds up an animated tree view. When combined with ffmpeg you can generate a video of that history!

On a Mac, install Gource and ffmpeg with Homebrew:

$ brew install gource ffmpeg

Then cd into the repository you’re visualising, and let ‘er rip!

$ gource -1280x720 \
    --stop-at-end \
    --seconds-per-day 0.2 \
    -a 1 \
    -o - \
    | ffmpeg -y \
    -r 60 \
    -f image2pipe \
    -vcodec ppm \
    -i - \
    -vcodec libx264 \
    -preset fast \
    -crf 18 \
    -threads 0 \
    -bf 0 \
    -pix_fmt yuv420p \
    -movflags \
    +faststart \
    output.mp4

Phew! The main options to fiddle with are the resolution from gource (1280x720 in this case), and the crf setting from ffmpeg (increase the number to decrease the quality and make a smaller file, or lower the number to increase the quality and make a larger file).

I ran it over my original website repository that started its life out as PHP in 2011 and was moved to Perl:

And then my Javascript website that I started in 2016 and subsequently moved to TypeScript:

How cool is that?!

I also ran it over the codebase of the main codebase at work that powers our internal PaaS that I support and it’s even cooler because the history goes back to 2014 and there’s a ton of people working on it at any given time.

Fixing a Guitar Hero World Tour/Guitar Hero 5 guitar strum bar

Kristina and I had a date night last night in which we ate trashy food and then took the Xbox 360 out of storage and fired up Guitar Hero: Warriors of Rock. It was an excellent time except that my Guitar Hero World Tour guitar had stopped registering downward strums, and only upwards strums worked.

I figured I’d pull it apart today and see what was up, and thanks to this guide I figured it out, and am documenting it here for posterity (my problem wasn’t one of the ones described in that guide, but it was very handy to see how to disassemble the thing in the first place).

Tools needed for disassembly

  • Philips-head #0 and #1 screwdriver
  • Torx T9 screwdriver

Process

Firstly the neck needs to be removed, and the “Lock” button at the back towards the base of the guitar set to its unlocked position.

Next, the faceplate needs to be removed. This can be done by just getting a fingernail or a flathead screwdriver underneath either of the top bits of the body, pointed to with a arrow here, and gently prying it away from around the edges.

After that, there’s twelve Torx T9 screws to remove, circled in red, and another four Philips-head #0 ones, marked in green.

Once they’re all out, you can gently separate the front of the guitar where all the electronics live from the back of it.

Next there’s four Philips-head #1 screws to remove to get the circuit board that contains the actual clicky-switches away from the strum bar itself. Leave the middle two alone as they attach the guides for the springs of the strum bar.

After this, it’s a bit of a choose-your-own-adventure, as what you do next really depends on what’s wrong with the strum bar. On the underside of the circuit board above are the switches, it’s definitely worth making sure they both click nice and solidly when you press on them directly. If they don’t, it’s apparently possible to source exact replacements (“275-016 SPDT Submini Lever Switch 5A at 125/250VAC”) and fix it with a bit of soldering, but thankfully this wasn’t necessary in my case.

In the next image, undoing the Philips-head #1 screws circled in blue will allow you to take the strum bar assembly itself out and give it a re-lubricating (don’t use WD40, use actual proper lubricating grease) to make it rock back and forth a bit more smoothly. Another improvement you can make is adding a couple of layers of electrical tape to the areas I’ve circled in red. They’re where the strum bar physically hits the inside of the case, and the electrical tape can dampen the noise a bit.

What the strum bar problem ultimately ended being in my case is that the middle indented section where the switch rests against the strum bar to register a downstroke had actually worn away and could no longer press the switch in far enough to click. My solution, circled in green, was to chop a tiny piece of plastic from a Warhammer 40,000 miniature sprue and glue it—with the same plastic glue I use for assembling plastic miniatures—to the strum bar. Then I reassembled everything and it’s as good as new!

A bread update!

I blogged back in October last year about how I’d started making my own bread from scratch, and I haven’t stopped! I still haven’t graduated to sourdough yet (my one attempt was poorly timed in retrospect, it was right in the middle of everyone panic-buying flour and things around when coronavirus was really becoming a thing, and I didn’t want to waste a whole bunch of flour), but I’ve settled into a delicious routine with the “Saturday Overnight White Bread” recipe. I’m using 20% whole wheat flour, have bumped the water from 780mL to 790mL, and use a full quarter teaspoon of yeast (the recipe says “a scant quarter”). It’s at the point where I don’t even need to look at the recipe, I’ve got the timings and measurements totally memorised.

As well, Kristina bought me a pair of proofing baskets for Christmas last year, and MAN do they make a difference! They help keep the shape of the loaf and also wick moisture from the surface of it, which makes the crust come out even better. Feast your eyes on these beauties.

A photo of two golden brown loaves of bread sitting on a cooling rack.
A photo of two golden brown loaves of bread sitting on a cooling rack.
A photo of two golden brown loaves of bread sitting on a cooling rack.
A photo of two brown loaves of bread sitting on a cooling rack.

My associated lunches have been excellent as well, they’re primarily either cheese and tomato toasties with a recent addition of capsicum, or mushrooms cooked in a frying pan with garlic, sage, and rosemary, sitting on top of thickly-sliced tomato (also cooked in a frying pan) with crumbled up sharp cheddar cheese on top.

A photo of two cheese and tomato toasties sitting on a plate, with diced capsicum on top.
A photo of two slices of homemade bread sitting on a plate, topped with thick slices of tomato with mushrooms on them, and small pieces of crumbled up cheddar cheese atop it all.

Beanie has gotten to know the exact sound of a knife cutting through bread, because whenever I start he’s over into the kitchen like a shot waiting for crumb fallout and also the tiny useless end-piece we always give him.

Apple Watch, one year on: fitter and healthier than ever

I blogged last year about getting an Apple Watch and the 23rd of this month marked a full year of owning it! I’m extremely pleased to say that I’m now fitter, healthier, and stronger than I’ve ever been in my entire life. 💪🏻 I’ve also closed all three activity rings on every single day since I bought the Watch!

I mentioned in that post that I was down to 70kg, after that (but prior to COVID) I hit 69kg, which I haven’t been since I was in my very early 20s. After we started working from home full-time in mid-March, I settled into a really good exercise routine: during the week I finish work around 5:30pm, then on every day but Monday and Friday I’ll go for a burn on the elliptical while watching an episode of something on Netflix (Saturdays and Sundays are also elliptical time, but at whatever point I feel like doing it during the day). It’s amazing how quickly time passes when you’re not concentrating on the fact that you’re exercising. I’d started watching Star Trek: The Next Generation prior to COVID, but went through the episodes much more quickly after it, and finally finished it last month.

Fun side note, I thought I’d watched way more episodes than it turns out I have… as I was making my way through the episodes, I’d remembered seeing the pilot episode before, then a random smattering of maaaaybe five to ten episodes in the middle, and then the series finale, but that was it. It was pretty fun to watch the progression of the series, and I think at this point I enjoy Star Trek more than I do Star Wars. After I finished TNG, I took a brief Trek break to watch Warrior Nun (IMDB’s blurb says “After waking up in a morgue, an orphaned teen discovers she now possesses superpowers as the chosen Halo Bearer for a secret sect of demon-hunting nuns.” HIGHLY recommended, I loved it), and have now started watching Star Trek: Deep Space Nine. I’m only a few episodes in, but again, I’ve definitely seen the first episode before but none of the other four I’ve watched thus far.

The elliptical has its own calorie counter in it, I have no idea how accurate it is because it doesn’t take into account your weight, and the Apple Watch doesn’t take into account the fact that I’ve got the incline on the elliptical at 14 degrees and the resistance set to 14 out of 22, but regardless it’s a hell of a workaround. After a typical 45-minute episode of a TV show, the elliptical says I’ve burnt around 650-700 calories, and the Apple Watch says 425-450 calories.

On Mondays and Fridays, instead of using the elliptical I do some weight training. I bought a pair of neoprene-coated 4kg dumbbells and have been doing twelve of each of the exercises from this article, three times each. I really like that Women’s Health Magazine article because it’s not pretentious and just says “Hey if you want some toned arms, do these things.” Pretty much all the ones aimed at men that I found were all of the “RARRRR GET RIPPPPPED BROOOO” which I just… just, no. 😑

As a result of all of this, I’ve actually gained weight and am now 71kg, but it’s all muscle mass, aw yiss. I’ve never actually had properly visible muscles, so this is a new look for me. 😛

On the odd occasion that I’m feeling a bit pooped-out for whatever reason, rather than using the elliptical or doing weight training, I’ll just go for a brisk walk on the treadmill and listen to part of a podcast. (Speaking of podcasts, I’ve basically entirely stopped listening to them except for when I’m occasionally using the treadmill… my commute into the city was my podcast listening time, and with that gone, I just don’t really do it anymore).

My primary use for the Apple Watch is still definitely fitness, but I’ve also really enjoyed the little notification buzz you get on your wrist when you get an iMessage or SMS. It’s totally unobtrusive and as someone who leaves all of their devices muted at all times, it means I don’t actually miss messages that need a timely reply, but if it’s not important I can just ignore it. The other handy thing I’ve found is using Siri to set custom timers. I don’t have any of the “always listening” Siri stuff on on any of my devices, but I have the Watch set to activate Siri when you hold the digital crown in for a couple of seconds. It’s super-handy when I’m timing my bread-making or barbecuing to just say “Set a timer for four minutes”, or whatever non-standard time it is, rather than having to futz around pressing things.

A year on, the Apple Watch is still one of my favourite gadgets and I’m keen to see what sort of additions Apple makes to it in the coming years!

Improvements in photographical skill

Back in June I blogged about the Flickr Memories functionality I’d added to my website, and finished the post off with this sentence:

I’m excited to see what forgotten gems from the past show up, and also being reminded of how terrible I was when I was first starting out taking photos. 

And oh boy, has it delivered on that second part in particular! I make a point of checking both my formerly-Tumblr-posts Memories page where I post all my random iPhone photos as well as the Flickr one each day, and in August of both 2008 and 2012 we were in Boston (in 2008 it was the first time I was visiting Kristina overseas back when she was still living there, and in 2012 we went back for her birthday) and I’d posted all the trip photos to Flickr.

Looking back at the 2008 photos (and all the photos I’d taken prior to that), they’re very much just happy-snaps… there was zero processing done on them, they were awkwardly-framed and weren’t straightened — crooked horizons ahoy! — and there was no culling, I’d just post them allll. I have a collection created for that trip where you can see the full horror (check out the sheer number of photos in the Mt. Auburn Cemetery album for example… why did I need to post that many?!).

In 2009 I borrowed an old Canon EOS 400D from a co-worker at Apple for a while, which was my first exposure to a proper DSLR and properly-shallow depth of field, as well as shooting in RAW, and doing actual post-processing. I had way too much fun adding heavy vignetting to everything, but I was starting to cull my photos rather than just dumping everything online!

In 2010 we bought a Canon EOS 7D with a 35mm f/2 lens, and my photography and processing skills definitely improved (that link is every single photo I took with the 7D, in order of date posted from oldest to newest).

Going back to Boston, some sample photos from Rockport and Boston Public Library on 2012’s trip shows the improvement from 2008. For some reason I was pretty far into too-heavy-handed territory with the post processing specifically for the Boston trip photos, but you can see the greatly-improved framing and composition.

From there on my skill has been on a upwards path, and thankfully I’ve stopped doing the EVERYTHING MUST BE REALLY HIGH CONTRAST thing I was doing for a while there. 😛 For travel photos specifically, the easiest way to see the progression is the “Travel” category of posts I have set up here! Some choice examples:

A while back I’d toyed with the idea of reprocessing my older photos, in particular the 2012 Boston ones, but I realised that way lies madness and never-ending editing, and I wouldn’t have a nice historical record of my photographical endeavours like I do now.

Memories redux: Flickr

I posted back in December that I’d created my own version of Facebook’s “Memories” feature for my formerly-Tumblr-and-now-Mastodon media posts, and even at the time I’d had the thought of doing it for Flickr as well, since that’s where all my Serious Photography goes.

Well, now I have!

It wasn’t quite as straightforward as my Media memories functionality, because there I could just do a single database call, but for Flickr I’m having to make multiple API calls each time. Fortunately two of the search parameters that flickr.photos.search offers are min_taken_date and max_taken_date, so my approach is to run a query for whatever the current day of the year it happens to be for each year going back to 2007—this being when my account was created and when I first started posting photos to Flickr—with the min_taken_date set to 00:00 on that particular day, and max_taken_date set to 23:59 on that same day. It does mean that currently there’s 13 API calls each time the Memories page is loaded and this will increase by one with each year that goes past, but Flickr’s API docs say “If your application stays under 3600 queries per hour across the whole key (which means the aggregate of all the users of your integration), you’ll be fine”. That’s one query every single second for an entire hour, which absolutely isn’t going to be happening, so I ought to remain well under the limit.

I’m excited to see what forgotten gems from the past show up, and also being reminded of how terrible I was when I was first starting out taking photos. 😛

And now for something completely different: Anzac biscuits!

And now for something completely different: Anzac biscuits!

A disclaimer up front: I can’t claim any credit whatsoever for this recipe, my co-worker Rachel posted about it at work and I asked if she minded if I re-posted the recipe here!

With Anzac Day just past, another thing from this time of year is Anzac biscuits:

The Anzac biscuit is a sweet biscuit, popular in Australia and New Zealand, made using rolled oats, flour, sugar, butter (or margarine), golden syrupbaking soda, boiling water, and (optionally) desiccated coconut. Anzac biscuits have long been associated with the Australian and New Zealand Army Corps (ANZAC) established in World War I.

As mentioned above, my co-worker Rachel posted her own recipe that’s based as closely as possible on the really early Anzac biscuit recipes, and I made them last night and they’re amazing.

There’s two recipes, one is the pre-1920s version without coconut, and the other is with coconut and more sugar. I made the pre-1920s one as Kristina can’t eat coconut. The method is identical, just the ingredients differ.

I should also point out that they should go a lot flatter than in the picture above, but I cheated and microwaved the butter and golden syrup in the microwave rather than heating it on the stove, so the mixture didn’t stay warm and it didn’t end up spreading.

Ingredients

  • 1 cup plain flour (all purpose flour)
  • 1 cup rolled oats (not steel cut or quick oats)
  • 1 cup sugar (part white sugar, part soft brown sugar)
  • 3/4 cup desiccated coconut
  • 115g salted butter
  • 1 tablespoon golden syrup
  • 1.5 teaspoons bicarbonate of soda
  • 2 tablespoons of boiling water

Pre-1920s recipe without coconut

  • 1 cup plain flour
  • 2 cups rolled oats
  • 1/2 cup sugar
  • 115g salted butter
  • 2 tablespoons golden syrup
  • 1 teaspoon bicarbonate of soda
  • 2 tablespoons of boiling water

Method

  1. Preheat your oven to 170˚ degrees (no fan).
  2. Combine dry ingredients in a bowl.
  3. Melt butter and golden syrup slowly in a small saucepan.
  4. In a jug, dissolve bicarb soda in boiling water.
  5. Pour boiling water and bicarb into the butter and syrup. It will foam up. Add it immediatley to the dry ingredients.
  6. Mix everything together just enough to combine. The mixture should be sticky. Don’t let it get cold.
  7. Roll into teaspoon sized balls, and place on a baking sheet, spaced at least 10cm apart – they will spread a lot.
  8. Bake for 8-10 minutes. Your oven time may vary.
  9. Remove from oven, and cool on the baking sheet for a few minutes.
  10. Transfer to wire rack to cool.
  11. Store in an airtight container. They’ll keep for months.

Digital archeology: recovering ClarisWorks drawing files

Three years ago I posted about how I’d gone back and recovered all my old websites I’d published over the years and packed them up into a Docker image, and last year I’d idly mused that I should go back and recover the multitude of websites that I’d designed but never actually uploaded anywhere. I finally got around to doing that over the weekend, and they’re all up on archive.virtualwolf.org! Some are the original HTML source, some are just the Photoshop mockups, but that now contains the almost sum total of every single website I’d created (and there’s a lot of them). The only one missing is the very very first one… The Dire Marsh news updates are from early 1998, but I’d copied most of the layout from the previous site as evidenced by the (broken) visitor at the left that says “<number> half-crazed Myth fanatics have visited this site since 21/12/97”.

Prior to building way too many websites, I’d been introduced to the Warhammer 40,000 and Dune universes when I was 13 and had immediately proceed to totally rip them off get inspired and write my own little fictional universe along the same lines. This was all in 1996 and very early 1997, I even still have all the old files sitting in my home folder with the original creation dates and everything, but didn’t have anything that could open them as they were a combination of ancient Microsoft Word writings — old enough that Pages didn’t recognise them — and ClarisWorks drawing documents — ClarisWorks had a vector-based drawing component to it as well as word processing. I ended up going down quite the rabbit hole in getting set up to bring them forwards into a modern readable format, and figured I’d document it here in case it helps anyone in future.

Running Mac OS 9 with SheepShaver

The very first hurdle was getting access to Mac OS 9 to begin with. I originally started out with my Power Mac G4 that I’ve posted about previously but unfortunately it seems like the power supply is on the way out, and it kept shutting down (people have apparently had success resurrecting these machines using ATX power supplies but I haven’t had a chance to look into it yet). Fortunately, there’s a Mac OS 9 emulator called SheepShaver that came to the rescue.

  1. Download the latest SheepShaver and the “SheepShaver folder” zip file from the emaculation forums.
  2. You need an official “Mac OS ROM” file that’s come from a real Mac or been extracted from the installer. Download the full New World ROMs archive from Macintosh Repository, extract it, rename the 1998-07-21 - Mac OS ROM 1.1.rom file to Mac OS ROM and drop it into the SheepShaver folder.
  3. Download the Mac OS 9.0.4 installer image from Macintosh Repository (SheepShaver doesn’t work with anything newer).
  4. Follow the SheepShaver setup guide to install Mac OS 9 and set up a shared directory with your Mac. Notes:
    • It defaults to assigning 16MB of RAM to the created virtual machine, be sure to increase it to something more than 32MB.
    • Disable the “Update hard disk drivers” box in the Options sections of the Mac OS 9 installer or the installer will hang (this is mentioned in the setup guide but I managed to miss it the first time around).
    • When copying files from the shared directory, copy them onto the Macintosh HD inside Mac OS 9 directly, not just the Desktop, or StuffIt Expander will have problems decompressing files.

Recovering ClarisWorks files

This was the bulk of the rabbit hole, and if you’re running macOS 10.15, you’ve got some additional rabbit hole to crawl through because the software needed to pull the ClarisWorks drawing documents into the modern era, EazyDraw Retro (scroll down to the bottom of the page to find the download link), is 32-bit only which means it doesn’t run under 10.15, only 10.14 and earlier.

Step 1: Convert ClarisWorks files to AppleWorks 6

  1. Download the archive of QuickTime installers and install QuickTime 4.1.2, which is required to install AppleWorks 6.
  2. Download the AppleWorks 6 installer CD image (it has to be added in SheepShaver’s preferences as a CD-ROM device) and install it.
  3. Open each of the ClarisWorks documents in AppleWorks, you’ll get a prompt saying “This document was created by a previous version of AppleWorks. A copy will be created and ‘[v6.0]’ will be added to the filename”. Click OK and save the copy back onto the shared SheepShaver drive with a .cwk file extension.

Step 2: Install macOS 10.14 inside a virtual machine

This entire step can be skipped if you haven’t upgraded to macOS 10.15 yet as EazyDraw Retro can be run directly.

Installing 10.14 inside a virtual machine requires a bootable disk image of the installer, so that needs to be created first.

  1. Download DosDude1’s Mojave patcher and run it (you’ll likely need to right-click on the application and choose Open because Gatekeeper will complain that the file isn’t signed).
  2. Go into the Tools menu and choose “Download macOS Mojave” to download the installer package, save it into your Downloads folder.
  3. Open Terminal.app and create a bootable Mojave image with the following commands:
    1. hdiutil create -o ~/Downloads/Mojave -size 8g -layout SPUD -fs HFS+J -type SPARSE
    2. hdiutil attach ~/Downloads/Mojave.sparseimage -noverify -mountpoint /Volumes/install_build
    1. sudo ~/Downloads/Install\ macOS\ Mojave.app/Contents/Resources/createinstallmedia --volume /Volumes/install_build
    2. hdiutil detach /Volumes/Install\ macOS\ Mojave
    3. hdiutil convert ~/Downloads/Mojave.sparseimage -format UDTO -o ~/Downloads/Mojave\ Bootable\ Image
    4. mv ~/Downloads/Mojave\ Bootable\ Image.cdr ~/Downloads/Mojave\ Bootable\ Image.iso

Once you’ve got the disk image, fire up your favoured virtual machine software and install Mojave in it.

Step 3: Convert AppleWorks 6 files to a modern format

The final part to this whole saga is the software EazyDraw Retro which can be downloaded from their Support page. It has to be the Retro version because the current one doesn’t support opening AppleWorks documents (I’m guessing whatever library they’re using internally for this is 32-bit-only and can’t be updated to run on Catalina or newer OSes going forwards, so they dropped it in new versions of the software). It can export to a variety of formats, and has its own .eazydraw format that the non-Retro version can open.

Unfortunately EazyDraw isn’t free, but you can get a temporary nine-month license for US$20 (or pay full price for a non-expiring license if you’re going to be using it for anything else except this). It did work an absolute treat though, it was able to import every one of my converted AppleWorks 6 documents and I saved them all out as PDFs. There were a few minor tweaks required to some of the text boxes because the fonts were different between the original ClarisWorks document and the AppleWorks one and there were some overlaps between text and lines, but that was noticeable as soon as I’d opened them in AppleWorks and wasn’t the fault of EazyDraw’s conversions.

Converting Aldus SuperPaint files

There were only two of my illustration files that were done in anything but ClarisWorks, and they were from Aldus SuperPaint. Version 3.5 is available from Macintosh Repository and pleasingly it’s able to export straight to TIFF so I could convert them under current macOS from that straight to PNG. There were some minor tweaks required there as well, but it was otherwise quite straightforward.

Converting Microsoft Word files

All my non-illustration text documents were written with Microsoft Word 5.1 or 6, but the format they use is old enough that Pages under current macOS doesn’t recognise it. I wouldn’t be surprised if the current Word from Office 365 could open them, but I don’t have it so I went the route of downloading Word 6 from Macintosh Repository which can export directly out to RTF. TextEdit under macOS opens them fine and from there I saved them out as PDF.

History preserved!

Following the convoluted process above, I was able to convert all my old files to PDF and have chucked them into the Docker image at archive.virtualwolf.org as well (start at the What, even more rubbish? section), so you can marvel at my terrible fan fiction world-building skills!

I’m not deluding myself into thinking that this is any sort of valuable historical record, but it’s my record and as with the websites, it’s fun to look back on the things I’ve done from the past.

Ten years of Atlassian

Today marks ten years to the day that I started at Atlassian! I blogged LiveJournaled at the end of the first week back in 2010 but looking back on it, it didn’t quite capture the brain-dribbling-out-my-ears aspect of when I started. Jira was — and still is, really — a complicated beast, and attempting to wrap my head around how all the different schemes interrelate was something else, especially when everything was called a <something> scheme!

I started doing support for Jira Studio at the beginning of 2011 — where we would host the products ourselves versus what I was doing when I first started, supporting Jira Server which is running on the customer’s own hardware—, was promoted to senior support engineer in 2014, and then left the customer support wing of the company entirely nearly three years ago and started doing support for our internal PaaS (platform as as service)!

I’m still in that same “Shield” role, still doing a good amount of coding on the side, and have been rewriting vast swathes of our internal documentation which has been received extremely positively. (We have very clever developers at work, but writing clear and end-user-focused documentation is not their strong suit. 😛) The coding has been primarily on the internal tool I mentioned in this post — except we’re now using Slack instead of Stride — and there’s been an increasing number of teams adopting it internally, and I’m actually getting feature requests!

Granted I’ve worked at exactly three companies in my entire career, but I honestly can’t imagine being anywhere else. Here’s to another ten years!

HomePod, Docker on Raspberry Pi, and writing Homebridge plugins

Apple announced the HomePod “smart speaker” in 2017, and started shipping them in early 2018. I had zero interest the smart speaker side of things — I’d never have Google or Amazon’s voice assistants listening to everything I say, and despite trusting Apple a lot more with privacy compared to those two companies, the same goes for Siri — but the praise for the sound quality definitely piqued my interest, especially having set up shairplay-sync on the Raspberry Pi as an AirPlay target and enjoying the ease of streaming music to a good set of speakers. For AU$499 though, I wasn’t going to bother as the setup for the stereo system in our home office did a reasonable enough job. It consisted of an amplifier that was sitting next to my desk, going into an audio switchbox that sat next to my computer and could be switched between the headphone cable attached to my computer, and one that snaked across the floor to the other side to Kristina’s desk so she could plug into it, with the speakers were sitting on the bookshelves on opposite sides of the room (you can see how it looked in this post, the speakers are the black boxes visible on the bottom shelves closest to our desks).

Fast-forward to last week, and someone mentioned that JB Hi-Fi were having a big sale on the HomePod and it was only AU$299! The space behind my desk was already a rat’s nest of cables, and with the standing desk I’ve ordered from IKEA I was wanting to reduce the number of cables in use, and being able to get rid of a bunch of them and replace it with a HomePod, I decided to get in on it (it’s possible to turn the “Listen for ‘Hey Siri'” functionality off entirely).

It arrived on Tuesday, and to say I’m impressed with the sound quality is a bit of an understatement, especially given how diminutive it is. It has no trouble filling the whole room with sound, the highs are crystal clear, and if the song is bassy enough you can feel it through the floor! It shows up just as another AirPlay target so it’s super-easy to play music to it from my phone or computer. I took a photo of our new setup and you can see the HomePod sitting on the half-height bookshelf right at the bottom-left of the frame (the severe distortion is because I took the photo on our 5D4 with the 8-15mm Fisheye I borrowed from a friend, which requires turning lens corrections on to avoid having bizarrely-curved vertical lines, which in turn distorts the edges of the image quite a bit).

The setup and configuration of the HomePod is done via Apple’s Home app, which uses a framework called HomeKit to do all sorts of home automation stuff, and the HomePod is one of the devices that can work as the primary “hub” for HomeKit. I have no interest in home automation as such, but a selling point of the HomeKit is that’s a lot more secure than random other automation platforms, and one of the things it supports is temperature sensors. Someone wrote a Node.js application called Homebridge that lets you run third-party plugins and even write your own ones to appear and interact with in HomeKit, so I decided I’d see if I could hook up the temperature sensors that are attached to the Raspberry Pi(s)!

I’d ordered a 4GB Raspberry Pi 4B last month because I wanted to have a bit more grunt than the existing Pi 3B — which only has 1GB RAM — and to start using Docker with it, and it arrived on the 1st of this month. With that up and running inside in place of my original Raspberry Pi 3B, I moved the Pi 3B and the outside temperature sensor much further outside and attached it to our back room that’s in the backyard, because the previous position of the sensor underneath the pergola and next to the bricks of the house meant that in summer the outdoor temperatures would register hotter than the actual air temperature, and because the bricks absorb heat throughout the day, the temperatures remain higher for longer too.

Installing and configuring Homebridge

Next step was to set up Homebridge, which I did by way of the oznu/docker-homebridge image, which in turn meant getting Docker — and learning about Docker Compose and how handy it is, and thus installing it too! — installed first:

  1. Install Docker — curl -sSL https://get.docker.com | sh
  2. Install Docker Compose — sudo apt-get install docker-compose
  3. Grab the latest docker-homebridge image for Raspberry Pi — sudo docker pull oznu/homebridge:raspberry-pi
  4. Create a location for your Homebridge configuration to be stored — mkdir -p ~/homebridge/config

Lastly, write yourself a docker-compose.yml file inside ~/homebridge

version: '2'
services:
  homebridge:
    image: oznu/homebridge:raspberry-pi
    restart: always
    network_mode: host
    volumes:
      - ./config:/homebridge
    environment:
      - PGID=1000
      - PUID=1000
      - HOMEBRIDGE_CONFIG_UI=1
      - HOMEBRIDGE_CONFIG_UI_PORT=8080

Then bring the Homebridge container up by running sudo docker-compose up --detach from ~/homebridge. The UI is accessible at http://<address-of-your-pi>:8080 and logs can be viewed with sudo docker-compose logs -f.

The last step in getting Homebridge recognised from within the Home app is iOS is to open the Home app, tap the plus icon in the top-right and choose “Add accessory”, then scan the QR code that the Homebridge UI displays.

Writing your own Homebridge plugins

Having Homebridge recognised within the Home app isn’t very useful without plugins, and there was a lot of trial and error involved here because I was writing my own custom plugin rather than just installing one that’s been published to NPM, and I didn’t find any single “This is a tutorial on how to write your own plugin” pages.

Everything is configured inside ~/homebridge/config, which I’ll refer to as $CONFIG from now on.

Firstly, register your custom plugin so Homebridge knows about it by editing $CONFIG/package.json and editing the dependencies section to add your plugin. It has to be named homebridge-<something> to be picked up at all, I called mine homebridge-wolfhaus-temperature and so my $CONFIG/package.json looks like this:

{
  "private": true,
  "description": "This file keeps track of which plugins should be installed.",
  "dependencies": {
    "homebridge-dummy": "^0.4.0",
    "homebridge-wolfhaus-temperature": "*"
  }
}

The actual code for the plugin needs to go into $CONFIG/node_modules/homebridge-<your-plugin-name>/, which itself is a Node.js package, which also needs its own package.json file located at $CONFIG/node_modules/homebridge-<your-plugin-name>/package.json. You can generate a skeleton one with npm init — assuming you have Node.js installed, if not, grab nvm and install it — but the key parts needed for a plugin to be recognised by Homebridge is adding the keywords and engine sections into your package.json:

{
  "name": "homebridge-wolfhaus-temperature",
  "version": "0.0.1",
  "main": "index.js",
  "keywords": [
    "homebridge-plugin"
  ],
  "engines": {
    "homebridge": ">=0.4.53"
  }
}

index.js is your actual plugin code that will be run when Homebridge calls it.

Once I got this out of the way, the last bit was a LOT of trial and error to actually get the plugin working with Homebridge and the Home app on my iPhone. The main sources of reference were these:

After several hours work, I had not the nicest code but working code (Update 2020-04-12 — moved to ES6 classes and it’s much cleaner), and I’ve uploaded it to GitHub.

The final bit of the puzzle is telling Homebridge about the accessories, which are the things that actually show inside the Home app on iOS. For this, you need to edit $CONFIG/config.json and edit the accessories section to include your new accessories, which will use the plugin that was just written:

{
    "bridge": {
        "name": "Traverse",
        [...]
    },
    "accessories": [
        {
            "accessory": "WolfhausTemperature",
            "name": "Outdoor Temperature",
            "url": "http://pi:3000/rest/outdoor"
        },
        {
            "accessory": "WolfhausTemperature",
            "name": "Indoor Temperature",
            "url": "http://fourbee:3000/rest/indoor"
        }
    ],
    "platforms": []
}

The url is the REST endpoint that my pi-sensor-reader runs for the indoor and outdoor sensors, and the name needs to be unique per accessory.

Homebridge needs restarting after all these changes, but once you’re done, you’ll have two new accessories showing in Home!

They initially appear in the “Default Room”, you can add an “Indoor” and “Outdoor” room to put them into by tapping on the Rooms icon in the bottom bar, then tapping the hamburger menu at the top-left, choosing Room Settings > Add Room, then long-pressing on the temperature accessory itself and tapping the settings cog at the bottom-right and selecting a different room for it to go into.

What’s next?

As part of doing all this, I moved all of my public Git repositories over to GitHub where they’re more likely to be actually seen by anybody and will hopefully help someone! I also updated my pi-sensor-reader to use docker-compose, and fully-updated the README to document all the various options.

Next on the Homebridge front is going to be tidying up the plugin code — including moving to async/await — and adding the humidity data to it!

A very DIY weekend

All this coronavirus business has meant that we’ve been doing a lot of not going anywhere for the past couple of weeks. Both Kristina and I are lucky enough to be able to work our regular jobs entirely from home which is fantastic, and the lack of commuting means that we’ve got lots more time in our days. I finally got around to putting together the Tau and Space Marines that come in the Kill Team box set I got back in September, and have started painting them. I’d noticed when I was painting the ungors from the Beastgrave box that the new arch lighting on my painting table still wasn’t quite sufficient, and with the additional painting I’m doing, decided I should get around to doing something about it.

All of my miniature stuff lives in the back room and I’ll bring it inside as necessary, and yesterday started with me being annoyed that the door handle on the outside of the room was nearly falling off because the holes that the screws go into were worn out and the screws didn’t actually hold anything in (and also that to lock the deadbolt you had to lift the door slightly because it’s out of alignment with the hole). I drilled out the holes in the actual metal door handle itself to fit newer and larger screws in, and also filed down the plate that the deadbolt goes into so the door doesn’t need lifting anymore when you lock it, so now the door is like a normal door and you don’t have to fight with it when locking and unlocking it.

Also yesterday, Kristina decided to trim the horrible trees that we have growing in the narrow garden bed next to the pergola, and I decided to follow suit by pulling out all the weeds and grasses that’d grown there and generally trying to make it tidier. We were going to get mulch from Bunnings to prevent the weeds from getting a foothold and generally to improve how the garden bed looks, but I got carried away and also ended up removing all the dead leaves that were sitting against the bottom of the back room walls, and so was totally exhausted. Today we did hit up Bunnings, and while we were there I figured I’d do something about the lighting for my painting table so I also picked up some more wood for the arch, plus another set of strip lights and a power board, as well as a circular saw because I’m sick of manually hacking at pieces of wood when I’m chopping them!

I’ve not used a circular saw before, it’s delightfully fast to chop the wood of course but it was a little tricky because the pieces of wood I use for the arch on my table are quite narrow and it’s difficult to work out specifically where the blade is going to be cutting; I’m sure some more practice will help. I also attached another piece of wood to the side of it so I could mount the power board there and have both the strip lights plugged into that, and just turn it on and off at the wall.

Behold!

A photo of a DIY wooden "table" sitting on a dining table, with a square arch over the top of it with LED strip lights all along the inner surface of it. There's lots of Games Workshop paint pots and miniatures on it.

I also generally tightened everything up and attached the two arches together so they’d stop being knocked forwards if I bumped into them. It’s very much even brighter than before, so much so that the ceiling light above the dining table doesn’t really bring anything to the party anymore — previously I’d found that I was still needing to use it if I didn’t have the miniature I was painting totally centred under the arch.

Reducing our power usage: Powershop, heat pump hot water, and solar panels

Ever since we moved into our house almost seven years ago now we’d been slowly making the place more energy efficient and reducing our power usage. First were double-glazed windows, then a new roof, then replacing all the light globes with LED ones, and slowly but surely replacing our various appliances with newer ones (the fridge was replaced shortly after we had our kitchen redone and used over a third less power than the old one, we got a dryer as well that uses a heap less power than trying to dry things in the combination washer/dryer, and a new air conditioner just before summer of 2018 to replace the ancient and increasingly-creaky one). Despite power bills going up at a fairly absurd rate over the past decade, we actually didn’t see much of an increase at all thanks to us being able to steadily use less power over time.

There’s an electricity company called Powershop that a few people at work use and are very happy with, and all their power is 100% carbon offset. We switched over to them in mid-September last year, and got our power meter upgraded — for free — to a smart meter in early October. The data from the meter is really fascinating, you can view it right on Powershop’s site and they give you a heat map of half-hour blocks throughout the whole day where you can see specifically when and how much power you’re using. A snapshot from part of October looked like this:

Brighter means more power being used, darker means less. You can clearly see the weekends during the day when we had the air conditioning on, and that much brighter section around 1:30-2:00am every night is the hot water system coming on. It was using 3-4 kilowatt-hours of power each and every night, and even though it’s on the off-peak pricing and thus not costing us a heap, it’s still pretty damn inefficient. I had a close look at the system and it had a manufacturing date of 2002, so we decided it was probably time to replace it anyway. I did a bunch of research and settled on a heat pump hot water system from a company called Sanden. Heat pumps work on the same general principle as refrigerators but in the exact opposite direction, they bring in the warm ambient air from outside to heat up the water. As a result, a heat pump hot water system can use 20% of the power of a regular hot water system. We got it installed in mid-December and it looks like a regular hot water system hooked up to an air conditioning unit!

A photo of our heat pump hot water system, which is comprised of a standard-looking hot water water tank with what looks like a small air conditioning compressor next to it.

(The scale is a bit off in this photo, I had to use the ultra-wide lens in my iPhone to get all of it in the shot as I was hard up against the fence. The photo was taken from about waist height, and the water tank comes up to about shoulder height or so). One neat thing this with system is that it’s absurdly quiet, they quote only 37 decibels when it’s running.

You can very clearly see the drop in power usage when after we got it installed, no more bright line! (The extended purple section in the wee hours of the morning is because that’s right when we had a spate of hot days and the air conditioner had to run more than normal overnight).

I worked out that the new system was using between 1 and 1.5kWh of power depending on what the outside temperature was (remember that it uses ambient air to heat the water, so warmer ambient air equals less power needed), which is a pretty nice improvement over the old system.

After all of this, we also decided to invest in a solar power system! There’s a fantastic website called Solar Quotes that has a ton of good info and will contact up to three installers for you to organise them to come out and give quotes. We ended up going with Penrith Solar Centre who were fantastic, and the system was installed on the 20th of last month, and it looks very handsome.

A photo of our backyard and house with solar panels on the roof.

(You can also see we also had our pergola redone too, which was a bit of a shit-show and dragged on for way longer than it should have. You can see in the photo that we had it shortened a fair chunk as it was far larger than it needed to be, so now we have significantly more space for renovating our back yard when we eventually get around to doing that).

The total solar setup is 22 Hanwha Q-Cells panels — sixteen on the north-facing expanse and six on the western side — plus Enphase IQ7+ micro-inverters, and we also get Enphase’s Envoy consumption monitoring setup so we can see in real-time how much power we’re using versus how much we’re exporting back to the grid! The monitoring includes panel-level monitoring of each individual panel so if anything goes wrong with one of them, it’s immediately obvious and the solar installers can immediately see which one needs fixing or replacing.

The Envoy system took a bit of time to be activated but even before that we were able to get a sense of how much our usage had changed via Powershop’s usage heat map. You can clearly see the day the solar was installed because all the days after that have big black sections where we were using no power from the grid at all:

The day at the top there, the 1st of March, we were doing all of the clothes washing — so using the washing machine and the dryer — as well as having the oven on for lunch, the dishwasher running, and the the air conditioner on for most of the day, and yet only used 11.46kWh for the day, compared to what would normally be more like 30-35kWh.

Powershop also have the inverse of the usage heat map if you have solar, which is the feed-in heat map, showing how many kilowatt-hours you’ve sent back to the grid.

The especially bright yellow blocks in the middle on the 28th are just a bit under 3kWh! You can also see on the 26th of February when the clouds came over after lunch.

The Envoy system gives us essentially what Powershop’s heat maps do, but in real-time, and it also monitors each individual solar panel so if anything breaks with one of them or we see that it’s producing way less power, we immediately have the knowledge of which panel it is which makes Penrith Solar Centre’s job far easier if they need to come out and fix something. You also get consumption versus generation graphs in 15-minute increments as well! The blue is how much we’re producing from the solar panels and the orange is how much we’ve used. It took me a bit to work out what the circle at the top meant, but it means in total we’ve consumed 16.84kWh all up, and of that, the section of the circle is how much we consumed directly from the solar panels, whereas the grey portion is power imported from the grid.

Now that we’re generating a bunch of our own power, the other thing I wanted to do was to have our hot water run off the solar panels in the middle of the day, instead of using off-peak in the wee hours of the morning, especially since heat pump systems work best when the ambient temperature is warmer… not super useful in the middle of winter at 1am! Fortunately the Sanden system we have comes with a block-out timer so you can explicitly set when you want it to be running. I had the Penrith Solar people swap the hot water over to the regular non-off-peak meter and I configured the block-out timer so the system only comes on in the middle of the day when we’re producing power from our solar panels (and conveniently when the ambient temperature is at it highest so it needs to use even less power).

I’ll be very interested to see how this goes during winter, our power usage is generally less — we don’t use the air conditioner’s heating function because it dries everything out too much, but instead just use oil-filled radiant heaters — but from talking to people at work, the solar panels also generate only about 30% of the electricity compared to the height of summer thanks to the shorter days and lower sun.

A trip to New Zealand’s North Island (Te Ika-a-Māui)

A trip to New Zealand’s North Island (Te Ika-a-Māui)

On Monday last week we woke up at arse o’clock in the morning to catch a flight to Auckland! We’d been to Queenstown two years ago but hadn’t seen the North Island yet.

We hired a car and stayed in Parnell for the first two nights which was quite lovely (it’s also apparently one of the most expensive suburbs in New Zealand which I’d 100% believe), and the first partial day we were there just involved wandering around the neighbourhood taking some photos.

Untitled
Untitled
Holy Trinity Cathedral
Untitled
33 St Stephens Avenure

The second day we drove out to Piha to see the black sand beach which was extremely cool. I couldn’t capture it in the photos but when you look at the sand against the sun it really sparkles thanks to its volcanic origins!

Untitled
Watching
Untitled

We also went to the Arataki Visitor Centre and tromped through the forest on a walking track. It was very neat but the only photos were a vista from the centre and the big totem pole there as well, because the forest itself wasn’t particularly photogenic. 😛

View from the Vistor Centre
Arataki Vistor Centre entrance

After that we walked from our AirBnB to the CBD, and quickly decided that that was a large mistake due to the sheer amount of construction going on and blocked-off streets. It made Sydney’s construction look positively pedestrian!

There’s a pizza chain in New Zealand called Sal’s that claims to do genuine New York-style pizza, so we walked to the closest one for dinner and gave it a go, and oh my god. Kristina said it’s about 95% of the quality of the actual New York-style pizza she had when she lived in New Jersey. She’d been talking about how good it was for years, and I now finally understand it!

Day 3 we drove to Rotorua, which is about three and a half hours drive, so we broke up the trip by stopping in at Hamilton and visiting Hamilton Gardens which were absolutely fantastic. It’s divided up into a bunch of incredibly well-designed and well-maintained gardens from throughout history — plus some whimsical ones — and I can’t recommend it highly enough.

Indian Char Bagh Garden
Indian Char Bagh Garden
Italian Renaissance Garden
Italian Renaissance Garden
Japanese Garden of Contemplation
Japanese Garden of Contemplation
English Flower Garden
English Flower Garden
Chinese Scholar’s Garden
Chinese Scholar’s Garden
Tropical Garden
Tropical Garden
Surrealist Garden
Surrealist Garden
Tudor Garden
Tudor Garden

After that we continued on through to Rotorua itself and went for a lovely walk through Whakarewarewa Forest which is a forest full of massive redwood trees, then visited some work colleagues of Kristina’s — and Colin, their miniature wire-haired dachshund — who live in Rotorua and went out for a delicious dinner at Macs Steakhouse on Rotorua’s “Eat Street”.

Looking up at the redwoods
Whakarewarewa Forest
Colin the miniature wire-haired dachshund
Eat Street in Rotorua

Unfortunately the B&B we were staying at (“Sandi’s B&B”) wasn’t good… turns out it was on a major road that has large trucks going down it extremely loudly at all hours of the night, some enough that they’d actually vibrate the bed, and it had the world’s stupidest problem where there was a pear tree that was growing over the top of the cabin we were in and the fruit was ripe enough that the pears were randomly dropping onto the roof with an extremely loud THUD. Sandi made us a full breakfast in the morning which was delicious but it didn’t make up for the ruined sleep. 😴

Rotorua is known for its geothermal springs so on day 4 we visited “Wai-O-Tapu Geothermal Wonderland” and it was very impressive. The sulphur smell was something else, especially since it came it with steam so it was both humid and smelly!

Untitled
Champagne Pool
Untitled
Untitled
Lake Ngakoro

Our last full day we started with visiting a wildlife park called Paradise Valley Springs, after having another shitty night from trucks and pears thudding on the roof. We were there first thing in the morning so nobody else was around which was pretty sweet, but the farm animal section didn’t have very many animals in it and it seemed like they’d have liked some more company. We figured that it probably seems a lot more social when there are other people there.

Kea
Untitled
Feeding
Angora goat

The absolute best part of the day, however, was the visit to the Cornerstone Alpaca farm that broke up the drive from Rotorua back to Auckland! They had a whole presentation before we went out to see the alpacas, which was quite interesting, and the tour consisted just of Kristina and me. They’re so soft, and very pushy about getting food, haha.

Untitled
Untitled
Untitled
Untitled
Untitled

After that, our final night was spent in a hotel in Newmarket. We wandered around and took some photos which you can see in the album above. We made a second stop at Sal’s for dinner, and afterwards were poking around the cable TV channels in the hotel to see what was on, ended up watching the first game of the T20 England versus South Africa cricket match, and Kristina ended up quite enjoying it! I was explaining the rules as we went along and she’s pretty well got the hang of it now, to the point that we’ve at least temporarily subscribed to Foxtel Now to be able to watch more T20 games, hahah.

Our flight back to Sydney on the sixth day didn’t leave until 4pm so we went to the Auckland Museum to kill time after we checked out from the hotel, and it was actually really neat. The whole ground floor is a massive Maori and Pacific Islander exhibit with all sorts of cultural artefacts and details of their history, and on the second floor was a fascinating exhibit on volcanoes.

Next time we go back to New Zealand I reckon we’ll probably be hitting up the South Island (Te Waipounamu) again, but it was definitely a good trip — sans the terrible sleep in the middle at least, anyway.

Installing OpenWRT on a Netgear D7800 (Nighthawk X4S) router

I had blogged back in October of last year about setting up DNS over HTTPS, and it’s been very reliable, except for the parts where I’ve had to run Software Update on the Mac mini to pick up security update, and while it’s restarting all of our DNS resolution stops working! I’d come across OpenWRT a while back, which is an open-source and very extensible firmware for a whole variety of different routers, but I did a bunch of searching and hadn’t come across any reports of people fully-successfully using it on our specific router, the Netgear D7800 (also known as the Nighthawk X4S), just people having various problems. One of the reasons I was interested in OpenWRT because it’s Linux-based and extensible and I would be able to move the DHCP and DNS functionality off the Mac mini back onto the router where it belongs, and in theory bring the encrypted-DNS over as well.

I finally bit the bullet and decided to give installing it a go today, and it was surprisingly easy. I figured I’d document it here for posterity and in the hopes that it’ll help someone else out in the same position as I was.

Important note: The DSL/VDSL modem in the X4S is not supported under OpenWRT!

Installation

  1. Download the firmware file from the “Firmware OpenWrt Install URL” (not the Upgrade URL) on the D7800’s entry on OpenWRT.org.
  2. Make sure you have a TFTP client, macOS comes with the built-in tftp command line tool. This is used to transfer the firmware image to the router.
  3. Unplug everything from the router except power and the ethernet cable for the machine you’ll be using to install OpenWRT from (this can’t be done wirelessly).
  4. Set your machine to have a static IP address in the range of 192.168.1.something. The router will be .1.
  5. Reset the router back to factory settings by holding the reset button on the back of it in until the light starts flashing.
  6. Once it’s fully started up, turn it off entirely, hold the reset button in again and while still holding the button in, turn the router back on.
  7. Keep the reset button held in until the power light starts flashing white.

Now the OpenWRT firmware file needs to be transferred to the router via TFTP. Run tftp -e 192.168.1.1 (-e turns on binary mode), then put <path to the firmware file>. It’ll transfer the file and then install it and reboot, this will take several minutes.

Once it’s up and running, the OpenWRT interface will be accessible at http://192.168.1.1, with a username of root and no password. Set a password then follow the quick-start guide to turn on and secure the wifi radios — they’re off by default.

Additional dnsmasq configuration and DNS-over-TLS

I mentioned in my DNS-over-HTTPS post that I’d also set up dnsmasq to do local machine name resolution, this is very trivially set up in OpenWRT under Network > DHCP and DNS and putting in the MAC address and desired IP and machine name under the Static Leases section, then hitting Save & Apply.

The other part I wanted to replicate was having my DNS queries encrypted. In OpenWRT this isn’t easily possible with DNS-over-HTTPS, but is when using DNS-over-TLS, which gets you to the same end-state. It requires installing Stubby, a DNS stub resolver, that will forward DNS queries on to Cloudflare’s DNS.

  1. On the router, go to System > Software, install stubby.
  2. Go to System > Startup, ensure Stubby is listed as Enabled so it starts at boot.
  3. Go to Network > DHCP and DNS, under “DNS Forwardings” enter 127.0.0.1#5453 so dnsmasq will forward DNS queries on to stubby, which in turns reaches out to Cloudflare; Cloudflare’s DNS servers are configured by default. Stubby’s configuration can be viewed at /etc/config/stubby.
  4. Under the “Resolv and Hosts Files” tab, tick the “Ignore resolve file” box.
  5. Click Save & Apply.

Many thanks to Craig Andrews for his blog post on this subject!

Quality of Service (QoS)

The last thing I wanted to set up was QoS, which allows for prioritisation of traffic when your link is saturated. This was pretty straightforward as well, and just involved installing the luci-app-sqm package and following the official OpenWRT page to configure it!

Ongoing findings

I’ll update this section as I come across other little tweaks and changes I’ve needed to make.

Plex local access

We use Plex on the Xbox One as our media player (the Plex Media Software runs on the Mac mini), and I found that after installing OpenWRT on the router, the Plex client on the Xbox couldn’t find the server anymore despite being on the same LAN. I found a fix on Plex’s forums, which is to go to Network > DHCP and DNS, and add the domain plex.direct to the “Domain whitelist” field for the Rebind Protection setting.

Xbox Live and Plex Remote Access (January 2020)

Xbox Live is quite picky about its NAT settings, and requires UPnP to be enabled or you can end up with issues with voice chat or gameplay in multiplayer, and similarly Plex’s Remote Access requires UPnP as well. This isn’t provided by default with OpenWRT but can be installed with the luci-app-upnp and the configuration shows up under Services > UPnP in the top navbar. It doesn’t start by default, so tick the “Start UPnP and NAT-PMP service” and “Enable UPnP” boxes, then click Save & Apply.

Upgrading to a new major release (February 2020)

When I originally wrote this post I was running OpenWRT 18.06, and now that 19.07 has come out I figured I’d upgrade, and it was surprisingly straightforward!

  1. Connect to the router via ethernet, make sure your network interface is set to use DHCP.
  2. Log into the OpenWRT interface and go to System > Backup/Flash Firmware and generate a backup of the configuration files.
  3. Go to the device page on openwrt.org and download the “Firmware OpenWrt Upgrade” image (not the “Firmware OpenWrt Install” one).
  4. Go back to System > Backup/Flash Firmware, choose “Flash image” and select your newly-downloaded image.
  5. In the next screen, make sure “Keep settings and retain the current configuration” is not ticked and continue.
  6. Wait for the router light to stop flashing, then renew your DHCP lease (assuming you’d set it up to be something other than 192.168.1.x like I did).
  7. Log back into the router at http://192.168.1.1 and re-set your root password.
  8. Go back to System > Backup/Flash Firmware and restore the backup of the settings you made (then renew your DHCP lease again if you’d changed the default range).

I had a couple of conflicts with files in /etc/config between my configuration and the new default file, so I SSHed in and manually checked through them to see how they differed and updated them as necessary. After that it was just a case of re-installing the luci-app-sqm, luci-app-upnp, and stubby packages, and I was back in business!

A painting table upgrade

I’ve posted previously about my mobile painting table setup (one, two) but one downside it’s always had is the table light I was using. It was nice and bright, but fairly harsh — at certain angles it’s very easy to be working in your own shadow — and because it was a halogen bulb it would get really hot, which in summer is decidedly unpleasant, even when inside with the air conditioning going.

I’d wanted to take advantage of LED strip lights for a while now to construct some sort of lighting system over the top of the table, and I decided that the Christmas break would be a good time to do it. We went to Bunnings today, and a bit of wood-cutting, drilling, screwing, and attaching later, I’m very pleased with the result!

The lights are Arlec attachable LEDs, they come in a 3-metre-long strip with adhesive tape on the back of them, and you can even cut them at specific points and use the included joiner cable to join the two pieces together. Unfortunately the connection is very finicky, and I wasn’t able to properly attach the ends of the joiner cable to the wood as I’d wanted because at the angle I need the other half of the light strip cuts out. 🙁 Hence the dodgy cable tie that you can just see at the left of the photo above. As long as I don’t jostle or move it it works fine though.

It casts a really nice even light, and while I think it might actually be a little bit dimmer than the old light, the lack of harsh shadows more than makes up for it — and if I decide in future that it’s too dim, I can just get some more LED lights and hook them up!

Coding my own personal version of Facebook’s Memories feature

I deleted my Facebook account way back somewhere around 2009, but the one thing that Kristina shows me that I think is a neat idea is the “memories” feature, where it shows posts from previous years on that day in particular. I realised I could very much code something up myself to accomplish the same thing, given I have Media posts going back to 2009.

And so I did! By default it’ll show all posts that were made on this exact same date in each previous year (if any), excluding today’s, and you can also pick an arbitrary date and view all posts on that date for each previous year as well.

I was originally going to have it send me an email each day, but I quickly realised I couldn’t be bothered dealing with HTML emails and so it ended up in its current state. It’s not perfect, I’m still wrestling with timezones — if you view the main Memories page before 11am Sydney time, you’ll get yesterday’s date because 11am Sydney time is currently when the day switches over to the new day when it’s UTC time. If I do specify a Sydney time in my code, the automated tests fail on Bitbucket Cloud because they’re all running in UTC. I’m sure it’s fixable, I just haven’t had the brain capacity to sit down and work it out. 😛 Between this and my tag browser, it’s been pretty fun seeing old posts I’d forgotten about.

Update 21st December: I found these two posts about timezones in Postgres, and between them and firing up two Docker containers in UTC time for testing — one for Postgres and one for my code — I managed to get it fully working! 🎉

Another vehicular upgrade!

We bought a new car today! A shiny new 1.5-litre Toyota Yaris in white.

We’d been thinking of selling the old (2000 model!) Corolla and getting a new car for a while, and a friend of a friend of Kristina’s had an urgent need for a replacement car so we actually sold the Corolla a couple of months ago and have been managing with just the Cerato in the meantime.

We decided to go test drive the Yaris, and Kristina immediately loved it. The turning circle is hilariously tiny, and the visibility is fantastic, so we put down a deposit in early November, and went and picked it up today! It has a surprising amount of pep for a 1.5L engine, more than what the old Corolla’s 1.8L had — though granted that was also 19 years old. You definitely feel less like you’re sitting in the car compared to the Cerato… with that, you’re down in the seats whereas the Yaris has a much higher-feeling seating position. We’re getting a carport added to the front of the house so the Yaris isn’t sitting out in the open all the time (we’re still going to be parking at the station), but that won’t be until early January, so hopefully we don’t get any hail before then!

Of coding and a history of iPhone photo filter apps

I had last week off work, mostly due to being in desperate need of a holiday, I didn’t go anywhere but just chilled out at home. I did do a bunch of coding on my website though!

I’d been using Tumblr to post my random snaps from 2009 to about 2016 or so and cross-posting them to Twitter, before I found that Tweetbot had custom image posting functionality where you could post images to a URL that replied with a specific format and Tweetbot would use those image URLs in its tweets. I added functionality for that on my website and had been saving tweets and images directly since 2016.

Last year, it occurred to me that I should import my posts from Tumblr to my website in order to have everything in one place. I obsessively tag my Flickr photos and as a result am able to find almost anything I’ve taken a photo of very quickly, and while I hadn’t quite gone to those same levels of tagging with Tumblr, all my posts there had at least some basic tags on them that I wanted to preserve when bringing them in to my website, so I had coded up a tags system for my Media page and a script to scrape the Tumblr API and suck the posts, images, and tags in. I also wrote a very simple little React app to be able to continue adding tags to new posts I’m making directly to my website.

The one thing that was missing was the ability to see all of the current tags, and to search by tag, so this past week I’ve been doing exactly that! I have a page that shows all the tags that exist with links to view just the posts tagged with a given tag, and on the front page the tags that a post has are clickable as well.

I realised I had mucked up the tagging on a few posts so was going through and re-tagging and updating them, and it struck me just how much I used to rely on those camera filter apps to hide how shit photos from old iPhones used to be. One of the ways I’d tagged my photos on Tumblr, and I’ve continued this even now with the new direct-posting-via-a-custom-iOS-shortcut that I’ve got set up on my iPhone, is with the name of app I used to edit the photo. Going roughly chronologically as I started using each app:

Instagram was only a very brief foray, and VSCOCam was by far my most-used app. Unfortunately it went downhill a couple of years ago and they Androidified it and now all of the icons are utterly inscrutable and you also can’t get RAW files taken from within the app back out again in anything but JPEG. Apparently there’s a thing called a VSCO Girl which I suspect is part of what happened there.

My most recent editing app prior to getting the iPhone 11 Pro has been Darkroom, it’s extremely slick and integrates directly with the regular photo library on your phone and offers a similar style of film-esque presets to VSCOCam, though fewer in number.

With the iPhone 11 Pro, however, the image quality is good enough that I don’t even feel the need to add obviously-film-looking presets to the images. I take the photo, hit the “Auto” button in Photos.app to add a bit of contrast, and usually use the “Vivid” preset to bring the colours up a bit, but otherwise they’re pretty natural-looking.

That said, I’ll probably end up heading back to Darkroom at some point as I do like my film aesthetic!

iPhone 11 Pro

I upgraded to the iPhone 7 nearly three years ago and was very impressed with the camera, and here I am again being impressed by the camera in a new iPhone! Or more specifically, three cameras, since the iPhone 11 Pro comes with 13mm full frame-equivalent, 26mm-equivalent, and 52mm-equivalent lenses.

I absolutely love the look of the iPhone 11 Pro, especially in Midnight Green as Apple has been using for a lot of its promo shots. Someone on Ars Technica described it thus and I think they really nailed it:

It’s a touch retro, a touch cyberpunk, with a hint of bounty hunter droid (somewhere between 4-LOM and IG-88)

The cameras themselves are extremely impressive as well. Sadly there’s (currently) no way to get RAW images out of the ultrawide 13mm-equivalent lens, so you’re stuck with JPEGs, but the images direct out of the phone are still quite impressive. I’ve gone for three walks on my lunch break so far, two with the ultrawide (1, 2) and one with the regular 26mm-equivalent shooting in RAW with Halide.

That last shot was taken with the telephone lens on the way home from the train station.

I’ve also found myself sharing way more photos on Mastodon lately thanks to the combination of image quality, ease of taking photos even in dim lighting (Night Mode is extremely good), combined with our 40Mbps upstream NBN HFC connection and the custom iOS shortcut I have to post things to my website before sending them to Mastodon. It’s all very frictionless.

Another fantastic part of this whole setup is using Lightroom and my Apple Pencil on my iPad Pro to fully edit photos I’ve taken with the iPhone and then send them on to Flickr. My standard for images to go on Flickr is ones that are Proper Photography, if you will, rather than just snapping an image of something that’s happening or something that’s cute, and of good quality. The iPhone 11 Pro delivers that in spades, and not having to use a computer for Proper Photography is an absolute delight. There’s still some kinks — I obsessively tag my photos on Flickr and Lightroom Classic on my computer has a big hierarchical list of tags that all get uploaded along with the image, and sadly there’s no hierarchical tagging available at all in Lightroom Mobile or CC, so I have to manually add them all after the fact — but it’s incredibly freeing nonetheless.

The iPhone 11 Pro is just effortlessly fast, whatever I want to do just happens immediately without any sort of needing to think about it. It was an extremely worthy upgrade from my trusty old iPhone 7!

Cardiovascular health and a new shiny: Apple Watch Series 5

We bought a treadmill back at the start of 2014 and it came with a heart rate monitor that you wear around your chest, which is pretty cool. I gave the treadmill a pretty good going and was doing one of those Couch to 5K programs, but I keep having issues with my knees where running messes one of them up. We bought an elliptical in August last year — which apparently I didn’t post about here — and I’ve been thoroughly enjoying using it. The one we have has a tablet holder right at eye level so I’ve been watching TV shows on Netflix while using it, and it really helps pass the time.

The downside was that I had no heart rate monitor, as the one that came with the treadmill only works with the treadmill (it shows your current heart rate right alongside the distance and estimated calories burned and such). I’d been going pretty hard on it but had noticed that I was getting some heart palpitations, and had a couple of feeling-dizzy moments a while after I’d finished exercising. I went to the doctor and she suggested cutting down on caffeine to start with — I was on four admittedly only instant coffees a day — and see if that improves things to start with, and if not we could get an EKG done.

Quite conveniently timed, the Apple Watch Series 5 was announced on the 10th of September this year, and it comes with an always-on display. Prior models had their display totally black and would only light up when you’d either raise your wrist or tap on the screen. I’d been eyeing the Apple Watch off for a couple of years, and finally decided I’d jump on board because it’d be usable as a regular watch even if the screen doesn’t fully light up. I got the 40m stainless steel with black leather Modern Buckle band and it looks classy as hell.

(I also realised after my first workout that I needed to get one of the cheaper non-leather bands as well because man do I get sweaty wrists when I’m exercising 😛).

Apple has been leaning pretty hard into the health thing with the Apple Watch in recent years, and as well as the heart rate monitor — which is constantly taking your heart rate periodically throughout the day as well as constantly when you start a workout — it comes with an app called “Activity” on the iPhone to help motivate you to keep moving. The way it works is that there’s three “rings” you should try to close each day, called Move, Exercise, and Stand. Move is just generally getting up and about and not sitting on your arse, and is set to 1422kJ for me based on my height and weight. Exercise is 30 minutes of brisk movement — I walk fast enough that I get a few minutes counted towards it each time I’m walking to or from the station or taking the stairs at work. The stand goal is standing up and moving for at least a minute during a one hour period for 12 separate hours during the day, and if you’ve been sitting around for 50 minutes in a given hour you get a little buzz on your wrist at ten minutes to the next hour that reminds you to stand up and move around a bit.

Apple must have done a whole lot of psychological research into what’s most satisfying in terms of motivation because god damn closing those rings feels good. You get a little round fireworks animation of the given colour of ring when you fully complete one for the day, and the one with all three when you’ve finished all of them. I bought the Watch on the 23rd of September and every single day since then I’ve closed all three rings! You get little badges called “Awards” when you complete certain goals, like getting a full week of closing all three rings, which has meant that when I’ve been working from home I’ve been jumping on the treadmill or elliptical for just a quick half hour to get my exercise goal done. I also downloaded an app for the Watch called HeartWatch that gives you a little speedometer of heart rate when you’re exercising and ensures you keep it in the correct zone — not too fast and not too slow — for what you’re trying to do, in my case just generally be fitter.

I completed October with every single day’s rings fully closed, which I’m pretty chuffed about!

A screenshot of the Activity app for Apple Watch showing every single day in October having all three rings closed

We’d also bought a set of smart scales last year that sync with the Health app on iOS, I’ve been weighing myself each morning and as a result of all of this fitness I’m hovering around 70.2kg, which is a weight I don’t recall being for many years now; I was at 82kg a few years back. The heart palpitations have definitely decreased as well and I haven’t had any dizziness since I’ve been monitoring what my heart rate has been while exercising.

I don’t do much by way of outdoor exercising, but the Apple Watches all come with GPS as well so you can keep maps of the routes you’ve taken and see the speed you did during each section. Overall I’m wildly impressed with this bit of technology! I hadn’t worn a watch since about 2001 when I got a job and bought my first mobile phone, but now I feel naked without it, haha.

A new hobby: Making bread!

A new hobby: Making bread!

Back in July and August, Kristina had been on a bit of a bread-making spree. We have an old bread machine that Kristina bought from Vinnies back in 2009 when she first moved over here from the US, and she’d been using that with varying amounts of success. I was talking to some of my co-workers and one of them recommended a book called Flour Water Salt Yeast and said you absolutely cannot go past it. I bought that for Kristina’s birthday, as well as the thing it says to bake the bread in, a Dutch oven.

It’s a really interesting book, even the most basic recipes only use a tiny amount of yeast (2 grams/½ a teaspoon), you don’t knead them, and the shortest recipe has the dough rising for five hours and proofing for another hour. You can get a good idea of how it all goes from the man himself, Ken Forkish.

As it turns out Kristina doesn’t really have the patience for it, so the bread-making has become my thing, and OH MY GOD THE BREAD FROM THIS BOOK. It is absolutely epic, nice and chewy like sourdough and the Dutch oven is the magic around how the crust comes out so good.

I’ve been doing a batch of bread almost every weekend now, and there’s something really enjoyable about the whole process. I’ve been tooting my efforts, get a load of all of this damn bread! The first two photos were the “Saturday White Bread” recipe where it’s done in one day, the third was the Overnight 40% (actually 35% and rye) Wholemeal Bread, and the last was the Overnight White Bread (which I screwed the timing up for because I didn’t read the recipe of when to start it and so had to put it into the fridge overnight so it didn’t rise too much, but it came out delicious anyway).

A magnificent-looking round brown crusty bread loaf.
A dark brown round loaf of bread covered in flour.
A brown absolutely delicious-looking round loaf of bread sitting on a cooling rack, with a light dusting of flour on it.
Two round loaves of bread on a cooling rack, the left one being noticeably browner than the right one.

An artistic update

An artistic update

I posted back in February about some of the stuff I’d been doing in Procreate on my iPad, and I’m overdue for another post! I haven’t been doing as much in the intervening months, as there’s been lots of other things taking up my time and I haven’t felt as inspired but I still managed to do a few.

I’ve quite enjoy using Procreate’s Acrylic brush, you can get some really nice layer and lighting effects with it, and I used only that brush for this one:

A painting of a window at night, from inside a room. There's sheer curtains over the window, a candle is on a small table at the right casting light, and there's a tall cupboard at the left in the shadows.
The Window

I don’t actually remember the brush I used for this next one, but I definitely took full advantage of Procreate’s symmetry guides so I could get it properly even:

A painting of a cybernetic woman, her eyes look like blue glass and she has green and very shiny "skin". She has a purple hood over the back of her head.
Cybernetic Woman

This next one is interesting, I was intending on the main structures that take up the top two-thirds of the image to look like a big craggy mountain range, but I showed it to Kristina and she can’t see it as anything but a tornado coming down!

A painting of a craggy grey mountain range in the top two-thirds of the image, with a river of fire making its way the whole way across the image, and a bunch of conifers at the bottom.
The River

I quite enjoy doing epic-looking landscapes, and this one ended up starting out in a very different place than it finished. It was much more brown, the feature in the middle was a river, and the sky was a sunset which I didn’t manage to get looking how I wanted. In the end it became very much inspired by the aesthetic of the Hive from Destiny!

A painting looking down a desolate grey rocky valley. A deep black rift runs down the middle with a sickly green glow at the bottom, at the left is a crystal embedded in the ground with the same green glow coming from it. At the right is a cave entrance in the valley wall with another glowing crystal. The sky is awash with stars, and the moon peeks from behind the valley peak at the far left.
The Emergence

The paintings above were all done from about March to half-way through May, then there was a bit of a break until July.

I decided to take advantage of Procreate’s drawing guide again, this time with the perspective guide. I was aiming for buildings in a futuristic city but the thing that I always struggle with is details and a sense of scale, so it didn’t turn out to be anything but big blocks. ? Still pleased with the shadows and sense of lighting though.

A very clean geometric painting of grey and blue city buildings. The sky is purple and the light is coming from the very right, the buildings casting shadows to the left.
City Buildings

This next one I did as “speed-painting”, and did it in about 45 minutes! It was a combination of the acrylic brush and a palette knife brush from a big third-party brush pack I bought.

A painting of a volcano erupting atop a hill, the hill is surrounded by taller mountains all around, and the sky above is filled with striated dark orange clouds.
Volcano

Then lastly, this one was done in August, again with Procreate’s symmetry guide on! I was going to give her a witch’s hat but couldn’t get it looking right.

A head and shoulders portrait painting of a white woman with piercing green eyes, long red hair, and dark green lipstick. She’s wearing a dark purple top, and there’s a bright light shining behind her that’s lighting up her shoulders and the very edges of her hair.
The Witch

I also had a burst of inspiration and got some more miniature painting done! I’m still working my way through the Dark Imperium box set I got nearly two years ago, but the main impetus here was Games Workshop releasing their “Contrast” line of paints. They’re essentially a base coat plus wash combined into one single coat, and they’re seriously incredible. Dark Imperium comes with twenty poxwalkers which I was dreading having to paint, but the Contrast paints made them far quicker to deal with! There’s twenty models (but only ten unique ones), and I’ve done half of them so far.

As part of doing this, I also discovered how much better the miniatures look when you apply a varnish to them! The Contrast paint specifically comes off a lot more easily than regular paint, so varnish is a necessity, but it also really makes the colours pop, they’re a lot more vibrant than without it.

Poxwalker 1
Poxwalker 2
Poxwalker 3
Poxwalker 4
Poxwalker 5

I also finally finished off the Plague Marine champion that’d been sitting there mostly-finished for months, and I’m really happy with the base I did. I had a bunch of really old Space Marines from a starter painting box that a friend had given me, so I sacrificed one of them and cut him up to adorn the base, and it looks absolutely fantastic.

Plague Marine Champion

It’s fascinating seeing the evolution of Games Workshop’s plastic miniatures, back when I started (*cough*24 years ago*cough*) plastic was the cheap and crappy option, and the pewter (or lead as they were back then!) miniatures were much more detailed. Nowadays it’s very much the reverse, the plastic is INSANELY detailed — have a look at the full-size poxwalkers on Flickr and zoom all the way in — and the pewter ones are a bit shit by comparison.

There’s also a small-scale Warhammer 40,000 game called Kill Team that I’ve started playing at work with some people, and have bought the new box set that was released in September. It’s similar to Shadespire in that your squads only have a small number of miniatures so it’s much more feasible to get them painted, but it comes with a bunch of absolutely amazing-looking terrain. I put it together and took a couple of photos prior to it being painted, just to get a sense of the scale and what the terrain looks like.

A photo of some Death Guard and Space Wolves miniatures on the new Kill Team starter box terrain. The terrain itself is unpainted grey plastic but is towering over the miniatures and has a very steampunk aesthetic to it.
A photo of some Death Guard and Space Wolves miniatures on the new Kill Team starter box terrain. The terrain itself is unpainted grey plastic but is towering over the miniatures and has a very steampunk aesthetic to it.

I’ve finished painting a couple of pieces of it, but it’s so big that I don’t have a large enough white backdrop that’ll fit the whole terrain piece! Photos will definitely be forthcoming once I do get said backdrop though. ?

More coding adventures: Migrating to TypeScript and Express.js

Three and a half years ago I blogged about learning Javascript and Node.js, and then again at the start of 2018 about my progress and also learning React, and I figured it was about time for another update! This time it’s been moving from Sails.js (which is a web framework based on Express.js) to using raw Express itself and moving the language from Javascript to TypeScript (TypeScript is basically Javascript, except with type-checking).

At work, we migrated the codebase of the server that runs our internal platform-as-a-service from Javascript to TypeScript, and I figured it seemed like a neat thing to learn. TypeScript ultimately gets compiled down to Javascript, and I started by trying to just write my Sails.js modules as TypeScript and have them compiled to Javascript in the locations that Sails expected them to be in, but this proved to be a fair bit of a pain so I figured I’d just go whole-hog and move to raw Express.js while I was at it.

I did a whole heap of reading, and ended up coming across this absolutely excellent series of blog posts that takes you through using Express and TypeScript step by step. It took about a month all up, and you can really see how much code was removed (this excludes Node’s package-lock.json file because it’s massive):

$ git diff --stat a95f378 47f7a56 -- . ':(exclude)package-lock.json'
[...]
 151 files changed, 2183 insertions(+), 4719 deletions(-)

My website looks absolutely no different in any way, shape, or form after all of this, but when writing code it’s quite nice having all of Visual Studio Code‘s smarts (things like complaining when you’ve missed a required parameter when calling a function, auto-completion, and on).

Having moved to raw Express.js from Sails.js means I have a much better understanding of how it all works under the bonnet… Sails is great for getting up and running quickly, but there’s a lot of magic that happens in order to accomplish that, and more than once I’ve run into the boundaries of where the magic ends and have had to try to hack my way around it. Express by itself is a lot more widely-used than Sails too, so if I run into problems I generally have an easier time finding an answer to it!

Twenty years of VirtualWolf

Today twenty years ago marks the earliest point I can find of where I started going by “VirtualWolf” online! That’s over half my life. 😮

I had posted previously back in 2009 (back when this blog was on LiveJournal) about being VirtualWolf online for around ten years at that point but it was pretty vague in terms of dates, and I’ve since consolidated all my old websites and put them up online. Further digging reminded me that I have a whole bunch of other sites that I never actually finished — I should add those to archive.virtualwolf.org too, now I think about it — and there was one called “DevlinSlayer’s Imperium” from February of 1999 so it was clearly after that.

The earliest mention of VirtualWolf I can find is from my Realm of the Wolf site from the 28th of June 1999, and the name of the Myth II map I’d created, “Realm of the VirtualWolf”. Unfortunately that file has been lost and I cannot for the life of me find the original anymore. I was able to recover all but one of my Marathon maps from various places, but I can’t even find the original pre-compiled image files for the Myth II map. The only image related is the overhead map view from the website I created after Realm of the Wolf.

Since that point, it’s been all VirtualWolf, all the time. I’ve owned the domain virtualwolf.org since the start of 2002, and have made a point of ensuring all the old links to images I’ve posted here and on Ars Technica still work even now.

Here’s to me making a post in twenty years saying “Forty years of VirtualWolf!”. 😛

Ten years of marriage!

Ten years of marriage!

Ten years ago today, on a very cold and windy but at least sunny day at Dee Why headlands, Kristina and I got married!

TEN. YEARS.

I have no idea how it’s been ten years, it doesn’t remotely feel like it’s been that long. I’ve mentioned on this blog before, and on LiveJournal before it, that everything is so effortless, and it still remains true!

We originally knew each other from Everything2, which is still around but very much dead compared to the old days, and has been for many years now (my registration date there is May 2000). There was a bit of a mass-migration from E2 over to LiveJournal a couple of years afterwards, and I have a happy birthday wish from Kristina on one of my LJ posts from 2003! She said I was always “That guy in Australia who likes metal”, but we got to chatting more towards the end of 2007 and then on a whim decided to come visit in March of 2008, and the rest, as they say, is history!

With both of us being keen photographers we tend to be behind the camera instead of in front of it, but we’ve got a few photos together over the years!

Kristina and I
The first one of us together, in August 2008 in Boston the first time I visited (we don’t have any of the two of us from March when Kristina first visited Sydney). This was right before I trimmed my goatee entirely down because it’d started just triangulating outwards and getting all wispy.
Untitled
November 2008, we were up in the Blue Mountains when we got the call that the engagement ring was ready to be picked up!
Untitled
Then our wedding, of course. I love this photo so much!
December 2011, being all arty!
Selfie!
This was taken in May 2014 when we bought the Fujifilm X100S. Kristina looks so hilariously unimpressed.
Kristina and me
December 2015!
Untitled
Then an attempt in February 2017 at taking a photo with the two of us and Beanie. It didn’t go so well.
Untitled
And finally the most recent one of us together from December of 2017, taken with the flash and massive parabolic umbrella directly behind the camera.

In the time we’ve been married we’ve been to:

And that’s not counting the day trips or single night trips of which there have been plenty.

Here’s to many more years of adventures to come! ❤️

New bathroom!

The bathroom in our house was always a little bit crap, it’d clearly been done on the cheap and many years ago, and we wanted to get a new one put in. Conveniently, my parents were going away on holidays for three weeks last month so we decided that this was a pretty perfect time to do it. We schlepped our stuff over to their place and stayed over there while the old bathroom was ripped out and a brand new shiny one put in! (We also got the second toilet redone in the same style, so there were no toilets at all hence the inability to stay at home while this was being done; also Beanie would have lost his mind when with the tradies being over all the time).

I borrowed a really wide-angle lens from a friend in order to properly take before and after photos, this is the old bathroom (the shower screen is filthy because we’d given up on cleaning it by this point).

A wide-angle view of a bathroom, the floor tiles are sickly grey and the grout looks dark just from being dirty. A cheap-looking white vanity with a single sink is in the middle of the photo, the shower screen around the shower is at the left and is totally soap scum-encrusted. The tiles in the shower are black with discoloured grout between them. A toilet roll holder is attached by suction cup to the outside of the shower screen next to the toilet, that's just visible at the bottom-left.
A wide-angle view of a bathroom, the floor tiles are sickly grey and the grout looks dark just from being dirty. A cheap-looking white vanity with a single sink is in the right of the photo, the shower screen around the shower is in the middle and is totally soap scum-encrusted. The tiles in the shower are black with discoloured grout between them. A toilet roll holder is attached by suction cup to the outside of the shower screen next to the toilet, the toilet itself is small and very plastic-looking and sits next to the shower. In the ceiling above the shower is an EXTREMELY yellowed plastic fan vent.
A wide-angle view of a bathroom looking towards the bath, the floor tiles are sickly grey and the grout looks dark just from being dirty. The bath is in the middle of the photo, with black tiles around it, the shower screen is at the right of the photo and the vanity is just visible at the bottom-left.

The grout on the floor was all dirty and discoloured, the tiles were really thin, clearly cheap, and also quite ugly, and the vanity was really cheap-feeling, and we’d had to clean mould out of the inside of the cupboards more than once.

Kristina was interested in doing the bathroom in a much more modern and minimalist style than we’d done with the kitchen, to try to minimise the amount of nooks and crannies that would need to be cleaned, and to ensure it didn’t end up looking cluttered. We decided on white wall tiles, and slate grey floor tiles, plus a wall-hung vanity and wall-mounted taps.

The guy who was overseeing the whole lot sent us daily updates on how it was going, and it was fascinating to see everything ripped out and just the bare frame and insulation after they’d finished on the first day.

The bathroom with no tiles or gyprock on the bottom half of the walls, just bare timber frame and insulation.

After that, each day’s updates were just more and more things being put in, we were able to come back on the Saturday of the third week though the shower screen itself wasn’t in by that point. On the Friday of the following week the guy came to install it, and with that it was done, and it looks absolutely amazing!

The fantastic new bathroom, a white wall-hung double-basin vanity and wall-mounted chrome taps are in the middle of the photo, a shaving cabinet with mirror above, and two lovely in-ceiling LED lights above that. To the left is a waist-high thin wall tiled in white, and that side of the shower screen starts at the top of the wall. The wall tiles are white and the floor tiles are a dark slate grey.

There is a towel on the wall that's a dark pink, and on the right of the vanity is a small towel ring that has a hand towel of the same dark pink colour.
The fantastic new bathroom, a white wall-hung double-basin vanity and wall-mounted chrome taps are at the right of the photo. In the centre is a waist-high thin wall tiled in white, and that side of the shower screen starts at the top of the wall. Next to the wall sits a white ceramic toilet, and the toilet roll holder is a chrome metal one that's attached to the waist-height wall. The wall tiles are white and the floor tiles are a dark slate grey.
The fantastic new bathroom, a white wall-hung double-basin vanity and wall-mounted chrome taps are at the left of the photo. In the centre is the bath surrounded by white wall tiles, at the right is a waist-high thin wall tiled in white, and that side of the shower screen starts at the top of the wall. The wall tiles are white and the floor tiles are a dark slate grey.

A dark pink towel hangs on the wall over the left end of the bath.

There’s lovely LED lights above the vanity, and the mirror is actually a shaving cabinet so we can put stuff into it and not have to be rummaging around in a dark cupboard. As we discovered with the kitchen and the drawers in there, drawers are far superior to cupboards for anything but really shallow depths.

We didn’t even get anything moved in terms of layout, it’s all in the exact same position as it was before, but it just feels so much larger and more spacious, it’s wonderful!