Regarding things that I have done or made, ranging from the very minor to the more major. Output is the result of input and sometimes ephemera.

Screenshot of britishearways.com

Last month, I completed a major overhaul of the British Earways website. The design by Valerio di Lucente of Julia is almost entirely unchanged, the adjustments were largely performance-related and under the hood, geared towards modern browsers. Here’s brief rundown of the changes:

  • Style the full-window player layouts using CSS Grid Layout + 100% height (not 100vh since that can lead to unexpected behaviour on mobile browsers), and use CSS Scroll Snap w/ polyfill for scroll behaviour
  • Achieve flexible typography and spacing with “CSS Locks
  • On non-touch screens, implement invisible DragDealer instances so that each player’s scrubber can be dragged
  • On touch screens, add click event listeners that advance the relevant scrubber to the click target
  • Use styled HTML5 progress elements for each player since these are easily manipulated via their max and value attributes and don’t require adjustment if the window is resized
  • Use the Web Audio API to initialise each audio file and trigger the necessary state changes as the time updates
  • Switch the audio preload attribute from auto to metadata to reduce the size of the page when it initially loads
  • Update CMS to Kirby 3 (this was a joy, IMO the panel layout options make v3 much more client-friendly)
  • Adjust post_max_size, memory_limit, max_execution_time, and upload_max_filesize to allow upload of large (150MB+) audio files

I ran in to one issue that isn’t yet resolved. Kirby copies all uploaded media from the private /content folder to the publicly-accessible /media folder. This copying normally happens almost instantly, even with very large files. On the BE site however, the copy is pretty slow. Since the site pulls the audio duration from the audio file itself via the Web Audio API, the displayed duration is incorrect until the file has finished copying. This is almost certainly related to some rate limiting done by the shared hosting company, a legacy from the preexisting site. It isn’t a huge deal since the copying always finishes eventually, but it isn’t the best behaviour. I’d like to raise the issue with the hosting company but don’t have high hopes, shared hosting providers use rate limiting for a reason.

At any rate, I’m really looking forward to seeing how DB uses the site over the next year and listening to the new mixes.

Spread from book on Henry van de Velde by Richard Hollis

Richard Hollis’s Henry van de Velde: The Artist as Designer is out at long last. A lot of love, sweat, and tears has gone in to that book. It is absolutely jam packed, covering pretty much all of HvdV’s life with over 400 images. As part of Occasional Papers, I worked on the permissions, a bit of editing, and compiled the index.

Read about how the index was compiled

The folks at Penguin Random House have been sending some particularly strong e-newsletters recently using the system Sam and I created a little while back. Links below.

I’ve been spending a bunch of time on the Host site recently and just wrote up some thoughts about working with Netlify CMS and GitHub Pages on SB-PH’s tucked-away blog.

TL;DR
Though it’s an unusual setup for a client site, I like the stack and would consider using it again for a similar project.

Read post

Edit 23 Jan 2019
I just deployed some small fixes (force curly quotes via the smartify filter, prevent Cards from showing if no image), but the site doesn’t seem to be updating. It’s updated if I navigate to https://hostofleyton.com/index.html but not https://hostofleyton.com. Kind of weird. This StackOverflow thread seems useful, as does GitHub’s own troubleshooting page.

my experience getting up and running with Homebase

I finally got round to exploring Homebase yesterday (jump straight to setup steps). My original intention was to get the SB-PH site on Dat + HTTPS à la this blog post by Tara Vancil. As far as I can tell though, without multi-writer support in Dat this setup would effectively lock Sam out of being able to quickly deploy changes. We’re interested in making that site a little bit more of a collaborative sandbox, so making deployment harder than it is currently is not the right step to take there.

So though I definitely want to get the SB-PH site on Dat eventually, we’re putting that on hold for now and I’m pivoting towards my site. In this blog’s earliest incarnation it was on Tumblr, and for a long while now has been a pretty standard WordPress site. The big task in moving to Dat, besides figuring out Homebase, is converting my site from WordPress to a static site via Jekyll/Hugo/Eleventy/GatsbyJS or something similar. It’s taking a while, I didn’t realise quite how much content has accumulated (1000+ tags?!) and there are a few WordPress-y features that I definitely want to build in (“more” tags, descriptions for tags+categories, proper pagination, etc.). More on that in a separate note.

So yesterday I put that aside and focused on getting Homebase up and running on a DigitalOcean droplet. Overall, setting up Homebase wasn’t too bad. The most involved part of the process was setting up the server. I kind of like tinkering with server stuff, so that’s cool. I 100% agree with the caveat at the top of the Homebase README, you should consider Homebase only if you’re comfortable with and interested in server administration. I would add that your interest should be *ongoing*. Servers take maintenance (related, see note on serverless setups). It’s your responsibility if a process stops running, or the software is out of date, or the Let’s Encrypt certificate doesn’t renew, etc. Hashbase looks like a great alternative for those that want the final result but don’t want to deal with the server configuration/maintenance.

The rest of this note is an outline of the steps I took to get Homebase working. Where good documentation exists elsewhere, I have linked to that instead of elaborating.

Read Homebase setup steps