Published

Working with Netlify CMS and GitHub Pages

This was originally posted on sb-ph.com. I’ve moved it to my blog since we’ve decided to remove the blog from that site.

 

Heads up: this post is kind of old! I’ve got mixed feelings about this approach now for a few different reasons, in no small part due to Uploadcare’s new pricing. No dig at them, they gotta do what they gotta do, but it puts this out of most of my clients’ budgets. If you’re interested in additional thoughts on JAMstack CMSs, you might want to take a look at this post.

Working with Netlify CMS and GitHub Pages

We’ve recently been exploring a lightweight CMS setup for the Host site. This post summarises the thought process behind our decision to work with Netlify CMS and GitHub Pages.

TL;DR
Though it’s an unusual setup for a client site, I like the stack and would consider using it again for a similar project.

Evaluating Netlify CMS

There are two potential downsides that would rule out Netlify CMS for most of our client projects. One is that user account setup isn’t super straightforward, the authentication method makes this a little more complicated than normal. The other is that the hosting “ownership” is tied to a GitHub / GitLab / Bitbucket repository. All-in-all, this approach demands a slightly higher level of technical know-how from the client than most other CMSs we regularly use (Craft, Kirby, WordPress, etc.).

The upside is the relatively low cost to the client, both in the short and long term. Netlify CMS is free, as are a few selected static hosting providers (Netlify and GitHub Pages spring immediately to mind). The login authentication is the only step that requires a server, since the keys have to be kept secret. Previously, the server would be the big ongoing time and money sink. Certain platforms however, such as Netlify, offer selected webtasks / microservices / cloud functions for free.

When evaluating these pros and cons against the Host website requirements, it became apparent that it would be a pretty good fit. The original Host static site was built with Jekyll and jekyll-seo-tag, so we knew it would work nicely with GitHub Pages and Netlify CMS. GitHub recently introduced free private repositories for up to 3 users, so we felt less concerned about the site hosting being tied to a repository that we own since we could transfer it to Host at some point without feeling pressure to make the repository public. And the Host folks are a pretty tech savvy bunch, so I wasn’t worried about them being daunted by things like repositories or GitHub-linked authentication.

Working with Netlify CMS widgets and media

So I got started with Netlify CMS. The first thing I dove in to was the widget (field) configuration, and I was impressed. The widget configuration options are more fully-featured than I would have expected for a ~1.5 year-old CMS. While configuring the widgets I also gave the custom editor Preview components a try, however ultimately I abandoned that experiment and disabled the editor Preview. It’s out of the scope of this project, and the maintenance of these React components alongside a non-React site seems a little dicey. Something to explore separately at a later date perhaps.

Media management is a concern on static sites, and this one is no exception. We could have gotten away with hosting the images in a directory in the GitHub repo, but we wouldn’t have any nice image transforms for faster page speed and would have had to ask the client to do all of the heavy lifting with image resizing and optimisation. I would have also worried about the repository size spiraling out of control with overly-large image files. Netlify CMS v2.1.0 offers the Uploadcare media widget by default though. Uploadcare’s free tier seemed to be a very good fit for the Host website, so we got that set up as well. The implementation was pleasingly straightforward. As someone who has spent a lot of time debugging image transformations in more traditional CMSs, Uploadcare’s URL-based system is refreshing.

Configuring GitHub authentication for Netlify CMS + GitHub Pages

The final critical step was the authentication configuration. Netlify CMS offers a number of different methods. We knew we wanted to use GitHub Pages for hosting, so the two most likely options were Git Gateway with Netlify Identity or GitHub with Netlify. We went for the latter. It does require that all users have a GitHub account (unlike the former), however this ultimately feels like the right approach since much of the site documentation lives in the repo’s Readme file and we feel it gives the client a bit more ownership over and awareness of the inner workings of their site.

The authentication was a little more complicated than the rest of the Netlify CMS setup. I followed the GitHub Backend instructions, which are relatively minimal and more geared towards a site hosted on Netlify. I found this article more helpful since it addressed what we were after, a Netlify CMS-powered site hosted on GitHub Pages. One potential “gotcha” in these steps is that the OAuth application should be registered on whichever GitHub account owns the repo. I added it to my user account initially which was incorrect, since our organisation account owns the repo.

After following the GitHub backend instructions, I only ran in to two hiccups. I’m documenting them below since I didn’t find much related information elsewhere.

One problem was a post-login “dead end”. After clicking “Login with GitHub” and signing in successfully in the pop-up window, the user should be redirected to the Netlify CMS dashboard automatically. Instead, it just said “Authorized” and did nothing. This is because I hadn’t added the website URL to the Netlify project. I had assumed I didn’t have to add the URL since Netlify is not hosting the site, however the Netlify authorisation callback script checks the host URL against the custom domains. Once I added the URL to the project in Netlify, the issue resolved itself. Note that I did not repoint the DNS to Netlify since we still want to host the site on GitHub Pages, so it shows a little error / warning regarding the URL’s DNS records. This is not a problem.

So I was now being redirected successfully to the dashboard, but the dashboard was just a white screen. The console indicated an error with loading netlify-cms.js from Unpkg, specifically:

Refused to execute as script because "X-Content-Type: nosniff" was given and its Content-Type is not a script MIME type

This seemed possibly related to some security-related GitHub Pages headers, so I decided to grab the JS from the Unpkg URL and commit it to the repo and no longer use a CDN for the script. This immediately fixed the problem.

Final thoughts

I’d like to do a bit more exploration of Netlify CMS moving forward since it could be appropriate for more use cases depending upon the developers’ roadmap. I will probably look at the Git Gateway with Netlify Identity route for authorisation as well. Besides Netlify CMS though, we’re really interested in exploring Kirby 3. Kirby 2 has been excellent for a number of small-ish websites (<5 stakeholders, not a ton of relational data), so we’re excited to see where they’ve taken the newest version and if it could work with projects of a slightly larger scale.

Published

Saturday at Mozfest 2018

SB and I went to Mozfest for the first time last Saturday. What a lovely day! Took some haphazard notes throughout, see below for a dump of notes/links related to the sessions I attended. The bits in brackets are mostly thoughts that bounced around my head while taking notes during talks. All quotes are paraphrased.

Read more

Published

My experience getting up and running with Homebase

I finally got round to exploring Homebase yesterday (jump straight to setup steps). My original intention was to get the SB-PH site on Dat + HTTPS à la this blog post by Tara Vancil. As far as I can tell though, without multi-writer support in Dat this setup would effectively lock Sam out of being able to quickly deploy changes. We’re interested in making that site a little bit more of a collaborative sandbox, so making deployment harder than it is currently is not the right step to take there.

So though I definitely want to get the SB-PH site on Dat eventually, we’re putting that on hold for now and I’m pivoting towards my site. In this blog’s earliest incarnation it was on Tumblr, and for a long while now has been a pretty standard WordPress site. The big task in moving to Dat, besides figuring out Homebase, is converting my site from WordPress to a static site via Jekyll/Hugo/Eleventy/GatsbyJS or something similar. It’s taking a while, I didn’t realise quite how much content has accumulated (1000+ tags?!) and there are a few WordPress-y features that I definitely want to build in (“more” tags, descriptions for tags+categories, proper pagination, etc.). More on that in a separate note.

So yesterday I put that aside and focused on getting Homebase up and running on a DigitalOcean droplet. Overall, setting up Homebase wasn’t too bad. The most involved part of the process was setting up the server. I kind of like tinkering with server stuff, so that’s cool. I 100% agree with the caveat at the top of the Homebase README, you should consider Homebase only if you’re comfortable with and interested in server administration. I would add that your interest should be *ongoing*. Servers take maintenance (related, see note on serverless setups). It’s your responsibility if a process stops running, or the software is out of date, or the Let’s Encrypt certificate doesn’t renew, etc. Hashbase looks like a great alternative for those that want the final result but don’t want to deal with the server configuration/maintenance.

The rest of this note is an outline of the steps I took to get Homebase working. Where good documentation exists elsewhere, I have linked to that instead of elaborating.

Read Homebase setup steps

Published

Agorama #2: exploring Scuttlebutt

A wall in Rebecca’s Flat at Raven Row

This past Thursday 18 October was the second Server Co-op meetup in Rebecca’s Flat at Raven Row. See all Server Co-op notes.

I didn’t take as many notes this time, wasn’t feeling fantastic. Very sketchy notes below.


click public button twice if the Patchwork feed seems stuck after first install

how to have Scuttlebutt on multiple devices?
eh, maybe not worth the hassle, just use one device
“sameAs” is currently being worked on by devs in Scuttlebutt community

identity = private + public + network key combo
lib sodium

back up private key and gossip.json

dark crystal for backing up private key using social network

“shamir’s secrets” algorithm
kind of like horcruxes!

with Scuttlebutt, your friends are your cloud/datacentre

nothing is ever deleted (same as Dat)

could technically have multiple identities, but functionality isn’t implemented currently. Would have to swap .ssb directories

Published

Agorama #1: outstanding home decor + P2P

The rug in Rebecca’s Flat at Raven Row

Last night I went to the first Server Co-op meetup hosted by Agorama in Rebecca’s Flat. It’s a more-is-more space, and then some. It was a lovely evening. Notes:

Check out infocivics.com by Paul Frazee. “Computing networks are social and political systems. We should attempt to answer how the technical design of a network will influence the internal politics.”

There *is* a mobile Dat browser, but apparently it’s a bit… buggy. See Bunsen for Android (nada for iOS). Still, kudos to them for taking a stab at it. Apparently the project of making a Dat browser sort of hits a brick wall due to node.js, but a bunch of devs have taken it upon themselves to make a Rust implementation of Dat. TBH I don’t understand the ins-and-outs well enough to be able to describe how that lowers the barrier, but it sounds like the future of mobile Dat might be brighter for it.

I haven’t dug in to Scuttlebutt yet, and it sounds like it’s about time. An offline-first protocol, described by KG as a database/social network/community. See also Patchwork. Feel like I heard HL say that it came about after 2011 Christchurch earthquake due to the difficulties at the time with having any sort of connectivity, but that might be wrong?

And crucially, are there ethical conversations around P2P tech that we’re failing to have, or happily skating past? I’m thinking about when Facebook and similar now-giants were in their nascent stages, surely some of the current nastiness could have been avoided if the making was accompanied by a little more thinking, more extrospection? How do you wrap your head around the potential ethical implications of something that doesn’t yet exist? I found KB’s anecdote interesting, when a few fascistic idiots attempted to hijack Scuttlebutt but were almost immediately, organically, blocked from having any meaningful impact. It feels great, but who’s to say they’re not off in their own node somewhere trolling away? Feels awful to think that Scuttlebutt might be harbouring some sort of extreme-right cell, and yet maybe so be it, should it be a decentralised network’s responsibility to police that? How on earth would that work anyway?


Separate: I got my hair cut by Dean last week and am very pleased. When it’s styled it’s a bit Josie Packard (fabulous) and when not styled, it’s very Shawn Hunter (not totally a bad thing).

Published

Resolving Craft 3 Setup Wizard error

I keep encountering issues when running Craft’s setup command locally. Note that I use MAMP Pro for this sort of thing. I entered all the database creds correctly, and then got a SQLSTATE[HY000] [2002] No such file or directory error. This StackExchange answer sorted it for me. Add 'unixSocket' => getenv('DB_SOCKET') to /config/db.php and DB_SOCKET="/Applications/MAMP/tmp/mysql/mysql.sock" to .env.

Still encountering database connection issues on staging for one site currently under development. All of the credentials are set correctly in .env, but getenv() in /config/db.php retrieves the wrong DB_USER value. Ended up explicitly adding the problematic value to the /config/db.php file as a quick workaround, but it’s not ideal.

Published

Exploring the use cases for serverless website architecture

Last Saturday, Sam introduced me to Chris Coyier’s talk on serverless-ness, The All-Powerful Front-End Developer. Pretty interesting and useful. I’m glad he leads it by breaking down the problematic nature of the word “serverless”! The following day was spent in agorama’s p2p workshop at furtherfield. Coincidentally, there is a lot of overlap in these topics.

I’ve spent the past few days wrapping my head around all of this, contextualising it against the sorts of concerns and projects we work with. Though I desperately want to get going with Dat, I’m starting with serverless because it may solve an urgent need in my day-to-day work. Right now, I’m spending much more time than I realistically can maintaining CMSs and hosting environments for older websites.

All of the below is a thought dump on the topic, an attempt to pick apart the meaning of and the use cases for a serverless website architecture.

Read more

Published

cURL + Airtable + ./jq = squeaky clean JSON

We’re working on a new site for SB-PH at the moment, and we’re using Airtable to get our project documentation together. It’s also a good opportunity to test the platform a little (+ I’m a fan of tables). To grab tidy JSON for use with data-friendly design software like Sketch, we’re using the Airtable API with cURL and ./jq.

Simple example that dumps table records in to a JSON file for use with the the Sketch Data Populator plugin:

$ curl https://api.airtable.com/v0/YOUR_BASE_KEY/YOUR_TABLE_NAME -H "Authorization: Bearer YOUR_API_KEY" | jq '.records' > records.json

Published

Research involving NAS, backups, storage, etc.

Aside: Thumbs up to Katie Floyd’s Policies info. Super clear.

Edit: See well-timed Guardian article “Ask Jack: Should I buy a NAS drive to back up my laptop?”

Edit 15 March 2019: Katie Floyd seems to have taken her site offline, and her post about NAS usage isn’t archived in the Wayback Machine. 🙁

Published

Chinese web font research

Did some research on Chinese web font best practices a while back when working on Memory Machine for Tyler Coburn + Asia Art Archive with Luke Gould. It was an interesting challenge. This was my overall takeaway from the research:

  • Self-hosted fonts are out, the font files are prohibitively enormous due to the number of characters
  • The Great Firewall can cause issues with most font services, so no Google Fonts or Typekit
  • If you need to render a mixture of Latin and Chinese characters and want them to use different fonts, the font stack structure and naming is critical (see article by Kendra Schaefer for more info)
  • Bold and italic should never be used for emphasis on Chinese characters since it distorts their meaning

Read more