Published

Agorama ~#5: distributed web, quantum, crypto, and a dash of CS history

70’s wallpaper in Rebecca’s Flat at Raven Row, London

Last night was my third Agorama Server Co-op meet up in Rebecca’s Flat, a delightfully dilapidated space at Raven Row. I think it was actually the fifth though, I missed the last two due to illness which was a real bummer. The weekend jam sounded particularly great.

This particular meetup was more informal and a little smaller than usual. It ended up being a really nice, wandering conversation on the multifaceted possibilities of the distributed web, what it could look like.

The notes below are a sort of a prompt dump, snippets I wrote down at the time because I didn’t want to forget it or wanted to look in to it more. See all Server Co-op write-ups here.


Dark Crystal is now up and running on Patchbay (ssb client). Got Samsung funding, woohoo! Possible to create bot that receives shard? Think they’re trying to avoid that, the human element is kind of critical.

What about physical crypto? Microdots are worth checking out. Microdot tattoos?

Asked what ppl think about potential threat of quantum computing to modern cryptography methods, response was a little not as I expected (this is why I come to these things!). Personally I’ve been feeling a little tin-foil-hat-y, but general consensus from the other voices in the room seemed to be pretty ambivalent since the theory far outstrips the practicalities currently. Which is true, but it also just feels kind of like an arms race (particularly since it involves hardware / infrastructure). Whoever cracks it first wins the golden goose unless we can come up with cryptography that works against it. GP then mentioned the post-quantum crypto contest with NIST due to end pretty soon, looks pretty promising. I didn’t realise there was that much going on with quantum resistant algorithm research, so that makes me feel a bit better. I guess my concern is still there though, to a big degree. Banks, for example, are on notoriously crappy tech that is rarely overhauled. What of them, and the other institutions we rely on? Oh lord, and voting tech…

Got talking about what I’d been up to (not much, see first para…) and mentioned that I ultimately decided not to move my site on to Dat, partly due to scale issues w/ static site generators (read more on this) but more to do with the fact that I think I’d rather use Dat for something new and neato, rather than just repurpose something that already exists and is doing ok in it’s current form. Then we started talking about static site generators more generally and someone mentioned Pelican, which I hadn’t come across before. It’s written in Python and originally released in 2010 (!), so up there with Jekyll as one of the earlier static site generators.

HL demoed his mother-of-all-apps for us, it looks *so great*! Absolutely something I would use. Really excited to see where he takes it. I need to look in to Hypercore and Expo a bit more. The first I’d heard of, the second not so much. Apparently Expo is a cross platform app framework built around React Native. Ppl could not say enough good things about it and honestly, it does look fantastic. Particularly as a tool to dip your toe in to app waters, so to speak.

Towards the end of the demo, the conversation wound through lots of different topics. Blockchain, platforms vs aggregators, a bunch of CS history (need to read more about that…), the sustainability of open source, etc. The rest of this note details snippets from this part of the conversation that I need to look in to more.

Services / apps / platforms I’d like to look in to a bit:

  • Mapeo, an “open source, offline-first map editor”
  • Manyverse, kind of Scuttlebutt for your phone but better (shouldn’t suck the life out of your phone trying to sync)
  • Node.js for mobile apps
  • Webrecorder, like a personal Wayback Machine; also, did you know you can sometimes find YouTube vids that have been taken down archived on the Wayback Machine?
  • TMYK

A reading list. (Some of these links are painful to open, some orgs really need to cool their jets on the pop-ups and trackers):

Some soundbites. These are paraphrased points made by others that I found super-relevant. Bits in square brackets are added by me for clarity:

  • “Ordering is the toughest thing to sort out” [when it comes to ledgers / append-only logs]
  • “Biggest problem with blockchain is the definition of consensus, and how to establish consensus”
  • Article 13 [aka the “upload filter” provision] is forcing people’s hand, we’re going to see a lot more of this.”
  • “So much of this bullshit has come from chasing the technology and not the needs.” Related: “But seriously… does it need to be an app?”
  • “The future of the web will be much more about interoperability than a black-and-white, decentralised vs centralised approach.”
  • “Porn is a canary in the coal mine for whether a piece of tech is ready for primetime.” [Is someone using it for porn? Ok, it’s going to gain traction.]
  • “Could we ever have another Xerox PARC?” “Probably not, research now is just too results-driven. A report every week, and sometimes the funder has already indicated what they’d prefer your results to be.”

So many distributed / decentralised web conversations get quasi-evangelical about how this or that tech will save the world. Why does it have to be winner takes it all? Different needs require different technologies.

We recognise biodiversity as a fundamental requirement of a healthy, thriving biosphere. Why don’t we champion technodiversity in the same way? Embrace the chaos.

Published

Q&A related to privacy-first messaging apps

I rely heavily on messaging services since many of my friends and family (probably the majority) live outside of the UK, as do some critical professional contacts. I mainly use WhatsApp for encrypted messaging but have wanted to move away from it for some time due to concerns about Facebook. The recent news regarding the integration of WhatsApp, Instagram messages, and Facebook Messenger has been the catalyst for actual change within my group of peers.

The Q&A below is an amalgamation of many different conversations I am having at the moment about moving to a more privacy-first messaging app. I have focused on Signal and Telegram for the time being since they seem to be the most likely candidates.

I’ve done my best to pull together this information in a fairly short time, and some of it is new to me. If any of it seems incorrect, let me know.


I have nothing to hide, and I have no fear of my data being used against me by a private company or the government. Why should should I make data privacy a priority when I’m choosing a messaging app?

There are many ideological arguments against the “I have nothing to hide” viewpoint, most of which I agree with. That said, it can be near-impossible to agree 100% on ideology, so perhaps it is better to consider the practical.

When your messages are not encrypted, their contents are visible to anyone that has access to them. In an ideal world that would only be you, the recipient, and whatever app you use to manage your messages. Unfortunately, the reality is more complicated and there are many weak points that can be exploited. For example, if the WiFi network you’re on is insecure, your messages will be exposed to unintended prying eyes. Think of the last time you connected to WiFi in an airport, hotel, or cafe. Was it always password protected? Was it clear who supplied the network?

You may not be worried even if your messages were compromised, surely there is nothing in your messages that could be of consequence. But what about the photos of your adorable 4 year old niece from your sister? The online banking details you sent to your partner since the rent payment failed and they needed to sort it out? The message to your worried mother about your blood test results? The company Twitter password you sent to a co-worker that urgently needed access?

There are some things that are best kept private, and encryption lets you do just that.

I’m concerned about the privacy of my data, but why should I switch when WhatsApp already has end-to-end encryption? Isn’t that enough?

It is certainly a great step in the right direction, but whether it’s enough depends upon how much you trust Facebook and how you feel about Facebook’s role in the spread of misinformation.

As things currently stand, WhatsApp’s privacy policy allows limited data sharing with Facebook even though messages are encrypted end-to-end. Since the integration between WhatsApp and Facebook is only being strengthened, I feel it is reasonable to think that the data sharing will continue or possibly grow.

I don’t personally have much confidence in Facebook regarding their use of my data, no matter how minimal, so WhatsApp is not my first choice for encrypted messaging.

Oh man, another app… I really don’t want another app

I’m with you! It’s frustrating. I don’t have a good answer for this, except that personally I’m going to try to cultivate a little more patience for multiple apps. The WhatsApp / Facebook “monopoly” is kind of what led us here in the first place.

Besides that, the best advice I can give is to frequently Kondo apps and micromanage your notifications. Smartphones give you great, granular control over notifications nowadays, so take full advantage. Turn off the chimes, turn off the lock screen notifications, turn off the message previews. It makes managing multiple messaging apps (and your sanity) a lot easier.

And finally, if you feel like one particular app is a really great fit, then advocate for it! If you’re enthusiastic about it and get your friends / family on board, you may find you have fewer apps to juggle.

My phone is ancient! What privacy-focused messaging app would offer support for my device?

It depends upon the limitations of your specific device.

Signal currently supports Android and iOS. You can find more information about Signal’s operating system requirements in their documentation. Telegram currently supports Android, iOS, and Windows Phone. You can find more information about Telegram’s operating system requirements in their FAQs.

I am not sure about the memory or disk space usage for the different apps though, this is something I would have to look in to further.

I’m very up for switching to a privacy-first messaging app, but the actual switch will involve convincing my contacts to leave too. I wouldn’t mind bringing this up, but it feels like a political decision. Political discussion is not welcome in my field / organisation / family / friend group. How can I approach this?

This is a very understandable and tricky concern. How best to approach this depends completely on your specific circumstances and relationships. It is impossible to give general advice, but I’ll give it a go.

You could delay the conversation, however I would say that even if you do not have the “should we make the switch” conversation with your contacts now, it will likely come up at some point due to the current trajectory of WhatsApp. When you do broach the subject, perhaps consider focusing on the practical upsides of switching to an encrypted messaging app (see answer to first question above for more on this).

If you feel you simply can’t bring this up, then of course you could always continue to use WhatsApp for certain conversations and use a different app for others. Though every app provider would probably prefer you believe otherwise, there is no rule against using multiple apps!

On a more general note, the mis-use of personal data has led to previously unimaginable consequences and turbulence in recent years. As such, every decision related to the transmission of personal data, even something as mundane as choosing a messaging app, is unavoidably political. So though we cannot avoid the political nature of the choice, we can control how we treat that choice. We can be passive, or deliberate.

What is preventing these privacy-focused messaging apps from being acquired by some tech giant and the cycle happening all over again?

If the messaging service is already controlled by private investors, perhaps not much. Here is a very brief summary of how Telegram and Signal are structured as organisations. Note that much of the information that follows has been gleaned via Signal article and Telegram article on Wikipedia.

Telegram is owned by Telegram Messenger LLP and has been funded by Digital Fortress LLC. They have stated that they are not for profit but are not structured as a nonprofit, possibly due to the overhead involved in setting up an official nonprofit. The sustainability of their business model is unclear, however they did put together an Initial Coin Offering (ICO) to fund a new blockchain platform and cryptocurrency. Activity around this seems to have halted in early 2018.

Signal is owned by Signal Messenger LLC which is funded by the Signal Foundation, a 501(c) nonprofit organisation whose mission is to make “private communication accessible and ubiquitous”. Much of the funding ($50 million) used to create this nonprofit came from Brian Acton, a WhatsApp co-founder. Acton left Facebook in late 2017 and is now the foundation’s Executive Chairman. Signal’s open source Signal Protocol is said to be used by a number of large entities (including WhatsApp) for encryption. Part of Signal’s ongoing business model may be to offer services in relation to their protocol, though that is just speculation.

Because of Signal’s nonprofit status, I feel more confident in Signal’s longevity as an independent entity.

Regardless, there will always be churn in this sector, so I would expect to switch again some day. I look at switching messaging apps in a similar way to how I look at switching banks. It is a big hassle to switch, but eventually the arguments for leaving outweigh the reasons to stay. So I switch, and then I keep tabs on it to ensure it remains the best of the options that are open to me.

I really rely on [insert very specific feature]. Would another privacy-focused messaging app support the features I need?

Perhaps! The best place to find out is the app’s own website, they’re jumping to tell you all of the great things their app can do. Another place that might be worth checking is Slant.

Personally, I am most concerned about conversation backups and mute / unmute capabilities.

I want to have some way of backing up my conversations in case I ever lose my phone. But with convenience comes a cost. Backups are notoriously tricky with encrypted messaging since they introduce another potential weak point, the server that stores the backup. With Signal, you can back up on Android but not iOS (though iOS backups do seem to be on their roadmap). Telegram seems to allow backups of some sort, but it is unclear what this means for encryption. The only easily-available information I could find currently was their related FAQ “Why not just make all chats ‘secret’?” and their founder’s blog post “Why Isn’t Telegram End-to-End Encrypted by Default?

Both Telegram and Signal seem to support conversation muting according to various documentation and articles I found online. The muting duration and other functionality offered by each service will likely be slightly different from WhatsApp.

If I’m going to switch to a more privacy-focused messaging app, which app should I choose?

The three biggest factors in choosing a messaging app are probably the user base, features, and data privacy.

From a data privacy perspective, Signal is likely the best choice. Signal is fully open source, meaning that the security in every aspect of the service can be reviewed and is publicly-verifiable. Though Telegram has an open API and protocol, the backend software is not open source so the security cannot be fully evaluated by a third party.

From a features perspective, it is probably safe to say that WhatsApp is the most fully-featured encrypted messaging app out there currently. It is hard to tell how those features might change over time in light of Facebook’s plan to integrate it with Facebook Messenger and Instagram. Telegram used to be more fully featured than Signal, but at the moment it seems about neck-and-neck.

In terms of user base, it seems impossible to get very accurate numbers. The better thing to do, perhaps, is to just ask around. See what your friends and family are already using. There is a very good chance that certain circles will prefer one to the other. Personally I have more friends on Signal than Telegram, but that may relate to the sector that I work in.

But as a final point, maybe just don’t choose. There is nothing wrong with using multiple messaging apps. I use FaceTime and iMessage with my family because they all happen to have iPhones (though Apple’s not perfect!). I use Signal with lots of friends. I’ll probably hang on to WhatsApp ultimately as well, for a little while at least, since certain contacts are going to struggle to switch to a different app for one reason or another.


A closing thought. Though I’ve focused on Telegram and Signal here, there are a lot of other encrypted messaging apps out there to explore.

For mobile, take a look at Viber, Line, Threema. For business-y stuff, maybe take a look at Wire or Keybase. If you’re just talking desktop and are interested in getting a little techy, check out Freenode and Scuttlebutt.

This is a conversation worth continuing.

Published

Working with Netlify CMS and GitHub Pages

This was originally posted on sb-ph.com. I’ve moved it to my blog since we’ve decided to remove the blog from that site.

 

Heads up: this post is kind of old! I’ve got mixed feelings about this approach now for a few different reasons, in no small part due to Uploadcare’s new pricing. No dig at them, they gotta do what they gotta do, but it puts this out of most of my clients’ budgets. If you’re interested in additional thoughts on JAMstack CMSs, you might want to take a look at this post.

Working with Netlify CMS and GitHub Pages

We’ve recently been exploring a lightweight CMS setup for the Host site. This post summarises the thought process behind our decision to work with Netlify CMS and GitHub Pages.

TL;DR
Though it’s an unusual setup for a client site, I like the stack and would consider using it again for a similar project.

Evaluating Netlify CMS

There are two potential downsides that would rule out Netlify CMS for most of our client projects. One is that user account setup isn’t super straightforward, the authentication method makes this a little more complicated than normal. The other is that the hosting “ownership” is tied to a GitHub / GitLab / Bitbucket repository. All-in-all, this approach demands a slightly higher level of technical know-how from the client than most other CMSs we regularly use (Craft, Kirby, WordPress, etc.).

The upside is the relatively low cost to the client, both in the short and long term. Netlify CMS is free, as are a few selected static hosting providers (Netlify and GitHub Pages spring immediately to mind). The login authentication is the only step that requires a server, since the keys have to be kept secret. Previously, the server would be the big ongoing time and money sink. Certain platforms however, such as Netlify, offer selected webtasks / microservices / cloud functions for free.

When evaluating these pros and cons against the Host website requirements, it became apparent that it would be a pretty good fit. The original Host static site was built with Jekyll and jekyll-seo-tag, so we knew it would work nicely with GitHub Pages and Netlify CMS. GitHub recently introduced free private repositories for up to 3 users, so we felt less concerned about the site hosting being tied to a repository that we own since we could transfer it to Host at some point without feeling pressure to make the repository public. And the Host folks are a pretty tech savvy bunch, so I wasn’t worried about them being daunted by things like repositories or GitHub-linked authentication.

Working with Netlify CMS widgets and media

So I got started with Netlify CMS. The first thing I dove in to was the widget (field) configuration, and I was impressed. The widget configuration options are more fully-featured than I would have expected for a ~1.5 year-old CMS. While configuring the widgets I also gave the custom editor Preview components a try, however ultimately I abandoned that experiment and disabled the editor Preview. It’s out of the scope of this project, and the maintenance of these React components alongside a non-React site seems a little dicey. Something to explore separately at a later date perhaps.

Media management is a concern on static sites, and this one is no exception. We could have gotten away with hosting the images in a directory in the GitHub repo, but we wouldn’t have any nice image transforms for faster page speed and would have had to ask the client to do all of the heavy lifting with image resizing and optimisation. I would have also worried about the repository size spiraling out of control with overly-large image files. Netlify CMS v2.1.0 offers the Uploadcare media widget by default though. Uploadcare’s free tier seemed to be a very good fit for the Host website, so we got that set up as well. The implementation was pleasingly straightforward. As someone who has spent a lot of time debugging image transformations in more traditional CMSs, Uploadcare’s URL-based system is refreshing.

Configuring GitHub authentication for Netlify CMS + GitHub Pages

The final critical step was the authentication configuration. Netlify CMS offers a number of different methods. We knew we wanted to use GitHub Pages for hosting, so the two most likely options were Git Gateway with Netlify Identity or GitHub with Netlify. We went for the latter. It does require that all users have a GitHub account (unlike the former), however this ultimately feels like the right approach since much of the site documentation lives in the repo’s Readme file and we feel it gives the client a bit more ownership over and awareness of the inner workings of their site.

The authentication was a little more complicated than the rest of the Netlify CMS setup. I followed the GitHub Backend instructions, which are relatively minimal and more geared towards a site hosted on Netlify. I found this article more helpful since it addressed what we were after, a Netlify CMS-powered site hosted on GitHub Pages. One potential “gotcha” in these steps is that the OAuth application should be registered on whichever GitHub account owns the repo. I added it to my user account initially which was incorrect, since our organisation account owns the repo.

After following the GitHub backend instructions, I only ran in to two hiccups. I’m documenting them below since I didn’t find much related information elsewhere.

One problem was a post-login “dead end”. After clicking “Login with GitHub” and signing in successfully in the pop-up window, the user should be redirected to the Netlify CMS dashboard automatically. Instead, it just said “Authorized” and did nothing. This is because I hadn’t added the website URL to the Netlify project. I had assumed I didn’t have to add the URL since Netlify is not hosting the site, however the Netlify authorisation callback script checks the host URL against the custom domains. Once I added the URL to the project in Netlify, the issue resolved itself. Note that I did not repoint the DNS to Netlify since we still want to host the site on GitHub Pages, so it shows a little error / warning regarding the URL’s DNS records. This is not a problem.

So I was now being redirected successfully to the dashboard, but the dashboard was just a white screen. The console indicated an error with loading netlify-cms.js from Unpkg, specifically:

Refused to execute as script because "X-Content-Type: nosniff" was given and its Content-Type is not a script MIME type

This seemed possibly related to some security-related GitHub Pages headers, so I decided to grab the JS from the Unpkg URL and commit it to the repo and no longer use a CDN for the script. This immediately fixed the problem.

Final thoughts

I’d like to do a bit more exploration of Netlify CMS moving forward since it could be appropriate for more use cases depending upon the developers’ roadmap. I will probably look at the Git Gateway with Netlify Identity route for authorisation as well. Besides Netlify CMS though, we’re really interested in exploring Kirby 3. Kirby 2 has been excellent for a number of small-ish websites (<5 stakeholders, not a ton of relational data), so we’re excited to see where they’ve taken the newest version and if it could work with projects of a slightly larger scale.

Published

Saturday at Mozfest 2018

SB and I went to Mozfest for the first time last Saturday. What a lovely day! Took some haphazard notes throughout, see below for a dump of notes/links related to the sessions I attended. The bits in brackets are mostly thoughts that bounced around my head while taking notes during talks. All quotes are paraphrased.

Read more

Published

My experience getting up and running with Homebase

I finally got round to exploring Homebase yesterday (jump straight to setup steps). My original intention was to get the SB-PH site on Dat + HTTPS à la this blog post by Tara Vancil. As far as I can tell though, without multi-writer support in Dat this setup would effectively lock Sam out of being able to quickly deploy changes. We’re interested in making that site a little bit more of a collaborative sandbox, so making deployment harder than it is currently is not the right step to take there.

So though I definitely want to get the SB-PH site on Dat eventually, we’re putting that on hold for now and I’m pivoting towards my site. In this blog’s earliest incarnation it was on Tumblr, and for a long while now has been a pretty standard WordPress site. The big task in moving to Dat, besides figuring out Homebase, is converting my site from WordPress to a static site via Jekyll/Hugo/Eleventy/GatsbyJS or something similar. It’s taking a while, I didn’t realise quite how much content has accumulated (1000+ tags?!) and there are a few WordPress-y features that I definitely want to build in (“more” tags, descriptions for tags+categories, proper pagination, etc.). More on that in a separate note.

So yesterday I put that aside and focused on getting Homebase up and running on a DigitalOcean droplet. Overall, setting up Homebase wasn’t too bad. The most involved part of the process was setting up the server. I kind of like tinkering with server stuff, so that’s cool. I 100% agree with the caveat at the top of the Homebase README, you should consider Homebase only if you’re comfortable with and interested in server administration. I would add that your interest should be *ongoing*. Servers take maintenance (related, see note on serverless setups). It’s your responsibility if a process stops running, or the software is out of date, or the Let’s Encrypt certificate doesn’t renew, etc. Hashbase looks like a great alternative for those that want the final result but don’t want to deal with the server configuration/maintenance.

The rest of this note is an outline of the steps I took to get Homebase working. Where good documentation exists elsewhere, I have linked to that instead of elaborating.

Read Homebase setup steps

Published

Agorama #2: exploring Scuttlebutt

A wall in Rebecca’s Flat at Raven Row

This past Thursday 18 October was the second Server Co-op meetup in Rebecca’s Flat at Raven Row. See all Server Co-op notes.

I didn’t take as many notes this time, wasn’t feeling fantastic. Very sketchy notes below.


click public button twice if the Patchwork feed seems stuck after first install

how to have Scuttlebutt on multiple devices?
eh, maybe not worth the hassle, just use one device
“sameAs” is currently being worked on by devs in Scuttlebutt community

identity = private + public + network key combo
lib sodium

back up private key and gossip.json

dark crystal for backing up private key using social network

“shamir’s secrets” algorithm
kind of like horcruxes!

with Scuttlebutt, your friends are your cloud/datacentre

nothing is ever deleted (same as Dat)

could technically have multiple identities, but functionality isn’t implemented currently. Would have to swap .ssb directories

Published

Agorama #1: outstanding home decor + P2P

The rug in Rebecca’s Flat at Raven Row

Last night I went to the first Server Co-op meetup hosted by Agorama in Rebecca’s Flat. It’s a more-is-more space, and then some. It was a lovely evening. Notes:

Check out infocivics.com by Paul Frazee. “Computing networks are social and political systems. We should attempt to answer how the technical design of a network will influence the internal politics.”

There *is* a mobile Dat browser, but apparently it’s a bit… buggy. See Bunsen for Android (nada for iOS). Still, kudos to them for taking a stab at it. Apparently the project of making a Dat browser sort of hits a brick wall due to node.js, but a bunch of devs have taken it upon themselves to make a Rust implementation of Dat. TBH I don’t understand the ins-and-outs well enough to be able to describe how that lowers the barrier, but it sounds like the future of mobile Dat might be brighter for it.

I haven’t dug in to Scuttlebutt yet, and it sounds like it’s about time. An offline-first protocol, described by KG as a database/social network/community. See also Patchwork. Feel like I heard HL say that it came about after 2011 Christchurch earthquake due to the difficulties at the time with having any sort of connectivity, but that might be wrong?

And crucially, are there ethical conversations around P2P tech that we’re failing to have, or happily skating past? I’m thinking about when Facebook and similar now-giants were in their nascent stages, surely some of the current nastiness could have been avoided if the making was accompanied by a little more thinking, more extrospection? How do you wrap your head around the potential ethical implications of something that doesn’t yet exist? I found KB’s anecdote interesting, when a few fascistic idiots attempted to hijack Scuttlebutt but were almost immediately, organically, blocked from having any meaningful impact. It feels great, but who’s to say they’re not off in their own node somewhere trolling away? Feels awful to think that Scuttlebutt might be harbouring some sort of extreme-right cell, and yet maybe so be it, should it be a decentralised network’s responsibility to police that? How on earth would that work anyway?


Separate: I got my hair cut by Dean last week and am very pleased. When it’s styled it’s a bit Josie Packard (fabulous) and when not styled, it’s very Shawn Hunter (not totally a bad thing).

Published

Resolving Craft 3 Setup Wizard error

I keep encountering issues when running Craft’s setup command locally. Note that I use MAMP Pro for this sort of thing. I entered all the database creds correctly, and then got a SQLSTATE[HY000] [2002] No such file or directory error. This StackExchange answer sorted it for me. Add 'unixSocket' => getenv('DB_SOCKET') to /config/db.php and DB_SOCKET="/Applications/MAMP/tmp/mysql/mysql.sock" to .env.

Still encountering database connection issues on staging for one site currently under development. All of the credentials are set correctly in .env, but getenv() in /config/db.php retrieves the wrong DB_USER value. Ended up explicitly adding the problematic value to the /config/db.php file as a quick workaround, but it’s not ideal.

Published

Exploring the use cases for serverless website architecture

Last Saturday, Sam introduced me to Chris Coyier’s talk on serverless-ness, The All-Powerful Front-End Developer. Pretty interesting and useful. I’m glad he leads it by breaking down the problematic nature of the word “serverless”! The following day was spent in agorama’s p2p workshop at furtherfield. Coincidentally, there is a lot of overlap in these topics.

I’ve spent the past few days wrapping my head around all of this, contextualising it against the sorts of concerns and projects we work with. Though I desperately want to get going with Dat, I’m starting with serverless because it may solve an urgent need in my day-to-day work. Right now, I’m spending much more time than I realistically can maintaining CMSs and hosting environments for older websites.

All of the below is a thought dump on the topic, an attempt to pick apart the meaning of and the use cases for a serverless website architecture.

Read more

Published

cURL + Airtable + ./jq = squeaky clean JSON

We’re working on a new site for SB-PH at the moment, and we’re using Airtable to get our project documentation together. It’s also a good opportunity to test the platform a little (+ I’m a fan of tables). To grab tidy JSON for use with data-friendly design software like Sketch, we’re using the Airtable API with cURL and ./jq.

Simple example that dumps table records in to a JSON file for use with the the Sketch Data Populator plugin:

$ curl https://api.airtable.com/v0/YOUR_BASE_KEY/YOUR_TABLE_NAME -H "Authorization: Bearer YOUR_API_KEY" | jq '.records' > records.json