Blogging in 2024

published

I’ve had this website for a quarter of a century. I started writing content in Frontpage Express. Then I learned enough basic HTML to hand-craft some posts. Then I discovered WordPress. Then I helped start Habari. Then I found Hugo. Each of these has been good in some ways, and frustrating in others. Somewhere along the way I also discovered the indieweb movement, which really resonated with me in terms of owning my own content.

As I recently explained to a friend, I’m not a particularly good developer. I am, however, a pretty good integrator. I’ve made a successful career cobbling together other people’s solutions in useful ways. I contributed a lot of stuff to WP – the original backup plugin, the original cron implementation, who knows how many bug fixes – and just as much to Habari, but it was always a struggle for me. I was working uphill, way outside of my comfort zone. It was a great learning experience, and definitely expanded my overall capacity, but mostly I just want a low-friction means to create and publish content which I can manage and own.

As a professional systems administrator for the last 30 years, it should come as no surprise that I hate computers. There was a time when I loved to manage my own services (a Postfix allowlist for youths, for example), but those days are long gone. I get no personal satisfaction out of managing services, applying updates, or optimizing configurations for maximum performance. What I want now is less moving parts, less stuff to maintain, less stuff that can break.

Which brings me back to my blog. My current workflow is to compose Markdown in a text editor, and commit the file to a local git repository. Then I push a PR to GitHub. Then I merge that PR, which kicks off a GitHub Action that runs Hugo to build my site and then rsync the result to my server. All of this works fine, but GitHub Actions is a moving target with constantly updating versions of the bits on which I depend. Because this is my personal site, I don’t track those updates closely so I run a risk of missing a breaking change, or a security vulnerability, or simply a change in GitHub’s service offering. I don’t want to carry any of that cognitive load.

I could run Hugo on my server. But my server is on the smaller side of the available options, and while Hugo is super performant it’s still pretty slow on this little system. I don’t know if it’s actually faster for me to merge a PR, have the GitHub Action run, and rsync the content back down but it feels like it might be. Really, my beef with Hugo is that it rerenders my entire site, even though I’ve only added one new file. It seems so … wasteful? to do that. If I could more easily target the set of files to update, I might stay with Hugo. Oh, except that it means I need access to a device which can ssh / rsync to my server, which is a tremendously limiting factor.

I’ve been admiring micro.blog for a long time, and I’ve half-heartedly tried to use them. I’m not averse to paying for a reliable product (I pay for server hosting and email with no complaints); but after two decades of fiddling in the blogging space there’s just something about micro.blog that falls short for me. Superficially, it’s a dumb complaint about the URL structure of posts and how inflexible it is. But more realistically, it’s probably the fact that I’m giving up control of something I know I can do on my own. I used to run all of my own services on my own hardware on my residential cable modem connection. I don’t miss running email and anti-spam; but for some reason ceding my blog to a third party is a step too far.

When I used WordPress or Habari, it was easy to compose a post from anywhere. The web UI for post composition was available from any browser. That was a real draw. Writing now means I need something that can commit to GitHub. I could use a GitHub client on my phone or tablet, but in my experience those are way more cumbersome than I’d like. Anything that’s not supremely simple is something that’s going to be an excuse for me to not create a post on this site. Which got me thinking recently …

The one thing I always have available is email. I can create an email from my phone, or my tablet, or my personal laptop, or even from my work laptop. Email composition offers an easy way to save drafts, which I could then pick up to continue from one of my other devices. As a happy Fastmail user (seriously, I’m super happy with them, you should use them too!), I’ve been aware for a while that they’re pushing this technology called JMAP. Until now, I mostly didn’t care, as all of that Just Worked, and didn’t require anything from. Nor did I think it offered me anything, but I’m starting to think I was wrong.

See, email is hard. It’s an ancient technology, with a lot of crufty bits. Having a computer – say, my web server – contact a mail server, collect new emails, and do something with them is hard. There are arcane programs like fetchmail and procmail and others that do everything I want to do, but it requires some significant decication to a kind of computing that is very quickly fading away. Building workflows using email is hard, error prone, and fragile. The way most computers talk to each other today is via HTTPS. JMAP provides a way to do email stuff over HTTPS. It’s not pretty, by any means, but it seems like something I might be able to use to make a useful integration for myself.

My original thought process was to email myself (or an alias, perhaps), have a Fastmail rule send those messages to a dedicated mailbox, and then run a cron job on my server to query for new messages in that mailbox. JMAP looks to make this pretty easy, and using the JMAP Python samples I was able tp prove to myself I could do this. Great! I could write Markdown in an email and send it along, fetch it, and write that Markdown onto my server. I’d still need to run Hugo though. And I might.

But I use Caddy as my web server, and it supports templates. Indeed, they show how the Caddy site itself uses Markdown content rendered on-the-fly as HTML using their template system. So rather than run Hugo to build my complete site every time, maybe I could use Caddy to serve my Markdown? It’s definitely posssible, though it’ll take some work to revise my Hugo templates for use with Caddy, and there will be no shortage of edge cases to address.

So the thinking is: run a cron job to check for new email every 5-10 minutes; download that email to my website content directory; probably have that cron job update the list of “new posts” on my home page as well as update the RSS page; and then that’s it. Is it less moving parts than what I have now? Not particularly. It actually works against my goal of having more stuff to manage and maintain: a new cron job, a new script which I wrote, dealing with duplicate file names or filesystem errors or who knows what. Just thinking about it all makes me want to turn my laptop off and watch paint dry for a while. But it does remove GitHub from the equation; it may allow me to keep a history of all content in that mailbox; and it lets me do what I feel like I’m good at, which is cobbling together marginally useful things from building blocks provided by others.

I realize this a rambling, disjointed post. I’ve been noodling on this stuff for a while, and writing this stuff down is my first attempt to create some structure for myself. It may also be the case that someone else has done all this already, and can share their magic or dissuade me from wasting my time. Or maybe someone will get excited to lend a hand and we can collaborate! Or maybe I’ll just realize how much effort this will be, for little measurable benefit, and I’ll find some wet paint to watch.


home / about / archive / RSS