Jeremy Keith

Jeremy Keith

Making websites. Writing books. Hosting a podcast. Speaking at events. Living in Brighton. Working at Clearleft. Playing music. Taking photos. Answering email.

Journal 3238 sparkline Links 10831 sparkline Articles 87 sparkline Notes 8097 sparkline

Thursday, April 23rd, 2026

It’s Not AI. It’s FOMOnetization.

FOMO is a feeling. But it’s also a business model—and increasingly, one of the more successful ones. Fear, in general, makes people much easier to separate from their money. It’s perfectly suited to this moment of ubiquitous grift, where everything feels like a lottery ticket or a multi-level marketing scheme.

It’s even more perfectly suited for “the age of AI,” which squeezes economic FOMO from both sides. AI could make you wildly rich (the first person to start a billion-dollar company with zero employees!) or leave you hopelessly destitute (part of the looming “permanent underclass”). Which one do you want to be? Smash that like button, sign up for my online course, and use my new AI-powered business platform!

Summary punishment

In the latest issue of Matthias’s excellent Own Your Web series, he describes the recent betrayal by Google:

The search engine no longer says “here, go read what this person wrote.” It now says “here, I’ve already read it for you.” The contract is broken.

He’s absolutely right.

But…

Have you ever clicked on a result from a search engine? Unless you’re lucky enough to land on a nice personal website, you’re more than likely to be confronted with pop-ups to allow tracking, or a desparate plea to subscribe to a newsletter, or just rubbish ads all accompanied by a slow page loading somewhere in the mix.

Don’t get me wrong. I’m not saying that what Google is doing is okay. But let’s not pretend that everything indexed by Google is just fine and dandy for people to visit.

And of course the main reason why websites are so terrible is because they’ve tied their business model to heaps of behavioral advertising driven by invasive tracking courtesy of …Google.

This reminds me of AMP. Remember Google AMP? It was a terrible solution to a real problem. Web pages were (and still are) bloated and slow. The correct solution would be to encourage people to fix that, but instead Google mandated a proprietary format for your content that had to be hosted on their servers.

AMP was a disaster, both in practical terms and in the reputational damage it did to Google’s developer relations.

Now they’re doing it again, powerwashing away any goodwill they ever had with site owners. Now Google doesn’t even send search engine traffic to the websites that host the ads that Google encouraged people to put on every page.

It’s almost as if Google is a company so large and with so many competing interests that it now suffers from an incurable split personality disorder.

Personally I think they’re missing a trick. They should be using “AI” summaries as a stick.

If your site is slow, or filled with user-hostile annoyances then it should be cockblocked by a hallucinated summary. But a nice fast respectful website? Send the traffic their way! Everyone wins—users, site owners, Google, the World Wide Web.

Could you imagine how quickly this would revolutionise the world of search engine optimisation? They’ve always told us that we should make websites for humans in order to get good Google juice. This would be a way of making it come true, without any of the over-engineered woefulness of AMP.

It’ll never happen of course. But I can dream.

Tuesday, April 21st, 2026

Expansion artifacts || Matt Ström-Awn, designer-leader

Compression made the information age possible by stripping things down to fit the pipes. Expansion made the AI age possible by blowing data back up again. Both operations leave marks; we’ve learned to spot compression artifacts, but we’ve only just begun to reckon with expansion artifacts. Until we do, there’s a lot of risk to manage.

Never Lose Form Progress Again :: Aaron Gustafson

Here’s an excellent progressive web component from Aaron—wrap a custom element around your exising form and your good to go:

At its core, form-saver is a small web component that wraps a form, keeps an eye on it, stores values in localStorage, and restores them when the page loads again. Better yet, it clears out saved data after a successful submission so you’re not accidentally resurrecting stale information the next time someone stops by.

Monday, April 20th, 2026

Dilation

Nothing can travel faster than light. And if you manage to travel close to the speed of light, things get weird.

Technically, we all experience time differently depending on how fast or slow we’re moving. But the differences are so imperceptible as to be non-existent. That’s how we can describe events as being “simultaneous”, even though according to Einstein’s theory of relativity, there’s no such thing.

It’s thanks to these small relativistic effects that GPS works. But when you approach the speed of light—or get close to something very massive—then the large-scale relativistic effects kick in.

If you travel close to the speed of light for a short time, it will seem like a much longer time to everyone you left behind. This is the twin paradox, which isn’t really a parodox at all, just time dilation in action.

There are some coincidental parallels to this kind of time dilation in old folk tales.

The Japanese tale of Urashima Tarō tells of a fisherman who rescues a sea turtle and is rewarded with a relaxing few days in an underwater kingdom, only to find that when he returns home to his village, 300 years have passed.

The Irish tale of Oisín describes the warrior’s journey to Tir na nÓg, the land of youth. He spends three years there but when he returns to Éire to see his old fighting comrades from the Fianna, 300 years have passed.

This story gives us a wonderfully poetic turn of phrase that’s still used today. The closest English equivalent is “Billy no mates”, a rather cruel term to describe someone with no friends. In Irish, we say:

Mar Oisín i ndiadh na Fianna

Like Oisín after the Fianna.

Sunday, April 19th, 2026

Finn Mac Cool by Morgan Llywelyn

After reading The Morrigan I was hungry for more retellings of Irish myths and legends. I tracked down the 1994 novel Finn Mac Cool by Morgan Llywelyn.

When I was devouring modern retellings of Greek myths, I commented on an interesting difference in the tellings:

The biggest difference I’ve noticed is the presence or absence of supernatural intervention. Some of these writers tell their stories with gods and goddesses front and centre. Others tell the very same stories as realistic accounts without any magic.

The Morrigan was dripping in magic. Finn Mac Cool is a more down-to-earth affair.

That’s not to say that magic doesn’t matter. For the characters in this book, their belief in magic is as real as their belief in the weather. But there are no supernatural powers here. If anything, Finn’s superpower is his ability to tell—and believe—tall tales involving supernatural intervention.

All the usual accounts of Finn Mac Cool’s prowess are retold as deeds that may have a basis in reality but then get exaggerated almost immediately.

It’s a framing device that works well. It’s all too easy to believe in the rise to power of a charismatic man skilled in controlling the narrative.

There’s plenty of Machievellian politics at play. There are no outright villains, or even heroes. There’s a pleasing messiness to the forces at work.

Sometimes the author’s research shows a bit too much. There are digressions into explanations of Brehon law that threaten to derail the narrative.

Overall though, this is an engaging and vivid retelling that just makes me want to spend more time in this world.

Friday, April 17th, 2026

Thursday, April 16th, 2026

Threat models

People talk about the effectiveness (or lack thereof) of large language models as though all tasks are comparable. But it strikes me that there are three broad categories of work that large language models are applied to:

  1. Compression.
  2. Transformation.
  3. Expansion.

Compression is when you feed a large language model something big that you want to make small. Summarise this book. Give me the gist of this meeting. Large language models are generally pretty good at this, which makes sense given that they themselves are kind of like compressed artifacts.

Transformation is when large language models convert from one format into another. Turn this audio into text. Turn this jumble of data into structured JSON. A large language model can handle these tasks pretty well. There’ll probably be a few errors so make sure that’s not a deal-breaker.

Expansion is when you give a large language model a prompt to generate something from scratch. An image. A presentation. An email. A poem. This is where slop lives. The output inevitably betrays its origins, glistening with a sheen of mediocrity.

Laurie spotted this three-way split a while back:

Is what you’re doing taking a large amount of text and asking the LLM to convert it into a smaller amount of text? Then it’s probably going to be great at it. If you’re asking it to convert into a roughly equal amount of text it will be so-so. If you’re asking it to create more text than you gave it, forget about it.

I hope that when the bubble finally bursts, we’ll see the surviving large language models put to work on the first two categories. The boring stuff. The work that’s tedious for humans.

But tedious is as tedious does. Something I consider drudgery might be the very thing that gives you life. Like Giles says:

I have a feeling that everyone likes using AI tools to try doing someone else’s profession. They’re much less keen when someone else uses it for their profession.

The big exception seems to be programming. Apparently there are plenty of coders who never before expressed an interest in being managers who are now happily hanging up their coding spurs in favour being the overseer of non-human workers.

It’s a reasonable outlook. It could even be considered a user-centred approach. Users don’t care about the elegance of your code; they care about accomplishing their tasks.

Programming is something of an exception to the efficacy of large language models in general. Instead of relying on the subjectivity of painting, poetry, or prose, programming can be objectively tested. Throw enough money at the worst people in the world and they’ll give you tokens you can use to get the machines to test their own output. So you can get a large language model to create something reasonably good from scratch as long as that something is code.

If you had asked me about the threat model of large language models two years ago, I probably would’ve been worried for artists, writers, and musicians. I thought that software had enough inherent complexity to be relatively safe.

Now my opinion has completely reversed. Software is almost certainly the killer app for large language models.

I think the artists, writers, and musicians will be okay, or at least as okay as they ever were. It turns out that humans like things made by other humans.

And y’know what? If I had to choose which endeavour I’d rather see automated away—programming or art—it’s no competition.

Don’t get me wrong—it would be nice if everyone got paid for doing what they enjoy. It’s just that I’m okay with software engineers not being at the front of that line.

I remember when I first started getting paid money to make websites. “Really?” I thought, “Someone is willing to pay me to do something I’d do anyway?” I kept waiting for the jig to be up. Instead I saw my profession grow and expand.

Perhaps there’s a long-overdue compression happening.

Or maybe it’s more like a transformation.

Wednesday, April 15th, 2026

Tuesday, April 14th, 2026

No-stack web development – David Bushell – Web Dev (UK)

A stack is also technical debt, non-transferable knowledge, accelerated obsolescence, and vendor lock-in. That means fragility and overall unnecessary complication. Popular stacks inevitably turn into cargo cults that build in spite of the web, not for it.

The web platform does not require build toolchains. Always default to, and regress to, the fundamentals of CSS, HTML, and JavaScript. Those core standards are the web stack.

Just heard the sad news about Moya Brennan passing. Some of the first records I ever bought were by Clannad—Macalla, Magical Ring, Legend.

Ar dheis Dé go raibh a hanam.

Monday, April 13th, 2026

Saturday, April 11th, 2026

The bots are swarming hard today, laying waste to the meadows of the open web. This is what powers vibe-coding.

The slop must flow.

Older »