Overhead
A Raspberry Pi–powered LED matrix that listens to live aircraft transponders and narrates what's flying over our flat — in the voice of David Attenborough.
The idea
My girlfriend loves planes. Every time we're out and about I'll notice her looking at her phone, then up at the sky. And this can only mean one thing — she's on Flight Radar, looking at what's going overhead. It's an adorable special interest, and something I've started taking more interest in myself.
I wanted to make her something. A way to see what's flying over our flat, at a glance, the moment it happens, without reaching for her phone. Back in university I'd built a spectrum analyser on a 64x64 Adafruit LED matrix — 4,096 tiny LEDs, each one capable of any colour, about the size of your palm. So the seed of the idea was there: could I get plane data onto one of these boards?
My first attempt was just a Mac app hooked up to a free API that sent a push notification whenever a plane was nearby. It worked. It was also really annoying. But it proved the concept, and it made me think — what if instead of borrowing someone else's data over the internet, we could listen to the planes ourselves?
Every commercial aircraft constantly broadcasts its position, altitude, speed, and heading via radio transponder. A USB receiver called an RTL-SDR dongle, paired with a 1090MHz antenna, can pick all of it up. Real signals, from real planes, received live from a flat in North London. I had no idea if this would actually be feasible. Turns out it was shockingly easy.
I ordered everything — a Raspberry Pi 5, the LED matrix, the antenna, a bonnet board to wire it all together — and spent the week before it arrived building the software pipeline on my Mac against a simulated version of the matrix. The moment hardware started showing up, I was locked in.





Getting it working
The first few days were pure problem-solving. The Pi wouldn't connect to Wi-Fi because macOS had silently put garbage on the clipboard instead of my actual password — the Pi accepted it without complaint and failed silently. The LED matrix library didn't support the Pi 5's chip architecture, so I had to swap it out and write a translation layer. The matrix and Pi kept crashing because they were sharing a power supply and bright pixels drew too much current. Each problem felt like a wall until it didn't. Reflash the card. Switch the library. Separate the power supplies.

Within a couple of days, the antenna was decoding live aircraft signals, the Pi was processing them, and the matrix was lighting up. The pipeline worked. But what it was showing was ugly.

Making it feel right
Getting data onto a screen is an engineering problem. Making it feel like something you'd actually want to look at is a design problem. And it's much harder.
The raw transponder signal gives you almost nothing useful — a hex code, a cryptic callsign like "BAW226", altitude in feet, speed in knots. No airline name, no aircraft type, no route. So I built layers of translation: callsign prefixes mapped to airlines, type codes decoded into aircraft names, and flight databases cross-referenced for origin and destination. Even then there was a London-specific problem — "going to London" is useless when you live here. Which airport? That needed its own lookup.
I laid it all out on the board like a departures board — callsign, carrier, route, altitude, speed. Rows and columns, crossfades between pages. It worked. It also felt like a spreadsheet with animations.
The fonts were bothering me too. The defaults looked terrible at this size — inconsistent widths, poor spacing, no personality. I happened to be playing Pokemon Fire Red on the Switch, and noticed how lovely the in-game font was — designed for the Game Boy Advance's tiny 240x160 screen, every pixel carefully considered for legibility at low resolution. I got a TrueType version rendering on the matrix with anti-aliased edges. On a normal screen you wouldn't notice, but on an LED matrix where every pixel is physically visible, those soft edges are transformative. Suddenly the board had a visual identity.

But the layout was still the problem. And then it clicked: what if I stopped trying to arrange data and started writing sentences?

The personality
With the typewriter working, the question became: what should it sound like?
The first personality I tried was Matty Matheson — loud, unhinged chef energy. "HOLY CRAP a British Airways A350 just TORE out of Heathrow!" It was funny for about a day, then it got exhausting. Every sentence was at an 11. The structure got repetitive.
So I switched to David Attenborough. And everything fell into place. Planes became wildlife. Routes became migration paths. Airlines became species. A 747 is "a noble elder, increasingly rare in these parts." A Ryanair 737 is "the hardy sparrow, thriving where others dare not venture." An A380 is "the apex predator of these skies." Same data, completely different character.

When a new plane enters range, the board doesn't just update — it does a full-screen colour splash inspired by Pokemon wild encounters. The screen fills with the airline's brand colour and a word like "EYES UP" or "SPOTTED" appears knocked out in black. There are five animation styles that cycle randomly. It turns each sighting into an event — a moment that makes you look up.


The board remembers what it's seen, too. Every aircraft gets logged in a local database. When a repeat visitor shows up, the LLM knows and reacts warmly. First-timers get introduced. At set times through the day it shows wrap-up summaries — a morning report, an afternoon update, an evening wind-down — each narrated by Attenborough against the real stats.

After 10pm, the personality shifts. He's still there, but whispering. Conspiratorial. Cargo flights become "nocturnal hunters." It's funny because nobody's watching.

Two Claudes
One workflow detail worth calling out: I had Claude running on both my Mac and on the Pi simultaneously. Mac Claude was the architect — designing systems, writing the renderer, managing code. Pi Claude was the field engineer — testing against real hardware, introspecting live APIs, catching things that only break on ARM. When one hit a wall, I'd relay the problem to the other. When Mac Claude pushed a fix, Pi Claude would pull and test. I was the bridge between two AI instances troubleshooting the same problem from different angles. It cut what would've been hours of manual hardware debugging into minutes, and it became a genuinely interesting creative rhythm.

Building the frame
The software was done. Now it needed to stop looking like a science project.
The original plan was a handmade oak enclosure — mitre joints, Danish oil, the works. But I wanted something precise, and I wanted it quickly. So I went parametric. I designed the frame in OpenSCAD, writing the CAD model through Claude Code the same way I'd write any other code — iterating on dimensions, tolerances, and assembly logic in a text editor, previewing renders, and fixing problems as they came up.




The design is a 4-piece T-frame — two side pieces with built-in feet, and top and bottom bars that slot between them. The sides own the rounded corners. Eight M3 screws hold it all together. Inside, two parallel grooves run the full perimeter — the front groove holds a 3mm smoked acrylic panel, the rear groove holds a white opal diffuser. The matrix panel sits behind both, held in place by retaining lips. An M5 bolt through the top bar provides a clean mount point for the antenna.
I sent the STL files to JLCPCB's 3D printing service. Four pieces, SLA 9600 resin, matte white — intended as a prototype before ordering the final version in black. About £60 all-in, with shipping costing more than the parts themselves. They arrived about a week later, and the first surprise was the weight. SLA resin is dense — the pieces felt like ceramic, not plastic. Substantial in a way that made the whole thing feel more like a real object than a prototype.

Assembly was mostly straightforward — the acrylic and diffuser slid into their grooves, and the screws pulled the corners tight. One snag: the retaining lips that hold the matrix were about 4mm too long where the pieces met, causing overlap at the joints. But the resin filed down easily, and after a few minutes with a hand file everything sat flush. The 0.4mm-per-side tolerance on the main opening worked perfectly — the matrix dropped in snug, no forcing.
The matte white was never supposed to be the final colour. But once I saw the black M3 screw heads against the white resin — eight tiny black dots at the corners — I loved the contrast too much to reorder. The prototype became the product.
The result


The smoked acrylic does exactly what I hoped — it hides the 4,096 individual LEDs when they're off, so the face reads as a single dark panel. When pixels light up, the colour passes through cleanly, and the white opal diffuser behind softens each dot into a glow. The dipole antenna screws directly into the top bar via the M5 bolt, and a single USB-C cable runs out the back for power. Everything else — the Pi, the bonnet, the PSU, the SDR dongle, all the wiring — is hidden inside.
It weighs more than it looks. People pick it up expecting plastic and get something closer to a ceramic bookend. That weight, plus the matte finish and the visible screw hardware, gives it a presence that a 3D print has no right having. It doesn't feel like a prototype. It feels like a product.

Laura loves it. It hasn't curbed her time on Flight Radar — she now uses it as a supplementary tool. A heads-up that something's going over the flat, before she reaches for her phone to find out more.