$ cd ..

DIY plane spotting at home with a Raspberry Pi

📅 2025-01-24

⌛ 1 day ago

I live quite close to an airport and see planes flying overhead all the time. I was on the terrace one day with a couple friends and was surprised to see like 5-6 planes go over us in ~30 minutes. It was a Friday so I built a small plane spotting setup at home using a Raspberry Pi and an old Nokia Android phone.

This project is heavily inspired by skybot.cam (RIP), but tries to be a bit more DIY without using an ADS-B receiver or a Cloud vision API.

Plane Spotting
My plane spotting setup

How it works

In a previous iteration I tried to use a USB webcam but the quality was not great (especially in low-light). I’d love some feedback or suggestions on how to improve this setup.

Overview

I confess it’s a fragile setup, I make no promises about how long it will last but I’m quite happy with it. I would’ve liked to use an ADS-B receiver but I don’t have one yet, and they’re rather expensive in India.

You can view the entire script that runs on my Raspberry Pi here.

Note: uv’s support for inline script dependencies as described in PEP 723 is sooo good. I found out about it from Simon Willison’s blog post.

The Script

On the Pi, I ensure that v4l2loopback is loaded:

sudo modprobe -r v4l2loopback && sudo modprobe v4l2loopback devices=1 video_nr=0 card_label="scrcpy" exclusive_caps=1

The actual script runs as a systemd service. It begins by starting the scrcpy server on the phone:

scrcpy --v4l2-sink=/dev/video0 --no-playback --no-audio --video-source=camera --camera-id=0 --camera-size=1920x1080 --max-fps=1

Which now mirrors the phone’s back camera to /dev/video0. The Python script then reads from this device and processes the image. I capped it at 1FPS because the Pi can’t handle inference faster than that.

For the model itself, I’m using the YOLOv11 model with the nano size. Ultralytics recommends exporting it to NCNN for improved performance on ARM devices:

from ultralytics import YOLO

model = YOLO("yolo11n.pt")

model.export(format="ncnn")

ncnn_model = YOLO("yolo11n_ncnn_model")

The script checks if the NCNN version exists, if not it exports it into NCNN. In my limited testing, it sped up inference by almost 2x.

For checking if there’s an airplane in the image:

results = self.model(frame, verbose=False)

for result in results:
    boxes = result.boxes
    if len(boxes) > 0:
        for box in boxes:
            cls = int(box.cls[0])
            label = result.names[cls]
            if label == "airplane":

It’s really good. The model is around 5MB and I’m really happy with the results.

The rest of the script:

Pilane Web

It all works pretty well! The code for the site is available here. It’s a straightforward Astro site that fetches data from the SQLite database.

Why did I build this?

Because it’s fun!

One time during testing, I was losing my mind trying to figure out why it wasn’t capturing any images. Went on Flightradar and found out that all flights in Delhi were grounded due to the Republic Day rehearsals:

Pilane Grounded

What’s next?

Yolo Bird

Thank you for reading!