Pseudonymous maker "DeVayu" is back with an update to an ongoing project that uses a Raspberry Pi Pico and a secondary system to drive a 2,048-LED RGB matrix at impressive rates over a Wi-Fi connection — with the news he's more than doubled the refresh rate to 70 frames per second while adding gesture controls.
DeVayu unveiled the first-generation version of the project late last year, linking a Raspberry Pi Pico and its RP2040 dual-core microcontroller to a 64×32 RGB LED matrix. Rather than stopping there, though, DeVayu added an Espressif ESP8266, which allowed compressed updates to be streamed to the display over Wi-Fi — hitting an impressive 28 frames per second all-in.
"It suffered a lot of problems," DeVayu admits of that first implementation. "One was stability, the other one was frame rate — the frame rate in the post was only achievable with very simple images due to ZSTD compression. I only noticed this later on."
In attempting to resolve the issue, DeVayu built a second variant using an Arduino RP2040 Connect with its on-board Wi-Fi. "Integrating with the Arduino toolchain was annoying," the maker writes, "but I got it working — until I destroyed my Arduino RP2040 Connect." A third version, designed to solve lag issues with the second, used an ESP32 — but ran into a range of problems from lag to "ghost red pixels floating around."
Enter the fourth, and most functional, version of the project, which uses a Raspberry Pi Pico linked to a Raspberry Pi Zero WH single-board computer. "[It] works by sending the frames PNG-compressed over MQTT to a Raspberry Pi Zero WH which then decompresses them and sends them over SPI to Pico at around 5Mbit using Python," DeVayu explains.
"[This] in turn uses one thread to receive these images and the other thread to display the current one at around 500Hz. This high refresh rate is necessary to avoid headaches as the panel uses software PWM."
"The fourth version also has a gesture sensor at the top for controlling it. The gesture sensor is connected to the Zero which sends all detected gestures over MQTT to my animation script. I plan on making a fifth (and final?) version at some point. The fifth version would probably just use an USB to SPI adapter to send the frames directly from my server (where they are rendered) to the Pico."
More details are available on the project's Reddit thread, while DeVayu has pledged to release source code "within the next few days."