After releasing the physical button project for n8n I noticed enough interest to take the idea one step further. That’s when I came up with a small, inexpensive ESP32-based device that not only adds physical buttons but can also receive information from n8n and display it on a screen.
Sure, you can start n8n workflows or handle “man in the loop” approvals with a computer or a phone app, but in some situations, having a dedicated device capable of handling buttons, sounds, sensors, camera, lights, and a screen can be much more practical.
Example use case
Imagine an e-commerce company where an employee currently uses a computer to check a Google Sheet to decide the next delivery. With this device, the employee simply presses a button. That triggers an n8n workflow, an AI agent connected to an LLM reads the Google Sheet, and selects the next delivery—not only based on order date but also considering customer notes (for example, prioritizing anxious customers). The AI agent then sends the label info directly to the device’s small screen, along with visual and audio cues. By pressing a second button, a QR code is displayed, ready to be scanned.
Sounds complicated? That’s exactly how my demo works—and it can easily be adapted for other scenarios and even more complex projects.
Parts RequiredThis project is based on the Unihiker K10. If you are an old time user and a big fan of the classic Unihiker (M10) like me be aware that the M10 and K10 are different boards:
M10 → Unix-based, programmed in Python.K10 → ESP32-based, programmed with Arduino IDE.
The K10 comes loaded with features: 240x320 screen, onboard buttons, RGB LED, microSD slot, camera, speaker, microphone, light sensor, temperature sensor, and accelerometer. It also has Gravity connectors for easily adding external buttons.
Reference: K10 Arduino IDE Code Documentation
CircuitNo real circuit work needed—thanks to the Gravity connectors and onboard features. Just plug the two external buttons into the board. The code also supports the onboard buttons.
n8n SetupIf you don’t have an account, you can create one at n8n.io and get $5 of credits for your first project. n8n is open source, so you can also host it yourself or use a hosting service with n8n template ready like Railway
- Open my workflow template to get up and running quickly
- Upload the Excel sheet and configure Google Sheets and Google Gemini credentials (Google Sheets API is free; Gemini has a free tier).
- Get the webhook URL—this will go into the
.ino
code.
IMPORTANT: As of writing, the K10 requires Arduino IDE v1.8.19 or earlier for programming. You can install multiple Arduino versions side by side.
Steps
1. Download Arduino IDE 1.8.19 (not the latest version) and install
2. Go to Preferences → Additional Board URLs and add:
https://downloadcd.dfrobot.com.cn/UNIHIKER/package_unihiker_index.json
3. Select K10 as the board and the correct port.
4. Download n8nTerminal.ino img.h and the.wav sound file from GitHub.
5. Edit n8nTerminal.ino to include your Wi-Fi credentials and the webhook URL. Adjust JSON fields if needed—my demo uses idShipping, name, address, and notes, but you can customize them.
// Settings
const char* ssid = "";
const char* password = "";
const char* webhook_url = "";
6. Connect the board via USB Type-C and click Upload. (For some reason, my upload progress sometimes repeats three times—be prepared.)
I’m using the microSD just to play a startup voice. Copy the .wav
file onto the card, then insert it into the K10.
I designed a 3D-printed enclosure for the K10 and its buttons. It leaves the lights visible, the speaker unobstructed, and includes:
- A back hole for wall mounting.
- Cutouts for front sensors.
You can get the enclosure on Cults.
You will also need 1x m3 screw and 4x 3mm screws
Demo
I created a simple demo for a fictional e-commerce company. At the delivery warehouse, an employee presses a button on the n8n Terminal to fetch the next package to deliver.
The Terminal triggers an n8n workflow, which calls an AI agent powered by Gemini Flash 2.5 — although ChatGPT-5 or any other LLM could be used — to retrieve pending shipments from a Google Sheet. The agent checks the current date and time, then follows system rules to determine the next delivery.
In essence, it selects any row where the "Sent" cell is false, prioritizing the oldest shipping date. However, It also takes into account the "Notes" column, so if a customer is marked as anxious, their order may be prioritized even if placed later. The resulting data is sent to the n8n Terminal’s screen, and the employee can press a second button to generate a QR code with the same information. Finally, the "Sent" cell in the sheet is updated.
Communication between n8n and n8n terminal is done by json. Example:
[{"output":{"idShipping":"1","name":"Juan Perez","address":"San Juan 8888","notes":"Solo queria decir que entreguen por la tarde"}}]
Note: it seems that there is a small bug in Unihiker K10 libraries that prevents the screen to be cleared after QR code display. It should be fixed with an update.
Final notes
Inexpensive, dedicated devices that operate automations powered by AI agents—combining data and procedures from different providers and platforms—are well worth exploring. They represent a low-cost yet powerful way to bridge the gap between physical interfaces and cloud intelligence. Beyond the current functionality, there is substantial room for improvement and experimentation. For example, the Unihiker K10 could forward additional data it is capable of producing—such as light sensor readings, camera images, or recorded audio—to enrich the on-screen presentation or trigger more context-aware workflows. Likewise, it could receive AI-generated responses not only as text but also as synthesized speech or even dynamic visualizations, turning it into a richer, multimodal interface between automation processes and the human operator.
Demo
Interested about other projects with n8n?
View this AI Retro Console (turns old notebooks into n8n powered distraction-free terminal, delivering news, emails & commands without dopamine traps)
Contact
Feedback or questions about this project? https://www.linkedin.com/in/ronibandini/
Comments