A real-time, camera-powered Rock-Paper-Scissors game built with Edge Impulse and the Arduino UNO Q.
The camera watches your hand gestures using an object detection model, while the Arduino randomly picks its move. Think you can win the machine?
This is a super fun way to explore Edge AI on the new Arduino UNO Q using the latest Arduino App Lab version. This is a perfect project for makers, students, and anyone who wants to see machine learning in action!
What You’ll NeedHardware
- Arduino UNO Q board
- USB webcam (any standard webcam may work)
Software
- Arduino App Lab (latest version)
- Free Edge Impulse account
Recommended Starting Model
We have prepared a public Edge Impulse project for you: Rock Paper Scissors – Public Project
Clone it and retrain it with your own hand photos for even better accuracy under your camera, lighting and background conditions.
Step 1: Copy the Rock Paper Scissors Application- Clone the repository:
git clone https://github.com/edgeimpulse/example-rock-paper-scissors-Arduino-UNO-Q.git- In Arduino App Lab, go to `My Apps` → click `Create new App` (or use “Copy and edit app” on the example).
Alternatively transfer the app to your UNO Q (via SSH or App Lab import):
scp -r example-rock-paper-scissors-Arduino-UNO-Q/ arduino@<device-ip>:/home/arduino/ArduinoApps/RPS-gameStep 2: Re-Train or deploy your ML model- Open your new app in Arduino App Lab.
- Click on the "Video Object Detection" brick (top-left).
- Go to the `AI models` tab and click "Train new AI model".
- Log in with your Arduino account (this also connects to Edge Impulse).
Choose one of two options:
- Clone the public Rock Paper Scissors projectand retrain it with your own hand gesture photos.
- Once re-trained, go to "Deploy" and select the model for Arduino UNO Q (or Linux aarch64). Your custom model will automatically appear in the AI models list in the Arduino App Lab for the application.
- In the Arduino App Lab, select your model and click Install. It will be added to the `app.yaml` file automagically.
- It's time to play! In the Arduino App Lab, inside your Rock Paper Scissors app, and click `Run` (button on the top-right corner).
- Or start it via SSH:
arduino-app-cli app start user:rock-paper-scissors-game- Open your browser and go to: http://<your-uno-q-ip>:5001
Good luck!
- To start, show your hand (paper ✋). Once detected you will be to change the gestures.
- Show your hand gesture (rock ✊, paper ✋, or scissors ✌️) to the camera.
- Remember that rock beats scissors, paper beats rock and scissors beats paper.
- Watch the left panel — the Edge Impulse model detects your move in real time.
- Click `Play Round` to lock in your choice.
- The Arduino UNO Q picks its move… and the winner is revealed!
- Add more training images with different lighting, angles, and backgrounds.
- Try FOMO for faster inference if you want snappier responses.
Need help? Post your detailed message on the Edge Impulse forum.
DisclaimerThis project is for educational and entertainment purposes only. Have fun — but don’t take it too seriously… the Arduino might be better at Rock-Paper-Scissors than you think (or not)! 😉









Comments