This project demonstrates fundamental artificial intelligence concepts using a classic XOR (exclusive OR) logic gate implemented with a neural network on Raspberry Pi. Through hands-on interaction with physical buttons and LEDs, you'll learn how neural networks solve non-linear problems and how to deploy AI models on embedded systems.
π‘ Introduction to AI and Neural NetworksArtificial Neural Networks (ANNs) are computer systems designed to mimic how the human brain processes information. Just like the brain uses neurons to process data and make decisions, ANNs use artificial neurons to analyze data, identify patterns and make predictions. These networks consist of layers of interconnected neurons that work together to solve complex problems. The key idea is that ANNs can "learn" from the data they process, just as our brain learns from experience.
What is XOR and Why It Matters?
The XOR (exclusive OR) gate is a fundamental logic gate that outputs true only when inputs differ. It's a classic problem in AI because it's not linearly separable - meaning you can't solve it with a single-layer perceptron. This makes it perfect for demonstrating why we need multi-layer neural networks with hidden layers.
XOR is more than a clever math puzzle β itβs millions of real-world decisions. Look at these three situations where XOR logic comes into play.
- Smart lighting systems: A lamp turns on only when either motion is detected or it's dark β but not both.
- Security alerts: A system triggers an alert if either door is open or the alarm is armed β but not both.
- Game mechanics: In puzzle games, a door opens only when one of two switches is pressed β not both.
Step 1: Setting Up Raspberry Pi π
- Flash Raspberry Pi OS to microSD card using Raspberry Pi Imager
- Enable SSH and configure WiFi during setup
- Update the system:
sudo apt update && sudo apt upgrade -y-Install required packages:
sudo apt install python3-pip python3-venv git -yStep 2: Create Python Virtual Environment π§ͺ
python3 -m venv ai_env
source ai_env/bin/activate
pip install numpy RPi.GPIOStep 3: Training the Neural Network π§
Run the training script on your computer or Raspberry Pi
Step 4: Running on Raspberry Pi π¦
Connect all components according to the circuit diagram, then run
Raspberry Pi 3 (PULL_DOWN)
ββββββββββββββββββββββββββ
β β
β ββββββββββββββββββββ β
β β GPIO17 β Button Aβ β
β β GPIO27 β Button Bβ β
β β GPIO18 β Green LEDβ β
β β GPIO22 β Red LED β β
β ββββββββββββββββββββ β
β β
ββββββββββββββββββββββββββ
β β β β
β β β β
βββββ ββββ ββββ ββββ
βΌ βΌ βΌ βΌ
βββββββ βββββββ βββββββ βββββββ
βButtonβ βButtonβ βLED β βLED β
β A β β B β βOutputβ βStatusβ
ββββ¬βββ ββββ¬βββ ββββ¬βββ ββββ¬βββ
β β β β
βΌ βΌ βΌ βΌ
3.3V 3.3V GND GND
Raspberry Pi 3 (PULL_UP)
ββββββββββββββββββββββββββ
β β
β ββββββββββββββββββββ β
β β GPIO17 β Button Aβ β
β β GPIO27 β Button Bβ β
β β GPIO18 β Green LEDβ β
β β GPIO22 β Red LED β β
β ββββββββββββββββββββ β
β β
ββββββββββββββββββββββββββ
β β β β
β β β β
βββββ ββββ ββββ ββββ
βΌ βΌ βΌ βΌ
βββββββ βββββββ βββββββ βββββββ
βButtonβ βButtonβ βLED β βLED β
β A β β B β βOutputβ βStatusβ
ββββ¬βββ ββββ¬βββ ββββ¬βββ ββββ¬βββ
β β β β
βΌ βΌ βΌ βΌ
GND GND GND GND-Press button combinations to test XOR:
- Both buttons not pressed: LED off
- Button 1 pressed: LED on
- Button 2 pressed: LED on
- Both buttons pressed: LED off
-Console shows real-time predictions with confidence scores
π‘ Neur al Network Weights and Algorithm ExplanationThe XOR problem requires a 2-2-1 neural network:
- Input layer: 2 neurons (for inputs A and B)
- Hidden layer: 2 neurons (for non-linear transformation)
- Output layer: 1 neuron (for XOR result)
Network Architecture:
βββββββββββ ββββββββββββ βββββββββββ
β Input β β β Hidden β β β Output β
β Layer β β Layer β β Layer β
β 2 β β 2 β β 1 β
βββββββββββ ββββββββββββ βββββββββββ
A,B h1,h2 XORThe Magic Numbers:
weights_input_hidden = [[5.0, -5.0], [-5.0, 5.0]]
weights_hidden_output = [[5.0], [5.0]]
bias_hidden = [-2.0, -2.0]
bias_output = [-2.5]Weight Derivation Formula:
For XOR(A, B), the network implements:
h1 = sigmoid(5*A - 5*B - 2) # Detects A=1, B=0
h2 = sigmoid(-5*A + 5*B - 2) # Detects A=0, B=1
output = sigmoid(5*h1 + 5*h2 - 2.5)Hidden Neuron 1: "Activate when A=0 AND B=1"
h1 = sigmoid(5*A - 5*B - 2)
Case analysis:
- A=0, B=0: 5*0 - 5*0 - 2 = -2 β sigmoid(-2) β 0.12
- A=0, B=1: 5*0 - 5*1 - 2 = -7 β sigmoid(-7) β 0.001
- A=1, B=0: 5*1 - 5*0 - 2 = 3 β sigmoid(3) β 0.95
- A=1, B=1: 5*1 - 5*1 - 2 = -2 β sigmoid(-2) β 0.12
So h1 β 1 only when A=1, B=0Hidden Neuron 2: "Activate when A=1 AND B=0"
h2 = sigmoid(-5*A + 5*B - 2)
Case analysis:
- A=0, B=0: -5*0 + 5*0 - 2 = -2 β sigmoid(-2) β 0.12
- A=0, B=1: -5*0 + 5*1 - 2 = 3 β sigmoid(3) β 0.95
- A=1, B=0: -5*1 + 5*0 - 2 = -7 β sigmoid(-7) β 0.001
- A=1, B=1: -5*1 + 5*1 - 2 = -2 β sigmoid(-2) β 0.12
So h2 β 1 only when A=0, B=1Output Neuron: "Activate when h1 OR h2 is active"
output = sigmoid(5*h1 + 5*h2 - 2.5)
Case analysis:
1. A=0, B=0: h1β0.12, h2β0.12
output = sigmoid(5*0.12 + 5*0.12 - 2.5)
= sigmoid(0.6 + 0.6 - 2.5) = sigmoid(-1.3) β 0.21 β 0 β
2. A=0, B=1: h1β0.001, h2β0.95
output = sigmoid(5*0.001 + 5*0.95 - 2.5)
= sigmoid(0.005 + 4.75 - 2.5) = sigmoid(2.255) β 0.90 β 1 β
3. A=1, B=0: h1β0.95, h2β0.001
output = sigmoid(5*0.95 + 5*0.001 - 2.5)
= sigmoid(4.75 + 0.005 - 2.5) = sigmoid(2.255) β 0.90 β 1 β
4. A=1, B=1: h1β0.12, h2β0.12
output = sigmoid(5*0.12 + 5*0.12 - 2.5)
= sigmoid(0.6 + 0.6 - 2.5) = sigmoid(-1.3) β 0.21 β 0 βHow These Weights Were Discovered?Through Training (Backpropagation):
def backpropagation_learning():
"""
How neural networks learn XOR through training:
1. Start with random weights
2. For each training example (A,B, expected_output):
a. Forward pass: compute current prediction
b. Calculate error = expected - predicted
c. Backward pass: adjust weights to reduce error
d. Repeat thousands of times
After training, the network converges to weights like:
W1 = [[ 5, -5], [-5, 5]]
W2 = [[5], [5]]
"""Training Process Visualization:
Epoch 0: Random weights β Output error β 0.5
Epoch 100: Weights adjust β Output error β 0.3
Epoch 1000: Weights adjust β Output error β 0.1
Epoch 10000: Converged β Output error β 0.001π‘ The ResultThis project successfully demonstrates how to:
- Implement a basic neural network from scratch
- Train it to solve the XOR problem
- Deploy the trained model on Raspberry Pi
- Create an interactive physical computing interface
- Visualize and understand the learning process
The XOR problem serves as a perfect introduction to neural networks because it illustrates why single-layer perceptrons are insufficient for certain problems and how adding hidden layers enables solutions to non-linear problems.
Next Steps π
- Implement a neural network for image recognition with Raspberry Pi camera
- Create a web dashboard to monitor neural network performance









Comments