In most computer vision systems, a human is merely a detected object—a bounding box, a set of coordinates, a stream of data. The system knows “you are here, ” but it never truly sees you.
I began to feel that this kind of expression was too cold.
What if we changed the perspective? Instead of treating a person as something to be framed and labeled, what if we saw them as a field that influences the surrounding space?
I created this AI-generated concept image of a human energy field—hopefully, one day, we’ll be able to build something like this ourselves.Presence creates ripples. Movement reshapes structure.
And when multiple people appear, relationships begin to emerge between them.
So I started to deconstruct and reconstruct the output of visual recognition—no longer displaying detection boxes, but translating count, position, and scale into another language: particles.
Data was no longer just data — it began to feel alive.
When a second person enters the scene, connections, interference, and flow start to form between these particle-based figures. At that point, the system is no longer just recognizing people—it is expressing something more abstract :
👉 the relationship between humans and space 👉 the invisible yet ever-present influences between people.
I call this project "Human Energy Field."
It is both a real-time AI vision system and an exploration of expression—an attempt to see whether, when humans are transformed from objects into energy the world might begin to look different.
System ArchitectureThe system can be divided into three layers:
1. Vision Recognition LayerThe XIAO ESP32S3 Sense is responsible for capturing images from the camera and performing AI-based human detection. It represents the number of people, their size, and positions in the form of bounding boxes. Therefore, in this system, the XIAO ESP32S3 Sense acts as a “black box” for AI vision recognition, functioning more like a “sensor” rather than a “brain.”
2. Data Transmission LayerThe detection data is transmitted via UART to the XIAO ESP32-C3.
⚠ Improtant note(hard-earned lesson):You cannot directly read data from S3 Sense using Processing.
This is because:
One important thing to note here (and this is where I made a mistake—I initially thought I could directly use Processing to read and parse data from the S3 Sense, which turned out to be impossible and cost me a lot of time. If you’re smarter than me, feel free to skip this part) : since the XIAO ESP32S3 Sense runs a packaged AI firmware, it cannot execute custom Arduino logic simultaneously.
👉 Therefore, an ESP32C3 must be introduced as an intermediate layer to parse and restructure the recognition results, and then transmit them to Processing, forming a complete pipeline from data acquisition to visual expression.
3. Particle Generation LayerAfter receiving the data, Processing generates real-time particle-based human silhouettes based on each person’s center point, width, height, number of people, and relative positions, while assigning different colors and dynamic behaviors to each individual.
When only one person is present, the system generates a complete human-shaped particle field; when multiple people appear, their particle systems coexist and produce more complex interaction effects based on their spatial relationships. (You can also design more interesting visual effects in Processing yourself.)
Detailed Build ProcessStep 1:Complete AI Recognition and Human DetectionFirst, we need to deploy the AI vision model on the XIAO ESP32S3 Sense so that it can reliably detect human targets in front of the camera and continuously output detection results. Open the SenseCraft AI platform, select the Personnel Detection model, and make sure to choose the version specifically for XIAO ESP32S3 Sense.
If you need a detailed guide, you can refer to the official SenseCraft AI wiki provided by Seeed Studio.After successful deployment, you will see continuously output data. Pay close attention to the data format and think about what each value represents. This step is especially important if you plan to customize your own particle effects later.
If this is your first time using Arduino IDE, it is recommended to follow a tutorial to install and configure the environment, as well as download the required libraries.Step 3:Hardware Connection
- Prepare two USB cables and connect the XIAO ESP32S3 Sense and XIAO ESP32C3 to your computer.
- Use three female-to-female jumper wires to connect the XIAO ESP32S3 Sense and XIAO ESP32C3. Here we use UART communication. The wiring method is shown below:
To make Processing easier to handle the data, use Arduino IDE to upload code to the XIAO ESP32C3.
Select the correct board and serial port
Upload the following code:
#include <Seeed_Arduino_SSCMA.h>
#ifdef ESP32
#include <HardwareSerial.h>
// Define two Serial devices mapped to the two internal UARTs
HardwareSerial atSerial(0);
#else
#define atSerial Serial1
#endif
SSCMA AI;
void setup()
{
Serial.begin(9600);
AI.begin(&atSerial);
}
void loop()
{
if (!AI.invoke(1,false,true))
{
Serial.println("invoke success");
Serial.print("perf: prepocess=");
Serial.print(AI.perf().prepocess);
Serial.print(", inference=");
Serial.print(AI.perf().inference);
Serial.print(", postpocess=");
Serial.println(AI.perf().postprocess);
for (int i = 0; i < AI.boxes().size(); i++)
{
Serial.print("Box[");
Serial.print(i);
Serial.print("] target=");
Serial.print(AI.boxes()[i].target);
Serial.print(", score=");
Serial.print(AI.boxes()[i].score);
Serial.print(", x=");
Serial.print(AI.boxes()[i].x);
Serial.print(", y=");
Serial.print(AI.boxes()[i].y);
Serial.print(", w=");
Serial.print(AI.boxes()[i].w);
Serial.print(", h=");
Serial.println(AI.boxes()[i].h);
}
for (int i = 0; i < AI.classes().size(); i++)
{
Serial.print("Class[");
Serial.print(i);
Serial.print("] target=");
Serial.print(AI.classes()[i].target);
Serial.print(", score=");
Serial.println(AI.classes()[i].score);
}
for (int i = 0; i < AI.points().size(); i++)
{
Serial.print("Point[");
Serial.print(i);
Serial.print("]: target=");
Serial.print(AI.points()[i].target);
Serial.print(", score=");
Serial.print(AI.points()[i].score);
Serial.print(", x=");
Serial.print(AI.points()[i].x);
Serial.print(", y=");
Serial.println(AI.points()[i].y);
}
for (int i = 0; i < AI.keypoints().size(); i++)
{
Serial.print("keypoint[");
Serial.print(i);
Serial.print("] target=");
Serial.print(AI.keypoints()[i].box.target);
Serial.print(", score=");
Serial.print(AI.keypoints()[i].box.score);
Serial.print(", box:[x=");
Serial.print(AI.keypoints()[i].box.x);
Serial.print(", y=");
Serial.print(AI.keypoints()[i].box.y);
Serial.print(", w=");
Serial.print(AI.keypoints()[i].box.w);
Serial.print(", h=");
Serial.print(AI.keypoints()[i].box.h);
Serial.print("], points:[");
for (int j = 0; j < AI.keypoints()[i].points.size(); j++)
{
Serial.print("[");
Serial.print(AI.keypoints()[i].points[j].x);
Serial.print(",");
Serial.print(AI.keypoints()[i].points[j].y);
Serial.print("],");
}
Serial.println("]");
}
if(!AI.last_image().isEmpty())
{
Serial.print("Last image:");
Serial.println(AI.last_image().c_str());
}
}
}If everything works correctly, you will see real-time serial data in the Serial Monitor:
Before using Processing to receive data, make sure to completely close Arduino IDE so that the serial port is not occupied. Processing will receive serial data from the C3 and convert it into structured information for the particle system. The key points here include:
handling single and multiple targets, mapping coordinates to screen space (if the animation window appears incomplete, this step is likely the issue), scaling particle size based on bounding box dimensions, assigning different colors to different individuals, maintaining frame rate and real-time performance, and optimizing memory usage to prevent crashes.
Upload the following code to Processing:
import processing.serial.*;
import java.util.*;
import java.util.regex.*;
Serial myPort;
final int BAUD = 115200;
final float CAM_W = 320.0;
final float CAM_H = 240.0;
final int MAX_PARTICLES = 1200;
final int MAX_DETECTIONS = 50;
ArrayList<Detection> detections = new ArrayList<Detection>();
ArrayList<Track> tracks = new ArrayList<Track>();
Particle[] pool = new Particle[MAX_PARTICLES];
int nextTrackId = 0;
Pattern boxPattern = Pattern.compile(
"Box\\[(\\d+)\\].*x=(\\d+), y=(\\d+), w=(\\d+), h=(\\d+)"
);
void setup() {
size(1280, 720);
frameRate(60);
background(0);
colorMode(HSB, 360, 100, 100, 100);
for (int i = 0; i < pool.length; i++) {
pool[i] = new Particle();
}
println(Serial.list());
myPort = new Serial(this, Serial.list()[1], BAUD);
myPort.bufferUntil('\n');
}
void draw() {
fill(0, 0, 0, 25);
rect(0, 0, width, height);
drawHumans();
updateParticles();
drawConnections();
}
void serialEvent(Serial p) {
String line = p.readStringUntil('\n');
if (line == null) return;
line = trim(line);
if (line.length() > 200) return;
if (line.startsWith("Box")) {
Matcher m = boxPattern.matcher(line);
if (m.find()) {
if (detections.size() >= MAX_DETECTIONS) return;
Detection d = new Detection();
d.cx = float(m.group(2)) + float(m.group(4)) / 2.0;
d.cy = float(m.group(3)) + float(m.group(5)) / 2.0;
d.w = float(m.group(4));
d.h = float(m.group(5));
d.sx = map(d.cx, 0, CAM_W, 0, width);
d.sy = map(d.cy, 0, CAM_H, 0, height);
d.sw = map(d.w, 0, CAM_W, 0, width);
d.sh = map(d.h, 0, CAM_H, 0, height);
detections.add(d);
}
}
if (line.contains("invoke success")) {
while (tracks.size() < detections.size()) {
Track t = new Track();
t.id = nextTrackId++;
tracks.add(t);
}
for (int i = 0; i < detections.size(); i++) {
tracks.get(i).update(detections.get(i));
}
for (int i = detections.size(); i < tracks.size(); i++) {
tracks.get(i).active = false;
}
detections.clear();
}
}
void drawHumans() {
blendMode(ADD);
ArrayList<Track> activeTracks = new ArrayList<Track>();
ArrayList<Integer> spawnCounts = new ArrayList<Integer>();
for (Track t : tracks) {
if (!t.active) continue;
activeTracks.add(t);
drawHumanShape(t);
int count = int(t.w * t.h / 1500.0);
count = constrain(count, 10, 40);
spawnCounts.add(count);
}
int maxSpawn = 0;
for (int c : spawnCounts) maxSpawn = max(maxSpawn, c);
for (int s = 0; s < maxSpawn; s++) {
for (int i = 0; i < activeTracks.size(); i++) {
if (s < spawnCounts.get(i)) {
activateParticle(activeTracks.get(i));
}
}
}
blendMode(BLEND);
}
void drawHumanShape(Track t) {
float cx = t.x;
float cy = t.y;
float w = t.w * 1.2;
float h = t.h * 1.5;
stroke(t.c, 80);
noFill();
ellipse(cx, cy - h * 0.3, w * 0.2, w * 0.2);
line(cx, cy - h * 0.2, cx, cy + h * 0.3);
line(cx, cy, cx - w * 0.3, cy + h * 0.1);
line(cx, cy, cx + w * 0.3, cy + h * 0.1);
line(cx, cy + h * 0.3, cx - w * 0.2, cy + h * 0.6);
line(cx, cy + h * 0.3, cx + w * 0.2, cy + h * 0.6);
}
boolean activateParticle(Track t) {
for (Particle p : pool) {
if (!p.active) {
p.init(t);
return true;
}
}
return false;
}
void updateParticles() {
blendMode(ADD);
for (Particle p : pool) {
if (p.active) {
p.update();
p.draw();
}
}
blendMode(BLEND);
}
void drawConnections() {
if (tracks.size() < 2) return;
blendMode(ADD);
for (int i = 0; i < tracks.size(); i++) {
Track a = tracks.get(i);
if (!a.active) continue;
for (int j = i + 1; j < tracks.size(); j++) {
Track b = tracks.get(j);
if (!b.active) continue;
float d = dist(a.x, a.y, b.x, b.y);
if (d < 300) {
stroke(lerpColor(a.c, b.c, 0.5), 60);
beginShape();
for (int k = 0; k < 20; k++) {
float tt = k / 20.0;
float x = lerp(a.x, b.x, tt);
float y = lerp(a.y, b.y, tt);
float offset = (noise(tt * 5, frameCount * 0.02) - 0.5) * 40;
y += offset;
vertex(x, y);
}
endShape();
}
}
}
blendMode(BLEND);
}
class Detection {
float cx, cy, w, h;
float sx, sy, sw, sh;
}
class Track {
int id = -1;
float x, y, w, h;
color c;
boolean active = true;
void update(Detection d) {
x = d.sx;
y = d.sy;
w = d.sw;
h = d.sh;
active = true;
float seed = id * 37.0 + d.cx * 0.13 + d.cy * 0.17;
float hue = (noise(seed) * 360.0) % 360.0;
c = color(hue, 80, 100);
}
}
class Particle {
boolean active = false;
Track t;
float x, y;
float vx, vy;
float life;
color c;
void init(Track t_) {
t = t_;
active = true;
c = t.c;
float cx = t.x;
It is worth noting that Processing does not provide a dropdown menu for serial port selection. You need to manually specify it in the code.
When running the program for the first time, set the index to 0. The available ports will be printed in the console. If there are multiple ports, you will need to choose the correct one manually.float cy = t.y;
float w = t.w * 1.2;
float h = t.h * 1.5;
int type = int(random(5));
if (type == 0) {
float angle = random(TWO_PI);
float r = w * 0.1;
x = cx + cos(angle) * r;
y = cy - h * 0.3 + sin(angle) * r;
vx = cos(angle) * random(1, 3);
vy = sin(angle) * random(1, 3);
} else if (type == 1) {
float tLine = random(1);
x = cx;
y = lerp(cy - h * 0.2, cy + h * 0.3, tLine);
vx = random(-1.5, 1.5);
vy = random(0.5, 2);
} else if (type == 2) {
float tLine = random(1);
x = lerp(cx, cx - w * 0.3, tLine);
y = lerp(cy, cy + h * 0.1, tLine);
vx = random(-2, -0.5);
vy = random(-0.5, 1.5);
} else if (type == 3) {
float tLine = random(1);
x = lerp(cx, cx + w * 0.3, tLine);
y = lerp(cy, cy + h * 0.1, tLine);
vx = random(0.5, 2);
vy = random(-0.5, 1.5);
} else {
float side = random(1) < 0.5 ? -1 : 1;
float tLine = random(1);
x = lerp(cx, cx + side * w * 0.2, tLine);
y = lerp(cy + h * 0.3, cy + h * 0.6, tLine);
vx = side * random(0.5, 2);
vy = random(1, 3);
}
life = random(60, 120);
}
void update() {
if (t == null || !t.active) {
active = false;
return;
}
float dx = t.x - x;
float dy = t.y - y;
vx += dx * 0.0005;
vy += dy * 0.0005;
x += vx;
y += vy;
life--;
if (life <= 0) active = false;
}
void draw() {
fill(c, 85);
noStroke();
ellipse(x, y, 3, 3);
}
}It is worth noting that Processing does not provide a dropdown menu for serial port selection. You need to manually specify it in the code.When running the program for the first time, set the index to 0. The available ports will be printed in the console. If there are multiple ports, you will need to choose the correct one manually.
For example, if the console shows:
COM5
COM7and your ESP32C3 is connected to COM7, then you should set the index to 1 in the code.
After completing all the steps above, you will see a real-time particle visualization in the Processing window. Each detected person is represented as a particle-based human figure with a unique color. When multiple people are detected, dynamic interactions—like flowing energy or electric connections—will appear between them. This is what I interpret as the natural connection and interaction between people.
ConclusionWorking on this project made me rethink something: in a world where technology is becoming increasingly powerful, can we still express things in a softer, more intuitive, and more human way?I hope that through this project, you can feel this idea with me—not only are we constantly adapting ourselves to the environment, but the environment is also quietly adapting itself to us. There is a kind of silent warmth and care in that. Humans, space, and relationships have always been connected. You are never truly alone.
Try it yourself! And if you create more interesting particle effects, feel free to share them with me—I’d love to see what you build.












Comments