Hardware components | ||||||
| × | 1 | ||||
| × | 1 | ||||
| × | 1 | ||||
![]() |
| × | 1 | |||
| × | 3 | ||||
Software apps and online services | ||||||
![]() |
| |||||
| ||||||
![]() |
| |||||
Hand tools and fabrication machines | ||||||
![]() |
|
What if a cute stuffed animal could talk to you when you leave home and when you return? You might just get a warm and fuzzy feeling!
I'd like to introduce a project I created that warms the heart and lifts your spirits: a little companion that does its best to see you off with a "Have a good day!" and welcome you back with a "Welcome home!", all while moving its small body.
I had three main motivations for this project.
- I wanted to build a compact device that combined a speaker, microphone, and servo functions, and the M5ATOM S3, ATOMIC ECHO BASE, and ATOMIC MOTION seemed like the perfect components to achieve this.
- After upgrading my smart home with SwitchBot devices, I felt that just having the lights turn on automatically when I got home wasn't enough. I wanted something more... comforting.
- I discovered that SwitchBot offers an API that makes it easy to retrieve sensor data. It suddenly struck me that I could use this to create something that would give my Quality of Life a huge boost.
These ideas led me to start building a charming robot that would greet me when I came home. Now, actually seeing it in action truly warms my heart.
What Kind of Robot Is It, and What Does It Do?I created an adorable penguin robot. This penguin sits at my front door, greeting anyone coming or going with its voice and cute movements of its head and arms.
It says, "Have a good day!" when you leave and "Welcome home!" when you return.
・Itterassyai(いってらっしゃい "Have a good day!")
・Okaeri(おかえり "Welcome home!")
I've programmed several variations of these phrases, so it says different things at different times.
・Self-introduction
The system for this penguin robot consists of three main hardware components.
- SwitchBot Contact Sensor: This is used to detect when the door opens and closes, as well as to sense motion. A Raspberry Pi monitors the status of this sensor by polling the SwitchBot API.
- Raspberry Pi: When the sensor state changes to indicate someone is leaving (motion detected + door opens) or arriving (no motion detected + door opens), the Raspberry Pi sends the appropriate command to the robot.
M5Stack Modules: An M5ATOMS3 receives the command. It then plays audio through the speaker on the ATOMIC ECHO BASE and controls the servos for the neck and arms, which are connected to the ATOMIC MOTION.
1. Compact Robot Control Hardware
The M5Stack ATOM S3, ATOMIC ECHO BASE, and ATOMIC MOTION are already quite small. However, to fit them inside the confined space of a stuffed animal, I needed them to be even smaller. If the components are even slightly too large, they can snag on the fabric and prevent smooth movement.
Typically, you would need a separate ATOM S3 for both the ATOMIC ECHO BASE and the ATOMIC MOTION. This would require extra cables to connect the microcontrollers, taking up more space. It also complicates the programming, requiring separate code for each microcontroller and making code management and communication tedious, not to mention more expensive.
The back of the ATOMIC ECHO BASE has pins exposed for the ATOM S3's 4-pin connector, providing 5V, GND, and two I2C lines. Unfortunately, simply connecting these four pins to the ATOMIC MOTION isn't enough to make it work. While it receives 5V, it doesn't get the 3.3V needed to power its control chip.
To solve this, I soldered a wire directly from the 3.3V line on the ATOMIC ECHO BASE to the 3.3V line on the ATOMIC MOTION.
For this project, I used a pin header to make the connection. For an even more compact design, one could replace the pin header with a custom-printed circuit board (PCB) or a thinner connection method.
2. A Shape Designed for Moving a Stuffed Animal
Because a stuffed animal is made of soft, deformable fabric, the internal servos would often just slide against the cloth without actually moving the head or arms effectively.
To overcome this, I designed a part for the head with a special shape that catches on a seam inside the plushie's fabric, allowing it to turn the head properly. For the arms, I created L-shaped parts that can more effectively push the fabric outwards. These 工夫 (innovations) make the penguin's movements much more reliable.
3. Head-Turning and Arm-Lifting Mechanism
I integrated three servo motors inside the penguin. To hold them securely, I designed custom parts in 3D CAD and manufactured them with a 3D printer. I minimized the space between components to make the entire mechanism as compact as possible.
This allowed me to fit the entire mechanism inside the small stuffed animal, enabling it to move its head and arms briskly.
This was my first time creating a "cute" project using a stuffed animal.
This project uses the state from a SwitchBot sensor to trigger several pre-set phrases and movements. There's a lot of potential for expansion, such as using an LLM to generate more dynamic and non-repetitive greetings.
I also learned about the challenges of animating a soft, non-rigid object like a plushie. It was often frustrating when the servos moved, but the fabric didn't follow as expected, leading to a lot of trial and error with positioning to get it right. With some sewing skills, I imagine I could take it a step further by stitching the mechanism directly to the fabric.
In the introduction video, you can see a slight delay between when the door opens and when the robot starts moving. This is because my current setup uses polling, checking the SwitchBot API for the sensor's status every 5 seconds. A webhook would be a much better solution for real-time responsiveness.
I completed this project in about two days. I made extensive use of Google's Gemini to support the software development. My impression is that while it's not yet perfect for embedded systems code—it rarely works on the first try—it can generate a solid framework in a very short amount of time if you define your requirements clearly. It was significantly faster than coding everything from scratch.
Even the simplest movements make a stuffed animal incredibly cute. Just watching it flap its arms can bring a smile to your face. It's a great source of comfort, and I'd love to apply these ideas to another project in the future.
#include <M5Unified.h>
#include <ArduinoFFT.h>
#include <vector>
#include <numeric>
#include <algorithm>
#include <cmath>
#include "M5AtomicMotion.h"
#include <M5EchoBase.h>
#include "Calibrator.h"
#include "RobotActionController.h"
// ===============================================================
// ダンスプログラム 設定項目
// ===============================================================
#define I2S_BCLK_PIN 8
#define I2S_LRCK_PIN 6
#define I2S_DATA_PIN 7
#define I2S_DOUT_PIN 5
#define I2C_SDA_PIN 38
#define I2C_SCL_PIN 39
#define SAMPLING_FREQUENCY 16000
#define FFT_SAMPLES 1024
#define BASS_FREQ_MIN 60
#define BASS_FREQ_MAX 120
#define SILENCE_THRESHOLD 150000.0
#define BEAT_SENSITIVITY 1.5
#define MUSIC_STOP_TIMEOUT 2500
#define BEAT_HISTORY_SIZE 16
#define MIN_BPM 70
#define MAX_BPM 180
#define BPM_SMOOTHING_FACTOR 0.5
#define BEAT_RESET_TIMEOUT 2000
#define DECAY_RATE 60
#define NECK_CENTER_ANGLE 90
#define NECK_SWING_ANGLE 0//25
#define ARM_DOWN_ANGLE 0
#define ARM_UP_ANGLE 60
// ===============================================================
// オブジェクト作成 / グローバル変数
// ===============================================================
Calibrator myCalibrator;
RobotActionController robotController;
M5AtomicMotion AtomicMotion;
M5EchoBase echobase(I2S_NUM_0);
ArduinoFFT<double> FFT = ArduinoFFT<double>();
LGFX_Sprite lcdSprite(&M5.Display);
double vReal[FFT_SAMPLES];
double vImag[FFT_SAMPLES];
int16_t i2s_buffer[FFT_SAMPLES];
int bass_freq_start_index;
int bass_freq_end_index;
int current_brightness = 0;
long last_beat_detected_time = 0;
double detected_bpm = 120.0;
long last_flash_time = 0;
bool is_neck_left = true;
bool are_arms_up = false;
bool is_music_playing = false;
bool is_stopping = false;
unsigned long last_sound_detected_time = 0;
double bass_history_array[BEAT_HISTORY_SIZE] = {0.0};
int bass_history_index = 0;
bool bass_history_filled = false;
long beat_intervals_array[BEAT_HISTORY_SIZE] = {0};
int beat_intervals_index = 0;
bool beat_intervals_filled = false;
// ===============================================================
// ダンスプログラム用関数
// ===============================================================
void initI2S_Driver_Dance() {
i2s_config_t i2s_config = {
.mode = (i2s_mode_t)(I2S_MODE_MASTER | I2S_MODE_RX),
.sample_rate = SAMPLING_FREQUENCY,
.bits_per_sample = I2S_BITS_PER_SAMPLE_16BIT,
.channel_format = I2S_CHANNEL_FMT_ONLY_LEFT,
.communication_format = I2S_COMM_FORMAT_STAND_I2S,
.intr_alloc_flags = ESP_INTR_FLAG_LEVEL1,
.dma_buf_count = 4,
.dma_buf_len = 512,
};
i2s_driver_install(I2S_NUM_0, &i2s_config, 0, NULL);
}
void resetBeatIntervals() {
M5.Log.println("Resetting beat intervals history.");
for(int i=0; i<BEAT_HISTORY_SIZE; ++i) beat_intervals_array[i] = 0;
beat_intervals_index = 0;
beat_intervals_filled = false;
}
void resetState() {
M5.Log.println("Resetting detection state.");
for(int i=0; i<BEAT_HISTORY_SIZE; ++i) bass_history_array[i] = 0.0;
bass_history_index = 0;
bass_history_filled = false;
resetBeatIntervals();
last_beat_detected_time = 0;
detected_bpm = 120.0;
last_flash_time = millis();
}
double calculateMedian(long arr[], int size) {
if (size == 0) return 0.0;
std::vector<long> temp_vec;
for(int i = 0; i < size; ++i) temp_vec.push_back(arr[i]);
std::sort(temp_vec.begin(), temp_vec.end());
if (size % 2 == 0) return (temp_vec[size / 2 - 1] + temp_vec[size / 2]) / 2.0;
else return temp_vec[size / 2];
}
void detectTempo(double current_bass_magnitude) {
bass_history_array[bass_history_index] = current_bass_magnitude;
bass_history_index = (bass_history_index + 1) % BEAT_HISTORY_SIZE;
if (bass_history_index == 0) bass_history_filled = true;
if (!bass_history_filled) return;
double sum = 0.0;
for(int i=0; i<BEAT_HISTORY_SIZE; i++) sum += bass_history_array[i];
double avg_bass = sum / BEAT_HISTORY_SIZE;
double threshold = avg_bass * BEAT_SENSITIVITY;
int recent_val_index = (bass_history_index - 1 + BEAT_HISTORY_SIZE) % BEAT_HISTORY_SIZE;
int peak_candidate_index = (bass_history_index - 2 + BEAT_HISTORY_SIZE) % BEAT_HISTORY_SIZE;
int prev_to_peak_index = (bass_history_index - 3 + BEAT_HISTORY_SIZE) % BEAT_HISTORY_SIZE;
bool is_peak = (bass_history_array[peak_candidate_index] > bass_history_array[prev_to_peak_index]) && (bass_history_array[peak_candidate_index] > bass_history_array[recent_val_index]);
long min_interval_for_debounce = 60000.0 / MAX_BPM;
bool new_beat_interval_added = false;
if (bass_history_array[peak_candidate_index] > threshold && is_peak && millis() - last_beat_detected_time > min_interval_for_debounce) {
long current_time = millis();
if (last_beat_detected_time > 0) {
long interval = current_time - last_beat_detected_time;
long min_interval = 60000.0 / MAX_BPM;
long max_interval = 60000.0 / MIN_BPM;
if (interval >= min_interval && interval <= max_interval) {
beat_intervals_array[beat_intervals_index] = interval;
beat_intervals_index = (beat_intervals_index + 1) % BEAT_HISTORY_SIZE;
if (beat_intervals_index == 0) beat_intervals_filled = true;
new_beat_interval_added = true;
}
}
last_beat_detected_time = current_time;
}
if (millis() - last_beat_detected_time > BEAT_RESET_TIMEOUT) {
if (beat_intervals_index != 0 || beat_intervals_filled) resetBeatIntervals();
}
if (new_beat_interval_added) {
int current_interval_size = beat_intervals_filled ? BEAT_HISTORY_SIZE : beat_intervals_index;
if (current_interval_size > BEAT_HISTORY_SIZE / 4) {
double median_interval = calculateMedian(beat_intervals_array, current_interval_size);
if (median_interval > 0) {
double current_bpm = 60000.0 / median_interval;
detected_bpm = (detected_bpm * (1.0 - BPM_SMOOTHING_FACTOR)) + (current_bpm * BPM_SMOOTHING_FACTOR);
}
}
}
}
void updateDisplayAndMotion() {
if (is_music_playing) {
long beat_period_ms = 60000.0 / detected_bpm;
if (beat_period_ms <= 0) beat_period_ms = 500;
if (millis() - last_flash_time > beat_period_ms) {
current_brightness = 255;
last_flash_time = millis();
M5.Log.printf("Beat! BPM: %.1f\n", detected_bpm);
if (is_neck_left) AtomicMotion.setServoAngle(0, NECK_CENTER_ANGLE + NECK_SWING_ANGLE + NECK_OFFSET);
else AtomicMotion.setServoAngle(0, NECK_CENTER_ANGLE - NECK_SWING_ANGLE + NECK_OFFSET);
is_neck_left = !is_neck_left;
if (are_arms_up) {
AtomicMotion.setServoAngle(1, ARM_DOWN_ANGLE + RIGHT_ARM_OFFSET);
AtomicMotion.setServoAngle(2, 180 - (ARM_DOWN_ANGLE + LEFT_ARM_OFFSET));
} else {
AtomicMotion.setServoAngle(1, ARM_UP_ANGLE + RIGHT_ARM_OFFSET);
AtomicMotion.setServoAngle(2, 180 - (ARM_UP_ANGLE + LEFT_ARM_OFFSET));
}
are_arms_up = !are_arms_up;
}
} else {
if (is_stopping) {
M5.Log.println("Returning to home position.");
AtomicMotion.setServoAngle(0, NECK_CENTER_ANGLE + NECK_OFFSET);
AtomicMotion.setServoAngle(1, ARM_DOWN_ANGLE + RIGHT_ARM_OFFSET);
AtomicMotion.setServoAngle(2, 180 - (ARM_DOWN_ANGLE + LEFT_ARM_OFFSET));
is_neck_left = true;
are_arms_up = false;
is_stopping = false;
}
}
if (current_brightness > 0) {
int decay = is_music_playing ? DECAY_RATE : 5;
current_brightness -= decay;
if (current_brightness < 0) current_brightness = 0;
}
lcdSprite.fillScreen(lcdSprite.color565(current_brightness, current_brightness, current_brightness));
lcdSprite.pushSprite(0, 0);
}
void runDanceProgram() {
M5.Log.println("Robot Dance Program Started!");
echobase.init(SAMPLING_FREQUENCY, I2C_SDA_PIN, I2C_SCL_PIN, I2S_DATA_PIN, I2S_LRCK_PIN, I2S_DOUT_PIN, I2S_BCLK_PIN, Wire);
echobase.setMicGain(ES8311_MIC_GAIN_24DB);
M5.Log.println("ATOMIC ECHO BASE Initialized.");
while (!AtomicMotion.begin(&Wire, M5_ATOMIC_MOTION_I2C_ADDR, I2C_SDA_PIN, I2C_SCL_PIN, 100000)) {
M5.Log.println("Atomic Motion begin failed");
M5.Display.clear();
M5.Display.drawString("Motion Init Fail", M5.Display.width() / 2, M5.Display.height() / 2);
delay(1000);
}
M5.Log.println("Atomic Motion Initialized.");
initI2S_Driver_Dance();
lcdSprite.createSprite(M5.Display.width(), M5.Display.height());
M5.Display.fillScreen(TFT_BLACK);
double freq_resolution = (double)SAMPLING_FREQUENCY / FFT_SAMPLES;
bass_freq_start_index = (int)ceil(BASS_FREQ_MIN / freq_resolution);
bass_freq_end_index = (int)floor(BASS_FREQ_MAX / freq_resolution);
AtomicMotion.setServoAngle(0, NECK_CENTER_ANGLE + NECK_OFFSET);
AtomicMotion.setServoAngle(1, ARM_DOWN_ANGLE + RIGHT_ARM_OFFSET);
AtomicMotion.setServoAngle(2, 180 - (ARM_DOWN_ANGLE + LEFT_ARM_OFFSET));
delay(500);
while (true) {
M5.update();
size_t bytes_read = 0;
esp_err_t result = i2s_read(I2S_NUM_0, i2s_buffer, sizeof(i2s_buffer), &bytes_read, portMAX_DELAY);
if (result == ESP_OK && bytes_read > 0) {
long long buffer_sum = 0;
for (int i = 0; i < FFT_SAMPLES; i++) {
buffer_sum += abs(i2s_buffer[i]);
}
if (buffer_sum == 0) {
delay(10);
continue;
}
long total = 0;
for (int i = 0; i < FFT_SAMPLES; i++) total += i2s_buffer[i];
double average = (double)total / FFT_SAMPLES;
for (int i = 0; i < FFT_SAMPLES; i++) {
vReal[i] = (double)i2s_buffer[i] - average;
vImag[i] = 0;
}
FFT.windowing(vReal, FFT_SAMPLES, FFT_WIN_TYP_HAMMING, FFT_FORWARD);
FFT.compute(vReal, vImag, FFT_SAMPLES, FFT_FORWARD);
FFT.complexToMagnitude(vReal, vImag, FFT_SAMPLES);
double bass_magnitude = 0;
int freq_range_count = bass_freq_end_index - bass_freq_start_index + 1;
if (freq_range_count > 0) {
for (int i = bass_freq_start_index; i <= bass_freq_end_index; i++) bass_magnitude += vReal[i];
bass_magnitude /= freq_range_count;
}
if (bass_magnitude > SILENCE_THRESHOLD) {
if (!is_music_playing) {
is_music_playing = true;
resetState();
M5.Log.println("Music detected. Let's dance!");
}
last_sound_detected_time = millis();
} else {
if (is_music_playing && (millis() - last_sound_detected_time > MUSIC_STOP_TIMEOUT)) {
is_music_playing = false;
is_stopping = true;
M5.Log.println("Music stopped (timeout). Initiating stop sequence...");
}
}
if (is_music_playing) {
detectTempo(bass_magnitude);
}
}
updateDisplayAndMotion();
delay(1);
}
}
// ===============================================================
// 「お出迎えモード」の実装
// ===============================================================
void runGreetingMode() {
M5.Display.fillScreen(TFT_NAVY);
M5.Display.setTextDatum(MC_DATUM);
M5.Display.setTextSize(2);
M5.Display.drawString("Greeting Mode", M5.Display.width()/2, M5.Display.height()/2 - 15);
M5.Display.drawString("Ready", M5.Display.width()/2, M5.Display.height()/2 + 15);
// 必要なモジュールを初期化
echobase.init(SAMPLING_FREQUENCY, I2C_SDA_PIN, I2C_SCL_PIN, I2S_DATA_PIN, I2S_LRCK_PIN, I2S_DOUT_PIN, I2S_BCLK_PIN, Wire);
M5.Log.println("ATOMIC ECHO BASE Initialized for Greeting Mode.");
echobase.setSpeakerVolume(80); // 0-100
while (!AtomicMotion.begin(&Wire, M5_ATOMIC_MOTION_I2C_ADDR, I2C_SDA_PIN, I2C_SCL_PIN, 100000)) {
M5.Log.println("Atomic Motion begin failed");
M5.Display.clear();
M5.Display.drawString("Motion Init Fail", M5.Display.width() / 2, M5.Display.height() / 2);
delay(1000);
}
M5.Log.println("Atomic Motion Initialized for Greeting Mode.");
// ロボットコントローラーを初期化
robotController.begin(&AtomicMotion, &echobase);
while(true) {
M5.update();
robotController.update();
delay(10);
}
}
void setup() {
auto cfg = M5.config();
M5.begin(cfg);
M5.Display.setTextSize(2);
M5.Display.setTextDatum(MC_DATUM);
int mode_selection = 0;
unsigned long start_time = millis();
const int selection_timeout = 4000;
while(millis() - start_time < selection_timeout) {
M5.update();
M5.Display.fillScreen(TFT_BLACK);
M5.Display.drawString("Select Mode:", M5.Display.width() / 2, 20);
switch(mode_selection) {
case 0: M5.Display.drawString("-> Greeting", M5.Display.width() / 2, 60); break;
case 1: M5.Display.drawString("-> Dance", M5.Display.width() / 2, 60); break;
case 2: M5.Display.drawString("-> Calibrate", M5.Display.width() / 2, 60); break;
}
int time_left = (selection_timeout - (millis() - start_time)) / 1000;
M5.Display.drawString("Starts in " + String(time_left + 1) + "s", M5.Display.width() / 2, 100);
if (M5.BtnA.wasPressed()) {
mode_selection = (mode_selection + 1) % 3;
start_time = millis();
}
delay(50);
}
switch(mode_selection) {
case 0:
runGreetingMode();
break;
case 1:
runDanceProgram();
break;
case 2:
myCalibrator.begin();
myCalibrator.run();
break;
}
}
void loop() {
// ループ内では何もしない
}
sound.h
C Header FileSound file header (binary data stored in sound.cpp)
#ifndef SOUNDS_H
#define SOUNDS_H
#include <cstddef>
// --- 「いってらっしゃい」の音声データ(宣言) ---
extern const unsigned char itterassyai_wav_0[];
extern const unsigned char itterassyai_wav_1[];
extern const unsigned char itterassyai_wav_2[];
extern const unsigned char itterassyai_wav_3[];
// --- 「おかえり」の音声データ(宣言) ---
extern const unsigned char okaeri_wav_0[];
extern const unsigned char okaeri_wav_1[];
extern const unsigned char okaeri_wav_2[];
extern const unsigned char okaeri_wav_3[];
// ★★★ 追加: 自己紹介の音声データ(宣言) ★★★
extern const unsigned char explanation_wav[];
extern const size_t explanation_wav_len;
// --- 音声データを管理しやすくするための配列(宣言) ---
// 「いってらっしゃい」バリエーションの数
const int GO_OUT_SOUND_COUNT = 4;
// 各音声データへのポインタ配列
extern const unsigned char* go_out_sounds[GO_OUT_SOUND_COUNT];
// 各音声データのサイズ配列
extern const size_t go_out_sound_lens[GO_OUT_SOUND_COUNT];
// --- 「おかえり」も同様に設定 ---
const int COME_HOME_SOUND_COUNT = 4;
extern const unsigned char* come_home_sounds[COME_HOME_SOUND_COUNT];
extern const size_t come_home_sound_lens[COME_HOME_SOUND_COUNT];
#endif // SOUNDS_H
#include "RobotActionController.h"
#include "sounds.h" // 音声データヘッダーをインクルード
// コンストラクタで再生インデックスを初期化
RobotActionController::RobotActionController()
: _motion(nullptr),
_echo(nullptr),
_currentState(State::IDLE),
_goOutSoundIndex(0),
_comeHomeSoundIndex(0)
{}
void RobotActionController::begin(M5AtomicMotion* motion, M5EchoBase* echo) {
_motion = motion;
_echo = echo;
Serial.begin(115200);
M5.Log.println("RobotActionController initialized. Waiting for commands...");
setAllServosToHome();
delay(500);
}
void RobotActionController::update() {
if (_currentState == State::IDLE) {
processSerialCommand();
}
}
void RobotActionController::processSerialCommand() {
if (Serial.available() > 0) {
String command = Serial.readStringUntil('\n');
command.trim();
M5.Log.printf("Received command: %s\n", command.c_str());
_currentState = State::PERFORMING_ACTION;
if (command == "GO_OUT") {
performGoOutSequence();
} else if (command == "COME_HOME") {
performComeHomeSequence();
} else if (command == "RESET") {
resetToHomePosition();
} else if (command == "TEST_GO_OUT") {
performGoOutSequence();
delay(3000);
resetToHomePosition();
} else if (command == "TEST_COME_HOME") {
performComeHomeSequence();
delay(3000);
resetToHomePosition();
}
// ★★★ 追加: EXPLAINコマンドの処理 ★★★
else if (command == "EXPLAIN") {
performExplanationSequence();
}
else {
M5.Log.printf("Unknown command: %s\n", command.c_str());
}
_currentState = State::IDLE;
M5.Log.println("Action finished. Ready for next command.");
}
}
void RobotActionController::performGoOutSequence() {
M5.Log.println("Executing GO_OUT sequence...");
_echo->setMute(false);
_echo->play((uint8_t*)go_out_sounds[_goOutSoundIndex], go_out_sound_lens[_goOutSoundIndex]);
_goOutSoundIndex = (_goOutSoundIndex + 1) % GO_OUT_SOUND_COUNT;
_motion->setServoAngle(NECK_SERVO_CH, NECK_HOME_ANGLE + 10 + NECK_OFFSET);
waveHand(LEFT_ARM_SERVO_CH, LEFT_ARM_HOME_ANGLE, 60, 5, true);
M5.Log.println("Waiting for door to close...");
}
void RobotActionController::performComeHomeSequence() {
M5.Log.println("Executing COME_HOME sequence...");
_echo->setMute(false);
_echo->play((uint8_t*)come_home_sounds[_comeHomeSoundIndex], come_home_sound_lens[_comeHomeSoundIndex]);
_comeHomeSoundIndex = (_comeHomeSoundIndex + 1) % COME_HOME_SOUND_COUNT;
_motion->setServoAngle(NECK_SERVO_CH, NECK_HOME_ANGLE + NECK_OFFSET);
xTaskCreate([](void* pvParameters){
((RobotActionController*)pvParameters)->waveHand(RIGHT_ARM_SERVO_CH, RIGHT_ARM_HOME_ANGLE, 60, 5, false);
vTaskDelete(NULL);
}, "waveRight", 2048, this, 1, NULL);
xTaskCreate([](void* pvParameters){
((RobotActionController*)pvParameters)->waveHand(LEFT_ARM_SERVO_CH, LEFT_ARM_HOME_ANGLE, 60, 5, true);
vTaskDelete(NULL);
}, "waveLeft", 2048, this, 1, NULL);
delay(2500);
M5.Log.println("Waving finished. Waiting for reset command.");
}
// ★★★ 追加: 自己紹介シーケンスの実装 ★★★
void RobotActionController::performExplanationSequence() {
M5.Log.println("Executing EXPLANATION sequence...");
_echo->setMute(false);
_echo->play((uint8_t*)explanation_wav, explanation_wav_len);
// 自己紹介中の簡単な動き
delay(500); // 音声開始を少し待つ
// 左を見る
_motion->setServoAngle(NECK_SERVO_CH, NECK_HOME_ANGLE + 20 + NECK_OFFSET);
delay(1500);
// 右を見る
_motion->setServoAngle(NECK_SERVO_CH, NECK_HOME_ANGLE - 20 + NECK_OFFSET);
delay(1500);
// 正面に戻る
_motion->setServoAngle(NECK_SERVO_CH, NECK_HOME_ANGLE + NECK_OFFSET);
delay(1000);
// 両手を少し広げるポーズ
_motion->setServoAngle(RIGHT_ARM_SERVO_CH, RIGHT_ARM_HOME_ANGLE + 45 + RIGHT_ARM_OFFSET);
_motion->setServoAngle(LEFT_ARM_SERVO_CH, LEFT_ARM_HOME_ANGLE - 45 - LEFT_ARM_OFFSET);
// 音声の長さに合わせて待機(音声が5秒なら、上記のdelay合計と合わせて調整)
delay(2000);
// 最後にホームポジションに戻る
setAllServosToHome();
}
void RobotActionController::resetToHomePosition() {
M5.Log.println("Returning to home position...");
setAllServosToHome();
}
void RobotActionController::waveHand(int channel, int homeAngle, int waveAngle, int times, bool isLeftHand) {
int upAngle, downAngle;
for (int i = 0; i < times; ++i) {
upAngle = isLeftHand ? homeAngle - waveAngle - LEFT_ARM_OFFSET : homeAngle + waveAngle + RIGHT_ARM_OFFSET;
downAngle = isLeftHand ? homeAngle - LEFT_ARM_OFFSET : homeAngle + RIGHT_ARM_OFFSET;
_motion->setServoAngle(channel, upAngle);
delay(200);
_motion->setServoAngle(channel, downAngle);
delay(200);
}
upAngle = isLeftHand ? homeAngle - waveAngle - LEFT_ARM_OFFSET : homeAngle + waveAngle + RIGHT_ARM_OFFSET;
_motion->setServoAngle(channel, upAngle);
}
void RobotActionController::setAllServosToHome() {
_motion->setServoAngle(NECK_SERVO_CH, NECK_HOME_ANGLE + NECK_OFFSET);
_motion->setServoAngle(RIGHT_ARM_SERVO_CH, RIGHT_ARM_HOME_ANGLE + RIGHT_ARM_OFFSET);
_motion->setServoAngle(LEFT_ARM_SERVO_CH, LEFT_ARM_HOME_ANGLE - LEFT_ARM_OFFSET);
}
#ifndef ROBOT_ACTION_CONTROLLER_H
#define ROBOT_ACTION_CONTROLLER_H
#include <M5Unified.h>
#include "M5AtomicMotion.h"
#include <M5EchoBase.h>
#include "sounds.h"
// サーボのチャンネル定義
#define NECK_SERVO_CH 0
#define RIGHT_ARM_SERVO_CH 1
#define LEFT_ARM_SERVO_CH 2
// サーボの角度オフセット
#define NECK_OFFSET (5)
#define RIGHT_ARM_OFFSET (0)
#define LEFT_ARM_OFFSET (2)
// サーボの基本角度
#define NECK_HOME_ANGLE (90)
#define RIGHT_ARM_HOME_ANGLE (0)
#define LEFT_ARM_HOME_ANGLE (180)
class RobotActionController {
public:
RobotActionController();
void begin(M5AtomicMotion* motion, M5EchoBase* echo);
void update();
private:
M5AtomicMotion* _motion;
M5EchoBase* _echo;
// 音声再生のインデックスを管理するメンバー変数
int _goOutSoundIndex;
int _comeHomeSoundIndex;
// 動作シーケンス
void performGoOutSequence();
void performComeHomeSequence();
void resetToHomePosition();
void performExplanationSequence(); // ★★★ 追加 ★★★
// 基本動作
void waveHand(int channel, int homeAngle, int waveAngle, int times, bool isLeftHand);
void setAllServosToHome();
// 状態管理
enum class State {
IDLE,
PERFORMING_ACTION
};
State _currentState;
void processSerialCommand();
};
#endif // ROBOT_ACTION_CONTROLLER_H
#ifndef CALIBRATOR_H
#define CALIBRATOR_H
#include <M5Unified.h>
// キャリブレーション機能をまとめたクラス
class Calibrator {
public:
// コンストラクタ
Calibrator();
// 初期化処理
void begin();
// キャリブレーションのメインループを実行
void run();
private:
// データバッファや状態変数はクラス内に保持します
double vReal[1024]; // FFT_SAMPLES
double vImag[1024]; // FFT_SAMPLES
int16_t i2s_buffer[1024]; // FFT_SAMPLES
int bass_freq_start_index;
int bass_freq_end_index;
// ★★★ 修正点: SERVO_INIT モードを追加 ★★★
enum Mode {
MEASURE_NOISE,
OBSERVE_BEAT,
SERVO_INIT
};
Mode current_mode;
double max_noise;
double total_noise;
int noise_sample_count;
unsigned long measurement_start_time;
bool measurement_done;
double bass_history_array[16]; // BEAT_HISTORY_SIZE
int bass_history_index;
bool bass_history_filled;
// ★★★ 追加: サーボ初期化が完了したかを管理するフラグ ★★★
bool servo_init_done;
// 内部で使う関数
void initI2S_Driver();
void switchMode();
void runMeasureNoiseMode(double current_magnitude);
void runObserveBeatMode(double current_magnitude);
// ★★★ 追加: サーボ初期化モード用の関数 ★★★
void runServoInitMode();
};
#endif // CALIBRATOR_H
#include "Calibrator.h"
#include <M5EchoBase.h>
#include <ArduinoFFT.h>
#include "M5AtomicMotion.h"
extern M5EchoBase echobase;
extern ArduinoFFT<double> FFT;
extern LGFX_Sprite lcdSprite;
extern M5AtomicMotion AtomicMotion;
// ===============================================================
// 設定項目
// ===============================================================
#define I2S_BCLK_PIN 8
#define I2S_LRCK_PIN 6
#define I2S_DATA_PIN 7
#define I2S_DOUT_PIN 5
#define I2C_SDA_PIN 38
#define I2C_SCL_PIN 39
#define SAMPLING_FREQUENCY 16000
#define FFT_SAMPLES 1024
#define BASS_FREQ_MIN 60
#define BASS_FREQ_MAX 120
#define BEAT_HISTORY_SIZE 16
const int NOISE_MEASUREMENT_DURATION_MS = 15000;
// ===============================================================
Calibrator::Calibrator() {
current_mode = MEASURE_NOISE;
max_noise = 0.0;
total_noise = 0.0;
noise_sample_count = 0;
measurement_done = false;
bass_history_index = 0;
bass_history_filled = false;
servo_init_done = false; // ★★★ 追加: フラグを初期化
}
// 初期化処理
void Calibrator::begin() {
M5.Log.println("Calibration Program Started.");
M5.Log.println("Mode 1: Measuring Ambient Noise...");
echobase.init(SAMPLING_FREQUENCY, I2C_SDA_PIN, I2C_SCL_PIN, I2S_DATA_PIN, I2S_LRCK_PIN, I2S_DOUT_PIN, I2S_BCLK_PIN, Wire);
echobase.setMicGain(ES8311_MIC_GAIN_24DB);
initI2S_Driver();
M5.Log.println("Microphone initialized.");
lcdSprite.createSprite(M5.Display.width(), M5.Display.height());
lcdSprite.setTextSize(2);
lcdSprite.setTextColor(TFT_WHITE, TFT_BLACK);
lcdSprite.setTextDatum(MC_DATUM);
double freq_resolution = (double)SAMPLING_FREQUENCY / FFT_SAMPLES;
bass_freq_start_index = (int)ceil(BASS_FREQ_MIN / freq_resolution);
bass_freq_end_index = (int)floor(BASS_FREQ_MAX / freq_resolution);
measurement_start_time = millis(); // 測定タイマーを開始
}
// I2Sドライバの初期化
void Calibrator::initI2S_Driver() {
i2s_config_t i2s_config = {
.mode = (i2s_mode_t)(I2S_MODE_MASTER | I2S_MODE_RX),
.sample_rate = SAMPLING_FREQUENCY,
.bits_per_sample = I2S_BITS_PER_SAMPLE_16BIT,
.channel_format = I2S_CHANNEL_FMT_ONLY_LEFT,
.communication_format = I2S_COMM_FORMAT_STAND_I2S,
.intr_alloc_flags = ESP_INTR_FLAG_LEVEL1,
.dma_buf_count = 4,
.dma_buf_len = 512,
};
i2s_driver_install(I2S_NUM_0, &i2s_config, 0, NULL);
}
// モード切り替え
void Calibrator::switchMode() {
// ★★★ 修正点: 3つのモードを順番に切り替える ★★★
if (current_mode == MEASURE_NOISE) {
current_mode = OBSERVE_BEAT;
M5.Log.println("\nSwitched to Mode 2: Observe Beat Sensitivity");
M5.Log.println("Play music and observe the 'Ratio' in the log.");
} else if (current_mode == OBSERVE_BEAT) {
current_mode = SERVO_INIT;
servo_init_done = false; // サーボ初期化フラグをリセット
M5.Log.println("\nSwitched to Mode 3: Servo Initialization");
} else { // current_mode == SERVO_INIT
current_mode = MEASURE_NOISE;
M5.Log.println("\nSwitched to Mode 1: Measuring Ambient Noise...");
max_noise = 0.0;
total_noise = 0.0;
noise_sample_count = 0;
measurement_done = false;
measurement_start_time = millis(); // モード切替時にタイマーをリセット
}
}
// メインループ
void Calibrator::run() {
while (true) {
M5.update();
if (M5.BtnA.wasPressed()) {
switchMode();
}
// ★★★ 修正点: サーボ初期化モードではマイク処理をスキップ ★★★
if (current_mode == SERVO_INIT) {
runServoInitMode();
} else {
size_t bytes_read = 0;
esp_err_t result = i2s_read(I2S_NUM_0, i2s_buffer, sizeof(i2s_buffer), &bytes_read, portMAX_DELAY);
if (result != ESP_OK || bytes_read == 0) {
continue;
}
long long buffer_sum = 0;
for (int i = 0; i < FFT_SAMPLES; i++) {
buffer_sum += abs(i2s_buffer[i]);
}
if (buffer_sum == 0) {
M5.Log.println("Skipping empty buffer...");
delay(10);
continue;
}
long total = 0;
for (int i = 0; i < FFT_SAMPLES; i++) total += i2s_buffer[i];
double average = (double)total / FFT_SAMPLES;
for (int i = 0; i < FFT_SAMPLES; i++) {
vReal[i] = (double)i2s_buffer[i] - average;
vImag[i] = 0;
}
FFT.windowing(vReal, FFT_SAMPLES, FFT_WIN_TYP_HAMMING, FFT_FORWARD);
FFT.compute(vReal, vImag, FFT_SAMPLES, FFT_FORWARD);
FFT.complexToMagnitude(vReal, vImag, FFT_SAMPLES);
double bass_magnitude = 0;
int freq_range_count = bass_freq_end_index - bass_freq_start_index + 1;
if (freq_range_count > 0) {
for (int i = bass_freq_start_index; i <= bass_freq_end_index; i++) {
bass_magnitude += vReal[i];
}
bass_magnitude /= freq_range_count;
}
if (current_mode == MEASURE_NOISE) {
runMeasureNoiseMode(bass_magnitude);
} else { // OBSERVE_BEAT
runObserveBeatMode(bass_magnitude);
}
}
lcdSprite.pushSprite(0, 0);
delay(1);
}
}
void Calibrator::runServoInitMode() {
// このモードに入った最初の1回だけ実行
if (!servo_init_done) {
M5.Log.println("Setting servos to initial position...");
// AtomicMotionの初期化(既に初期化済みでも問題ない)
while (!AtomicMotion.begin(&Wire, M5_ATOMIC_MOTION_I2C_ADDR, I2C_SDA_PIN, I2C_SCL_PIN, 100000)) {
M5.Log.println("Atomic Motion begin failed");
lcdSprite.fillScreen(TFT_RED);
lcdSprite.drawString("Motion FAIL", M5.Display.width() / 2, M5.Display.height() / 2);
lcdSprite.pushSprite(0, 0);
delay(1000);
}
// サーボを初期位置に設定
AtomicMotion.setServoAngle(0, 90); // 首: 90°
AtomicMotion.setServoAngle(1, 0); // 右手 (仮): 0°
AtomicMotion.setServoAngle(2, 180); // 左手 (仮): 180°
M5.Log.println("Servos set. Neck: 90, Right Arm: 0, Left Arm: 180");
servo_init_done = true;
}
// 画面に情報を表示し続ける
lcdSprite.fillScreen(TFT_DARKGREEN);
lcdSprite.drawString("Mode 3", M5.Display.width() / 2, 20);
lcdSprite.drawString("Servo Init", M5.Display.width() / 2, 50);
lcdSprite.drawString("Neck: 90", M5.Display.width() / 2, 85);
lcdSprite.drawString("L:180 R:0", M5.Display.width() / 2, 115);
}
// モード1: 環境ノイズ測定の処理
void Calibrator::runMeasureNoiseMode(double current_magnitude) {
if (!measurement_done) {
if (millis() - measurement_start_time < NOISE_MEASUREMENT_DURATION_MS) {
lcdSprite.fillScreen(TFT_BLACK);
lcdSprite.drawString("Mode 1", M5.Display.width() / 2, 20);
lcdSprite.drawString("Measuring", M5.Display.width() / 2, 50);
lcdSprite.drawString("Noise...", M5.Display.width() / 2, 80);
M5.Log.printf("Measuring... Current Magnitude: %.0f\n", current_magnitude);
if (current_magnitude > max_noise) max_noise = current_magnitude;
total_noise += current_magnitude;
noise_sample_count++;
} else {
measurement_done = true;
double avg_noise = (noise_sample_count > 0) ? total_noise / noise_sample_count : 0.0;
M5.Log.println("--- Ambient Noise Measurement Result ---");
M5.Log.printf("Max Noise Magnitude: %.0f\n", max_noise);
M5.Log.printf("Avg Noise Magnitude: %.0f\n", avg_noise);
M5.Log.println("----------------------------------------");
M5.Log.println("Recommendation: Set SILENCE_THRESHOLD slightly higher than Max Noise.");
lcdSprite.fillScreen(TFT_BLUE);
lcdSprite.drawString("Done!", M5.Display.width() / 2, 20);
lcdSprite.drawString("Max:", M5.Display.width() / 2, 50);
lcdSprite.drawFloat(max_noise, 0, M5.Display.width() / 2, 80);
}
}
}
// モード2: ビート感度観察の処理
void Calibrator::runObserveBeatMode(double current_magnitude) {
bass_history_array[bass_history_index] = current_magnitude;
bass_history_index = (bass_history_index + 1) % BEAT_HISTORY_SIZE;
if (bass_history_index == 0) bass_history_filled = true;
double avg_bass = 0.0;
if (bass_history_filled) {
double sum = 0.0;
for(int i=0; i<BEAT_HISTORY_SIZE; i++) sum += bass_history_array[i];
avg_bass = sum / BEAT_HISTORY_SIZE;
}
lcdSprite.fillScreen(TFT_BLACK);
lcdSprite.drawString("Mode 2", M5.Display.width()/2, 20);
lcdSprite.drawString("Current:", M5.Display.width()/2, 50);
lcdSprite.drawFloat(current_magnitude, 0, M5.Display.width()/2, 70);
lcdSprite.drawString("Avg:", M5.Display.width()/2, 100);
lcdSprite.drawFloat(avg_bass, 0, M5.Display.width()/2, 120);
if(avg_bass > 0) {
M5.Log.printf("Current: %.0f, Avg: %.0f, Ratio: %.2f\n", current_magnitude, avg_bass, current_magnitude / avg_bass);
}
}
switchbot_controller.py
PythonMonitor Switchbot status, command behavior to AtomS3R.
# 使い方
# python switchbot_controller.py
# テスト TEST_GO_OUT、TEST_COME_HOME、RESETコマンドを入力してM5AtomS3に送信できます。
import requests
import json
import time
import hashlib
import hmac
import base64
import uuid
import serial
import threading
import sys
# --- 設定項目 ---
# SwitchBotアプリから取得したトークンとシークレットを設定
SWITCHBOT_TOKEN = "YOUR_SWITCHBOT_TOKEN"
SWITCHBOT_SECRET = "YOUR_SWITCHBOT_SECRET"
# 上記で確認した開閉センサーのデバイスIDを設定
CONTACT_SENSOR_ID = "YOUR_CONTACT_SENSOR_ID"
# M5AtomS3シリアルポート
SERIAL_PORT = "/dev/ttyACM0"
BAUD_RATE = 115200
# 判定ロジックのパラメータ (秒)
POLLING_INTERVAL = 5 # SwitchBot APIをポーリングする間隔
# --- グローバル変数 ---
ser = None
last_status = {
"openState": "close",
"moveDetected": False
}
# デモモード用の変数
demo_thread = None
demo_running = False
def build_switchbot_headers():
"""SwitchBot API用のヘッダーを生成する"""
t = int(round(time.time() * 1000))
nonce = str(uuid.uuid4())
string_to_sign = f"{SWITCHBOT_TOKEN}{t}{nonce}"
sign = base64.b64encode(
hmac.new(
SWITCHBOT_SECRET.encode('utf-8'),
msg=string_to_sign.encode('utf-8'),
digestmod=hashlib.sha256
).digest()
).decode('utf-8')
return {
"Authorization": SWITCHBOT_TOKEN,
"Content-Type": "application/json; charset=utf-8",
"t": str(t),
"sign": sign,
"nonce": nonce
}
def get_device_status(device_id):
"""指定したデバイスの状態を取得する"""
api_url = f"https://api.switch-bot.com/v1.1/devices/{device_id}/status"
headers = build_switchbot_headers()
try:
response = requests.get(api_url, headers=headers, timeout=10)
response.raise_for_status()
return response.json()
except requests.exceptions.RequestException as e:
print(f"Error getting device status: {e}")
return None
def send_command_to_m5(command):
"""M5AtomS3にシリアルコマンドを送信する"""
if ser and ser.is_open:
try:
print(f"Sending command to M5: {command}")
ser.write(f"{command}\n".encode('utf-8'))
except serial.SerialException as e:
print(f"Error writing to serial port: {e}")
else:
print("Serial port is not open.")
def monitor_contact_sensor():
"""開閉センサーの状態を監視し、ロジックを実行する"""
global last_status
print("Starting contact sensor monitoring...")
while True:
status_data = get_device_status(CONTACT_SENSOR_ID)
if status_data and status_data.get("statusCode") == 100:
body = status_data["body"]
current_open_state = body.get("openState", "close")
current_move_detected = body.get("moveDetected", False)
if current_open_state == "open" and last_status["openState"] == "close":
print(f"Door opened. Checking motion status...")
if last_status["moveDetected"]:
print("--- Event: GO_OUT (Motion was detected before door opened) ---")
send_command_to_m5("GO_OUT")
else:
print("--- Event: COME_HOME (No motion detected before door opened) ---")
send_command_to_m5("COME_HOME")
elif current_open_state == "close" and last_status["openState"] == "open":
print("--- Event: DOOR CLOSED, resetting robot ---")
send_command_to_m5("RESET")
last_status["openState"] = current_open_state
last_status["moveDetected"] = current_move_detected
else:
print("Failed to get valid status from API.")
time.sleep(POLLING_INTERVAL)
# ★★★ デモモード(自己紹介あり)を実行する関数 ★★★
def run_explain_demo():
"""自己紹介を含む動作パターンを繰り返す"""
global demo_running
print("\n--- Starting Explain Demo Mode ---")
print("Demo will loop until another command is entered.")
while demo_running:
print("Demo: Running EXPLAIN")
send_command_to_m5("EXPLAIN")
for _ in range(15):
if not demo_running: break
time.sleep(1)
if not demo_running: break
print("Demo: Running TEST_GO_OUT")
send_command_to_m5("TEST_GO_OUT")
for _ in range(10):
if not demo_running: break
time.sleep(1)
if not demo_running: break
print("Demo: Running TEST_COME_HOME")
send_command_to_m5("TEST_COME_HOME")
for _ in range(10):
if not demo_running: break
time.sleep(1)
if not demo_running: break
print("Demo: Running TEST_GO_OUT")
send_command_to_m5("TEST_GO_OUT")
for _ in range(10):
if not demo_running: break
time.sleep(1)
if not demo_running: break
print("Demo: Running TEST_COME_HOME")
send_command_to_m5("TEST_COME_HOME")
for _ in range(10):
if not demo_running: break
time.sleep(1)
if not demo_running: break
print("--- Explain Demo Mode Stopped ---")
# ★★★ 追加: デモモード(ループ)を実行する関数 ★★★
def run_loop_demo():
"""「お出迎え」と「お見送り」を繰り返す"""
global demo_running
print("\n--- Starting Loop Demo Mode ---")
print("Demo will loop until another command is entered.")
while demo_running:
print("Loop Demo: Running TEST_GO_OUT")
send_command_to_m5("TEST_GO_OUT")
for _ in range(10):
if not demo_running: break
time.sleep(1)
if not demo_running: break
print("Loop Demo: Running TEST_COME_HOME")
send_command_to_m5("TEST_COME_HOME")
for _ in range(10):
if not demo_running: break
time.sleep(1)
if not demo_running: break
print("--- Loop Demo Mode Stopped ---")
def test_command_interface():
"""テスト用のコマンドをキーボードから受け付ける"""
global demo_thread, demo_running
print("\n--- Test Command Interface ---")
print("Enter command (TEST_GO_OUT, TEST_COME_HOME, RESET, EXPLAIN, DEMO_EXPLAIN, DEMO_LOOP) and press Enter.")
print("Type 'exit' to quit.")
valid_commands = ["TEST_GO_OUT", "TEST_COME_HOME", "RESET", "GO_OUT", "COME_HOME", "EXPLAIN", "DEMO_EXPLAIN", "DEMO_LOOP"]
while True:
try:
cmd = input("> ")
# デモが実行中であれば、まず停止させる
if demo_running:
print("Stopping current demo...")
demo_running = False
if demo_thread and demo_thread.is_alive():
demo_thread.join() # スレッドが完全に終了するのを待つ
demo_thread = None
if cmd.lower() == 'exit':
break
if cmd == "DEMO_EXPLAIN":
if not (demo_thread and demo_thread.is_alive()):
demo_running = True
demo_thread = threading.Thread(target=run_explain_demo, daemon=True)
demo_thread.start()
elif cmd == "DEMO_LOOP":
if not (demo_thread and demo_thread.is_alive()):
demo_running = True
demo_thread = threading.Thread(target=run_loop_demo, daemon=True)
demo_thread.start()
elif cmd in valid_commands:
send_command_to_m5(cmd)
else:
print("Unknown command.")
except (EOFError, KeyboardInterrupt):
if demo_running:
demo_running = False
break
if demo_running:
demo_running = False
print("Exiting test interface.")
sys.exit(0)
def main():
"""メイン処理"""
global ser
try:
ser = serial.Serial(SERIAL_PORT, BAUD_RATE, timeout=1)
print(f"Opened serial port: {SERIAL_PORT}")
except serial.SerialException as e:
print(f"Error opening serial port {SERIAL_PORT}: {e}")
return
print("Attempting to connect to SwitchBot API to get initial status...")
initial_status = get_device_status(CONTACT_SENSOR_ID)
if initial_status and initial_status.get("statusCode") == 100:
print("Successfully connected to SwitchBot API.")
last_status["openState"] = initial_status["body"].get("openState", "close")
last_status["moveDetected"] = initial_status["body"].get("moveDetected", False)
print(f"Initial status: openState={last_status['openState']}, moveDetected={last_status['moveDetected']}")
print("Starting automatic sensor monitoring in the background.")
monitor_thread = threading.Thread(target=monitor_contact_sensor, daemon=True)
monitor_thread.start()
else:
print("\n" + "="*50)
print("WARNING: Could not connect to SwitchBot API.")
print("Automatic sensor monitoring is DISABLED.")
print("You can still use the manual test commands.")
print("="*50 + "\n")
test_command_interface()
if ser and ser.is_open:
ser.close()
print("Closed serial port.")
if __name__ == "__main__":
main()
Comments