Welcome to the world of LEGO MINDSTORMS EV3 and voice with Alexa! These instructions are part of a series created to show you how to connect your EV3 Brick to Alexa, and build custom voice-based interactions that incorporate EV3 motors and sensors. With this knowledge, you can build and submit your own creation to the LEGO MINDSTORMS Voice Challenge – Powered by Alexa. Don’t be shy bringing in other LEGO elements — let your imagination run wild!
Here is what is included in this series:
- Setup: Get your EV3 development environment setup.
- Mission 1: Get your EV3 Brick connected to a compatible Echo device, and reacting to the wake word.
- Mission 2: Build EV3RSTORM's legs, and react to music played from your Echo device.
(you are here)
- Mission 3: Add arms and a cannon to EV3RSTORM, and react to voice commands using an Alexa skill you create.
- Mission 4: Give EV3RSTORM eyes (IR Sensor), and make your Alexa skill react anytime an intruder is detected.
Helpful resources
Throughout these missions, several resources will be referenced along the way. These references will come in handy as you learn more about the different ways you can create your own LEGO MINDSTORMS EV3 creation that works with Alexa:
- Alexa Gadgets Toolkit Overview
- Alexa Skills Kit Documentation
- EV3 Python API Documentation
- EV3Dev Python Robot Examples
At this point, you should have completed the Setup Guide, and Mission 1, where you connected your EV3 Brick to your Echo device in order to get it to react when you say the Alexa wake word. You're on a roll!
For this mission, you will explore another capability that can be added to your gadget: reacting to music. You will:
- Build EV3RSTORM’s legs
- Add reacting to music as a new capability to your gadget
- Make EV3RSTORM dance when music is playing from the connected Echo device
In the end, EV3RSTORM’s legs should react like this:
The code for this mission can be found in the /alexa-gadgets-mindstorms/mission-02
folder in your VS Code workspace. Let’s get building!
In order to get EV3RSTORM to dance, you will need to have followed the EV3RSTORM build instructions up to page 51. This will at least get you to the point of EV3RSTORM’s legs being built, and the ability for him to dance. The build should look like this:
Using the same VS Code workspace from Mission 1, open the /alexa-gadgets-mindstorms/mission-02
folder. You should see a familiar series of files: an INI file, and Python file. This will be a common pattern for all the missions you will be working through.
As with Mission 1, the first thing you need to do to get your EV3 Brick to react to music, is to add your Amazon ID and Alexa Gadget Secret, and declare a new capability that gives your EV3 code access to the tempo of the music that is being played from the connected Echo device. Within the /alexa-gadgets-mindstorms/mission-02
folder, open the mission-02.ini
file, which should look like:
[GadgetSettings]
amazonId = YOUR_GADGET_AMAZON_ID
alexaGadgetSecret = YOUR_GADGET_SECRET
[GadgetCapabilities]
Alexa.Gadget.MusicData = 1.0 - tempo
The above should look familiar. Replace YOUR_GADGET_AMAZON_ID
and YOUR_GADGET_SECRET
with the Amazon ID and Alexa Gadget Secret tied to the Alexa Gadget you registered in the Amazon Developer Console.
With those changes, your EV3 Brick will be able to connect to your Echo device as an Alexa Gadget, and you’ll see a different capability has been specified. With the tempo
capability within MusicData
specified, you will now be able to access the tempo data you need to get your EV3RSTORM build to dance.
There are more capabilities you can explore in the documentation.
Getting the music tempo dataWithin the mission-02
folder, open the mission-02.py
file. To get EV3RSTORM to dance, you first need to obtain the tempo of the music playing from the connected Echo device. To do this, you can override the on_alexa_gadget_musicdata_tempo
callback method. Adding this method to the MindstormsGadget
class gives us access to the tempo directive. With the tempo directive, we can get the average beat per minute of a song currently playing. Here is how we obtain this data:
def on_alexa_gadget_musicdata_tempo(self, directive):
"""
Provides music tempo of a song currently playing.
:param directive: the music data directive containing the beat per minute value
"""
tempo_data = directive.payload.tempoData
for tempo in tempo_data:
print("tempo value: {}".format(tempo.value))
if tempo.value > 0:
"""
Music is playing
"""
elif tempo.value == 0:
"""
Music is not playing
"""
Dancing to the musicBefore EV3RSTORM can dance, you will need to initialize the motors that control the legs. Based on the building instructions, you should have large motors connected to the following ports:
- Left motor connected to Port B
- Right motor on Port C
To control the motors, you can use ev3dev2.motor
API within ev3dev. From that module, you need to import the port names and the LargeMotor
class. Then, you can initialize the left and right motors using the LargeMotor
class and its corresponding port name. The following code within the MindstormsGadget
class perform motors initialization:
from ev3dev2.led import Leds
from ev3dev2.sound import Sound
from ev3dev2.motor import OUTPUT_B, OUTPUT_C, LargeMotor
class MindstormsGadget(AlexaGadget):
"""
A Mindstorms gadget that performs movement in sync with music tempo.
"""
def __init__(self):
"""
Performs Alexa Gadget initialization routines and ev3dev resource allocation.
"""
super().__init__()
"""
ev3dev initialization
"""
self.leds = Leds()
self.sound = Sound()
self.left_motor = LargeMotor(OUTPUT_B)
self.right_motor = LargeMotor(OUTPUT_C)
With the music tempo data and the motor controls set up, you can orchestrate the robot movement to perform a dance. There are many ways to do this, feel free to implement your favorite dance moves. In this example, there are two parts to the dance move, the initial pose
and a dance loop
. When a new song is started, the robot will strike a pose
.
def on_alexa_gadget_musicdata_tempo(self, directive):
"""
Provides the music tempo of the song currently playing on the Echo device.
:param directive: the music data directive containing the beat per minute value
"""
tempo_data = directive.payload.tempoData
for tempo in tempo_data:
print("tempo value: {}".format(tempo.value))
if tempo.value > 0:
"""
dance pose
"""
self.right_motor.run_timed(speed_sp=750, time_sp=2500)
self.left_motor.run_timed(speed_sp=-750, time_sp=2500)
self.leds.set_color("LEFT", "GREEN")
self.leds.set_color("RIGHT", "GREEN")
time.sleep(3)
"""
starts the dance loop
"""
self.trigger_bpm = "on"
threading.Thread(target=self._dance_loop, args=(tempo.value,)).start()
elif tempo.value == 0:
"""
stops the dance loop
"""
self.trigger_bpm = "off"
self.leds.set_color("LEFT", "BLACK")
self.leds.set_color("RIGHT", "BLACK")
After that, it will go into a dance loop
until the song ends. The movement speed in the loop will be based on the music tempo:
def _dance_loop(self, bpm):
"""
Perform motor movement in sync with the beat per minute value from tempo data.
:param bpm: beat per minute from AGT
"""
color_list = ["GREEN", "RED", "AMBER", "YELLOW"]
led_color = random.choice(color_list)
motor_speed = 400
milli_per_beat = min(1000, (round(60000 / bpm)) * 0.65)
print("Adjusted milli_per_beat: {}".format(milli_per_beat))
while self.trigger_bpm == "on":
"""
Alternate led color and motor direction
"""
led_color = "BLACK" if led_color != "BLACK" else random.choice(color_list)
motor_speed = -motor_speed
self.leds.set_color("LEFT", led_color)
self.leds.set_color("RIGHT", led_color)
self.right_motor.run_timed(speed_sp=motor_speed, time_sp=150)
self.left_motor.run_timed(speed_sp=-motor_speed, time_sp=150)
time.sleep(milli_per_beat / 1000)
print("Exiting BPM process.")
Running the EV3 codeAfter reviewing the code, you can run it and see how your EV3 Brick reacts. As noted above, the code you run will make EV3RSTORM strike a pose and start dancing based on the tempo of the music played from the Echo device that your EV3 Brick is connected to.
To see this in action, follow similar steps from Mission 1:
1. Make sure VS Code is connected to your EV3 Brick. You should see a green dot next to your EV3 Brick’s name in the EV3 DEVICE BROWSER.
2. Copy the missions folder in your computer workspace to your EV3 Brick. Click on the Send workspace to device button next to the EV3DEV DEVICE BROWSER text that shows up when you hover over the text.
When you click this button, you will see the status of the files being copied in the bottom-right of VS Code, and will update the /alexa-gadgets-mindstorms/mission-02
folder on your EV3 Brick with the code you modified.
3. Once the files have copied over to your EV3 Brick, you can run the Python code by navigating to the Python file in the EV3 DEVICE BROWSER, right-click on the Python file, and select Run:
4. Once the program starts, you should see a prompt in the debug console when your EV3 Brick connects to your Echo device.
You will also see a message on the screen of your EV3 Brick.
Now comes the fun part! Try the following to see how your EV3 Brick reacts:
“Alexa, play music”
When music starts playing, EV3RSTORM’s legs should strike a pose, and then start dancing along to the music.
“Alexa, next song”
Try changing the song to see how the changing tempo data affects how EV3RSTORM dances.
“Alexa, stop”
When the music stops, so should EV3RSTORM.
You should also see messages printed to the debug console that look like this:
You did it! EV3RSTORM has learned some new dance moves. You can always stop the program by clicking the Stop button at the top of the screen in Debug mode, or pressing the back button on your EV3 Brick.
Note: If you run into connectivity issues, you can try unpairing your EV3 Brick from your Echo device by following the Unpairing your EV3 Brick instructions in Mission 1. You may also want to try forgetting your EV3 Brick from the Bluetooth settings of your Echo device before pairing again.Other things to explore
For this mission, try making EV3RSTORM dance differently based on different tempo data, change the dance moves, or use LEDs and Sound too. Reacting to tempo data is one of the things that you can do with your creation by leveraging the Alexa Gadgets Toolkit.
You can also do things like react to notifications, timers, wakeword, and more. Learn more by reviewing the documentation, or by referencing the INI and Python file within the Alexa Gadgets Kitchen Sink example for Raspberry Pi.
You can also explore other capabilities of ev3dev by referring to the documentation.
What to do nextIn Mission 3, you’ll create an Alexa Skill that allows you to control EV3RSTORM's motors with your voice!
Comments