"Auxi" is an abbreviation for the Latin word auxilium, which means "assistance." Auxi, as her name implies, is a helper. It assists people in carrying items with its small computer brain and computer vision. For autonomous movement, TensorFlow-based machine learning on a Raspberry Pi is used. The supervised learning technique known as behavioural cloning is used in this project.
Auxi was recently showcased at the AI Fest expo. Auxi was created to be used as a shopping companion, so the development process and machine learning training are centred on that goal.
The purpose was to move avoiding obstacles (humans) while carrying things to a certain position. Helping people from the burden of carrying everything in their hand. The build is still in it's development stage. So far, the things implemented are ML-based driving.
The objective was to move while avoiding obstacles and carrying items to a predetermined location. Taking away the burden of carrying everything in one's hand. The construction is still in its early stages. ML-based driving has so far been implemented.
Gathering ThingsI'll go through the abstract part of the construction process, removing some of the confusing parts and adding workarounds for any problems I encounter. If you have any questions, please visit the official DonkeyCars website.
Hardware PartWe chose the DonkeyCar platform because it includes all of the necessary components, code, and tutorials. To begin developing autonomous vehicles. To buy one, visit https://www.donkeycar.com/.
The donkey car kit that we will need for our build includes the following items:
- RC car
- Servo driver board
- Top mount parts (can be 3D-printed or laser-cut 3D model available at Thingiverse)
- Screws and connectors
- Jumper wires
- Power bank
Go here for more information on purchasing the appropriate items.
DonkeyCar Simulator
Their software, which is based on the Unity game engine, can simulate a virtual autonomous car driving through a virtual environment, with which we can interact just as we would in real life. Go to this page to learn how, or skip this section and continue the build in the real world.
If the battery connector is not of the XT60 type, replace it with a new XT60 (trust me, this simple swap can save you a lot of trouble) and if the battery supplied with your RC car is not a LiPo battery, replace it with a LiPo battery (link in the product description area) because LiPo batteries last much longer and are easy to charge using an appropriate charger. Don't forget to change the battery jumper on top of the ESC while swapping batteries.
Using the provided screws, attach the 3D-printed/laser-cut parts and top mounts. Make sure the wires from the servo and ESC go through the top mount.Attach the 3D-printed/laser-cut parts and top mounts with the provided screws. Ensure that the servo and ESC wires pass through the top mount.
Software PartFor the software part, there are 2 subparts
- Setting up software on Raspberry Pi.
- Software installation on the Host PC (In my case, I'll be using Linux PC because the entire workaround is much easier to set up on that platform.)
A computer is required to install the operating system on the Raspberry Pi.
- Download Raspbian lite from the link provided in the product description
- To burn OS image onto your SD card, you will need Etcher by balena.io.
- Download Etcher from here https://www.balena.io/etcher/
- Connect the SD card using an adapter and then run the etcher program.
On Etcher:
- Select the downloaded Raspbian lite file
- Select the connected SD card
- Then click on Flash
Allow the programme to run until you receive a success message. Depending on the write speed of your SD card, this process could take up to 5 minutes or more.
- Once it's done remove the SD card and reinsert the SD card once more.
Boot drive should appear.
Let's make some changes to the burned image; these changes will later allow us to use the RPi in a headless mode, i.e. without the need for a display and input devices directly connected to the RPi.
Mod 1:
- Open up your preferred editor
- Paste the content below and edit the <your network name> to your wifi modem username/ssid, edit <your password> to your wifi password.
country=US
ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
network={
ssid="<your network name>"
psk="<your password>"
}
- Save this file to boot partition of your burned SD card with the file name
wpa_supplicant.conf
Mod 2:
Enabling SSH before the first boot allows you to access the RPi's terminal without requiring a connected display. SSH is disabled by default on Raspbian; this simple step will fix that.
The best way is to open up notepad++ and save that file as SSH without typing in anything. Make sure there's no extension for the file.
Eject the SD card and insert it into the RPi. Start the Pi (make sure you have at least a 5V 2amp power supply).
If your Pi does not automatically connect to your WiFi, look for troubleshooting tips using keywords like wpa supplicant.conf headless raspberry pi
Remote Connection:
The next step is to connect to the pi remotely, for which we will need your pi's IP address. https://www.raspberrypi.org/documentation/remote-access/ip-address.md
Alternatively, you can connect your Raspberry Pi to a monitor via HDMI and use a keyboard and mouse as input devices.
Pi login and password by default are pi
& raspberry
respectively.
Then launch a terminal and enter the command.
ifconfig
This should give you the IP address as shown below.
Now to remotely access the terminal use
- Putty or any other alternative ssh client.
- Type in the Hostname/IP address, once the window asks for username and password type in the default credentials. Agree to connect via ssh.
You will be presented with the terminal window of Raspi on your remote PC/Smartphone.
As a word of advice, it's always good to update your OS once you boot it up.
sudo apt-get update
sudo apt-get upgrade
Once this is done type in
sudo raspi-config
A new window will pop up,
Change username and password (optional)
enable I2C and camera from the interfacing option
Also, select expand filesystem from advanced options.
Hit enter and reboot
Installing Files:
- Dependencies
sudo apt-get install build-essential python3 python3-dev python3-pip python3-virtualenv python3-numpy python3-picamera python3-pandas python3-rpi.gpio i2c-tools avahi-utils joystick libopenjp2-7-dev libtiff5-dev gfortran libatlas-base-dev libopenblas-dev libhdf5-serial-dev git
sudo apt-get install libilmbase-dev libopenexr-dev libgstreamer1.0-dev libjasper-dev libwebp-dev libatlas
- Virtual env
python3 -m virtualenv -p python3 env --system-site-packages
echo "source env/bin/activate" >> ~/.bashrc
source ~/.bashrc
- Install Donkey Car
mkdir projects
cd projects
git clone https://github.com/autorope/donkeycar
cd donkeycar
git checkout master
pip install -e .[pi]
pip install tensorflow==1.13.1
Validate TensorFlow install
python -c "import tensorflow"
On Host PC (Linux Machine):- Using terminal Install minconda python
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
bash ./Miniconda3-latest-Linux-x86_64.sh
- Create a folder with name projects and change to that folder
mkdir projects
cd projects
- Clone DonkeyCar Github repo
git clone https://github.com/autorope/donkeycar
cd donkeycar
git checkout master
- To create a python anaconda environment
conda update -n base -c defaults conda
conda env remove -n donkey
conda env create -f install/envs/ubuntu.yml
conda activate donkey
pip install -e .[pc]
- Optional Install TensorFlow GPU (For host PC with Nvidia based GPU)
conda install tensorflow-gpu==1.13.1
- Create a local folder
donkey createcar --path ~/mycar
Note: After closing the Anaconda Prompt, when you open it again, you will need to type
conda activate donkey
to re-enable the mappings to donkey specific Python libraries.
Adding Brains to the HardwareIt's time to combine both the Pi and the Car. Since the Pi is not capable of controlling the motors on it's own; a motor driver board is used as an intermediate to establish communication between both devices. This motor driver uses the I2C protocol to communicate.
Connections From RPi to Motor Driver
- Connect VCC of the motor driver to any 3.3V pin of RPI.
- Connect GND of the motor driver to any GND pin of RPI.
- Connect SDA & SCL to the same pins on RPi.
Refer pinout diagram of Raspi
Connection From RC Car to the Motor Driver
Two cables coming from the RC car is connected to the channel 0&1 of the motor driver ensuring the polarity matches
- The throttle cable (cable from the ESC) runs to channel 0.
- steering cable (cable coming from the front servo) is channel 1.
Please make sure that the cables are aligned in such a way that the red wire from car touches the red colored connector on the driver and black wire from the car to the black connector on the driver. Similarly, the data wire (in my case white) goes to the yellow/orange part of the connector.
Connection From Raspi to the Camera Module
Connecting the camera module is easy, follow this video given by the Raspberry Pi Foundation.
Calibrating RC CarNow we need to configure the speed and turning angle of the RC car.
SSH to Pi:
nano ~/mycar/myconfig.py
Cars settings are stored inside this file. To edit the file run the command given above.
For calibration steps visit http://docs.donkeycar.com/guide/calibrate/
DrivingOnce calibration is done, SSH into your pi using your smartphone or PC and type in this command to initiate driving mode.
cd ~/mycar
python manage.py drive
Now open up a browser on another device connected to the same network as the Pi. Search for this
<your car's IP's address>:8887
You will be presented with a web-controller like this.
Features
- Recording - Press record data to start recording images, steering angels and throttle values.
- Throttle mode - Option to set the throttle as constant. This is used in races if you have a pilot that will steer but doesn't control throttle.
- Pilot mode - Choose this if the pilot should control the angle and/or throttle.
- Max throttle - Select the maximum throttle.
Some Modes of Controlling the Car:
Using Smartphone
On your phone, you can now press start to set your phone's current tilt to be zero throttle and steering. Now tilting your phone forward will increase throttle and tilting it side to side will turn the steering.
Using Keyboard
space
: stop car and stop recordingr
: toggle recordingi
: increase throttlek
: decrease throttlej
: turn leftl
: turn right
Using PS/XBOX Game Controller
There are two ways to use a game controller.
1. Directly connect to the Raspberry Pi using a Bluetooth controller. Link
2. Connect to the host PC and then control the car through the web controller.
But in my case, I found using a game controller with the host PC much easier to use.
Don't forget to change the Control Mode according to the mode you chose
Try driving the car around and getting used to your preferred control mode.
Collecting DataTo collect good data, practice driving around the track a couple of times.
- Restart the python mange.py process
python manage.py drive
to create a new session when you are ready to record data. PressStart Recording
if using web controller. The joystick will auto record with any non-zero throttle. - If you crash or run off the track press
Stop
Car immediately to stop recording. If you are using a joystick tap theTriangle button
to erase the last 5 seconds of records. - After you've collected 10-20 laps of good data (5-20k images) you can stop your car with
Ctrl-c
in the ssh session for your car. - The data you've collected is in the data folder in the most recent tub folder.
Repeat this procedure until you get satisfactory results.
Training ML ModelIt's not viable to train ML models on the RPi itself since RPi lacks the power needed to train such a model. We need to transfer the data from the RPi to a capable PC to perform the training.
The best way is to use a host Linux machine connected to the same network and rsync the data to the host.
rsync -r pi@<your_pi_ip_address>:~/mycar/data/ ~/mycar/data/
Due to some unfortunate events, I couldn't do that on my Linux machine so I came up with a way to transfer files to a Windows machine.
The solution was to use an application called WinSCP which ssh into pi and transfers data. The UI of the software was easy to navigate, you can upload and download contents to and from the RPi to the windows pc. Once again there was a problem. The problem was that the download speed from the RPi was so slow that it took almost 1hr to transfer data from each lap. After a few closer inspection I finally knew what was wrong. The data collection drive produced more than 25K individual pictures & corresponding turn angle. Sending each file one by one proved to be the problem. After a few searches yet again I came up with a solution. The solution was to zip the file inside the RPi then transfer that zipped file to the host computer. Then extracting the compressed file will give me the dataset. It worked flawlessly.
The Alternate Steps I Took:
To Download the Files
- Compress in Raspi:
tar -cvzf filename.tar.gz ~/mycar/tub
- Transfer the zip file using WinSCP
- Uncompress in Ubuntu:
tar -xvzf ~/documents/filename.tar.gz
Move this dataset (tub folder) to the ~/mycar/data/
folder.
Training Data
- Restart the virtual env
virtualenv env -p python3
source env/bin/activate
- Move to donkeycar folder
cd donkeycar
- Start Training
python ~/mycar/manage.py train --tub ~/mycar/tub --model ~/mycar/models/mypilot.h5
Training should finish within a few epochs.
Once it is done fetch the trained model from ~/mycar/models/
folder.
Rsync it to folder ~/mycar/models/
inside the Rpi. If Rsync is not working use WinSCP.
Once the trained model is transferred, start the car to drive in autonomous mode.
python manage.py drive --model ~/mycar/models/mypilot.h5
Go to the web controller <your car's IP's address>:8887
Change mode & pilot
to any one of these to experience autonomous driving.
a. User: As you guessed, this is where you are in control of both the steering and throttle control.
b. Local Angle: Not too obvious, but this is where the trained model (mypilot from above) controls the steering. The Local refers to the trained model which is locally hosted on the raspberry-pi.
c. Local Pilot: This is where the trained model (mypilot) assumes control of both the steering and the throttle. As of now, it's not very reliable.
Repeat the training process until you get hold of a good set of datasets and models.
Comments