In order to build a really good line follower you can't avoid video processing. Installation of OpenCV or even Tensorflow on a Raspberry Pi is a must....so here we go.....or rather get familiar with the topic by simply clicking a video processing in the browser and you get a, maybe not perfect LineFollower but you understand for the first time how all the video filters work and what you can achieve with them....here we go.
All of the coding is done in the browser just by connecting different video filter via virtual wires and send the calculated steering angle to the servo control of the PiCar.
Demo video pipeline is available here: https://www.brainbox-demo.de/circuit/?global=/video/WebCamTensorflow.brain
Prepare your Raspberry PiSoftware to use
For the whole magic to work, you need to install some software on the Raspberry Pi. I have used BrainBox for this. It is free and open source. It gives you the possibility to control the camera of the Raspberry Pi and the servos of SunFounder PiCar with a browser based drag&drop interface.
For the video stream I use mjpg-streamer. It is a command line application that copies JPEG frames from one or more input plugins (e.g. your camera) to multiple output plugins (e.g. http). It can be used to stream JPEG files over an IP-based network from a webcam to various types of viewers such as Chrome, Firefox.
Installation
Open a console on your Raspberry Pi and install the NodeJS app
git clone https://github.com/freegroup/brainbox.git
cd brainbox
npm installnow the app is installed and ready to run. You can start the brainbox the first time to see if everything is fine.
Because brainbox uses the full functionality of the GPIO port (PWM on all ports) the start with "sudo" is necessary because the used library "pigpio" requires this.
sudo node ./app/backend/index.jsNow you can open Chrome or Firefox to go to the brainbox UI. On the console you can find the port and IP address to use. Normally it is http://localhost:7400
After the installation of the mjpg-stream you can start them with your required camera to use.
cd <path_of_mjpg-streamer-binary>
export LD_LIBRARY_PATH="$(pwd)"
./mjpg_streamer -i "input_uvc.so -d /dev/video0"You can check if the video stream is available just by open a browser with your raspberry pi IP address
http://<pi-address>:8080/?action=stream
You should now see a live stream of your raspberry pi camera
Thats it. Now you can open the sample document with the SunFounder car video pipeline of brainbox or play with the video filter which are part of your brainbox installation.
Final CommentI agree, that this is not the perfect setup for a realtime video and motor control application. But I thought it was great to take the first steps in video processing, filters, object recognition with ML, without the huge burden of installing OpenCV and Tensoflow. The advantage is that you can see the effect of each filter immediately. This knowledge you have gained with brainbox can be transferred to your coding later on, if you want to continue working on the topic.













Comments