- Independent Operation
- Obstacle & Collision Avoidance
- AI Recognition
- Raman SERS Covid19 Detection System
- Long-Range 802.11ah on-board Wi-Fi and remote base station
- High-speed Ethernet connection from FMUK66 to Navq
VADS is designed around the NXP Hovergames Navq Arm Linux system which is mounted in the Hovergames drone and paired to the realtime NXP FMUK66 MCU controlling the drone electronics and motors over a two-wire Ethernet interface. The VADS system is built around the Navq embedded NXP Linux computer with a 64-bit ARM i.MX8M Mini CPU, a high-def Google Coral camera, Ethernet communications port, 802.11ah long-range Wi-Fi, 802.11ac 5 GHz Wi-Fi, an Intel D455 RealSense 3D imaging system, and a Hamamatsu C13560 Raman SERS spectroscopy molecular detection system.
The molecular detection system is able to detect Covid19 viral particles in sampled air while hovering above target groups of human beings. Realtime Covid19 detection allows first responders to isolate and treat Covid19 infections before they spread to nearby persons. The detection system is designed to fly overhead above outdoor gatherings of people and rapidly detect the Covid19 virus.
The AI recognition system allows the drone to detect and focus on groups of human beings while saving power by keeping the Raman detection system offline except when needed. When a group is detected, the AI recognition powers the Raman SERS detection hardware, which samples the overhead air and checks for the presence of the Covid19 virus.
I like to work with an OS that can be updated live from the Internet and that parallels my x86_64 development system. So I created an Ubuntu 64-bit ARM userspace OS and mated it with an NXP Linux 5.4.47 kernel. I had to custom compile several components in the userspace graphics pipeline to get accelerated graphics working with the i.MX8 Mini Vivante GPU. These are included in the OS image along with a Wayland/Weston GUI and Hovergames2 background image running over the Navq’s HDMI interface. If you connect a mouse and keyboard you can use the Navq just like a desktop Linux system.
Please see my forum post for the SD card image I used for the VADS project work
https://www.hackster.io/contests/hovergames2/discussion/posts/7905#challengeNav
This SD card image includes the latest version of ROS2, Mavlink and RTSP compiled on the Navq itself. These services are configure to run by default using systemd and to communicate with the FMUK66 PX4 system over the two-wire Ethernet interface.
The kernel source code can be checked out from my github fork in case you want to modify it
git clone https://github.com/droidifi/linux-imx.git -b imx_5.4.47_2.2.0_navq_hovergames2
Obstacle & Collision AvoidanceThe Intel D455 3D imaging system contains multiple cameras that create a 3D view of object in front of the drone with a range of up to 50 meters. The D455 is connected to the Navq Linux system over a USB bus and the Navq transmits the raw data over its Ethernet port to the FMUK66 PX4 drone control realtime MCU. The FMUK66 samples the data several times a second and adjusts the drone’s course to avoid any obstacles encountered in its original flight path.
I used the ROS2 github project for the Intel RealSense Devices
https://github.com/intel/ros2_intel_realsense
And the ROS2 to PX4 obstacle detection and avoidance project code
https://github.com/PX4/PX4-Avoidance
AI RecognitionThe eIQ development framework is currently only available in Yocto Poky. Since I am using an Ubuntu 20.04 OS, I had to build the NXP eIQ machine learning framework from scratch. I followed the NXP README files and set up the NXP Yocto build using the repo tool, then read the Yocto Bitbake build recipes to use as a guide to compile the eIQ framework under my Navq’s Ubuntu 20.04 OS. Since we are using an NXP 5.4.47 kernel, the kernel drivers are compatible with the eIQ userspace libraries and tools.
I cloned the git repo following the README at https://source.codeaurora.org/external/imx/imx-manifest/tree/README?h=imx-linux-zeus and ran “repo init” and “repo sync”. This installed the Yocto Poky build system. The NXP ML libraries and tools are in the Yocto sources/meta-imx/meta-ml/recipes-libraries sub-directories. I read through the Bitbake recipes and used their contents to clone the eIQ source repositories and build them under Ubuntu 20.04.
NXP ML libraries and tools built for this project:
arm-compute-library
opencv-imx
armnn
onnxruntime
tensorflow-lite
Additional tools not built for this project:
pytorch
nn-imx (usesOpenVX GPU acceleration, not available on the i.MX8M Mini)
While the Navq is a powerful and capable embedded system, compiling the eIQ framework on the device itself would take quit a long time. So I used qemu emulation for an ARM v8-a CPU on my x86_64 Ubuntu development machine to do most of the build.
Steps to build eIQ under qemu on an x86_64 PC:
1) Install the qemu utilites on the Ubuntu x86_64 system via “sudo apt install qemu-user-static”
2) Extract the Navq SD card carefully (I used SMD tweezers), put it into an SD card USB dongle and insert into the development PC
3) Mount to a convenient location via “sudo mount /dev/sdb2 /mnt”
4) Copy qemu-aarch64-static to /usr/bin on the mounted SD card via “sudo cp /usr/bin/qemu-aarch64-static /mnt/usr/bin/”
5) Mount the necessary pseudo directories and /tmp under your SD card mount point:
sudo mount -o bind /proc /mnt/proc
sudo mount -o bind /sys /mnt/sys
sudo mount -o bind /dev /mnt/dev
sudo mount -o bind /tmp /mnt/tmp
6) Chroot into the /mnt SD card image via “sudo chroot /mnt”
7) Chdir to the /usr/src directory and run my script “eiq_compile.sh” which is attached to this project
We are now running a qemu emulated 64-bit ARM CPU on our x86_64 linux development machine and can compile at x86_64 CPU speeds. After several hours of compiling your SD card will be up-to-date with the NXP eIQ machine learning framework.
Exit chroot, unmount the directories mounted in 5) and unmount the SD card. Put the SD card back into the Navq and run the Python tensorflow recognition script that also controls the Covid19 Raman SERS detection system.
sudo umount /mnt/tmp
sudo umount /mnt/dev
sudo umount /mnt/sys
sudo umount /mnt/proc
sudo umount /mnt
[NOTE: I did have trouble compiling Google’s bazel build tool under qemu, though everything else compiled fine. If you prefer to compile bazel on the Navq I recommend adding a zram compressed swapfile to help with memory issues while compiling.
I used this guide to setup and tune my Navq zram swap & parameters:
https://haydenjames.io/linux-performance-almost-always-add-swap-part2-zram/]
The i.MX8M Mini CPU on the Navq uses ARM NEON instructions to accelerate the NXP eIQ machine learning framework. On other CPU models the GPU is also tapped to accelerate eIQ runtime calculations. And a new chip, dubbed the i.MX8M Plus, has an on-board dedicated Neural Processing Unit or NPU (similar to the Google TPU) that accelerates the eIQ framework directly in hardware.
https://www.nxp.com/docs/en/user-guide/IMX-MACHINE-LEARNING-UG.pdf
The NXP eIQ AI system allows the drone to hover over groups of human beings encountered below its flight path while it brings the Raman spectroscopy system online and samples the air above for Covid19’s viral molecular signatures. If Covid19 is detected the drone notifies remote first responders over the 802.11ah long-range Wi-Fi system to the Raspberry Pi 4 802.11ah long range Wi-Fi base station.
Ramans SERS Covid19 DetectionRaman radiation is named for the famed physicist Sir C.V. Raman who first postulated its existence and then demonstrated its effects in 1928. Raman was awarded the Nobel prize in physics in 1930 for this work. Most people are familiar with the effects of excited electrons within atoms moving from a higher to a lower energy state and emitting photons of a specific frequency. This is the basis for laser operation for example. In a laser, energy is pumped into an atomic system, the electrons absorb this energy and move to higher sub-shells within the atoms, then emit photons with a specific energy (i.e. frequency of light) while moving back to a lower energy state. The resulting photons are emitted from the system as a coherently polarized photon beam.
In a nutshell, Raman spectroscopy is based on the idea that when energized electrons within a molecule, as distinct from individual atoms, transit the bonds between atoms and change energy states, specific frequencies of light that are unique to that specific molecule are emitted.
Raman spectroscopy is used as a molecular identification technology where laser energy is pumped into an unknown molecular system and the resulting spectra are captured via a sensor and used to deduce what molecules exist within an unknown sample. This very useful identification technology has been used for decades to identify everything from IEDs to illicit drugs to everyday compounds and chemicals. Traditional Raman spectroscopy needs a large amount of laser energy and requires a concentrated sample. Identifying a miniscule trace of Covid19 dispersed in the air high overhead a group of human beings would be impossible in a traditional Raman spectroscopy system.
So enter the mysterious and little understood technology of SERS. SERS stands for “Surface Enhanced Raman Spectroscopy” and is a technique where the molecular signature of Raman radiation can be magnified up to 1,000,000 times or more for a specific molecule. The molecule we are seeking must be embedded in a conductive substrate surface to create the Raman enhancement that is the basis for SERS.
In the paper, “SERS-Based Biosensors for Virus Determination with Oligonucleotides as Recognition Elements” (https://www.researchgate.net/publication/341301274_SERS-Based_Biosensors_for_Virus_Determination_with_Oligonucleotides_as_Recognition_Elements) the authors describe how SERS is used to detect very small quantities of viral particles. Additional SERS research is ongoing for detecting Covid19 and several other researchers and laboratories have described their successes. This is a technique that can detect trace amounts of DNA, RNA and proteins in liquid or gas samples.
But wouldn’t a Raman spectroscope capable of SERS be too bulky and power hungry for an airborne drone you ask? Enter the Hamamatsu C13560. The C13560 is a Raman spectroscope that weights just 90 grams, is powered & controlled over a USB port, and is designed specifically for SERS. I created a mount for a C13560 and attached it to my Hovergames2 drone in the battery bay where a C13560 can connect to the Navq Linux computer over USB. The Hamamatsu C13560 is one of the smallest functional SERS devices available and the only one I am aware that is powered and controlled over a USB bus. The maximum peak power draw is 90 mW during Raman scanning but with VADS' AI recognition system it powers on the C13560 only when necessary.
https://www.hamamatsu.com/us/en/product/type/C13560/index.html
The wavelength of the Raman laser used is critical both as to the length of time needed to gather sufficient Raman spectral energy to identify an unknown but also to minimize fluorescence of biological samples. Biological samples produce significant fluorescence at lower wavelengths and this will quickly overwhelm the detector making it impossible to identify the unknown sample’s molecular composition. Raman spectroscopy is governed by the Raleigh scattering formula where the amount of scattering is inversely proportional to the fourth power of the laser wavelength used. This means that the higher frequency laser used the longer the process of acquiring enough scan energy to energize the detector and deduce the unknown sample’s molecular composition.
If it were not for the problem of fluorescence of biological molecules we could use a laser with a wavelength as low as 500 nanometers. This would result in much shorter scan times but would unfortunately not be practical due to the high fluorescence these lower frequencies produce in our target Covid19 proteins. We could also use a high frequency near infra red or NIR laser with a wavelength of 1064 nanometers or above. This will significantly reduce biological sample fluorescence but will also lengthen our scan time considerably and will require use of Indium-Gallium-Arsenide (INGAAS) or Charge Coupled Device (CCD) detectors, since CMOS is only sensitive to ~850 nanometers. The sweet spot is a balance between sample fluorescence and scan duration which lands us at 785 nanometers. This frequency produces acceptable fluorescence & scan times and fits the sensitivity range of relatively inexpensive CMOS detectors.
[Note that the steps to create a Raman SERS substrate and to create the comparison signature library to detect Covid19 proteins would need to be carried out in a level 4 biolab. I will describe the steps to do this, but obviously did not complete them for this project].
To create a Covid19 Raman SERS enhanced substrate insert the Raman spectroscopy device and a glass substrate into a sterile sealed biochamber that contains an AU or AG sputtering system and a sample of a Covid19 MRNA vaccine such as that from Moderna.
Steps:
1) Pour MNRA vaccine sample into a small glass tray
2) Immerse glass SERS substrate in MRNA vaccine
3) Remove and allow glass substrate to dry
4) Coat SERS glass substrate with AU or AG particle layer via sputtering
5) Insert SERS glass substrate into Raman system
6) Save Raman SERS spectra of unadulterated air as a control
7) Introduce Covid19 virus to the chamber
8) Save Raman SERS spectra with Covid19 present
9) UVC 30 min exposure to destroy any live Covid19
10) Extract SERS substrate and place into VADS
While using glass as the SERS substrate is easiest for product prototyping and development, production use would substitute a metal nanoparticle or AG-on-paper substrate:
https://pubs.rsc.org/en/content/articlelanding/2012/cc/c2cc31604h
The Covid19 sample spectrum and unadulterated air spectrum are used as comparators to detect Covid19 samples from Raman spectra taken during VADS operation. I used the rampy Python library for Raman unkown sample spectrum matching against known sample spectra.
https://github.com/charlesll/rampy
In order to gain access to flowing air churned by the drone's engine we use a thinner glass substrate than the original Hamamatsu C13560 SERS plate that permits airflow from outside the unit to flow over the SERS substrate.
Long-Range Wi-FiMost folks understand that traditional Wi-Fi has limited range and difficulties penetrating cement, brick or other dense materials. The frequencies used are in the 2.4 GHz and 5-6 GHz range, aka 802.11n and 802.11ac. The higher the frequency means higher data rates but also shorter range and little or no ‘bounce’ around obstacles. The average range of a standard 2.4 GHz Wi-Fi Access Point is a few hundred meters and a 5 GHz AP is quite a bit less than that. Also, the higher the frequency the more direct line-of-sight needed to connect to the AP. Even small obstacles can significantly degrade high-frequency 5GHz Wi-Fi connections.
802.11ah operates at frequencies below 1 GHz. It was first proposed over a decade ago and was championed by the Wi-Fi industry titans as a technique to overcome range limitations and line-of-sight issues of its higher frequency brethren. The standard re-uses the RF bands from the first mobile phones that have lain dormant since the advent of smartphones and 4G & 5G. The lower frequencies mean that the signal can bounce around obstacles and extends the range to at least a thousand meters or more. The trade-off is lower data rates. The curent max rate in the 802.11ah specifications is 16 mbps, a fraction of the gbps rates of 5 GHz Wi-Fi. But we don’t need gigabit rates for VADS, what we need is range and 802.11ah fits the bill.
802.11ah is a new standard as far as practical working chipsets are concerned. To date just a single company has actual working chipsets available from your local chip distributors. This company is Newracom from the Republic of South Korea. Newracom was an early player in the 802.11ah market and sent off their first press releases back in 2016. I remember pestering them with inquisitive emails about getting my hands on their mythical chipset. Polite and patient to a fault, the Korean engineers put up with my incessant stream of emails and essentially told me, “wait grasshopper, it will come.” Fast forward to 2020 and working chips arrived. Newracom teamed up with my favorite Qualcomm-Atheros partner, Silex, and together have pushed a working 802.11ah module into the distributor channels that I could actually get my hands on.
https://www.silextechnology.com/connectivity-solutions/embedded-wireless/sx-newah
https://www.mouser.com/ProductDetail/Silex-Technology/SX-NEWAH-SP-US/
https://www.arrow.com/en/products/sx-newah-sp-us/silex-technology-america
The Silex/Newracom SX-NEWAH modules and devkits are new enough that the software is still playing catch-up with the hardware. The SX-NEWAH development kits arrived with Raspberry Pi 3b’s and an SD card running a Linux 4.14.x kernel from 2017. I wanted to move to the Raspberry Pi 4 with its beefy ARM Cortex-A72 cores and at least a 5.4.x kernel, as that is what is running on my Navq. I pestered the patient Newracom engineers about upgrading the drivers and they sent me to their github repo. So I rolled up my sleeves, cloned the repo, and set about moving it forward through a dozen-odd kernel releases. As anyone who has done kernel development knows, to call the Linux kernel a moving target is a massive understatement. The kernel is a moving target like the Shinkansen bullet train is ‘kind of fast’. After two weeks of walking kernel revs, I had a patchset for the Newracom driver that works on any Linux kernel through version 5.4.x.
After booting my shiny new Pi 4 and Raspbian OS, the Linux 5.4 kernel came up running and I could bring up the interface with the Newracom home-grown Wi-Fi tools. I confirmed its operation between two Pi 4’s with two of the Silex/Newracom development kits and then set about making their module work with the Navq over the SPI bus and a 6-pin JST connector. It took a bit of hacking the NXP Navq’s kernel DTS file to get the SPI enabled and pins set up properly. Then I had to modify the kernel driver slightly as the Pi’s kernel handles the SPI bus differently than the NXP i.MX8M Mini chip does. After a few days of work, voila, I had the 802.11ah module chewing gum and baling wired to the Navq.
One of the nicest features of 802.11ah, other than very low power operation, is that it is designed to accommodate up to 8,000 devices connected to a single Access Point. So a single 802.11ah AP can support several thousand client devices. The first gen Newracom chip’s limits are currently several hundred devices but this is still a much larger limit than with 2.4 GHz 802.11n or 5 GHz 802.11ac.
For the latest Newracom kernel driver with my updates please see Newracom’s github -
https://github.com/newracom/nrc7292_sw_pkg
NXP T1 EthernetThe FMUK66 comes with a two-wire Ethernet port, which is capable of much higher data rates than UART serial communication. Unfortunately two-wire Ethernet is a relatively new standard and the Navq cannot connect directly to it. I lucked out when the NXP Hovergames engineering team had a prototype T1 they could loan me that allowed my two-wire FMUK66 Ethernet hardware to connect to the Navq’s standard Ethernet port. Without this link the Intel D455 3D navigation system would have been reduced to serial port speeds and the obstacle avoidance would have been too slow to be practical.
Please see my Youtube post for a how-to to connect the Navq and FMUK66 devices over the Ethernet interface -
3D Printer MountsI created a couple of 3D printed mounts for this project. They are available on Thingiverse and are covered in my forum post -
https://www.hackster.io/contests/hovergames2/discussion/posts/7996#comment-154332
Comments