Electricity shortages around the world, and particularly throughout Europe, have led to a renewed interest in nuclear energy. While some see this means of energy production as an unacceptable risk, others tout the small amounts of fuel required, low cost of generated electricity, and environmental benefits of nuclear energy. One thing that we should all be able to agree on is that when nuclear materials are utilized, it should be done in the safest way possible. After all, the effects of radiation poisoning are well understood and can include serious illness or death — not to mention the impact to the environment from a large-scale radiation leak.
Aside from energy production, nuclear technology is used in many other applications, ranging from medical and industrial radiography to commercial smoke detectors and insect control for food processing. Over the years, these activities have contributed to the quarter million metric tons of highly radioactive waste that has accumulated in storage facilities worldwide. These storage areas, as well as the facilities making use of raw radioactive materials, need to be regularly inspected to ensure that there are no infrastructure problems leading to radiation leaks. Previous efforts to develop monitoring systems have largely focused on ground-based tele-operated robots, unmanned aerial vehicles, or underwater robots for pipe inspection.
Despite all of these efforts, mapping and characterization of distributed nuclear radiation fields via autonomous robots is not a solved problem. In particular, inspecting the interior of complex facilities presents a unique set of challenges that current techniques are inadequate to overcome That may change in the near future, however, thanks to a collaboration between the University of Nevada, Reno and the Norwegian University of Science and Technology. They have developed an autonomous drone that can perform mapping and spectroscopic analysis of distributed radiation fields of the sort that might be found at a nuclear waste storage facility.
The prototype design consists of a quadcopter drone, called the RMF-γ aerial robot, with an mRo X2.1 Rev. 2 autopilot running Ardupilot firmware. A high-performance Khadas VIM3 Pro single board computer provides the computing horsepower to run the necessary algorithms during flight. An Intel RealSense T265 stereo visual-inertial system and a RealSense D435i RGB-Depth sensor give the drone the ability to sense its environment. A Scionix V10B10 Thallium-activated Cesium Iodide CsI(Tl) scintillator is used to detect gamma radiation.
The first task of the drone is to determine the lay of the land — this is a prerequisite to associating the estimated radiation field with the environment being surveyed. This is accomplished by using data from the RealSense T265 sensor, which provides visual-inertial odometry estimation, and the RealSense D435i sensor, which provides point clouds at a distance of up to three meters. Fusing the measurements from the pair of sensors allows for the construction of a 3D occupancy map and a dense point cloud of everything the robot has explored.
With an awareness of its environment, the drone is then able to build a real-time distributed radiation ﬁeld estimate map. By making the assumption that spatial variations in gamma radiation intensity are smooth locally, this problem can be addressed by running a series of local regressions. But to collect enough measurements to generate an accurate map, and to do it quickly to conserve the limited battery capacity of the drone, a specialized tactic was needed. Towards this end, a path planner was created that could lead the robot to cover as much area as possible, in the most efficient manner, but without redundantly covering uninteresting background radiation measurements. The planner also had to be capable of providing enough coverage in each area to ensure that correct estimates of the mean could be calculated — if the measurements were too sparse, noise could not be filtered out of the map.
A series of real-world tests were conducted to assess the accuracy of the system. In each, the drone was tasked with exploring an unknown area, creating a map of the environment, and performing a mapping of the radiation levels. Real radioactive substances were placed in each location to provide a source of radiation for the tests. The results were positive overall, with the robot creating accurate radiation field maps of each location within about five minutes of flight time.
The work reported on by this team shows promise for real-world inspection applications, and may help to make working with nuclear technologies safer in the future. It will be interesting to see how this research is built upon in the years to come.