OptiGap Turns 3D Printer Filament Into a Bend Sensing and Localization System for Soft Robots

With cleverly-encoded air gaps, a length of optical fiber or 3D printer filament can be used as a detail-rich bend sensor.

Gareth Halfacree
1 month ago β€’ Sensors / Robotics

Research and development engineer Paul Bupe, Jr has published details on a new approach to soft robotic sensing, developed during his PhD research: OptiGap, which started with cheap 3D printer filament and ended with a functional bend sensor offering bend-localization capabilities.

"In very general terms, this sensor is basically a rope that if bent can tell you where along its length you bent it. The fancy term for that is 'bend localization,'" Bupe explains. "OptiGap's application is mainly within the realm of soft robotics, which typically involves compliant (or 'squishy') systems, where the use of traditional sensors is often not practical. The name OptiGap, a fusion of 'optical' and 'gap,' reflects its core principle of utilizing air gaps within flexible optical light pipes to generate coded patterns essential for bend localization."

The idea of using optical light pipes as bend sensors isn't new: something as simple as a length of fiber-optic cable with an LED at one end and a photoresistor at the other can be used to get an idea of how much the sensor is bending, but what it can't do is tell you where. That's where OptiGap comes in β€” and initial prototypes were made using a simple length of transparent TPU filament designed for use in 3D printers.

"To verify [my] hypothesis, I attached a longer piece of TPU to a tape measure and began bending it at various points to observe how light transmission would change," Bupe explains, referring to a theory that the adhesive from electrical tape used to attach the two together was causing stretches in the filament and a drop in the amount of light transmitted as the filament bent.

"I wrote a small Linux I2C driver for the [STMicroelectronics] VL53L0X ToF [Time of Flight] sensor to run on a Raspberry Pi and push the data to a socket using ZeroMQ," Bupe continues. "I then created a rough GUI in Python to pull the sensor data from the socket and visualize the light transmission data in realtime […] which very quickly validated my hypothesis. This validation marked the 'Eureka!' moment that sparked the eventual development of the OptiGap sensor."

Bupe then moved to experimenting with air gaps, cutting the filament into shorter lengths and attaching them together using soft rubber sleeves. As the sleeve bends, less of the light from one section of filament makes it to the next. By placing the air gaps in a particular pattern, inspired by the operation of a linear encoder, it's possible to figure out where along the sensor's length the bend is taking place.

Having proven OptiGap's capabilities, Bupe switched from filament to considerably thinner PMMA optical fibers β€” and dropped the VL53L0X ToF sensor in favor of using photodiodes and LEDs. "This also allowed me to use a microcontroller to read the sensor data," Bupe notes, "which was a significant improvement over the initial prototype." That microcontroller, an STMicro STM32, runs a naive Bayes classifier on-device to decode the encoded air-gap patterns and localize the bend.

More information on the project is available on Bupe's website, while his dissertation on the topic is available on the University of Louisville's Institutional Repository under open access terms.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles