Stirring Up Excitement in Fluid Manipulation
FluidLab simulates complex interactions between fluids and solids to help robots learn to perform complex fluid manipulation tasks.
Although it does not always seem difficult to us humans, fluid manipulation is a very complex activity that comes with many challenges. Whether scooping an item out of the water, rolling an ice cream cone, or creating art in a latte, we intuitively know what to do. But teaching a robot how to do these same tasks remains an unsolved problem. To date, research in this area has been largely limited to operating under idealized Newtonian mechanics, and on relatively simple tasks like pouring water into a container.
To build capable, intelligent robots that can assist humans in a wide range of real-world fluid manipulation tasks, much work remains to be done. There are so many parameters that need to be considered — fluids have very complex material behaviors. Moreover, there can be interactions between different fluids and solids. Consider, for example, the frothed milk on top of coffee when making latte art.
In order for a machine learning algorithm to gain a well-generalized understanding of complex fluid dynamics to be useful for a broad range of real-world tasks, a huge amount of training data would need to be collected. Such a large effort would be extremely labor intensive and expensive, and perhaps not entirely practical. Unlike, for example, large language models that can be trained on Internet-scale data, there is no such dataset available for learning fluid dynamics.
To address this issue, a team led by researchers at MIT decided to build a simulated environment for performing realistic fluid manipulations. Called FluidLab, this system can simulate a diverse set of manipulation tasks involving complex fluid dynamics. Unlike previous efforts, FluidLab can even simulate interactions between multiple types of fluids and solids. Since the environments are generated computationally, a large amount of data can be collected from them quickly, saving lots of time and money. And that data can be leveraged to train machine learning models to manipulate fluids.
FluidLab is powered by a physics engine named FluidEngine, which was developed in Python and Taichi. It is proficient in simulating solids, liquids, and gasses, and objects that are elastic, plastic, and rigid. This engine follows the OpenAI Gym API, and supports massive parallelism when running on GPUs. A GPU-accelerated renderer was also developed to allow for realistic rendering of the simulations in real-time. By accurately simulating the physics of Newtonian and non-Newtonian liquids and gasses, FluidLab can provide useful data for model training.
Several reinforcement learning algorithms, sampling-based optimization methods, and trajectory optimization techniques were evaluated using data produced by the simulator for their ability to accomplish a fluid manipulation task. The results of the experiments demonstrated that FluidLab provides valuable information for learning complex fluid dynamics. To further validate these findings, sim-to-real was leveraged to optimize the trajectories in a real-world robotic setup, which was found to perform reasonably well in realistic scenarios.
There are still many challenges yet to be solved, however. For example, it was found that the high-dimensional data generated by the simulator made some models struggle to learn useful information. The team also points out that the tasks they have designed FluidLab around are relatively simple, and more complex actions, such as those that combine multiple actions, like stirring and pouring, need to be evaluated. Further, while the real-world experiments showed the potential of FluidLab to help robots learn to manipulate fluids, the testing was far from perfect, and left much room for improvement. Additional work will be needed to understand how to improve these results.
The team hopes that FluidLab will benefit future research in robotic manipulation involving fluids. For those that would like to further explore their work, they have released the source code.