In a rush? Here is all you need to know about the lunaFlow project!
Dinoflagellates are marine planktons that fluoresce blue light when subject to strain. They can be observed during the night time along beaches when waves break against the shore (Figure 1). In the past, these organisms have allowed scientists to track ships in the ocean from space and to develop weather forecasting models. We would like to use their strain-responsive luminescence to measure pressure fields in fluid flows (e.g. a flow around an impeller). We have assembled a team from a mixed fluid dynamics, engineering, chemical and biological background to do so. Our aim is to develop a cost-effective: (i) incubator to grow the organisms and (ii) a multi-camera system to acquire tomographic videos of the luminesced light for three-dimensional pressure field reconstructions.
Measuring pressure fields
Pressure is a fundamental property of a fluid flow. Parcels of fluid exert a force on their surroundings and the distribution of these forces are interlinked with the movement of the fluid.
Traditionally, pressure is measured at single points within a flow using devices such as Pitot tubes or pressure transducers. While these measurements can achieve a high degree of accuracy and temporal frequency, they are often intrusive to the flow and cannot be used to understand how pressure is spatially distributed in the flow field.
Current three-dimensional techniques for measuring pressure are indirect in that they are calculated from the measurement of the velocity of seeding tracers in the flow field (known as tomographic PIV reconstructions). Getting pressure fields in such a way requires a high degree of accuracy for the velocity measurements which is often very hard to achieve. Furthermore, these techniques typically rely on the use of expensive (and dangerous-to-operate) high-power pulsating lasers.
Using strain-responsive plankton for measuring pressure would provide an alternative to these measurement techniques which would circumvent many of these problems associated to them, with the added benefit of being much more cost-effective. These organism can also be used in combination with UV lights to simultaneously measure velocities and pressures. We plan to develop open-source easily-deployable systems to grow these amazing “bio-pressure-sensors” and a multi-camera system to image them three-dimensionally.
The plankton are particularly sensitive to temperature, salinity and oxygen levels and require a 12-hour cycle of light and dark. Off-the shelf incubators are not designed with our particular application in mind and we are therefore limited to a number of commercially available products that are rather expensive.
Tomographic camera systems
Tomographic camera systems share the same burden of an excessive cost; often prohibitive to most researchers. Their cost has been choking both their development and deployment in the field. In recent years, flourishing alongside the fields of computer vision and robotics, low-cost electronics are making the design of complex acquisition system much more accessible. We would like to contribute to the community by showcasing a design for an ultra-low cost 3D imaging system (which could be used in a broad range of disciplines).
Various species of Dinoflagellates are bioluminescent, and they emit blue light upon physical stimulation. The bioluminescence behaviour is related to the circadian cycle of these organisms, and it only occurs during the night cycle. The two most studies species are Pyrocystis lunula and Lingulodinium polyedra. While they share some of the general features in their bioluminescence, they vary in their mobility and biophysical response to a stress stimulus. As they can be cultured in very similar conditions we will be growing both species and assessing their suitability as pressure sensors in the flow visualisation device.
We are currently assembling our incubator. We will control the temperature within the recommended range of 20-22 °C in two ways. For heating, we will operate resistance elements with PID-controllled heating. For cooling, a thermoelectric Peltier heat pump to cool the air above and around the batches. Tropical aquarium lights will be used for illumination, controlled by a timer switch.
We will be using Raspberry Pi camera modules / webcams that will be controlled using multiple Raspberry Pi 3B+ board. We aim to develop a four-camera system and will allow us to control the cameras simultaneously. It also should allow us, in the long run, to embed the camera-acquisition system into an integrated system, in which the cameras can be moved in response to the flow measured.
Ideally, we plan to achieve a system that can record at 1 Megapixel resolution (comparable to the 720-1080p High Definition) at frame rates of 20-30 fps.
We intend to release both the incubator hardware and control software design and the entirety of the multi-camera acquisition system. The project will lead to a number of benefits. It will substantially increase the accessibility to a technique to measure a fundamental flow property (pressure). Both the incubator and the imaging system can be used in a wide range of different scientific applications.
The project is currently under development - keep up with our progress below.
- Pyrocystis lunula (CCAP 1131)
- Lingulodinium polyedra (CCAP 1121)
- Buy from CCAP (https://www.ccap.ac.uk) £50 pounds each
- f/2 fertiliser from amazon https://www.amazon.co.uk/Phyto-Plus-fertilizer-phytoplankton-1000ml-x/dp/B00RQUESJG
- L1 stock media from CCAP (£50 for 5L of media)
- Make your own L1 stock media, recipe: https://www.ccap.ac.uk/media/documents/L1.pdf
All of these options include all of the components needed for the dinoflagellates to survive (all the vitamins that they cannot produce, phosphates, nitrates, trace metals etc. – all the carbon is sourced via photosynthesis). All bioluminescent dinoflagellates are marine species, so the media will be prepared in artificial saltwater.
- CCAP will send 50ml of liquid culture,
- add them to 500ml of fresh media, it will take c. 2-3 weeks to grow to high density 10-15.000 cells/ml
- To quantify take a known volume and put it on microscope slide and count a few
- We can then keep diluting them to 10% density every c. 2 weeks (e.g. 5L next)
- A good guide for a more DIY setting https://www.ccap.ac.uk/documents/PyrocystisCulturing.pdf
Grow them in 2L conical flasks and fill them to max 500ml.
- 1 x Raspberry Pi Zero W
- 4x NTC thermistors (Open Smart)
- 3 x heating units (https://uk.rs-online.com/web/p/enclosure-heating-elements/2995950/)
- 1 x circulating fan (https://uk.rs-online.com/web/p/axial-fans/7496950/)
- 1 x lighting strip (310 mm, https://www.allpondsolutions.co.uk/pled/)
- 1 x Timer switch socket (https://uk.rs-online.com/web/p/plug-in-time-switches/8215058/)
- 1 x Light sensor (Open Smart)
- 1 x fridge cooling unit (https://www.banggood.com/12V-6A-DIY-Electronic-Semiconductor-Refrigerator-Radiator-Cooling-Equipment-p-1074404.html)
- 1 x AC/DC 12 V adaptor
- 3 x MOSFETs (FQP30N06L, https://uk.rs-online.com/web/p/mosfets/8075863/)
- 1 x Relay board (https://www.banggood.com/5V-4-Channel-Relay-Module-For-Arduino-PIC-ARM-DSP-AVR-MSP430-Blue-p-87987.html)
- Misc. Wires
- Assorted resistors; 1k, 10k and 20k resistors
- 3 x 2N3904 transistors
- Biomaker board
- Raspberry Pi Zero W
- Thermometer (we used a –10 to 50 deg C thermometer)
- Thermal measurement chamber (ceramic mug with cats on it)
- Hot water and ice
The control system regulates the heating and cooling of our incubator to ensure the dinoflagellates are in the right conditions to grow. We have four sensors inside the incubator; one for light and three to measure temperature in different parts of the box, enabling identification of hotspots. As such, it is important that our temperature sensors are well calibrated.
For calibration purposes we connected 4 NTC thermistors to our Biomaker board, a Raspberry Pi Zero W, and placed the thermistors and a calibration thermometer in a water-filled thermal measurement chamber (i.e. a ceramic mug with cats on it) as shown in Figure 3. For convenience, a shield to plug all the temperature sensors into was constructed. The thermistors were tethered together and placed in the water. Care was taken to ensure the water was well stirred and that the thermistors and thermometer were close together in the water. The temperature of the water was slowly adjusted with ice and hot water, the raw analog data was recorded from 4 NTC thermistors, using a low-pass filter to smooth the signals. When a touch button on the Biomaker board was pressed, the analog sensor values were sent to the Raspberry Pi via a serial connection (see “Remote monitoring”) and the operator noted down the thermometer temperature in a spreadsheet. To ensure proper data indexing, an incrementing integer is displaced on the LM1637 4-digit display that is recorded in both data sets. The XOD sketch for thermistor calibration is here, and the data-logging Python script to run on the Raspberry Pi is here. Note that the use of the 4-digit display negates the use of an external SD card for data logging as the D11 pin is used by both devices.
The data from the analog values, and the actual temperature was assembled for all four temperature sensors, and the resistance was calculated given the 10k Ohm on the thermistor board. The measured resistance and actual temperature were fitted to the first order Steinhart-Hart equation below to derive T0 and B (below left) (https://learn.adafruit.com/thermistor/using-a-thermistor). The close correlation of actual temperature to (calibrated) measured temperature for sensor 3 is shown in Figure 4.
Incubator temperature control
Our first prototype for the incubator uses a simple algorithm to change state based on the temperature, and switch on/off the fridge cooling unit, the fridge fan, the heating elements and the internal air circulating fan. A simple temperature-control algorithm is implemented with a state machine.
All sensor readings are taken on the Biomaker board, implemented in XOD. A Raspberry Pi is used for internet access and datalogging and receives serial data by the XOD program on the Biomaker board (see “Remote monitoring”). We have chosen to use an electromechanical relay for the first prototype to handle the 12V and 6A of current required for the fridge, but we may change to N-MOSFETs which can handle up to 30A, and will operate silently. The relay activates with a GROUND signal (logical 0), and rather than invert the control signals in the XOD software, we have used 2N transistors to invert the ON/1/Vcc signal to the required low signal (a “NOT” gate). The 12 V power is supplied by an AC/DC power converter and conveniently broken out with a DC jack adaptor. The heating elements are controlled with N-MOSFETs and are powered by 12 V. There are four indicator LEDs; yellow for the circulating fan, red for heating elements, and two green LEDs for the fridge relay and fan.
For the first prototype the heating and cooling control is digital – on or off. However, by using the MOSFETs with the heating elements, we have built-in the option to use a PID controller with a PWM control signal, if initial testing reveals our simple heating system is not sophisticated enough. Some online research suggests that Peltier devices (i.e. our fridge unit) cannot be controlled with a PWM signal. An initial phase of data collection and analysis will indicate whether this is an issue for us, though the biology of the organism is such that being too cold is less of a problem than being too hot.
We have also added a 4DS display unit to the Biomaker board to display the internal temperature and indicate the ON/OFF status of the fridge, fan and heating units. In future iterations, we will develop this interface using a "scope" object to display the time-course of the temperature and add some interactive functionality.
After setting up the Pi for serial communication (see "Remote Monitoring" below), execute the Python script on the Pi, by navigating to the directory containing the script and typing:
$ sudo nohup python Pi_ThingSpeak.py &
The use of 'nohup' ensures the python script still keeps running, even when the local session is closed (useful for headless SSH sessions). The '&' character effectively lets the script run in the background and returns control back to the console. To shut the script down, you'll need to find its process ID (by typing 'top' and finding the python program and ID), then typing 'kill ID', where ID is the process ID.
The circuit diagram for the incubator is shown below.
The incubator will run 24-7, maintaining the internal temperature and light conditions for optimal growth. In order for the team to monitor the status of the incubator remotely, we have connected the Rich UNO R3 to a Raspberry Pi Zero W, which in turn is connected to the internet. The Rich UNO R3 periodically sends sensor readings to the Pi via a software serial UART on the UNO (using XOD), and is read over the Pi hardware UART by a short Python script (Python 2.7). The Python script also posts the incubator data to an IOT ThingSpeak channel. Using ThingSpeak we will also be able to enable automatic alerts, if the incubator temperature exceeds a certain value for example. As the Rich UNO R3 uses 5 volts and the Pi uses 3.3 volts, a simple voltage divider is used on the Pi RX input. The sensor data is also recorded locally on an SD card for analysis.
Some configuration of the Raspberry Pi is required to use the serial UART port on the GPIOs (14 and 15). First disable console use of the serial port:
$ sudo raspi-config
Go to Connections --> Serial --> Disable console, Enable hardware serial. Reboot the Pi:
$ sudo reboot
Now we need to add a couple of things:
$ sudo nano /boot/config.txt
Confirm that the line "enable_uart = 1" appears in the file (usually at the end). When using the Raspberry Pi Zero W, we need to add the following line to the file and save:
Reboot the Pi (sudo reboot) and GPIOs 14 and 15 should now be correctly configured.
- (reclaimed) plywood boards, cut to size on a table saw:
> 840 by 540 by 20 mm cub x2 (base & lid)
> 840 by 400 by 20 mm cub x2 (long sides)
> 500 by 400 by 20 mm cub x2 (short sides)
- brass hinges x2
- waterproof paint
- insulation board
- (reclaimed) stainless steel handles x2
- draught excluding double-sided tape
- castor wheels with brakes x4
- wood screws
> No 6 x 5/8" (for castors and hinges) x 28
> No 8 x 2" (for box joints) x 40
> No 4 x 5/8'' (for mounting TotemMaker frame onto plywood) x 16
- Battery drill
- Drill bits (5mm)
- Wood saw
- Philips Screwdriver (PH2)
- Wood file
We are developing software that allows to control an imaging system on different platforms and with different devices.
As a basic requirement, you will need:
- a PC running a Windows 7/8/10 OS or Linux Debian OS with 2-4 USB ports (not tested with Mac iOS)
- 2-4 cameras (tested on Logitech C310 Webcams)
We are also developing a image AQ system. This runs on multiple Raspberry Pi (hereafter rpi) boards which are controlled from a PC and communicates over a private local area network over an ethernet network switch.
Using multiple devices that run independently substantially increases the bandwidth (i.e. resolution and acquisition rate) that can be achieved using a single machine.
To acquire with multiple cameras on a single machine, jump to interfacing.
For more information on how to develop a multi-device imaging system, keep reading.
The image AQ system requires:
- 2-4 raspberry pi 3B+ boards with 8GB+ Micro SD cards
- a network switch with 5+ ports
- 3-5 cat5/cat6 cables
alongside the PC and webcams. You can also substitute the webcams with rpi camera modules.
Here we outline the steps required to set up the imaging system. These include:
- setting up rpi boards to work with MATLAB
- setting up a private network
- setting up an MQTT Protocol configuration
Setting up Raspberry Pi boards
The imaging system runs on a combination of MATLAB (for app interfacing, see interfacing) and Python (for rpi interfacing) scripts.
First, you will need to configure the Raspberry Pi boards to interface with MATLAB following the instructions at:
We recommend cloning the MATLAB Raspbian image onto a new Micro SD card. Run the add-on manager in MATLAB to setup your rpi board. To connect to eduroam, choose a WPA-2 enterprise connection and fill your details using your network access token (for Uni of Cambridge students, https://tokens.csx.cam.ac.uk/). You will need an internet connection to download the required software to set up the private network.
Creating a private network
Here, we will look at how to create a private network over an ethernet network switch between the rpis and a PC.
We will configure one of the rpis as a uDHCP server (micro Dynamic Host Configuration Protocol) and the other rpis and the PC as clients.
Proceed with caution while setting up network connections and ask for help if you are unsure. We do not want to be liable for anything going wrong in your home or department!
Setting up server:
1. First, open a terminal on your server rpi and install udhcpd
$ sudo apt-get update $ sudo apt-get install udhcpd
2. Second, you will need to determine the name and MAC address of your server rpi's ethernet port (i.e. the unique identifier to that network bus). Run
On older versions of Raspian, the name of the port is by default eth0 (case 1). On newer versions of Raspian, the port name is given as enxXXXXXXXXXXXX (case 2, where the capital Xs are the numbers & letters of your MAC address without the colons). You can also find your ethernet's MAC address, given after
3. Next, you will need to set up a static IP address for your server rpi for this private network. Go to and edit:
$ sudo nano /etc/network/interfaces
and add to the bottom (for case 1)
auto eth0 iface eth0 inet static address yyy.yyy.YY.1/24
or (for case 2)
auto enxXXXXXXXXXXXXX iface enxXXXXXXXXXXXXX inet static address yyy.yyy.YY.1/24
where yyy.yyy.YY.1 is the IP address you want to assign to the rpi server. For simplicity, we allocated the number 1 to the first device we added to the network. Save and exit.
>> Do not choose an IP address that is likely to clash with existing ones in your network! <<
4. Next, you will need to enable the dhcp server. Go to and edit
$ sudo nano /etc/default/udhcpd
and change the status from the default "no" to a "yes"
Save and exit.
5. Next, you will need to edit the configuration file for the network. Go to and edit
$ sudo nano /etc/udhcdp.conf
and comment out everything, but
interface enxXXXXXXXXXXXXX ------- local ethernet port max_leases 0 option subnet 255.255.255.0 option domain local option lease 864000
at the end of the file, also add the MAC addresses of the other devices in the network (below represented by Zs) and the IP address you will allocate to them (i.e. the clients).
static lease ZZ:ZZ:ZZ:ZZ:ZZ:ZZ YYY.YYY.YY.2 static lease ZZ:ZZ:ZZ:ZZ:ZZ:ZZ YYY.YYY.YY.3 static lease ZZ:ZZ:ZZ:ZZ:ZZ:ZZ YYY.YYY.YY.4 static lease ZZ:ZZ:ZZ:ZZ:ZZ:ZZ YYY.YYY.YY.5
Save and exit. You can come back to this step later if you do not know the MAC addresses yet.
6. Finally, you will want to restart the server by running
$ systemctl enable udhcpd $ systemctl restart udhcpd
The server should now be running.
Setting up clients (rpis):
Repeat steps 1-4 from setting up server. For step 3, allocate an IP Address with the appropriate device number (e.g. YYY.YYY.YY.2 and so on).
Setting up clients (pc windows):
1. Simply find your MAC address in your cmd prompt by running
> ipconfig /all
and identifying under
Ethernet Adaptor Ethernet: Physical Address ....... ZZ:ZZ:ZZ:ZZ:ZZ:ZZ
Setting up an MQTT Protocol
The imaging system communicates over an MQTT protocol. This is a simple communication protocol in which devices can publish to a server (referred to as a broker) and subscribe to messages sent to that broker. More information on how MQTT works is available on:
Download the MATLAB support packages for MQTT from their file exchange:
MQTT communication requires a server to act as the broker. We are running our communication over a mosquitto channel. You will need to install and run mosquitto on the machine that acts as the broker.
To install mosquitto on a PC visit:
and follow the installation instructions.
Alternatively, you could use one of the rpis as a broker and install mosquitto on it by running:
$ sudo apt install mosquitto -y mosquitto-clients
in a terminal. You will need to run mosquitto on your broker machine to enable communications will running the imaging system.
To run on a Windows PC, first add mosquitto to your environment variables. Then run mosquitto in the background, with
> START mosquitto \B
To run on the rpi, run mosquitto in background (as a daemon, -d) by running
$ mosquitto -d
You can find examples on how to use MQTT in Python (using the paho-mqtt module) from:
To connect your devices you will have to find your IP addresses by running ipconfig (in a Windows command prompt) or ifconfig (in a Linux terminal).
For file transfer, the imaging system sends files over SSH. You will have to install and enable ssh on your rpis by running
$ sudo apt-get update $ sudo apt install openssh-server $ sudo systemctl enable ssh
in the terminal.
>> Remember to change your rpi passwords from default! <<
To change password, run
$ sudo raspi-config
Following the setup you can test whether acquisition via the rpis works using the camFlowAq app (see below).
A bespoke app has been developed by the lunaFlow team to help users acquire with multiple cameras simultaneously.
The app has been designed to work with any device connected to the computer (for example, multiple webcams connected via USB). A remote AQ mode is available to use the app over an MQTT connection established over a local area network (see connectivity).
The app runs on MATLAB (tested on v2019) or as a standalone. It is available for download from the following github link:
While running on MATLAB, the following toolboxes are needed:
- image acquisition toolbox
- image processing toolbox
- computer vision toolbox
You will also have to install the hardware support package for your particular camera interfacing. Run
in the MATLAB command window. For webcams, install the OS generic video interface package:
The camFlowAqApp also works with scientific cameras! (tested on a JAI Spark 5000M USB with a GenICam interface) You will have to add your camera's software development kit (SDK) to your environment variables.
Add the app to your MATLAB search path (Home > Environment > Search Path) and run the app from the command window with
By default, the camFlowAqApp runs on devices connected directly to your machine.
To access remote device acquisition, click on the File > Remote Aq Mode tab.
You will have to perform the setup instructions outlined in connectivity.
More documentation and examples in the README file. The lunaFlow suite is released under an MIT open-source licence.
Single computer system
To run the single computer system, simply connect the webcams via USB. After configuring MATLAB for image acquisition (see connectivity), you can launch the app and start acquiring with multiple cameras simultaneously.
The multicomputer systems work with rpi boards and a network switch. You will need CAT5/6 cables to connect the rpis and PC to the switch and power supplies to power the devices. Rpi camera modules connect via the camera link (make sure these are enabled on your rpi) and webcams can connect via USB.
An example imaging system is shown below.
The imaging system is mounted onto a custom-built Totem Maker kit.
Visit the TotemMaker website for instructions on how to assemble connections:
We used M2 screws and nuts to mount the rpi camera modules and nylon screws to secure the boards and the network switch to the chassis.
1. Choose the device interface, format and ID to use in the camera tab. Press the load camera button to connect to device for acquisition. You can use the quickload button to connect to multiple devices in one click. Use the reset button to restart the camera connection to MATLAB.
2. The preview tab can be used to start a livestream. It will display images from devices whose Checkbox (top right) is ticked.
3. Adjust camera settings interactively in the control panel.
4. Acquire images in the acquire panel. Acquisition is triggered simultaneously on active cameras (whose Checkbox is ticked). The acquire background button saves a mean image from a short video for background subtractions. Choose your project folder to save your images.
Remote access mode
1. To start the mosquitto channel, press on the mosquitto launch icon in the start panel.
2. log-in with your raspberry pi credentials.
3. establish an SSH connection for file transfer and reboot controls.
4. launch your PC as an MQTT broker under the MQTT protocol settings panel.
5. acquire sync'd images or videos using the PiCameraControl panel.
6. calibrate camera extrinsics manually (guided checkerboard acquisition) or automatically (based on SURF feature detection)