Those who struggle to use their hands can find interacting with touchscreens to be a very difficult task. That's why Zack Freedman created Hypervisor, which is a wearable device that is able to track a user's eye movements and pupil positioning to place a mouse cursor on a screen. He built the head-mounted unit as an entry for the 2020 Hackaday Prize to fulfill the needs of United Cerebral Palsy.
The brain of the Hypervisor is a Raspberry Pi Compute Module 3+ slotted into a StereoPi carrier board, and it communicates with all of the sensors and then processes acquired the data. There is a pair of IR transceivers that are used to transmit and/or receive device information and gaze data between different pieces. Two CSI cameras are connected to the carrier board and provide environmental and eye data for computer vision processing. Finally, several LEDs indicate the current state of the wearable so the user can see what is happening.
In order to determine where a user is looking, one of the CSI cameras (an IR-enabled one) sends a continuous video feed to the Raspberry Pi Compute Module that is running OpenCV. From there, a secondary camera is used in conjunction to triangulate a couple of things: which device is being looked at and where the eye's pupil is.
The result from calculating where the eye is pointed at then places a cursor at that location, which allows a person's eye to act like a mouse. The wearer can blink to click, quint to drag an item, and blink twice in rapid succession to scroll. Each device is connected to a dedicated receiver that gets data from the headset for where to position the cursor.