There are numerous problems faced by college students, one of the most common and undiagnosed being mood swings, which are a leading cause of depression and suicide. Gait analysis helps us detect body and facial motion. Without invading privacy, we evaluate walking patterns and body posture, enabling early intervention and continuous monitoring to enhance overall well-being and quality of life. Our focus on measuring gait parameters through posture and force during walking offers a unique methodology for detecting irregularities and supporting suicide prevention efforts.
To analyze human gait, a combination of optical cameras and force platforms is used. By using the recorded data and intensive analysis, we can determine if a person is experiencing behavioral changes that could be perceived as harmful. This approach could open new avenues for preventing suicides among young adults. Patients, as compared to non-depressed controls, showed reduced walking speed and vertical up-and-down movements, as well as a slumped posture during everyday life episodes of walking. We record this pattern using three methods:
1. Gait analysis to detect walking patterns.
2. Posture detection using ML to identify postures (slow, hyper, angry, prideful, etc.).
3. Facial emotion detection using ML to understand expressed emotions.
If a student shows signs of depression, we alert the authorities to provide timely assistance. If a student has suicidal thoughts, this system helps recognize and aid them. We observed that angry gaits were relatively more heavy-footed, while sad gaits had less arm swing compared to other gaits. Proud and angry gaits had longer stride lengths than happy or sad gaits, and happy gaits performed faster in pace than the others.
Furthermore, we have created an ML model to classify facial expressions and analyze posture. Data is collected through a camera, and the models are trained using pre-existing data. By combining all three results, we can monitor students with anomalous behavior.
**Data Acquisition:**
Using two components—a floor mat with pressure or force sensors and cameras fixed in the college environment—we collect data. The floor mat acquires the pressure map of the foot, while the camera detects the posture and motion of the person. The foot pressure pattern is compared with that of a healthy person, and any differences are recorded. If an erroneous pattern is detected, the person is monitored for the next week, and their data is continuously analyzed. If the pattern persists, the concerned authorities are informed for further counseling procedures. The NPU present in the Ryzen PC AI helps analyze large amounts of real-time data using ML models for accurate predictions.
**Data Acquisition from the Mat for Gait Analysis:**
Force sensors connected to a floor mat detect foot pressure and angle as students walk. The sensors are connected to an Arduino to read sensor data, which is formatted in a CSV-compatible structure. We established serial communication between the Arduino and the PC using a USB cable, configuring the baud rate for data consistency. Using a script on the AMD AI PC, we read the serial data stream from the Arduino using Python with pyserial to capture and process the incoming data, appending new data to the CSV file in real-time.
During the initial stages, we used the collected data to train an ML model for gait analysis. Later, once the data is saved in CSV format, we test it directly, helping us classify gait patterns.
By combining all three data sources (results from gait analysis, facial expression analysis, and posture analysis) and monitoring a person’s activity for a few days, we can determine if they are sad, happy, angry, etc.
Created February 27, 2024
Comments