Stick Out Your Tongue and Say AI

This AI-based tool analyzes tongue images to diagnose diseases with 99% accuracy, making diagnostic tests easier and more accessible.

Nick Bild
2 years agoHealth & Medical Devices

Modern medicine can detect many diseases earlier and more accurately than ever before thanks to advances in areas such as medical imaging and genetic testing. But a major hurdle that the field still needs to overcome is the timely delivery of these diagnostic tests. In general, people will not seek out medical care until after their symptoms have already progressed. Furthermore, because the testing is often expensive, intrusive, or cumbersome and time-consuming to complete, patients may just avoid the problem.

It is hoped that wearable devices and artificial intelligence (AI)-powered tools will soon make medical diagnostic tests much less cumbersome and expensive, such that problems can be detected before they get out of hand. One such tool was recently described by a team led by researchers at the Middle Technical University. Using it is as easy and quick as sticking out your tongue and snapping a picture (and who among us does not already do that from time to time?). It then uses AI to diagnose a large number of potential medical conditions with a high degree of accuracy.

It is known that the color, shape, thickness, and other factors associated with the tongue can be strong indicators of underlying medical conditions such as diabetes, stroke, anemia, asthma, liver and gallbladder conditions, COVID-19, and several vascular and gastrointestinal issues. The researchers’ leveraged this knowledge to build a machine learning algorithm that can analyze an image of the tongue and classify its color, which is a proxy for the diagnosis of specific diseases.

To find the best approach, six different machine learning algorithms, like support vector machines and decision trees, were evaluated. A training dataset consisting of over 5,000 images was used to train each model. After evaluating each algorithm’s performance, it was found that the Extreme Gradient Boost (XGBoost) method performed the best. This algorithm was capable of correctly classifying the color of the tongue in images in nearly 99 percent of cases, even under challenging lighting conditions.

At present, the tool is still under development and runs on a laptop with a webcam. But the relative simplicity of the algorithm and the robustness of the classifications suggests that the methods could be transferred to more convenient platforms, like smartphones, in the future. That would significantly lower the friction in performing diagnostic tests. A notification every few weeks or months to snap a picture of the tongue would be sufficient to test a large percentage of the population — even individuals that do not have any known risk factors and that have not presented any warning signs.

It was noted that in some cases, camera reflections could hinder the accuracy of the tool. The team intends to apply advanced image processing techniques in a future version of the system to overcome this issue. Improvements of this sort could ultimately lead to the development of a cost-effective and efficient way to diagnose a wide range of medical conditions earlier than is now possible.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles