The last of those is still under development; however, was proven possible in a study published in NPJ Digital Medicine this week by a group of researchers from the University of Washington and the University of California San Diego.
They developed an artificial intelligence algorithm that aims to decipher blood oxygen readings using only close-up videos of a user’s finger captured by a smartphone camera, allowing them to perform their own check-ups at home using technology already in their pockets. “This way, you could have multiple measurements with your own device at either no cost or low cost,” said Matthew Thompson, M.D., Ph.D., the study’s Co-Author and a Professor of Family Medicine in the UW School of Medicine. “In an ideal world, this information could be seamlessly transmitted to a doctor’s office. This would be really beneficial for telemedicine appointments or for triage nurses to be able to quickly determine whether patients need to go to the emergency department or if they can continue to rest at home and make an appointment with their primary care provider later.”
In the study, six (6) participants were asked to place a finger over a smartphone’s camera and flash. The researchers’ deep learning AI then parsed through the resulting videos, looking specifically at how the light from the flash was absorbed by each person’s blood. “Every time your heart beats, fresh blood flows through the part illuminated by the flash,” said Senior Author Edward Wang, an Assistant Professor in UCSD’s Department of Electrical and Computer Engineering. “The camera records how much that blood absorbs the light from the flash in each of the three color channels it measures: red, green and blue. Then we can feed those intensity measurements into our deep-learning model.”
They further tested the algorithm by having each participant inhale a mixture of oxygen and nitrogen to slowly reduce their oxygen levels. Throughout the process, to help validate the results, the participants simultaneously wore pulse oximeters on separate fingers of the same hands undergoing the smartphone test.
In total, the researchers gathered tens of thousands of data points from the participants — spanning oxygen levels from 100% down to 61% — that were in turn used to hone the algorithm. According to the study’s results, the AI was ultimately able to detect cases of low blood oxygen — meaning an SpO2 reading below 90% — with 81% sensitivity and 79% specificity.
Though the smartphone-based method helps surmount some obstacles to continuous, hospital-grade pulse oximetry, the researchers have not yet been able to overcome another major issue currently facing pulse oximeters: racial bias. In recent years, several studies have shown that standard pulse oximeters may consistently return inaccurate results for people with darker skin tones, since their methods of measuring how blood absorbs light have traditionally been developed and tested primarily on people with less-pigmented skin. The devices may also be less effective when used by people with thicker skin, further pointing to the need for more rigorous testing on a variety of skin tones and types.
In this study, all but one of the six (6) participants identified as white, and the researchers noted that they did indeed have issues gathering accurate readings from one subject with heavily callused fingers. “If we were to expand this study to more subjects, we would likely see more people with calluses and more people with different skin tones. Then we could potentially have an algorithm with enough complexity to be able to better model all these differences,” said Co-Lead author Jason Hoffman, a Doctoral Student in UW’s School of Computer Science & Engineering.
REFERENCE: Fierce BioTech; 21 SEP 2022; Andrea Park