Smartphones have a variety of sensors that can help monitor a patient's vital signs, and tech companies are figuring out ways to use those sensors to help diagnose a variety of conditions. But experts say many of the tools still need fine-tuning before they can be relied on by providers, Hannah Norman reports for Kaiser Health News.
Tech companies are utilizing phones' built-in cameras, microphones, accelerometers, gyroscopes, and speakers to feed data into machine learning algorithms to help connect patients and health care providers.
"It's very hard to put devices into the patient home or in the hospital, but everyone is just walking around with a cellphone that has a network connection," said Andrew Gostine, CEO of senor network company Artisight.
Google has heavily invested in this area, Norman reports. In the Google Fit app, users can check their heart rate by placing their finger on the rear-facing camera lens or use the front-facing camera to track their breathing.
"If you took the sensor out of the phone and out of a clinical device, they are probably the same thing," said Shwetak Patel, director of health technologies at Google.
Google has also been looking into using a phone's built-in microphone for detecting heartbeat murmurs and using the camera to screen for diabetic eye disease, according to the company's blog.
Meanwhile, Binah.ai has been utilizing phones' cameras to calculate a person's vital signs. The software looks at the region around a person's eyes where the skin is thinner, then analyzes the light reflecting off blood vessels and back to the lens.
Canary Speech utilizes the phone's microphone using the same technology as Amazon's Alexa to analyze patients' voices for mental health conditions. The software also allows clinicians to screen for anxiety and depression using a library of vocal biomarkers and predictive analytics, according to Canary CEO Henry O'Connell.
ResApp Health received FDA clearance last year for SleepCheckRx, an iPhone app that listens to a patient's breathing and snoring to help diagnose moderate to severe obstructive sleep apnea.
And the Reflex app, developed by Brightlamp, utilizes a phone's camera to measure how a person's pupils react to changes in light, helping clinicians manage concussions and vision rehabilitation. The app uses machine learning analysis to provide clinicians with data points for evaluating patients.
While the utilization of smartphones as clinical devices offers great potential, experts say ensuring accuracy and clinical validation will be key to get buy-in from providers. According to Eugene Yang, a professor of medicine at the University of Washington, many of the tools still need to be fine-tuned.
For example, researchers have found that some applications using computer vision, like heart rate or blood pressure monitoring, are less accurate for patients with darker skin. In addition, small algorithm glitches can produce false alarms for patients.
And since the tools rely on algorithms built by machine learning and artificial intelligence rather than physical tools in hospitals, it's difficult to judge them, as research can't "compare apples to apples" with medical industry standards, Yang said.
Doctors still have to verify the results of these tools, undermining their goal of reducing costs and improving access, Yang added. "False positives and false negatives lead to more testing and more cost to the healthcare system."
"We're not there yet," Yang said. "That's the bottom line." (Norman, Kaiser Health News, 1/17)
Create your free account to access 1 resource, including the latest research and webinars.
You have 1 free members-only resource remaining this month.
1 free members-only resources remaining
1 free members-only resources remaining
Never miss out on the latest innovative health care content tailored to you.