Did you miss a session from GamesBeat Summit Next 2022? All sessions are now available for viewing in our on-demand library. Click here to start watching.
Last week, Mark Zuckerberg and his wife, Priscilla Chan, announced a $3 billion plan to cure, prevent, or manage “all diseases in our children’s lifetime.”
For the past two years, a startup called Beyond Verbal has been working on disease detection through voice samples and machine learning, an application Zuckerberg himself has talked about as having interesting potential.
Today, the company launched the Beyond mHealth Research Platform to collaborate with research institutes, hospitals, businesses, and universities to collectively search for unique markers in voice samples.
“[This idea] really got my imagination,” Beyond Verbal CEO Yuval Mor said. “Here you have people who say, ‘We know that there are new tools that can be utilized on a global basis, and let’s do something about this’, so I think it’s an amazing thing.”
Intelligent Security Summit
Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.
Beyond Verbal wants to get the attention of Zuckerberg and his wife and of researchers, hospitals, and tech companies interested in exploring the way voice can be used for the tracking of “acoustic abnormalities to identify specific medical conditions.”
“One of the things that we’ve been doing is research with the Mayo Clinic on the ability to detect some heart problems just by analyzing the tone of voice. And when we realized that these types of correlations exist, we’ve decided to [expand] the work we’ve been doing,” Mor said.
Beyond Verbal and others claim to have found unique biomarkers for diseases like Parkinson’s and ALS. Partners have already agreed to submit voice samples to help develop detection of PTSD and neurological diseases. Early partners include the Scripps Research Institute, Mayo Clinic, and Hadassah Medical Center in Jerusalem.
Beyond Verbal has also been working on a related program — emotion detection through voice analysis. To date, the startup has gathered more than 2.5 million voice samples in 40 languages to help it tackle this machine learning task.
Emotion analytics is now available through an API, and app users can help label emotions found in voice samples shared by other users.
The company wants to follow a similar process to track health through voice analysis, and it is using another app, Beyond Clinic, to collect voice samples for this research. Medical records can be shared on the research platform to corroborate voice analysis for health.
Like many in the quantified self movement, Mor believes that as long as people know how their data is being used, individuals should be allowed to carry out ongoing collection of voice samples to detect specific health conditions. This might include monitoring a heart patient who’s just returned home after surgery, keeping tabs on aging parents who live far away, or monitoring your kids after school.
Mor believes Beyond Verbal can put health tracking into consumer products in one year, but the company will first have to substantiate its claims.
“We need to substantiate [this idea], which is again why we are coming with this research platform to really engage with institutions all over the world,” he said. “We are still in the research phase.”
Beyond Verbal wants to create a common API to detect health changes through voice data collected continuously through smart cars, Internet of Things devices in smart homes, and personal assistants like Siri and Alexa.
The hope is that the API could someday be used to detect cardiological conditions and diseases like dementia, Parkinson’s disease, and ALS from the sound of a person’s voice.
Mor hopes that, eventually, people will be able to call a doctor or caretaker and have specific medical conditions identified using this type of voice analysis. The same technology could theoretically help doctors provide medical care to people in remote parts of the world.
Emotion analytics, or what is called “affective computing,” is an area of expanding interest.
Startups like Retinad are tracking head movement to determine the emotional state of virtual reality users. Affectiva monitors facial movement to detect emotion in video. In January, Apple acquired San Diego-based emotion analytics company Emotient.
IBM Watson can track the tone of your message, and companies like Cogito use a smartphone app to listen to the voice of people who have experienced a trauma, depression, or an ongoing mental health condition.
Mor wants to see more collaboration across fields of biometric tracking, which is why Beyond Verbal and Affectiva recently held a hackathon.
“We feel this research platform is an important tool to bring together researchers and engineers and let them collaborate under one umbrella, so if we have hospitals use our application, and if we can get…reliable medical data, this is really a way to call the different forces in the world and say ‘Let’s collaborate and do some fascinating things together’,” Mor said.
Beyond Verbal works with 25 data scientists, researchers, and contractors primarily based in Israel. It was created in 2011 but uses technology based on decades of research obtained through an acquisition. On Sept. 1, the company announced that it had received $3 million in venture capital funding to continue its emotion analytics work.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.