We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
The analysis of the words people use in chat environments can help businesses make money and improve the customer service experience for consumers. At Bark.us, however, analysis of 500 million messages has so far helped save the lives of 25 kids who were considered imminently suicidal but whose parents didn’t know.
The announcement was made onstage by Bark.us chief parenting officer Titania Jordan at MB 2017, a gathering of artificial intelligence and bot industry leaders held July 11-12 at Fort Mason in San Francisco.
Bark.us uses machine learning and statistical analysis to crawl conversations teens have on email, SMS, and platforms like Snapchat, Instagram, and WhatsApp. Analysis is performed to determine if a kid is suffering from cyberbullying, suicidal thoughts, possible depression, hate speech, or other attacks that can happen online without a parent or guardian aware anything is happening. Sexting and drug usage are also flagged. When signs of alarm are recognized, Bark alerts parents via text or email, then suggests potential next steps.
Bark knows it’s saved that many lives, CEO Brian Bason told VentureBeat, because parents contacted the company directly to tell them as much. Thousands of suicide alerts have been sent since the launch of Bark in 2015.
Bark is just one of the solutions that rely on machine learning to protect or safeguard young internet users.
Crisis Text Line is a chat service for people ranging from those who are just having a bad day to those who are experiencing depression or suicidal thoughts. The majority of its users are below the age of 25.
CTL also uses natural language understanding to recognize when teens are experiencing depression or suicidal thoughts. Once alerted, trained volunteers who work with Crisis Text Line step in.
Integrations are being sought with platforms like Kik and Facebook so if a person uses the specific kinds of words that demonstrate imminent risk of suicide, emergency services can be dispatched in the real world to help them. CTL also powers suicide risk chat at places like the Golden Gate Bridge or in cities or states that demonstrate a higher than average risk of suicide.
The AI Buddy Project recently launched to give the children of active duty military service members a cartoon avatar to speak with on their favorite digital platforms while their parent is deployed. Progress reports based on analysis of words used by the child will then be sent to a parent or guardian back home who is taking care of the child. The children of deployed service members bear a higher average risk of suicide.
After the project learns how to competently serve the children of servicemembers, they want to help other children suffering from trauma, such as child refugees or kids in war zones.
Bark costs $9 a month or $99 a year for each family. The company has 9 employees and is based in Atlanta, Georgia.