Amazon released a new device last week, the Fire Phone, that features four sensors, one at each corner of the front of the device. Using these sensors the phone can track the user’s head movements, and adjust the 3D image on the screen according to the angle from which the user is looking at it.
But these sensors are like cameras, too, according to Amazon. They can sense the length, width and depth of an object in front of it.
This could enable some intriguing digital health applications.
On the same day that it announced the Fire Phone, Amazon released an SDK to developers, who will create apps for the new phone. The more they exploit the unique features of the phone (the sensors, the 3D display), the better and more engaging the apps will be. Health apps could exploit those features very well.
Amazon’s director of product management for the Fire Phone, Cameron Janes, told VentureBeat his team has not explored the possible health-related capabilities of the device, but said he found the idea “fascinating.”
It’s easy to imagine a doctor talking to a patient from the screen of the phone while she reads the objective data being gathered by the sensors moving over the patient’s body.
The Fire Phone’s sensors could be useful for patients undergoing physical therapy.
“Orthopedic surgeons love having the latest and greatest technology; I’m betting this product will be very useful for doctors trying to figure out range of motion problems,” says Dr. Molly Maloof, an expert in digital health technologies. “Physical therapists will also find this valuable tool for remote physical therapy, which is becoming a burgeoning digital health industry in itself.
A number of startups, like Home Team Therapy and Reflexion Health, have created home physical therapy programs around Microsoft’s Kinect gaming interface, which uses sensors to detect the movements of gamers — and, potentially, patients — in the living room.
As the patient is guided through exercises on the TV set, the Kinect’s sensors measure physical range of motion and other metrics and report the data to doctors.
The Kinect senses movement from about 8 feet away. The sensors on the front of the Fire Phone are equally powerful, but the body being scanned is much closer.
Scanning the eye and face
The Fire Phone could be used to scan injuries or diseases of the eye and face.
“This technology offers great promise for adding greater sophistication of monitoring and diagnostics at the point of care — vision impairment such as amblyopia [also called lazy eye] in children, and retinal and glaucoma diseases in adults, come to mind,” says Jim Bloedau of the Information Advantage Group, a San Francisco consultancy that helps startup enter the health market.
Bloedau adds that the sensors on the Fire Phone might also be useful in detecting impaired facial motor functions caused by stroke.
The 3D presentation of scanned results on the Fire’s screen might also be important in a clinical setting, Bloedau says. “I think we need to consider what 3D presentation did for radiology, and how that advanced diagnostics.”
The sensors might also be used to diagnose concussions.
Reebock has developed a head sensor pad that when worn under a football helmet can detect if (and when) the wearer has suffered a fall bad enough to cause a concussion.
Another way to tell if a person has suffered a concussion is the speed at which the eyes can follow and focus. The sensors on the Fire Phone might be able to track the response time of the eyes as they follow a moving marker on the screen.
“Certain potentially cancerous and noncancerous growths (e.g., moles and hemangiomas) could be tracked by doctors to see how fast they are changing,” says Dr. Molly Maloof, MD, an expert in digital health technologies.
“This could help them identify people who may need a biopsy sooner than six months,” Dr. Maloof says. “This could literally change the game for ongoing monitoring of concerning lesions and lumps.”
Teledermatology companies might develop apps that use the Fire Phone’s four sensors to scan and measure growths on a remote patient’s skin. The scans could create a detailed 3D picture of the growths for the doctor to examine in her office.
The Fire Phone could also be a powerful vehicle for neurogaming, which is already being used for help people with ADHD, depression, anxiety.
Neurogaming uses inputs like heart rate, brain waves, and hand-and-body gestures to control the game. The Fire Phone’s head tracking capability would create a new, and potentially game-changing, input, says neurogaming expert Zack Lynch, CEO of the Neurotechnology Industry Association.
Neurogaming on the Fire Phone could be used to treat nervous system disorders like Parkinson’s, Lynch says. “If you’ve got a sensor that knows how your head is moving it would be easy to detect tremors. This could allow doctors to tell if the patient needs to adjust their medications.”
The main barrier to games helping patients and improving outcomes is poor engagement, Lynch says. Patients get bored and stop playing.
The Fire Phone’s head tracking and 3D experience could make therapeutic neurogames more fun to play, Lynch says. “The more the functionality can be integrated into the game mechanics, the more the game would take hold,” he says.
We’ll have to wait and see if health app developers see the potential in the Fire Phone. The Fire Phone’s app ecosystem is just now getting off the ground.
But if Amazon actively encourages developers and qualified health professionals to work together on new health apps, it could create some viable businesses that really help healthcare providers and their patients.