Synaptics likes to stay in touch with the way that people interact with devices. The company makes touchscreen, touchpad, display components, and fingerprint identification technologies.
Because of that, it likes to stay on the leading edge of thinking around how humans interact with machines. For instance, Synaptics just announced the ability to recognize a fingerprint and authenticate it, even when viewed through touchscreen glass.
That means that we might no longer need a home button on our smartphones. Is that a good idea? There might be an outcry if we take away the security of the home button. Rick Bergman, CEO of Synaptics, and his engineers have to think about these kinds of challenges. You could invent a new user interface technology, but if people don’t like it, or they don’t want to learn it, then it won’t fly.
We talked about technology and the human-machine interface at the recent CES 2017 event, the big tech trade show in Las Vegas last week.
Here’s an edited transcript of our conversation.
VB: It looks like there’s still a lot of innovation in this space, a lot of things happening.
Rick Bergman: There are three areas: touch, display drivers, and fingerprint. The market isn’t sitting still. We continue to find ways to innovate and add value. OLED screens, which I’m sure you saw, is a big trend. Authentication technologies — if you think about phones, just three years ago, virtually no phone shipped with that. This year, there could be 700 or 800 million phones with fingerprint readers.
Bergman: Fingerprint itself, everyone saw the market trend. There’s been a rush of companies to get into the space. Now, specifically under glass solutions, that’s a different requirement than the solutions we’ve had to date. Right now, no one’s introduced an equivalent solution, at least to my knowledge. A company in Florida called Sonavation is the only startup I’m aware of that’s trying to do fingerprint under glass.
VB: It’s interesting, because then it enables a new kind of device. We all got used to having that one Home button, holding it down to see if it recognizes something. Now you can get rid of the button and have the whole glass be able to recognize a fingerprint.
Bergman: Certainly that’s the goal. Samsung has gone edge-to-edge on the two vertical edges. Every square millimeter of your phone becomes capable of displaying. That’s where people want to be. If you get rid of the home button, that takes away entries for moisture or dust. It reduces costs from a system-level perspective. You have the visual benefits of a complete display.
VB: Are there are still ways to introduce new things and get people to adapt to them? It almost seems like there might be an outcry if you get rid of the home button. People have gotten used to it, even if it wasn’t that great an idea in the first place.
Bergman: It’s a safety button for a lot of people. Samsung and Apple have the two iconic home buttons out there. We may not actually get rid of it, though. As a physical button it may go away, but you could still keep it there electronically. It’ll feel the same if you use haptics appropriately. It just isn’t a button anymore. A home spot, something like that.
VB: There’s the facial recognition and fingerprint combination, and car fingerprints as well. It seems like two-factor authentication is important for some applications.
Bergman: Yesterday we announced two-factor authentication. It can be for convenience, or for security. The obvious thing is you’re skiing in Tahoe, and you really don’t want to take off your gloves to fingerprint, so you turn the facial mode on as an alternative to read your email or texts. Two-factor is for security. Many of the banking applications are already asking for that.
VB: For a car, does there appear to be a reason to use fingerprint instead of a key?
Bergman: We haven’t seen it as a substitute for a key yet. We’re seeing interest in two areas, and I’m sure it will grow. The first big picture is people are getting comfortable with using fingerprints, because of the iPhone and other phones. It’s becoming very natural. It could be something as simple as driver settings. You or your spouse or whatever can have separate settings that it’ll recognize. Also, as vehicles become rolling commerce interfaces — you go through the McDonald’s drive-through and approve the transaction right from your vehicle.
VB: In the car, is there going to be something new that people would see as a fingerprint button?
Bergman: Different OEMs have different visions. There could be something close to the steering wheel, or something on the center console. More and more people want their vehicles to have all the capabilities of a phone. If you pay $50,000 for a vehicle, you don’t want it to have old, lagging consumer technologies. They want state of the art. It’s good for us, because being a leader on the consumer side is opening the door for us on the automotive side.
VB: What do you think of the combination of haptic and touch? Is that still something somebody wants, generally speaking?
Bergman: Almost all phones use some level of haptics. It’s usually just one actuator, though, on your home button. You feel like you’re depressing a button, but you’re really not, as one example. That click isn’t really there.
VB: With some of these other things coming, like virtual reality, people are saying they want the sense of touch to come in somewhere. I don’t know if that’s something you guys have put thought into as well. It may be farther afield from where you are.
Bergman: No, AR and VR is an area of high interest to us. Not so much related to touch opportunities. Something like Gear VR has lower-end touch, where we don’t play very much. What’s of interest in VR is the displays. As you saw from our demos, we do high-res displays. In VR you have two of them, which presents a lot of interesting opportunities for us to focus on.
VB: The fingers in the VR world — they’re getting represented in a lot of the sensing that’s happening now. Oculus Touch represents your fingers and shows what they’re doing, but you don’t really touch anything. You never get any real feedback. I’ve seen some guys doing ultrasound touch feedback, blowing sound back at your finger with a lot of little speakers.
Bergman: You can use ultrasound, or you can actually use IR and visible light to look at the fingers and get a third dimension. There are a number of companies doing 3D gestures out there. But to date we haven’t done anything in that area. We don’t directly do haptics ourselves, but we do work with guys like Immersion in that space. We do reference designs to improve the touch experience. It’s always a potential area for growth, but we don’t have immediate plans to do haptics technologies.
VB: Is force an area of potential further innovation?
Bergman: We’ve offered force for several years now. We’re starting to see phones introduced with it. What held it back for a while was the higher cost of implementation. We don’t quite get it for free now, but it doesn’t require a separate chip or separate sensors now. We’ve seen adoption by guys like Xiaomi and others.