Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Waymo, the self-driving project owned by Google’s parent company Alphabet, is preparing for the day when the century-old relationship between car and driver becomes discombobulated.
On Monday, within the confines of its typically off-limits testing facility known as Castle, the company showed off how it plans on helping people adjust to a future where the car is the driver and humans are no longer behind the wheel. The heavy subtext: we’re getting close to the moment when we pull our human test drivers out of our self-driving vehicles.
Waymo is still developing and testing its software and hardware, which includes a robust suite of sensors that allow its self-driving Chrysler Pacifica minivans to see and hear the world around it. This includes the vision system, radars, and light detection, and ranging radar known as LiDAR.
But as that technology matures, engineers and designers are also building out user experience features that aim to help humans interact with and, eventually, trust self-driving cars.
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
The basic functions of a car—and its relationship with the driver—are fundamentally the same as they were when the Ford Model T arrived in 1908. The driver uses the steering wheel, accelerator and brake pedals to navigate the vehicle on city streets, highways, and rural roads. Even when people become passengers—say while using a ride-hailing service like Uber or Lyft—there is a human driver they can communicate with.
“That communication builds trust between you and the driver,” said Ryan Powell, Waymo’s head of user experience design, during a presentation at the Castle proving grounds near Atwater, Calif.. “And so you feel at ease on that ride, you feel safe, you feel secure, you know what to expect.”
Waymo is trying to recreate that experience with a variety of features from audio cues and visual messages during the ride as well as an app that is still under development. For example, Waymo is looking at developing a feature that would allow the car to recognize when riders are near so it can pull over and let them in instead of continuing to the designated pick-up spot.
“Every moment in a self-driving car matters,” said Juliet Rothenberg, product manager of Waymo’s in-car user experience. “Small touches are critical for safety. They’re also vital for creating a more convenient in-car experience.”
The Rider Experience
People who are participating in Waymo’s “early rider program,” which launched this April in the Phoenix area, have been interacting with the user interface and regularly provide feedback, according to Waymo. Now the company is sharing information and images of these features with the broader public.
When riders first step into the self-driving Chrysler Pacifica Hybrid minivan, they’ll notice two displays and a button console with four choices: “help,” “lock and unlock,” “pull over,” and a blue button that says “start ride.”
The two screens, which are attached to the back of the headrests, display a welcome message.
Once the “start ride” button is pushed, the display message disappears and is replaced with a real-time view of the route. The displays showcase relevant information, like pedestrians, other vehicles, and objects in the road. It will also display less relevant objects like trees and buildings about every five seconds. Even here, Waymo has tinkered with what is the best mix of information for the rider, and just how to present it. For instance, greater emphasis is placed on the most relevant objects like pedestrians and bicyclists, while buildings are less visible.
At the top of the screen, an estimated arrival message constantly updates as the vehicle makes progress towards its destination.
Other messages show up towards the bottom of the display, typically to explain the behavior of the vehicle. For instance, if the self-driving minivan stops at an intersection because a pedestrian is about to cross, passengers would see a “yielding to pedestrians” message. They might also notice the crosswalk that the pedestrian is using is illuminated on the screen.
The screen also shows when the self-driving minivan has entered a construction zone. If there are multiple construction cones, those too will be displayed.
The company has also tried to anticipate the needs of apprehensive passengers, by mimicking the behavior of human drivers. For instance, as the self-driving minivan prepares to make a right turn, the camera angle displayed on the screen moves to the left to signal that the car is looking for objects like oncoming traffic.
Waymo’s designers are hardly finished with these features. The company is constantly testing and tweaking, including experimenting with using sound to alert riders who are blind when their vehicle has arrived.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.