Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More

The low roar of a combustion engine. It’s one of the most recognizable sounds on the streets, instantly alerting pedestrians that a car, and perhaps danger, is approaching. To urban pedestrians, it is an entirely common sound, yet it is remarkable in its ability to serve as a warning.

In 2016, sales of electric vehicles (EVs) grew by 41 percent globally. This number is set to increase exponentially in the coming decades. By 2040 — which isn’t as far away as it seems — 35 percent of new cars will be electric. This is great news for drivers, who will benefit in the cost savings and convenience of electric cars, and of course, reduced emissions are a net benefit for the environment.

But there is a potentially dangerous and unintended consequence built into these silent engines EVs operate on. Because they lack any emitted sound, they pose a lethal danger to pedestrians and cyclists far greater than any threat from a combustion engine. A study by the National Highway Traffic Safety Administration shows that hybrid and EVs are 37 percent more likely to cause accidents involving pedestrians. When it comes to cyclists, that statistic jumps to 57 percent.

Already, governments around the world are issuing regulations mandating that these cars emit a sound while in operation at certain speeds. In order to comply, auto manufacturers will need to develop digital engine sounds and other warning signals broadcast by external car speakers that can warn pedestrians and reduce collisions.


Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.


Register Now

Unlike the advent of the combustion engine, with its default roar, auto manufacturers are in a unique position to decide what the next era of automobiles will sound like. This opportunity comes at a time when cars aren’t just getting quieter, but smarter and more connected, with collision avoidance technologies and autonomous driving capabilities emerging at the same time.

Here at Ustwo, we saw this convergence as a huge opportunity for exploration, experimentation, and research. As electric vehicles become ubiquitous, the sounds they make and the ways they behave will become a regular feature of the urban landscape, with the potential to save lives. Getting it right is critical.

So, in partnership with the sound experts at Man Made Music, we embarked upon an exploration that sought to increase pedestrian safety by re-imagining the sounds of electric cars. We developed a perspective that cars should utilize the contextual information, and emit sound when pedestrians are at risk with the car. We created sound concepts, then we tested those concepts by observing pedestrians in a virtual reality environment as they reacted to a variety of sounds emitted by an electric vehicle. The VR experiment was prefaced and followed by weeks of research into the topic.

Here’s what we learned:

  • Sound should cover three critical aspects, in order of importance: increasing safety of pedestrians, minimizing sound pollution and reducing overall traffic noise, and using sound design as a means of brand expression.
  • A context-aware external audio system based on vision and sensory systems may be the best solution. This would take into consideration data such as the car’s speed and direction, driving conditions, and number and type of pedestrians in the surroundings, and adjust the sound emitted accordingly.
  • When it comes to sounds themselves, high-intensity sound alone doesn’t communicate risk, but a sudden change in sound does.
  • While there is a certain amount of intuition that can guide us towards what will grab people’s attention, individuals interpret sounds differently, assessing different risk levels from the same sounds. A level of education will be needed.

What we still need to explore:

  • How will cars and sensors react in complex scenarios? As sensor and vision technologies advance, this will influence the way cars communicate with sound.
  • How many risk levels should be communicated to pedestrians? Our experiment relied on four levels of risk as a car approached a pedestrian. More testing is needed to see if fewer levels could be more effective or intuitive.
  • Could verbal communication be integrated? If electric vehicles have external speakers, would it be effective to allow them to speak directly to pedestrians?

When we started this journey, we thought that the solution would be something akin to the sound a parking sensor makes inside a car with a backup camera, but after weeks of research and testing, we realized that this issue is much more complex. As technology advances, and cities and the ways we get around them evolve, different problems and solutions may emerge. What’s clear is that sound design must no longer be only in service to the driver, but to pedestrians and the larger environment.

Kota Kobayashi is a product designer at Ustwo, a digital design studio.

Above: VB Profiles Connected Cars Landscape. (Disclosure: VB Profiles is a cooperative effort between VentureBeat and Spoke Intelligence.) This article is part of our connected cars series. You can download a high-resolution version of the landscape featuring 250 companies by clicking the image.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.