Lidar — laser-based sensors that produce depth data by measuring the distance between themselves and objects — are the cornerstone of self-driving systems from Google parent company Alphabet’s Waymo, Uber, and others. They’ve been employed for decades in space travel (during the Apollo 15 mission) and military reconnaissance, most recently in Afghanistan. But they’re not perfect — they have a somewhat limited range of between 200 and 300 meters, and they’re susceptible to interference from rain, fog, dust, snow, and other elements.
That’s why when Perceptin set out to design a perception system for autonomous high-speed trains, it ditched lidar to explore an alternative. The Santa Clara startup, which earlier this year debuted a $40,000 compact self-driving car and a driverless vending machine, this week took the wraps off a stereo “visual perception device” it says can reliably spot obstacles up to 1,000 meters ahead.
“Lidar sensors, which are commonly used in autonomous driving scenarios, suffer from a lack of detection range and semantic information, making them unsuitable for high-speed applications. Visual perception is currently the best solution that can be used in high-speed scenarios,” said Shaoshan Liu, founder and CEO of Perceptin.
According to Liu and Perceptin’s CTO, Bo Yu, achieving a measure of accuracy at long distances required combining seven distinct elements: a custom-designed lens, proprietary image signal processing algorithms and image sensing technology, mechanical design, calibration, computing units, and machine learning algorithms. The resulting system can detect objects even at speeds of up to 300 kilometers per hour (186 miles per hour), the company claims.
“With our 1,000-meter visual perception technology, we’ve created the longest-range visual perception capabilities for high-speed rail, while ensuring safety with critical functions such as obstacle avoidance and object detection,” Liu said.
Perceptin isn’t the only company with a perception platform it claims can surpass lidar’s performance, of course. In March, three-year-old San Diego-based TuSimple said its camera-based system — one purpose-built for long-haul autonomous trucks — can recognize objects up to 1,000 meters away and can work in inclement weather, like rain.
Yu contends that Perceptin’s solution has a significant advantage in that it can also be modified to cover a shorter range.
“While developing our 1,000-meter visual perception module, we ensured it remained flexible enough to be applied to any autonomous vehicle,” he said. “Based on our ongoing discussions with global automotive manufacturers, we designed it to have a shorter 300-meter perception range, the common requirement for commercial vehicles, such as trucks.”
The visual perception tech, which hasn’t been priced yet, is based on Perceptin’s modular DragonFly computer vision hardware and will be made broadly available in the coming months.
Perceptin emerged from stealth in March, but it has already raised a decent chunk of change: $11 million overall, including a $2 million seed round in 2016 and an $8 million series A in 2017. Samsung Ventures, Samsung’s investment arm, was an early partner, and PerceptIn said it has “visual intelligence” hardware and software projects in the works for over 100 customers, including Huawei, ZTE, UC Irvine, Wayne State University, a partner in Detroit, and unnamed industrial parks in China.