London-based Slamcore today announced it has raised $5 million to develop and commercialize AI algorithms that give robots better positional awareness in any environment.

Simultaneous localization and mapping — or SLAM, as it’s known — is a decades-old computational problem centered on how best to get robots, drones, and other autonomous entities to move around a given environment without bumping into other things. Garnering accurate positional awareness in dynamic or unmapped surroundings is no easy feat, particularly when indoors or in built-up areas where GPS is less effective. Thus, engineers have been setting out to develop algorithms that can both map a new environment and guide the robot through that space at the same time, using data gathered from on-device sensors such as cameras, sonar, radar, and lidar.

And that is effectively what Slamcore is setting out to achieve, with “spatial AI” smarts that help robots and drones glean deeper situational understanding from sensor data.

Deep learning

According to Slamcore cofounder and CEO Owen Nicholson, robots should be able to answer three questions accurately: Where am I in 3D space? Where are the objects around me? and What are the objects around me? But it’s all too common an occurrence for at least one of these questions to draw a blank.

“The number one cause of failure for a robotic system is when the robot gets the answer to one of these questions wrong,” Nicholson told VentureBeat. “Another way to put this is when there is a discrepancy between what the robots thinks the world is — and its position within it — and reality. This is what leads to a robot crashing into something, getting lost, or failing to get back to the charging station in time.”

Arguably, the most intriguing part of Slamcore’s proposition is the semantics it promises to bring spatial understanding, helping robots answer the “what are the objects around me” question. Garnering an accurate position in relation to an environment is important, but understanding the different objects in that space can bring extra context to help a robot build more accurate maps. For example, a dynamic object — such as a person or forklift truck — will likely move around frequently; thus, it shouldn’t be mapped in the same way as a table or a wall.

It’s also worth noting that while machine learning does factor into Slamcore’s technology, the company says that the majority of its algorithms are designed on “first principles,” which means no training data is required. “This makes it much more reliable when operating in the real world,” Nicholson said.

Rise of the machines

Robots have been steadily infiltrating society for years through gadgets such as automated vacuum cleaners, but the COVID-19 crisis has seemingly increased demand for technology that reduce the need for human-to-human contact — this has helped robotic baristas and cleaners gain traction. Drones too have broken into the spotlight via medical supply deliveries, while aerospace systems giant Honeywell this week launched a new business unit aimed at the autonomous aviation space, covering drones, air taxis, and unmanned cargo delivery vehicles.

Augmented reality (AR) and virtual reality (VR) have also gained widespread attention through the pandemic, as companies and consumers have sought new ways to interact with each other from afar. These areas are also heavily reliant on spatial intelligence technology.

“Both a robot and an AR/VR headset require spatial intelligence in order to operate,” Nicholson said. “They both need to track their position relative to the static world as they move through space. A robot uses this information to control its wheels and motors to get from A to B; a headset uses this information to continually update the 3D graphics so that the virtual world appears locked in reality.”

Slamcore, too, said that it has seen inquiries to gain early access to its alpha program increase by 50% in the wake of the COVID-19 crisis.

There are of course many intelligent, spatially aware products on the market already, but they are typically built by deep-pocketed companies such as Microsoft with HoloLens and iRobot with its Roomba robotic vacuum cleaner. But Slamcore is looking to bring this type of functionality and intelligence to devices of all shapes, sizes, and price points. It also wants to do so without relying on existing open source technologies that may not be good enough for commercial use.

“Open source solutions, whilst good enough for demos and proof-of-concepts, were not designed for commercial use and therefore most companies run into issues as they try to scale,” Nicholson explained. “At this point, companies try to modify or build solutions from scratch but due to the complexities involved, it is just not achievable for all but the largest of companies.”

Over the past year, Slamcore said it has conducted a series of alpha-tests with a range of startups and multi-national corporations in the U.S., U.K., South Korea, Japan, Taiwan, and Singapore. “Our system has been tested on industrial robots, inspection drones, hospitality robots, and construction robots, to name a few,” Nicholson added.

Slamcore was spun out of London’s Imperial College back in 2016, and today claims a team of 25 which includes PhD graduates spanning computer vision and robotics. Prior to now the company had raised $5 million, and with another $5 million in the bank from Octopus Ventures, MMC Ventures, Toyota AI Ventures, and Amadeus Capital Partners, it is well financed to cater to the “growing demand” for its spatial AI software development kit (SDK).

Sign up for Funding Weekly to start your week with VB's top funding stories.