Perhaps no company is as synonymous with robot vacuums as iRobot. To date, the Bedford, Massachusetts-based firm has sold more than 25 million units to customers around the world, and it has an estimated 88% share of the robot vacuum market. But beyond the iconic Roomba, iRobot counts in its sprawling portfolio Mira, a pool-cleaning robot; RP-VITA, a medical robot that can plug into diagnostic devices like otoscopes and ultrasound machines; and Seaglider, a long-range dual-role autonomous underwater vehicle. In truth, the Roomba didn’t emerge publicly until over a decade after iRobot’s founding — and years after the company had devoted substantial resources to robotics contracts with the U.S. military.
For a glimpse into iRobot’s fascinating history and the Roomba’s origin story, we spoke with iRobot cofounder and CEO Colin Angle and chief technology officer Chris Jones. The pair shed light on the company’s current product lineup and offered insights into the barriers standing in the way of versatile, affordable, and highly capable home robots.
iRobot has its roots at MIT, where Angle studied as an undergraduate. In 1990, he founded iRobot with two partners: Australian roboticist and MIT professor Rodney Brooks, who’s credited with advancing the idea that robots’ cognitive abilities must be based on actions within the environment, and NASA Jet Propulsion Laboratory researcher Helen Greiner.
The small team architected Genghis and Ariel, robots designed for space exploration and mine disarmament, respectively. Genghis, a six-legged robot with an 8-bit microprocessor and 256 bytes of RAM, was inspired by insects; Angle noted in his undergraduate thesis that flies could navigate environments with incredibly basic neural pathways. This crystalized what Angle and colleagues dubbed “subsumption architecture,” where processes subsume one another in sequences of hundreds, together demonstrating emergent behavior like climbing over terrain and progressing toward an endpoint.
The pioneering work led to the creation of NASA’s Mars Sojourner rover, and soon after iRobot’s services were in demand. The company worked with Johnson Wax to build a product that autonomously cleaned floors in supermarkets and shopping centers, and iRobot collaborated with Hasbro on toys like My Real Baby, which employed animatronic facial expressions meant to mimic those of a human infant. In 1998, it nabbed one of its first research contracts from DARPA, the branch of the U.S. Department of Defense that spearheads R&D of emerging technologies for military applications.
“What we were really trying to say with iRobot was ‘Forget all these preconceptions of what things are supposed to be — how can we actually solve [hard] problems [in robotics]?'” said Angle. In DARPA’s case, that problem was stairs. The agency wanted a machine that could climb steps on its own, without human guidance.
“We won a contract for $120,000 to write a proposal on how to build a stair-climbing robot, and we were [competing] against JPL, Northrop Grumman, Boeing, Lockheed — a bunch of huge defense companies against little iRobot,” Angle said. “We got to go present, and it was like this big conference room and all these [people] lined up in evaluation panels.”
Their prototype would later become PackBot, a sensor-packed robot that’s been deployed in Iraq and Afghanistan. It was used to search the debris of the World Trade Center after 9/11, and it helped scientists assess the damaged Fukushima nuclear power plant in the aftermath of the 2011 Tōhoku earthquake and tsunami. Its large caterpillar track treads, which human pilots control with video game-style joysticks and buttons, enable it to traverse mud, rocks, stairs, and other obstacles (including shallow pools of water) and climb an up to 60-degree incline. It can do all of this while carrying payloads weighing more than 40 pounds.
The selection committee wasn’t impressed by the team’s description. “They told us, ‘We gave you a chance with this robot, and there’s no way this design would actually work,'” recalls Angle. Anticipating their skepticism, iRobot’s engineers brought in a proof of concept — Urbie — constructed from lightweight, machined aluminum, with compartments for LEDs and cameras and a handheld remote control with an LCD screen.
“Everybody wanted to have a robot that would climb up the stairs like a human, but that costs 1,000 times more and is 10 times slower than what we did with treads. We showed [the panel how it] drove up the stairs, and that was the moment,” said Angle. “It was a $4 million [follow-on] contract, and this was a turning point for the company.”
In some respects, PackBot — now the domain of iRobot’s defense division, which the company sold to Arlington Capital Partners for $45 million in February 2016 — became a symbol of iRobot’s commitment to practical robots engineered with bankable applications in mind. “Being practical at the end of the day and focusing on a task you’re trying to accomplish is going to give you the best overall result,” said CTO Jones, who was involved in robotics research and development at the Artificial Intelligence Lab at the University of Zurich prior to joining iRobot in 2005. “This is as opposed to trying to build something that does many things.”
Sweeping carpets with AI
Enter Roomba. “When we started with Roomba, people didn’t even think it was a robot. It was an automatic vacuum,” explained Angle. “The mental image of how robots are going to vacuum was a humanoid pushing a manual upright vacuum, and that’s so profoundly wrong on many levels. It’s just about the most complicated, expensive way of creating a robot vacuum you can possibly imagine.”
Roomba’s form factor might not resemble Rosie from The Jetsons, but Angle asserts it’s ideally suited for ducking around tables and chairs because of its small size. The two-wheeled, disc-shaped autonomous vacuum can detect the presence of obstacles and sense steep drops (with cliff sensors) to keep it from falling down stairs or off tall balconies. Most models have a pair of brushes rotating in opposite directions and a horizontally mounted side-spinning brush that sweeps against walls, followed by a vacuum that directs airflow through a narrow slit.
Second- and third-generation Roomba models (the 400 series and 500 series) have a self-charging, bin-emptying home base that they seek out at the end of each cleaning session via embedded infrared beacons. First-generation Roomba units lacked it, which, according to Angle, severely inhibited their autonomy. “Prior to the home base, the idea that the robot had to go out [and] do this job successfully every single time was nice, but not quite what we’d pictured in our heads,” he said. “Turning off the robot or letting it die is killing it — you have to go to it, pick it up, and move it back to its place. When you add the home base, you change the game.”
Early Roomba models were also relatively static in their approach to sweeping. They relied on iRobot’s in-house iAdapt Responsive Cleaning Technology, which stemmed from Brooks’ research: a set of heuristics and single algorithms like spiral cleaning, room crossing, wall-following, and random-walk angle-changing triggered by collisions with walls and furniture. As a result, primitive Roomba units covered some areas more frequently than others and took several times longer to clean rooms than a human would.
Subsequent Roombas (the 600 series, 700 series, and 800 series) expedited dust busting with a forward-looking, obstacle-detecting infrared sensor. A new imaging sensor used odometry to infer distance traveled from wheel turns, and internal sensors identified particularly dirty spots on floors.
The most substantial hardware upgrade to date arrived in the Roomba 900 series, which added a visual simultaneous localization and mapping (vSLAM) system that generated pixel maps of unknown rooms and tracked the robot’s location within those rooms. iRobot wasn’t the first to market with a vSLAM-capable vacuum, but it laid the groundwork for improvements to come.
The advent of modern AI techniques has accelerated the pace of robotics innovation, particularly in computer vision, according to Jones. It’s the subfield that deals with how computers gain high-level understanding from images or videos, and it seeks to replicate digitally what the human visual cortex can do naturally.
“There’s been a step change of improvement … with deep learning,” said Jones, referring to the neural networks at the heart of most AI systems today. Deep neural networks consist of “neurons,” or mathematical functions loosely modeled after biological neurons, that are arranged in layers and connected by “synapses” that transmit signals to other neurons. Those signals — the product of input data — travel from layer to layer and slowly “tune” the network by adjusting the synaptic strength (weights) of each connection. Over time, the network extracts features from the data set and identifies cross-sample relationships, eventually learning to make predictions.
“The ability of the robots to visually perceive the world really forms the basis of how they’re able to kind of carry out the tasks you’re seeing [them complete] today,” continued Jones. “You can have a robot clean your kitchen autonomously or have a team of robots mop and vacuum your entire house. [Deep learning is] key to the innovations we’re showing in our latest products.”
To that end, iRobot’s Roomba i7, which debuted in late 2018, features an upgraded imaging sensor that retains a memory of the room maps it generates and an understanding of the spatial relationships among subroom visual markers. Owners can command the i7 via iRobot’s mobile app or an intelligent voice assistant (like Amazon’s Alexa or Google Assistant) to clean any room, and it’s able to localize itself even if you pick it up and move it to another place.
Getting the i7 to reliably figure out where it is in space was deceptively difficult, said Angle, because of home environments’ dynamic nature. For instance, the Roomba engineering team had to deal with changing light conditions, which still throw some Roomba units for a loop — according to iRobot, the 900 series’ vision-based system needs at least some light for autonomous navigation, and their range is severely limited in very low light.
“It [takes] a lot of AI to get the robot to figure out where [it] is in space, but it’s so important,” explained Angle. “Success isn’t cleaning a home twice in a row or even three times in a row. Success is doing it 20, 50, or 100 times in a row.”
With positional awareness more or less solved, the Roomba team moved to address another longstanding problem plaguing the Roomba lineup: a lack of interconnectivity with complementary products. The Roomba s9+ and the Braava jet m6, which were announced in May, tap iRobot’s Imprint Link Technology to communicate with each other wirelessly over the internet. First, the s9+ (or i7+, which is also supported) vacuums the floor in selected areas, and when it’s finished and docked, the Braava jet m6 mops those same floors with an electrostatic pad.
The s9+’s other innovations are a 3D sensor that helps it find its way around large spaces, along edges, and deep into corners, and an anti-allergen system that traps and locks pollen and mold allergens to keep them from escaping the robot or its dock. As for the Braava jet m6, it features a wet mopping mode designed to tackle sticky messes, grime, and kitchen grease.
Beyond robots that mop and sweep up dust from tiled floors, iRobot is dipping its toes into the autonomous lawnmower market with Terra, which features “state-of-the-art” mapping techniques borrowed from the company’s Roomba line. Using Imprint Smart Mapping, it’s able to “remember” its location in any yard it’s seen before and cut the grass in parallel back-and-forth lines, like a human would. And unlike most robot lawnmowers on the market today, which require boundary wires, Terra is able to suss out its location from beacons placed strategically around the yard.
Like its vacuuming counterparts, Terra automatically returns to a home base to recharge before completing mowing assignments if the battery dips below a certain threshold. It sports a ruggedized exterior designed to protect against inclement weather, and you can adjust things like precision, grass height, and time-of-day preferences from the iRobot Home App.
As previously announced, Angle notes that Terra will be available in Germany and as part of a beta program in the U.S. sometime later this year.
Home robot stasis
What lies beyond lawnmowers, mops, and vacuums for iRobot in the future? Angle declined to share specifics but said that dollars and cents will largely dictate the company’s product roadmap, as has been the case historically. “Ultimately, what the robotics industry needs most isn’t more roboticists, but better business models,” he said. “In addition to trying to solve hard problems … companies [like ours] need a consumer central to what [they’re] trying to do.”
Speaking to VentureBeat in an interview last year, Misty Robotics CEO Tim Enwall predicted that every home and office will have a robot within 20 years. But realists like Ken Goldberg, a professor at the University of California, Berkeley, anticipate that it’ll be 5-10 years before we see a mass-produced home robot that can pick up after kids, tidy furniture, prep meals, and carry out other domestic chores.
The consumer robotics market appears to be in a kind of stasis, in any case, exemplified by Bosch’s decision to dissolve home robot startup Mayfield Robotics; Honda’s cancellation of its Asimo program; and the shuttering of Anki Robotics, which had raised nearly $200 million in venture capital from big-name backers like J.P. Morgan and Andreessen Horowitz. In more bad news, Jibo, a startup developing a robot with a custom built-in assistant, shut down earlier this year, and seven months ago industrial robotics company Rethink Robotics closed its doors after trying unsuccessfully to find a buyer.
Robotics isn’t an easy pursuit at commercial scale, said Jones, who described it as an “art.” Despite the emergence of powerful robotics software development platforms like Microsoft’s Robotics Developer Studio and AWS RoboMaker, he says formidable supply chain challenges stand in the way of companies’ success.
“You have electrical, mechanical, software … and all that has to come together in a practical package that actually does something valuable, and getting those to work together efficiently and effectively is a challenge,” he said. “Every home is different — people interact with robots differently. It’s a tall order, and that’s why staying focused on practicality really matters.”
iRobot didn’t arrive at this philosophy overnight. Angle points to the Ava 500, a roving telepresence robot the company created in partnership with Cisco, as one example of an early misfire. Ava didn’t want for features — it sported a 21.5-inch HD screen, an iOS companion app that enabled operators to direct it to rooms or employees on a map, and integration with Cisco’s video collaboration platform — but according to Angle, its marketing failed to properly convey its capabilities.
“We went through a long list of all the things that it could do, which we quickly learned is a terrible way to sell a product,” he said. “Even if it did two out of the 10 things or even five out of the 10, people wouldn’t buy it because it wouldn’t do 10 out of the 10 things. There’s a whole separate, non-technologically driven side to help people understand what robots are meant to do.”
A ‘family’ of robots
That’s why both Angle and Jones believe a single home robot capable of doing it all — the sort that has dominated science fiction for decades — won’t be feasible in the foreseeable future. They instead predict that a family of machines will work together to perform individual chores like folding clothes, washing the dishes, and assisting older or disabled family members.
“The home can handle several different types of robot. You’re going to be able to buy them incrementally, each specialized to do a purpose really well, and there’s going to be some things where combining functionality into one robot makes sense,” explained Angle.
Paired with spatial knowledge gleaned from mapping data, these innovations could usher in robots whose behaviors take into account the presence or state of furniture, fixtures, electronics, carpets, and even people, according to Angle. “I don’t want the robot to come in when I’m watching TV — the robot needs to understand where the TV is, whether it’s on or off,” he said. “The robot should be able to discover that.”
Angle, perhaps anticipating questions about data privacy, asserted that this sharing paradigm would comply with legislation such as General Data Protection Regulation (GDPR), the legal framework that requires businesses to protect the personal data and privacy of European Union citizens for transactions that occur within EU member states. Two years ago, Reuters erroneously reported that iRobot planned to sell maps to third parties like Amazon, Google, and Apple, which Angle says still isn’t in the cards. Instead, he surmises that iRobot will eventually reach a deal to share its maps for free with customers’ consent, extracting value by making its devices more useful in the home.
“Robots have this very unique and valuable position within [the home] … and it’s [with] that understanding of the home that we’re working with partners to develop insight that can improve the value of other products,” said Angle. “The [connected devices] industry is asymptotically hitting a ceiling, because so-called smart devices in the home don’t really understand anything about the home — they’re very narrow. Robots really have an opportunity to make smart homes smarter.”
He believes that in order to overcome this hump, voice assistants — which are fast becoming one of the most popular ways Roomba customers interact with their vacuums — must improve in their ability to parse intent rather than continue to rely on “magical phrases” like “Roomba, clean my living room.” Angle’s not alone in this: Amazon recently began piloting AI systems that guess which apps to launch from vague commands and attempt to resolve ambiguous requests.
“‘Roomba, vacuum the kitchen’ is pretty straightforward. It’s an utterance that people can remember, and that kind of works,” said Angle. “But when you’re dealing with more than one machine, it changes on an order of magnitude the complexity of the utterances that you may need to remember. [Voice assistants] should be able to understand a phrase like, ‘Well, can you vacuum the kitchen and then mop the den after I leave to go to the party?'”
He added: “[Amazon’s and Google’s] problem is that they’re trying to be everything to everyone, and what I’d like to do instead is to find a way to more richly interact with a more curated set of devices.”