Sometimes robots replace human workers, but other times they augment those workers instead. At Amazon’s re:MARS 2019 conference, a quartet of companies showed off a set of telerobotic arms and hands that let a human operator “feel” what the robot hands were touching. The idea is that the robotic limbs and digits become essentially an avatar for the human operator, who may be standing right there (as was the case with the show floor demo), or may be operating remotely thousands of miles away.
Four-part robotic harmony
The system on display was curiously unnamed, but it’s the result of four companies bringing their technologies together.
It uses SynTouch‘s BioTac biomimetic tactile sensor that’s designed to work like a human fingertip, Shadow Robot Company‘s dextrous hand, and haptic feedback and advanced motion capture technology from HaptX. Demonstrators were able to pick up and move a small ball, stack rings on a toddler toy, and even stack red Solo cups.
The biomimetic sensor imitates the sensation of human touch, where the robot fingertips interact with objects and surfaces. The company website describes the sensor as using “an elastic liquid-filled skin” on top of a firm core, which could just as well be an apt yet slightly twisted description of human skin, flesh, and bone. The sensor can detect force, vibration, and temperature, with all the key electronics protected inside of the core.
HaptX’s technology transmits a simulated sensation to the gloves that are being worn by the human operator. This is married to the dexterity afforded by the hand itself, which rounds out the human-ish feeling of being able to grasp and manipulate objects.
There’s clearly a certain amount of latency, and the dexterity looks — at least for novice users — to be about on par with that of a pre-kindergarten child.
It’s obvious how the aforementioned technologies work together, but it’s less apparent how the fourth company, Japan-based ANA Holdings, Inc., contributes. A press release vaguely stated that the system is “united by the ingenuity of ANA,” but it didn’t elaborate further. ANA is an airline, but in March 2018 it launched its ANA Avatar Division, which the company describes as “a multi-faceted breakthrough endeavor to advance and pioneer real-world avatar technologies.” It’s meant to incorporate “robotics, haptics, AR/VR, and AI to enable humanity to instantaneously teleport their presence, consciousness, knowledge and skills to a remote location.” It appears that ANA’s contributions thus far have been less technical and more financial.
It’s perhaps strange that an airline would dive into telerobotics, telepresence, and teleoperation, but it’s certainly less confounding than an online bookseller’s founder diverting his attention to lunar colonization.
Here, there, and anywhere
One of the most compelling aspects of telerobotics is that its goal isn’t to replace humans with robots necessarily, but to provide new ways of allowing humans to perform tasks remotely. In that sense, the “avatar” metaphor is ideal, particularly in the case of this demo from re:MARS that not only allows a human to control a robot from afar, but feel what it’s touching, as well. There may be crucial information, like temperature or vibration, that the operator will receive almost immediately, and intuitively.
The whole package reframes how some industry jobs will work. As the internet and the ocean of services built on top of it over the decades have made remote work both commonplace and practical, telerobotics could make geographic location irrelevant for a new wave of jobs. A person could work from home but use the avatar system to dial in to a telerobotic system anywhere in the world and perform key tasks. For example, if a given task required specialized expertise, a company may need only one operator with that knowledge and wouldn’t have to fly them all over the world to perform it.
Telerobotic systems are also ideal for work that requires a human touch but superhuman strength, as well as for tasks that are too dangerous for a human to be in the immediate proximity, like defusing a bomb.
A side effect of being able to feel something within your body that a robot hand feels can be, ironically, a more human experience. Touch isn’t merely informational — hot, cold, pointy, smooth, and so on — it’s connective. A mundane action, like gingerly picking up an incubated egg without breaking it, is one thing, but feeling the embryonic animal moving inside is quite another.
The companies did not announce further specific plans for a rollout.
The audio problem: Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences. Access here