We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Researchers from MIT and Yale University have devised a method to use robots to pick out paper, plastic, and glass as a way to sort recycling.
You may presume a trash-sorting robot would rely on computer vision to recognize the difference between different materials. The discipline to give machines sight is the primary tech behind AMP Robotics system that’s been used at a Denver, Colorado recycling facility, as well as TrashBot and Oscar, AI sorting trash cans being sold for use in homes and commercial offices.
Instead, the RoCycle system relies solely on sensors and soft robotics to identify and sort glass, plastic, and metal through touch alone.
“Computer vision alone will not be able to solve the problem of giving machines human-like perception, so being able to use tactile input is of vital importance,” MIT professor Daniela Rus said in a statement provided to VentureBeat.
Relying on a purely optical object-sorting process introduces inaccuracy because material type is not a visual property, but a tactile one, researchers note in the paper “Automated Recycling Separation Enabled by Soft Robotic Material Classification.”
As a greater push is made for increased sustainability, durable versions of previously disposable items are increasingly common and visually indistinguishable from the disposable versions, such as plasticware that looks metallic.
Researchers next plan to incorporate a camera and computer vision in conjunction with RoCycle’s sense of touch to improve its accuracy.
RoCycle was 85% accurate in its ability to identify and sort the three materials through touch alone when in a fixed position. Initial tests with 27 objects found RoCycle was 63% accurate when collecting items from a conveyor belt, the way trash and recycling is often sorted today.
RoCycle can be attached to any robotic arm. Its gripping appendage is made from a material called auxetics that gets wider when pulled on. The use of auxetics also allows a robotic hand to conform to an object’s surface and to form in twisted strands.
A sensor is first used to determine the size of an object, and pressure sensors measure how much force is exerted to grasp the object. This information is in turn used to determine what kind of material a robotic arm picks up.
The research, supported by Amazon, JD, the Toyota Research Institute, and the National Science Foundation (NSF), will be presented later this month at the IEEE International Conference on Soft Robotics in Seoul, South Korea.
Properly sorting and disposal of trash and recycling has become an even bigger priority since last year, when China declared it would no longer be willing to receive plastic recycling imports in 2017.
Soft robotics has been used to create elastic machines capable of wiggling into tight spaces, and are being considered as a way to help care for the elderly.
MIT researchers have introduced a number of novel approaches to robotics use in recent months, including a gripper attached to a robotic arm inspired by oragami that can pick up objects 120 times its weight, and methods for teaching a robot to pick up objects it’s never seen before. Last year, MIT researchers also devised ways to control robots with your brain.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.