Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.
A University of California, Berkeley robotics lab is developing AI systems for polyculture gardening as part of AlphaGarden, an AUTOLAB project that wants to find out if humans can train robotic control systems to fully automate a polyculture garden of edible plants and invasive species. The AUTOLAB robotics laboratory is perhaps best known for creating the DexNet system for robotic grasping.
AUTOLAB director Ken Goldberg said the goal is to find out if AI can learn a function as complex as polyculture gardening, or farming with multiple species of plants growing alongside one another, instead of monoculture growing, a single-crop strategy commonly practiced today.
“I think that’s an open question. I don’t know if we can,” he said. “It certainly might be interesting to be able to have a fully automated garden. In my own view it’s probably unlikely to be viable as a real functioning productive garden. I think that it’s going to be very hard to learn, and that’s the art side of the lesson, which is that nature is very complex, and that we can put some very complex machinery on it, but it’s not going to necessarily open up and be controllable.”
At launch, eight students in the AlphaGarden Collective pruned and planted alongside a robotic FarmBot Genesis system that automates water dispensing inside a greenhouse at UC Berkeley. As a result of COVID-19 forcing the closure of the school, students will now focus on polyculture garden simulations and models instead of working in a UC Berkeley greenhouse.
Participants in the project want to learn from the real-life garden as well, because simulations can only get so close to predicting real life, and polyculture gardens can be unpredictable.
“For every real garden, we have 100,000 or millions of gardens that can be generated,” Goldberg said. “This runs at 100,000 times faster than nature so you can accelerate time dramatically, and for each one you can say, ‘Well, if I tweak these parameters in my control policy, here’s what the outcome will be in terms of how often you water, in what conditions you water, etc.'”
AlphaGarden Collective members started the first growing cycle on January 1 and, due to limited access to the university greenhouse, plan to begin the second cycle in April or May. The project will continue for the coming two to three years, he said.
Goldberg says AlphaGarden is both an art and science installation meant to contrast the complexity of nature with the complexity of AI.
“AI is incredibly complex — you’re throwing a lot of a technology and theory and processing at these problems — but when faced with the complexity of just a single polyculture garden, it’s meeting its match because gardens are incredibly complex,” he said.
AlphaGarden is part of The Question of Intelligence, an exhibit of more than a dozen other projects at The New School in New York City that examines contrast between human and machine learning and the impact of automation on human senses. The exhibit was initially scheduled to run until April but The New School exhibit is now closed as a result of the coronavirus pandemic.
Harvard MetaLAB senior researcher Sarah Newman is an advisor to the project and called AlphaGarden a project to study the nature of diversity and explore the limitations of AI in the context of ecology and sustainability.
“AlphaGarden foregrounds the beauty of nature and exposes the limitations of AI and robots,” she said. “There will always be a distance between simulation and reality.”
AlphaGarden resembles TeleGarden, a gardening online webcast project Goldberg led from 1995 to 2004. With AlphaGarden, despite limited access at the moment, a time lapse visualization is updated daily to demonstrate progress.
AlphaGarden utilizes hardware from FarmBot rigged to an overhead gantry crane, together with cameras for data collection and sensors for measuring things like soil measurement.
“We just took that [FarmBot] off the shelf, but where we’re coming in is by putting a camera way up top overhead, and that’s the global image that you see from a bird’s-eye view; then we’re basically taking images every day, and we’re basically trying to monitor the state of the garden every single day so that we can see how it evolves, and then start to understand what the effect of actions are,” Goldberg said.
AlphaGarden also builds on the PlantCV computer vision system for leaf identification to recognize specific plants in the garden of herbs and vegetation. Such leaf identification systems can be used by plant growers to monitor growth with plant phenotyping.
Demo of the robot, simulations, and computer vision system in action are available on alphagarden.org.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform
- networking features, and more