Friday, 14 December 2018
Could imitation robots soon be replacing humans in some of the repetitive and manual tasks in horticulture? Aran Sena, PhD student in the Department of Informatics at King's College London believes so.
“The challenge of recruiting and retaining labour for horticultural businesses is more acute than ever before. Greater levels of robotics and automation has long been heralded as one of the potential answers to this challenge, but this can only happen if we’re able to overcome some of the unique technical challenges the industry presents.
Some forms of automation, such as tray-filling and seeding machines, are already familiar and self-driving tractors are more than a pipe-dream. When it comes to handling the crops themselves, progress in robotic manipulation has been much slower.
When you work with plants, you do so without having to think about the way you manipulate your hand. But it’s the very nature of plant material that makes grasping it with a robotic ‘hand’ so difficult. The variation in plant ‘architecture’ – the numbers of leaves, buds or fruits for example, and the relatively fragile nature of the tissues all contribute to making plant material an extremely challenging target for robotics.
Robots work best in environments with a high degree of certainty – knowing exactly where to find a machine component or how hard it can press against a surface – but it’s rarely so clear-cut with plants.
A new generation of so-called ‘collaborative’ robots could offer some of the solutions to these kinds of horticultural operations. Collaborative robots are designed to be inherently safe to work alongside people and are designed so that people without a programing background, or any specialist robotics knowledge, can quickly and easily set them up to undertake simple tasks and reconfigure them later to do a different job.
With ‘imitation learning’ the grower would perform the task they want automated, while the robot monitors their actions, through its vision system and through being connected to sensors embedded in a ‘smart glove’ worn by the grower.
This process is split into three stages, starting with the collection of sensor data as the user performs the task – this includes data describing movements, interaction forces, visual records, location of objects the user is working on and so on. This gives the robot a lot of data but at this stage it’s essentially just a string of 1s and 0s.
To get useful information from it about a task, we next have an ‘inference stage’, in which the robot’s machine-learning algorithms extract the task information from the raw data. This is a way of getting the robot to ‘understand’ what the user intended from their demonstrations, so it will know what to do when it encounters a particular situation –for example, when a pot is placed in front of it, it should pick it up and transfer it to a tray.
Finally, having determined the appropriate actions to take for particular situations, the system must translate this into action. The system must move swiftly, safely, and accurately in order to effectively execute the learned action. Modern controllers for collaborative robots can allow the rigid system to behave in a compliant, or “soft” manner, which greatly reduce the risk involved when there are unexpected collisions with the surroundings. The overall performance of the system does however depend on the quality of the data provided to it by the person.
While imitation learning presents a promising mechanism for how to go about transferring human skills to a robot, an important question that is central to my research is whether people such as growers who have no expertise in robotics or programing are able to give demonstrations successfully to such a system.
When you perform the task during the collection phase, you’re ‘teaching’ the robot, not just showing it how the task is done in one case, but in a range of scenarios. For example, if you wanted to demonstrate how to pick up a pot you would show the robot how to pick up a pot from various locations, thus enabling it to deal with situations where the pot might not be put in front of it in exactly the same place each time.
We need to be able to guide people toward good teaching practices if these learning systems are going to be used by growers directly. Initial research was conducted with volunteers from J&A Growers to try to better understand how those who are not robotics experts interact with robot systems. I’ve also been developing our prototype system using a commercially available industrial collaborative robot with trials conducted on commercial nurseries.”
Aran’s PhD project is funded by AHDB and he will be speaking at SmartHort 2019 conference. To find out more about the conference, please click here.
The latest reports for the research project 'Growbot: a grower-reprogrammable robot for ornamental plant production tasks' is available here.