Robotics is not a new technology – so what has got scientists so excited recently about this scientific field? Fruit…that’s right, Agri-Tech East is bringing together scientists with industry and investors to explore the potential for robots in agriculture
The introduction of robots to daily life is becoming closer to reality, with a Global Market Insights report forecasting the service robotics market will grow over 17.8% from 2016 to 2022, reaching $21.7 billion.
“All of a sudden, robotics feels really exciting again. Instead of just getting 1% faster or more accurate year on year, it feels like we are on the edge of something revolutionary – the technology has got to the point where things which were pure science fiction five or ten years ago can now happen in the real world,” says Chris Roberts, Head of Industrial Robotics at Cambridge Consultants. The challenge now is to convince industry that the technology is sufficiently robust to move from lab to field, and secure the investment to go beyond prototypes. To facilitate this, membership organisation Agri-Tech East is bringing together scientists with industry and investors to explore the potential for robots in agriculture.
Robots learning on their own without human supervision and checking are still a long way off
“Most robots can only cope with environments like a car production line, where everything is always exactly the same, in exactly the same place,” says Roberts. “Where I’m excited to be working is in adding intelligence to the system so it can better cope with real-world environments.” Roberts is testing robots within semi-structured environments, where the system encounters some variation. For example, Cambridge Consultant’s fruit-picker robot has the ability to pick up and move pieces of fruit, each slightly different to the last. The fruit-picker’s gripper has been developed to avoid bruising of the fruit, using both machine vision and mechanics. Delicate and precise manipulation is vital for wider application of robotics. Roberts explains the process: “First, we needed to accurately find the top of the fruit; this is achieved by combining an optical picture of the fruit with a depth map from Microsoft’s Kinect. Then, we can position the gripper so that the soft tips of the fingers make contact with the fruit. The gripper has been designed to allow individual articulation of the fingers so it can rest against fruit of different shapes and sizes. Use of multiple fingers allows distribution of the suction, so we’re not putting too much pressure on any individual part of the fruit.”
Dr Andre Rosendo, a Research Associate at the Department of Engineering, University of Cambridge, says the movement of robots is restricted at present, as it is not yet possible to replicate how the muscles deform when walking. He says: “For manipulation, however, the movement happens at the tendon and joint level so we are able to create manipulators so soft humans would be unafraid of shaking hands with them. This would allow future robots to help elderly people from their beds, handle glasses as a bartender or interact with children.” At present, these soft robotics are being used within a ‘Vege-bot’ – an Iceberg Lettuce harvesting robot, developed with support from a regional producer, which handles and cuts lettuces with the same handling care that human harvesters adopt. Dr Rosendo says: “This robot can be applied to any ‘fragile’ produce, including strawberries and mushrooms too.”
In order to operate in a changing environment, Vege-bot is trained using reinforcement learning. Dr Rosendo says: “Harvesting a lettuce requires the right and left arm to move following a specific trajectory, and small deviations in this trajectory will result in either a lettuce with a flawed cut or no lettuce at all. In these cases the machine is rewarded negatively, and will make small modifications in the trajectories that its arms follow to increase its own reward. What looks like ‘intelligence’ is simple number crunching.”
Robots learning on their own without human supervision and checking are still a long way off. Professor Tom Duckett of the Lincoln Centre for Autonomous Systems, at the University of Lincoln believes full automation is not necessarily desirable. “I would envisage a new generation of robotic helpers than can work alongside and assist their human counterparts, enabling them to be more productive and deliver the ‘sustainable intensification’ of agriculture while minimising the impact on the environment. “Smart robots that run on battery power rather than fossil fuels could also be part of the solution for a cleaner, greener future. The underpinning technologies for robotic perception, learning and action are already reaching the required level of maturity to leave our research laboratories and start working in challenging environments on the farm or in the factory,” he says.
One of these technologies – 3D imaging – is improving robotic spatial awareness. The Lincoln team has been working on 3D mapping techniques for improving the precision of agricultural sprayers which include the detection, mapping and quality analysis of crops to ensure operations happen at the right growth stage.
The process for harvesting broccoli is expensive and labour intensive. However, using 3D technology, a robot can collect depth data in the field to describe the individual broccoli heads. A classifier labels the clusters, passing the positions to a tracker which creates a map with the broccoli heads showing up in bright red. Low-cost 3D cameras are also being utilised in robotic weeding machinery, enabling the robot to discriminate between weeds and the crop. While these 3D vision robots have their precise tasks, Professor Duckett believes that there is potential for multiple capabilities. “Already we can envisage agricultural robots that could perform multiple tasks, for example, if they would have inter-changeable tools for switching between tasks such as seeding, tillage, spraying and harvesting.
You could also have robots for agriculture and food production that would perform other useful tasks “on the side” such as surveillance, keeping a watchful eye on crops, livestock and expensive farm machinery, while carrying out their primary duties on the farm or in the factory.” As the sophistication of robotics continues to improve, sensor fusion is playing a significant role in navigation. Autonomous robotics require continuous information from the environment, gathered by sensors and real-time processing software. Dr Kevin Rathbone, CEO of Cambridge-based robotics consultancy, Robotae, has expertise in mechatronics, machine vision, machine learning and much more. He explains: “Sensor fusion takes data from a number of sources, such as direction (magnetometer), speed (accelerometer), weight distribution (pressure sensors) and rotation (gyroscope), and combines the strengths of each to provide meaningful information for decision support.”
Increasingly drones are used in agriculture for imaging and the next stage is to use this method for precision application of pest control. The robot needs to be able to orientate the camera and take moving pictures without camera shake. To assist this Robotae has developed sensor fusion algorithms for a brushless gimbal which allows control of a camera mounted on drone. Dr Rathbone says: “We developed control software to provide stabilisation and smooth motion to a brushless gimbal, either to be used on a drone or hand-held. Using a sensor fusion algorithm, we combined the gyro, accelerometer and magnetometer data to give a low latency, low drift attitude estimation.”
This type of technology is now moving into consumer electronics and Robotae has also supported the development of Motion Metrics Ltd’s ‘Carv’ – a digital ski coach that uses a smart boot insert to capture motion and pressure. Achieving the fusion of data from a number of sources in real-time, while a person is moving at speed, is technically challenging and a specialist area of mechatronics. Optimistic for the future, Dr Rathbone sees change as incremental: “The world won’t change overnight; instead we will see gradual development, with robots becoming increasingly visible in daily life as the cost of the technology falls. Businesses are taking interest, now seeing technologies as mature enough for application.” Many of the technologies behind the advancement of robotics are mature. While there has not been a big breakthrough in any one technology, there has been a gradual evolution of each individual part becoming better and better. It is agreed amongst experts that investment is the critical factor in closing the gap between prototypes in the laboratory and real world application.
There are signs that this is happening; Cambridge Innovation Capital is part of a consortium that has invested over $20 million in Series A funding for Cambridge Medical Robotics – the company is developing a surgeon’s assistant for keyhole surgery. Agri-Tech East sees that demystifying robotics is an important step in gaining industry support. According to its director, Dr Belinda Clarke: “Many of our members are innovative producers of salads and other crops that are now harvested 24/7. There is also increasing interest in urban farming and year around cultivation under LED lighting. “These situations create opportunities for higher levels of automation and we see a number of technologies coming together to make this possible. Getting the scientists in the room with end users and investors is our contribution to making it happen.”