A helping hand07/10/16Science & Technology
Ocado Technology is leading an ambitious EU project to develop the world’s most advanced collaborative robots. SecondHands co-ordinator DR Graham Deacon tells PEN more
Warehouses up and down the UK could soon look very different if British innovator Ocado has anything to do with it. The technology division of the online giant, Ocado Technology, is spearheading an ambitious five-year (2015-2020) project to develop an autonomous humanoid robot that will provide its maintenance technicians with a literal second pair of hands. It’s being joined in this endeavour by four European research institutions and universities with funding from the EU’s research and innovation programme Horizon 2020.
The SecondHands robot – which will hopefully be integrated into Ocado’s warehouses by 2020 – will combine artificial intelligence (AI) and machine learning with advanced vision systems and human-like flexibility in order to reliably and adequately assist (but not replace) human workers performing routine and preventative maintenance in real time. It is expected that the robot will be able to both predict when its help would be useful and know which actions to take to provide assistance – increasing both safety and efficiency on the warehouse floor.
Speaking to PEN, project co-ordinator Dr Graham Deacon, Ocado Technology’s robotics research team leader, discusses the impact of assistive robotics on maintenance technicians, shares his thoughts on Horizon 2020, and reflects on some of the challenges involved in building one of the world’s most advanced collaborative robots.
What are some of the main features of the SecondHands robot?
The objective of the SecondHands robot is to proactively offer assistance to the maintenance technicians in the Ocado warehouses who have to repair and maintain the warehouse automation. Because of the scientific complexity involved in proactively offering assistance, the capabilities we are endowing the robot with are limited to passing and collecting tools, fasteners (like screws), holding and collaboratively moving safety guards with the technician, and other simple tasks. The robot, which is being built by the Karlsruhe Institute of Technology (KIT) in Germany, will therefore need two arms with hands, the ability to move and interact with the workspace and the human, and the ability to perceive the environment, and its interaction with both the environment and the technician, through an array of sensors and a natural language dialogue system. In addition to KIT the university institutions contributing to the groundbreaking science include École polytechnique fédérale de Lausanne (Switzerland), University College London (UK) and the Sapienza University of Rome (Italy).
What impact might the use of such an assistive robot have in terms of jobs and the workforce?
The SecondHands robot has been designed to help technicians perform their job more effectively and to reduce the need for the technicians to perform tasks that could result in injury. The intention for the robot is to help people in the tasks that the people do; the people are essential to the effectiveness of the robot; the goal is to create a partnership that is more effective than the sum of its parts. Not only is this intended to improve the operational efficiency of Ocado, by improving productivity, but it would also allow the technicians to focus more on the detail of the task, resulting in a higher quality of work. Furthermore, the robot could be equipped to detect the indicators of the onset of a problem that a human just can’t perceive due to biological limitations; for example, the robot could be endowed with thermal imaging to see components that are operating out of their normal range.
What are some of the main challenges involved in the design and building of such a robot that can work effectively alongside humans and machines in an industrial environment?
The challenges of producing a robot, with even the simple skill set that we aim for, are hugely complex:
We need reliable visual perception of the environment, where the environment is dynamic and full of highly reflective metal surfaces and lighting varies significantly; There will be a dialogue system so that the robot can be given commands by a technician and in return it can either confirm the course of action it is taking or possibly initiate a dialogue where it might enquire as to what course of action the technician would prefer. The speech understanding system needs to work despite variations in technician speech patterns and turns of phrase and in the context of an unpredictably noisy environment; The cognitive system needs to in effect learn knowledge, a very basic level of knowledge compared to a human, but even so this type of capability will
be a necessary keystone in the development of robots that help people and interact with them in their world. It has to have an internal conceptualisation or model of the task, it has to be able to track what the technician is doing and determine how that relates to the theoretical model, and it has to be able to do this in such a way, and sufficiently quickly, that it can accurately predict what tool the technician will need next, and then plan to offer assistance. This is truly state-of-the-art; and the robot will need to have cutting-edge manipulation and grasping capabilities, such that it can collaboratively work with the technician to complete a task, like jointly moving a safety guard. The motion and speed of the robot needs to be sufficiently human-like that the human really feels helped by the robot and not that the robot is a handicap, whilst at the same time it needs to be intrinsically safe during interactions.
It is perhaps fair to say that AI is at present not that ‘intelligent’. Where might research look to advance this, and how far will you take into account the ethics of doing so?
It all depends on what one means by intelligent and one’s frame of reference for observation. AI can be very sophisticated, it can yield insights into complex problems where there are vast amounts of data, and it can be developed to make decisions based on statistical analysis of a collection of input parameters. But it is not self-aware and is still bound by the limitations of the rules that are programmed into it and the data that it is trained with.
As touched on above, one of the key scientific challenges of this project is to develop a highly sophisticated AI system that can learn in a generalised way how to support a maintenance technician in a small number of activities. This type of capability is a key step towards the development of the level of sophistication in robots that we see in the movies, and it will be a long road to travel with many steps.
However, this is still based on code written by people that the computational system is forced to execute, the programme is not self-aware, and there is no scope in the project to explore this. Therefore, there is no significant research being done in SecondHands into the ethics of truly intelligent systems that comprehend.
Horizon 2020 includes one of the world’s largest civilian robotics programmes – how would you assess support and funding for robotics research and innovation in Europe?
World-leading; the Horizon 2020 funding stream is an amazing way of stimulating collaboration and research between academics and industrial partners to solve real problems and to make a difference to the economy of Europe. The field of robotics is so complex and diverse that we need to be closely joined with the academic experts all across Europe to stay as the world leaders that we are. And we are very lucky to be working with some of the best on this project and our other Horizon 2020 project, Soma.
Dr Graham Deacon
SecondHands Project Co-ordinator
Robotics Research Team Leader
This article first appeared in issue 20 of Pan European Networks: Science & Technology, which is now available here.