Cobot programming for collaborative industrial tasks: An overview

https://doi.org/10.1016/j.robot.2019.03.003Get rights and content

Abstract

Collaborative robots (cobots) have been increasingly adopted in industries to facilitate human–robot collaboration. Despite this, it is challenging to program cobots for collaborative industrial tasks as the programming has two distinct elements that are difficult to implement: (1) an intuitive element to ensure that the operations of a cobot can be composed or altered dynamically by an operator, and (2) a human-aware element to support cobots in producing flexible and adaptive behaviours dependent on human partners. In this area, some research works have been carried out recently, but there is a lack of a systematic summary on the subject. In this paper, an overview of collaborative industrial scenarios and programming requirements for cobots to implement effective collaboration is given. Then, detailed reviews on cobot programming, which are categorised into communication, optimisation, and learning, are conducted. Additionally, a significant gap between cobot programming implemented in industry and in research is identified, and research that works towards bridging this gap is pinpointed. Finally, the future directions of cobots for industrial collaborative scenarios are outlined, including potential points of extension and improvement.

Introduction

Manufacturing in the Industry 4.0 era necessitates rapid, proactive responses to ever-changing consumers’ demands. This had led to a trend of mass customisation, where certain aspects of the product, hence manufacturing processes, are tailored to meet the requirements of individual customers. Meanwhile, manufacturers need to continuously improve sustainability, production efficiency and quality throughout the product life cycle to ensure their competitive edge. Industrial automation is capable of maintaining high efficiency and repeatability for mass production. However, it lacks flexibility to deal with uncertainties in work spaces resulting from mass customisation. While humans, in such situations, can deal with such uncertainties and variability, they are restricted by their physical capabilities, in terms of repeatability, physical strength, stamina, speed etc. [1]. These limitations often result in reduced efficiency and quality [2]. A balance of automation and flexibility is thus required to achieve these overarching manufacturing goals during mass customisation. That promotes research in combining the benefits of automation and manual labour. This research has culminated in Human–Robot Collaboration (HRC), a promising robotics discipline focusing on enabling robots and humans to operate jointly to complete collaborative tasks.

HRC refers to application scenarios where a robot, usually a collaborative robot (cobot), and a human occupy the same workspace and interact to accomplish collaborative tasks [3]. Following the introduction of the UR5, a cobot produced by Universal Robotics in 2008 [4], industrial interest in applying HRC and cobots to factory floors has escalated. Many other robotics manufacturers, such as KUKA, ABB and Rethink Robotics, have also developed their own cobots, each tackling a particular niche. An overview of cobots on the market is discussed [5], comparing their costs, payload capabilities, and safety features. Since cobots are built for close-proximity interaction with humans, they must adhere to stringent safety requirements, such as power and speed limiting, soft padding, and the absence of trap points (i.e. points that can trap body parts or clothing) (ISO-TS 15066 [6]). Considering the distinguishing characteristics of cobots over regular robots, they are envisioned to pave the way for mass customisation, decrease required floor space, increase product quality and production efficiency and improve working conditions for humans [1], [7].

Relevant research in HRC and cobots has revolved around enhancing particular enabling functions like visual perception, action recognition, intent prediction, safe on-line motion planning, etc. These technologies enable human-awareness, which result in flexible cobot behaviour as opposed to traditional fixed action-sequence cobot programs. Another line of research has revolved around Learning from Demonstration (LfD), Reinforcement Learning (RL), human–robot communication, collaborative task semantics, etc. These fields enable intuitive cobot programming, allowing non-expert operators to create and alter robot programs quickly and intuitively. This paper explores the union between these two research directions, resulting in human-aware, intuitive robot programs for collaborative industrial tasks. Imbuing cobots with flexibility, reliability and autonomy is indeed a persistent research bottleneck for HRC scenarios in industry and elsewhere.

It is noted that several papers have provided related literature reviews, such as on HRC applications [8], methods for safe HRC [9] and more specific topics such as LfD [10], [11], gesture recognition [12] and Augmented Reality [13]. Bauer et al. reviewed the technologies enabling HRC such as machine learning, action planning and intention estimation [14], which are all relevant to the programming of cobots. However, their review only covers works until 2008, prompting an update considering the surge of relevant recent works.

The goal of this paper is to provide research communities with a guide on deploying cobots in collaborative industrial scenarios. In particular, programming features that support cobots for collaborative scenarios will be reviewed. The work scopes and contributions of this paper are:

  • an overview of collaborative industrial scenarios

  • a general structure for a cobot programming, including safety measures and on-line/off-line human involvement

  • a review of three main programming features for cobots; communication, optimisation and learning

This paper is organised as follows. Section 2 presents an overview of HRC scenarios, applied safety measures and general program structure. Sections 3 Communication, 4 Optimisation, 5 Learning elaborates on cobot programming features used for collaborative industrial scenarios. Section 6 concludes the work by providing recommendation regarding the deployment of cobots in industrial settings and providing general guidance for the advancement of HRC-related research towards expanding and improving HRC industrial implementations.

Section snippets

Overview on collaborative scenarios and cobot programming

A human operator and a cobot can collaborate on a variety of industrial tasks, which are defined here as collaboration scenarios. In such a scenario, the human operator and the cobot share the same workspace to perform manufacturing processes on work pieces, such as pick-and-place, assembly, screwing or inspection. That is, each scenario involves a cobot, a human operator, work piece(s) and manufacturing process(es). Collaboration scenarios, safety measures, and cobot programming to support the

Communication

Humans rely heavily on communication to work in teams and complete tasks fluently and efficiently. Communication can be made to issue orders, convey intention and ask/answer questions. Researchers have been working on enabling communication between humans and cobots such that the human is able to command the cobot through different communication modes. The works mentioned in this subsection are categorised by communication mode: body language and speech, user interfaces and haptics.

Optimisation

Optimality is a primary goal during industrial design processes (product, process and production line design) since it ultimately yields a “maximum” profit. The main challenge in HRC scenarios is to optimise around the human, i.e. modelling and incorporating the human in the cost function. This subsection reviews the works done on optimising different aspects to yield optimal and semi-optimal cobot action in different industrial HRC scenarios.

Learning

Humans learn new tasks by observing them being done, by trying to do them and by asking questions and receiving feedback on performance. A human teacher serves to demonstrate a task, answer questions and provide feedback, all of which do not require programming skills. Researchers have attempted to enable a learner–teacher relationship between the cobot and the operator due to its naturalness and wide potential. In HRC, it is advisable to equip the cobot with learning capabilities since the

Recommendations for industrial parties

Different programming features enable different degrees and forms of cobot autonomy. As cobot autonomy increases, an operator is more likely to feel unease due to the cobot’s decreased predictability. However, as the cobot autonomy decreases, the operator is required to make decisions on behalf of both, which increases the mental workload. Therefore, one programming feature is not strictly better than the other, but can be mixed to exploit their benefits while negating, or limiting, their

Acknowledgements

This work is part of a project funded by Coventry University, UK, Unipart Powertrain Applications ltd., UK and High Speed Sustainable Manufacturing Institute (HSSMI), UK .

Shirine El Zaatari is currently a PhD researcher at the Institute of Advanced Manufacturing and Engineering in Coventry University. She graduated with a Bachelor’s of Engineering in Mechanical Engineering from the American University of Beirut with a focus in robotics. Shirine gained research experience in computer vision as a visiting student in King Abdallah University of Science and Technology. Her current research interests revolve around using learning from demonstration to

References (134)

  • de Gea FernandezJ. et al.

    Multimodal sensor-based whole-body control for human–robot collaboration in industrial settings

    Robot. Auton. Syst.

    (2017)
  • SchouC. et al.

    Skill-based instruction of collaborative robots in industrial settings

    Robot. Comput.-Integr. Manuf.

    (2018)
  • PedersenM.R. et al.

    Robot skills for manufacturing: From concept to industrial deployment

    Robot. Comput.-Integr. Manuf.

    (2016)
  • KochP.J. et al.

    A skill-based robot co-worker for industrial maintenance tasks

    Proc. Manuf.

    (2017)
  • WojtaraT. et al.

    Human–robot collaboration in precise positioning of a three-dimensional object

    Automatica

    (2009)
  • M. Rußmann, M. Lorenz, P. Gerbert, M. Waldner, J. Justus, P. Engel, M. Harnisch, Industry 40: The future of...
  • BicchiA. et al.

    Safety for physical human–robot interaction

  • Our history

    (2018)
  • (2018)
  • Robots and robotic devices – Collaborative robots, ISO Standard ISO/TS 15066 (2016)...
  • PeternelL. et al.

    Towards ergonomic control of human–robot co-manipulation and handover

  • ChandrasekaranB. et al.

    Human–robot collaboration: A survey

  • LasotaP.A. et al.

    A survey of methods for safe human–robot interaction

    Found. Trends Robot.

    (2017)
  • LeeJ.

    A survey of robot learning from demonstrations for humanrobot collaboration

    (2017)
  • ZhuZ. et al.

    Robot learning from demonstration in robotic assembly: A survey

    Robotics

    (2018)
  • GreenS.A. et al.

    Human–robot collaboration: A literature review and augmented reality approach in design

    Int. J. Adv. Robot. Syst.

    (2008)
  • BauerA. et al.

    Human–robot collaboration: A survey

    Int. J. Humanoid Robot.

    (2008)
  • HaddadinS. et al.

    Physical human–robot interaction

  • SyllaN. et al.

    Implementation of Collaborative Robot Applications: A Report from the Industrial Working GroupTech. Rep.

    (2017)
  • A. Cesta, A. Orlandini, G. Bernardi, A. Umbrico, Towards a planning-based framework for symbiotic human–robot...
  • MunzerT. et al.

    Efficient behavior learning in human–robot collaboration

    Auton. Robots

    (2017)
  • CommissionE.

    Periodic reporting for period 1 - colrobot (collaborative robotics for assembly and kitting in smart manufacturing)Tech. Rep.

    (2018)
  • Innovative human–robot cooperation in BMW group production

    (2013)
  • WinkelmannN.

    Human–robot cooperation at Audi

    (2017)
  • Many wrenches make light work: KUKA flexFELLOW will provide assistance during drive train pre-assembly

    (2016)
  • UR10 Cobots offer aging workforce solution and reduce relief worker costs for global car manufacturer

    (2018)
  • Innovative skoda factory introduces human–robot collaboration with KUKA LBR iiwa

    (2017)
  • S. Lichiardopol, N. van de Wouw, H. Nijmeijer, Control scheme for human–robot co-manipulation of uncertain,...
  • NikolaidisS. et al.

    Improved human–robot team performance through cross-training, an approach inspired by human team training practices

    Int. J. Robot. Res.

    (2015)
  • HuangC.-M. et al.

    Adaptive coordination strategies for human–robot handovers

  • JohannsmeierL. et al.

    A hierarchical human–robot interaction-planning framework for task allocation in collaborative industrial assembly processes

    IEEE Robot. Autom. Lett.

    (2017)
  • V. Gabler, T. Stahl, G. Huber, O. Oguz, D. Wollherr, A game theoretic approach for adaptive action selection in close...
  • G. Maeda, A. Maloo, M. Ewerton, R. Lioutikov, J. Peters, Anticipative interaction primitives for human–robot...
  • WongphatiM. et al.

    Gestures for manually controlling a helping hand robot

    Int. J. Soc. Robot.

    (2015)
  • C. Lenz, M. Rickert, G. Panin, A. Knoll, Constraint task-based control in industrial settings, in: 2009 IEEE/RSJ...
  • I.E. Makrini, K. Merckaert, D. Lefeber, B. Vanderborght, Design of a collaborative architecture for human–robot...
  • K.R. Guerin, S.D. Riedel, J. Bohren, G.D. Hager, Adjutant: A framework for flexible human-machine collaborative...
  • Robots and robotic devices Safety requirements for industrial robots Part 1: Robots, ISO Standard ISO 10 218-1 (2011)...
  • SchmidtB. et al.

    Depth camera based collision avoidance via active robot control

    J. Manuf. Syst.

    (2014)
  • Y. Wang, X. Ye, Y. Yang, W. Zhang, Collision-free trajectory planning in human–robot interaction through hand movement...
  • Cited by (283)

    View all citing articles on Scopus

    Shirine El Zaatari is currently a PhD researcher at the Institute of Advanced Manufacturing and Engineering in Coventry University. She graduated with a Bachelor’s of Engineering in Mechanical Engineering from the American University of Beirut with a focus in robotics. Shirine gained research experience in computer vision as a visiting student in King Abdallah University of Science and Technology. Her current research interests revolve around using learning from demonstration to program collaborative robots for industrial tasks.

    Mohamed Marei is a PhD researcher at the Institute of Advanced Manufacturing and Engineering in Coventry University, researching real-time monitoring and optimisation of production lines and predictive plans. His main research focus is cloud real-time predictive maintenance capabilities for manufacturing processes, through developing intelligent digital twin models of manufacturing assets. He graduated from the University of Sheffield in 2016, with a Master of Engineering (MEng) in Mechatronic and Robotic Engineering. He gained research experience in Sheffield Robotics, where he researched implementing computer vision for a system of self-reconfigurable modular robots. His other research interests are in the areas of Industry 4.0, industrial automation, and robotics in manufacturing.

    Weidong Li received the Ph.D. degree from the Mechanical Engineering at National University of Singapore in 2002. He is a full professor and an academic leader at Research Center for Advanced Manufacturing and Engineering, Coventry University, U.K. Before joining Coventry in 2007, he worked at Singapore Institute of Manufacturing Technology, University of Bath, and Cranfield University as a researcher and academic. His primary research interests include computer aided manufacturing and automation, sustainable manufacturing, and Big Data analytics for smart manufacturing. In the past decade, he has participated in a number of EU projects in the areas of sustainable or digital manufacturing, and cooperated with automotive, aeronautical and manufacturing industries (e.g., Airbus, Jaguar Land Rover, Sandvik, and some manufacturing SMEs). In the research areas, he has published 150 research papers in international journals and conferences, and 4 books (Springer and World Scientific Publisher).

    Dr Zahid Usman, CEng, MIMechE, is currently working as Manufacturing Systems Lead Engineer in Rolls Royce plc. Before this he was working as a Lecturer in Robotics and Automation at Coventry University. He has extensive industrial and academic experience within the fields of Manufacturing Informatics, Robotics and Automation and Systems Engineering. He has lead several research and industrial projects with world leading academic institutes and industries. He has published over 20 peer reviewed research articles. Dr Usman is a reviewer for several international journals and has been on technical committees for various journals and conferences. Dr Usman continues to be involved in academic research and applied work in industry.

    View full text