A framework for symbiotic Human Robot Collaboration in the industrial domain Nikhil Somani , Emmanuel Dean-Le´ on , Caixia Cai and Alois Knoll Abstract— Industrial robotics is currently witnessing a phase where a lot of effort is directed towards applications of standard industrial robots in smaller industries with short production lines, where the environment is rather unstructured and rapidly changing. Standard industrial robot systems face limitations in their ability to adapt to these environments, and with the complexity of some tasks which seem relatively easier to humans. We present a framework for intuitive symbiotic human robot collaboration in industrial scenarios, where the differing capabilities of human and robot can be combined in a way which enhances the overall effectiveness of the process. I. I NTRODUCTION This architecture combines state-of-the-art perception ca- pabilities [1] with first order logic reasoning [2] to generate semantic description of the system states. This abstract representation reduces the complexity of problem analysis by allowing process planning at semantic level, thereby isolating the problem description and analysis from the execution and scenario-specific parameters. One of the applications demonstrating the potential of this architecture is an intuitive teaching interface which allows humans to teach assembly process plans to a standard industrial robot through physical HRI. This interface is capable to automatically generate a generic process plan for a robot using the semantic infor- mation generated during the teaching phase. The semantic description of the process plan also allows the possibility to combine instructions for the robot and for the human in the same plan to enhance the adaptability and robustness of the overall system. This property is also evaluated in an assembly scenario where a co-operative plan containing activities for both the human and robot is created. The complete architecture is evaluated with a standard industrial robot in an industrial scenario. II. ARCHITECTURE OVERVIEW Fig. 1 shows the overall architecture of our system. There are 3 key aspects about this design : Semantic description of process plans, which enables modeling Human and Robot Tasks in the same process plan. A clear separation between the Problem and Solution space, which makes the high-level planning independent of robot and scenario specific information. Authors Affiliation: †Technische Universit¨ at M¨ unchen, Fakult¨ at f¨ ur Infor- matik. ‡Cyber-Physical Systems, fortiss - An-Institut Technische Universit¨ at unchen The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement n 287787. Fig. 1. Overview of the Architecture Fig. 2. a,b,c) Teaching Application. d,e,f) Execution and Plan generation of taught Task. g,h,i) HRC in an assembly process. Compared to traditional industrial robot program in- structions, the process plan is now in a format which is more intuitive to edit/extend or even create from scratch. III. APPLICATIONS A video demonstrating a co-operative assembly task based on the proposed architecture can be found at: http://youtu.be/TQB6GsUnbDI REFERENCES [1] N. Somani, E. Dean, C. Cai, and A. Knoll, “Scene perception and recognition in industrial environments for human-robot interaction,” Proceedings of the 9th International Symposium on Visual Computing, 2013. [2] ——, “Perception and reasoning for scene understanding in human- robot interaction scenarios,” Proceedings of the 2nd Workshop on Recognition and Action for Scene Understanding at the 15th Interna- tional Conference on Computer Analysis of Images and Patterns, 2013.