Real-virtual components interaction for assembly simulation and planning

https://doi.org/10.1016/j.rcim.2016.03.005Get rights and content

Highlights

  • Enable real-virtual components interaction during augmented reality assembly.

  • Facilitate intuitive manipulation of virtual components with bare-hand interface.

  • An industrial case is conducted to validate the effectiveness of the system.

Abstract

Assembly motion simulation is crucial during conceptual product design to identify potential assembly issues and enhance assembly efficiency. This paper presents a novel assembly simulation system incorporating real-virtual components interaction in an augmented-reality based environment. With this system, users can manipulate and assemble virtual components to real components during assembly simulation using natural gestures through an enhanced bare-hand interface. Accurate and realistic movements of virtual components can be obtained based on constraints analysis, determination of resultant forces from contacts with real components and manipulation from the user's hands (forces and torques applied, etc.) during assembly simulation. A prototype system has been developed, and a case study of an automobile clutch was conducted to validate the effectiveness and intuitiveness of the system.

Introduction

Assembly simulation, which allows users to evaluate an assembly process and address assembly issues, is crucial during conceptual product design [17]. Nowadays, assembly operations are becoming more complex due to the growing numbers of product variants, smaller batch sizes, and shorter life cycles of products; there has been a growing demand for flexible and reconfigurable assembly simulation and planning systems. The main task of assembly process simulation and planning is to formalize the details of assembly constraints and relationships [38], so as to simulate the assembly operations and processes accurately. Typically, expert assembly planners use traditional interaction tools (e.g., mouse, keyboard) to manipulate 3D CAD models of the components of an assembly, so as to determine the geometrical characteristics of the components as well as the assembly features of the assembly [21], [27]. However, expert assembly planners are usually required to identify constraints manually. Pure virtual reality (VR) based assembly motion simulation and planning does not allow users to relate their assembly experience to the physical context, and thus leading to a lack of real spatial feeling. Augmented reality (AR) technology, a seamless interface that bridges the gap between the real and virtual world, can be used to enhance the assembly simulation process [22]. An AR-based assembly simulation system can be implemented to augment useful virtual contents (e.g., virtual prototypes, information, tools, etc.) onto a user's view to assist the user to verify assembly operations and sequences and assess ease of assembly, thus closing the gap between product design and assembly operation.

AR has been applied to simulate assembly operations and address various assembly-related issues throughout a product's lifecycle, e.g., assembly planning and design [23], ergonomics assessment [24], operation guidance and training [20], etc. However, previous work on AR assembly simulation and planning focused primarily on manipulating virtual objects (assembly components or tools) and carrying out virtual assemblies in a real environment, and has neglected real-virtual components interaction and contact phenomena. Information from real-virtual components interaction, e.g., the spatial relation of a virtual new part design with respect to an existing real component, physical constraints and manipulation obstructions from the real component, etc., are important for decisions making in assembly design and planning [33], [34]. Accurate and realistic motions of the virtual components should be determined based on their interaction with real components, and manipulation forces from the user's hands. Essential factors contributing to a realistic and reliable AR assembly simulation system should be considered, e.g., intrinsic properties of virtual components (e.g., material, shape, dimensions, etc.), human factors (e.g., ergonomics such as assembly position and orientation, accessibility to component), etc.

This paper presents a novel AR-assisted assembly planning and simulation (ARAS) system. ARAS provides (1) an enhanced bare-hand interface (EBHI) enabling users to manipulate virtual components realistically using natural hand gestures, (2) a method to calculate the resultant forces exerted on virtual components from contacts with real components and manipulation from the user's hands (forces and torques applied, etc.) during an assembly process, and (3) constraint analysis and virtual snapping force generation. Thus, ARAS can position virtual components accurately, simulate the assembly motions realistically, allow users to assess product assembly design and identify potential assembly issues. The calculated force/torque (e.g., position, orientation, magnitude, etc.) can be rendered with 3D symbols when virtual components are manipulated by the users. In order to validate the benefits of ARAS for assembly operators, a case study is conducted.

The rest of the paper is organized as follows. Section 2 presents a review of the state-of-the-art assembly motion simulation systems and AR based interaction methods for assembly applications. An overview of the proposed system is presented in Section 3, and the details are described in 4 Enhanced bare-hand interface, 5 Assembly information management, 6 Assembly motion simulation. Section 7 presents the system implementation and a case study. Section 8 summarizes the study and future work.

Section snippets

Assembly motion simulation

An effective and popular method for assembly planning is to carry out assembly motion simulation (AMS) in an AR/VR environment. Early AMS systems focused on constraint-based methods. Jayaram et al. [16] reported the VADE system (Virtual Assembly Design Environment), in which pertinent geometrical information (e.g., tolerances, locations, orientations, etc.) is extracted from a CAD system before component motion simulation. Trajectories of virtual components are pre-determined to support user

ARAS system architecture

ARAS (Fig. 1) consists of the AR-based assembly environment (ARAE), where users perform AR-based assembly operations and planning based on real-virtual components interaction; the AR functions (ARF) module, for tracking of real components and workspace, and registration, rendering and display of augmented contents in ARAE; the assembly information management (AIM) module, which collects, stores and reasons assembly information; the enhanced bare-hand interface (EBHI), which supports natural

Enhanced bare-hand interface

Many assembly tasks are objects manipulation with multi-degrees-of-freedom (DOFs), and 3D bare-hand interactions are effective and natural in manipulating components, tools and subassemblies [23] in an AR environment. In ARAS, an EBHI is developed to support dexterous manipulation by considering the manipulation forces exerted so as to move virtual components precisely and realistically in an assembly simulation task, using both single and dual hands gestures (Fig. 2). The EBHI improves the

Assembly information management

AIM undertakes the representation and management of components information and their assembly relations, which are essential for capturing the mating relationship between components and the kinematics of these mating relationships, e.g., constraint analysis [29], [9]. AIM consists of an ontology-based assembly information model (OAIM), contact reasoner and assembly information instances. Ontology is used to classify data associated with the components so that ARAS can recognize their

Assembly motion simulation

Fig. 4 shows the AR assembly motion simulation process. First, the user has to load a component into ARAS and ARAS will extract and store the pertinent information in AIM automatically. A registration process is performed for real components by selecting a set of feature points to obtain the pose of the components, so that the geometrical models can be superimposed accurately in ARAE. Virtual components have a default initial pose and the geometrical model will be rendered accordingly. Next, at

Prototype setup

A prototype system has been developed based on the methodologies presented in the previous sections. The ARF, AMS and EBHI modules have been implemented using Visual Studio 2010. The tracking approach and initial registration process are implemented with the ViSP tracking platform (Visual Servoing Platform, http://www.irisa.fr/lagadic/visp/visp.html). The AIM module has been developed using OWL API (http://owlapi.sourceforge.net/) and the Pellet reasoner (http://clarkparsia.com/pellet/), and

Conclusion and future work

This paper presents a real-time AR-assisted assembly simulation platform to facilitate assembly planning and design issues identification during early product design. The methodology implemented considers actual constraints and interactions from real components in a real assembly environment, and the dexterous component manipulation associated with different assembly operations during assembly simulation. The EBHI, which does not require calibration and attached device, is implemented to

References (39)

  • A.K. Swain et al.

    Extended liaison as an interface between product and process model in assembly

    Robot. Comput.-Integr. Manuf.

    (2014)
  • R.D. Yang et al.

    Virtual assembly technologies based on constraint and DOF analysis

    Robot. Comput.-Integr. Manuf.

    (2007)
  • J. Bender et al.

    Interactive simulation of rigid body dynamics in computer graphics

    Comput. Graph. Forum

    (2014)
  • P. Boonbrahm et al.

    Assembly of the virtual model with real hands using augmented reality technology

  • C.W. Borst et al.

    A spring model for whole-hand virtual grasping

    Presence

    (2006)
  • S. Boyd et al.

    Convex Optimization

    (2009)
  • M.T. Ciocarlie et al.

    Hand posture subspaces for dexterous robotic grasping

    Int. J. Robot. Res.

    (2009)
  • A.I. Comport et al.

    Real-time markerless tracking for augmented reality: the virtual visual servoing framework

    IEEE Trans. Vis. Comput. Graph.

    (2006)
  • F. Faure, S. Barbier, J. Allard, F. Falipou, Image-based collision detection and response between arbitrary volumetric...
  • Cited by (66)

    • AR-based deep learning for real-time inspection of cable brackets in aircraft

      2023, Robotics and Computer-Integrated Manufacturing
    • Image and model sequences matching for on-site assembly stage identification

      2021, Robotics and Computer-Integrated Manufacturing
    View all citing articles on Scopus
    View full text