Skip to main content
Log in

Off-line programming of an industrial robot in a virtual reality environment

  • Original paper
  • Published:
International Journal on Interactive Design and Manufacturing (IJIDeM) Aims and scope Submit manuscript

Abstract

Industrial robots need to be programmed quickly in order to be practically deployable in the production of small batches. In programming by teaching or demonstration, when most of the program’s content involves handling tasks, gesture recognition or other multimodal interfaces may be exploited. However, when the main task concerns manufacturing processing, typically tracing an edge in seam welding, deburring or cutting, then positioning and orienting the tool to considerable accuracy are required. This can only be achieved, if suitable tracking sensors are used. The current work employs a 6 degree-of-freedom magnetic sensor, but any other equivalent sensor could be used, too. The sensor is attached to a suitable hand-held teaching tool that is constructed in accordance with the real end-effector tool, enabling continuous tracking of its position and orientation interactively. A virtual reality platform records this stream of data in real time, making it possible to exploit it primarily in off-line programming of the robot. In this mode both the robot and the manufacturing cell are virtual, inverse kinematics allowing for calculation of joint coordinates from end-effector coordinates. Collision and clearance checks are also straightforwardly implemented. An edge-tracing application in 3D space was programmed following this paradigm. The resulting curves of the tool tip in the virtual and the real environment were close enough when compared by using photogrammetry. If required, the VR environment also allows for remote on-line programming, without any major modifications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Pedersen, M.R., Nalpantidis, L., Andersen, R.S., et al.: Robot skills for manufacturing: from concept to industrial deployment. Robot. Comput. Integr. Manuf. 37, 282–291 (2016). https://doi.org/10.1016/J.RCIM.2015.04.002

    Article  Google Scholar 

  2. Villani, V., Pini, F., Leali, F., Secchi, C.: Survey on human–robot collaboration in industrial settings: safety, intuitive interfaces and applications. Mechatronics (2018). https://doi.org/10.1016/j.mechatronics.2018.02.009

    Google Scholar 

  3. Vosniakos, G.-C., Chronopoulos, A.: Industrial robot path planning in a constraint-based computer-aided design and kinematic analysis environment. Proc. Inst. Mech. Eng. Part B J. Eng. Manuf. 223, 523–533 (2009). https://doi.org/10.1243/09544054jem1234

    Article  Google Scholar 

  4. Pan, Z., Polden, J., Larkin, N., et al.: Recent progress on programming methods for industrial robots. Robot. Comput. Integr. Manuf. 28, 87–94 (2012). https://doi.org/10.1016/J.RCIM.2011.08.004

    Article  Google Scholar 

  5. Makris, S., Tsarouchi, P., Surdilovic, D., Krüger, J.: Intuitive dual arm robot programming for assembly operations. CIRP Ann. 63, 13–16 (2014). https://doi.org/10.1016/J.CIRP.2014.03.017

    Article  Google Scholar 

  6. Pellegrinelli, S., Pedrocchi, N., Molinari Tosatti, L., et al.: Multi-robot spot-welding cells: an integrated approach to cell design and motion planning. CIRP Ann. 63, 17–20 (2014). https://doi.org/10.1016/J.CIRP.2014.03.015

    Article  Google Scholar 

  7. Alami, R., Gharbi, M., Vadant, B., et al.: On human-aware task and motion planning abilities for a teammate robot. In: Human–Robot Collaboration for Industrial Manufacturing Workshop, RSS 2014 (2014)

  8. Massa, D., Callegari, M., Cristalli, C.: Manual guidance for industrial robot programming. Ind. Robot. Int. J. 42, 457–465 (2015). https://doi.org/10.1108/IR-11-2014-0413

    Article  Google Scholar 

  9. Nathanael, D., Mosialos, S., Vosniakos, G.-C.: Development and evaluation of a virtual training environment for on-line robot programming. Int. J. Ind. Ergon. 53, 274–283 (2016). https://doi.org/10.1016/j.ergon.2016.02.004

    Article  Google Scholar 

  10. Jara, C.A., Candelas, F.A., Gil, P., et al.: EJS + EjsRL: an interactive tool for industrial robots simulation, Computer Vision and remote operation. Robot. Auton. Syst. 59, 389–401 (2011). https://doi.org/10.1016/j.robot.2011.02.002

    Article  Google Scholar 

  11. Aleotti, J., Caselli, S.: Physics-based virtual reality for task learning and intelligent disassembly planning. Virtual Real. 15, 41–54 (2011). https://doi.org/10.1007/s10055-009-0145-y

    Article  Google Scholar 

  12. Chen, H., Sheng, W.: Transformative CAD based industrial robot program generation. Robot. Comput. Integr. Manuf. 27, 942–948 (2011). https://doi.org/10.1016/J.RCIM.2011.03.006

    Article  Google Scholar 

  13. Angelidis, A., Vosniakos, G.-C.: Prediction and compensation of relative position error along industrial robot end-effector paths. Int. J. Precis. Eng. Manuf. 15, 63–73 (2014). https://doi.org/10.1007/s12541-013-0306-5

    Article  Google Scholar 

  14. Neto, P., Mendes, N., Araújo, R., et al.: High-level robot programming based on CAD: dealing with unpredictable environments. Ind. Robot. Int. J. 39, 294–303 (2012). https://doi.org/10.1108/01439911211217125

    Article  Google Scholar 

  15. Michas, S., Matsas, E., Vosniakos, G.-C.: Interactive programming of industrial robots for edge tracing using a virtual reality gaming environment. Int. J. Mechatron. Manuf. Syst. 10, 237–259 (2017). https://doi.org/10.1504/IJMMS.2017.087548

    Google Scholar 

  16. Pai, Y.S., Yap, H.J., Singh, R.: Augmented reality-based programming, planning and simulation of a robotic work cell. Proc. Inst. Mech. Eng. Part B J. Eng. Manuf. 229, 1029–1045 (2015). https://doi.org/10.1177/0954405414534642

    Article  Google Scholar 

  17. Fang, H.C., Ong, S.K., Nee, A.Y.C.: A novel augmented reality-based interface for robot path planning. Int. J. Interact. Des. Manuf. 8, 33–42 (2014). https://doi.org/10.1007/s12008-013-0191-2

    Article  Google Scholar 

  18. Sucan, I.A., Moll, M., Kavraki, L.E.: The open motion planning library. IEEE Robot. Autom. Mag. 19, 72–82 (2012). https://doi.org/10.1109/MRA.2012.2205651

    Article  Google Scholar 

  19. Kaltsoukalas, K., Makris, S., Chryssolouris, G.: On generating the motion of industrial robot manipulators. Robot. Comput. Integr. Manuf. 32, 65–71 (2015). https://doi.org/10.1016/J.RCIM.2014.10.002

    Article  Google Scholar 

  20. Berenson, D., Abbeel, P., Goldberg, K.: A robot path planning framework that learns from experience. In: 2012 IEEE International Conference on Robotics and Automation, pp. 3671–3678. IEEE (2012)

  21. Billard, A., Calinon, S., Dillmann, R., Schaal, S.: Robot programming by demonstration. In: Siciliano, B., Khatib, O. (eds.) Springer Handbook of Robotics, pp. 1371–1394. Springer, Berlin (2008)

    Chapter  Google Scholar 

  22. Zolkiewski, S., Pioskowik, D.: Robot Control and Online Programming by Human Gestures Using a Kinect Motion Sensor, pp. 593–604. Springer, Cham (2014)

    Google Scholar 

  23. Neto, P., Norberto Pires, J., Paulo Moreira, A.: High-level programming and control for industrial robotics: using a hand-held accelerometer-based input device for gesture and posture recognition. Ind. Robot. Int. J. 37, 137–147 (2010). https://doi.org/10.1108/01439911011018911

    Article  Google Scholar 

  24. Pedersen, M.R., Krüger, V.: Gesture-based extraction of robot skill parameters for intuitive robot programming. J. Intell. Robot. Syst. 80, 149–163 (2015). https://doi.org/10.1007/s10846-015-0219-x

    Article  Google Scholar 

  25. Tsarouchi, P., Athanasatos, A., Makris, S., et al.: High level robot programming using body and hand gestures. Procedia CIRP 55, 1–5 (2016). https://doi.org/10.1016/J.PROCIR.2016.09.020

    Article  Google Scholar 

  26. Ni, D., Yew, A.W.W., Ong, S.K., Nee, A.Y.C.: Haptic and visual augmented reality interface for programming welding robots. Adv. Manuf. 5, 191–198 (2017). https://doi.org/10.1007/s40436-017-0184-7

    Article  Google Scholar 

  27. Liu, H., Wang, L.: Gesture recognition for human-robot collaboration: a review. Int. J. Ind. Ergon. 1, 1–12 (2017). https://doi.org/10.1016/j.ergon.2017.02.004

    Google Scholar 

  28. Qi, L., Zhang, D., Zhang, J., Li, J.: A lead-through robot programming approach using a 6-DOF wire-based motion tracking device. In: 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1773–1777. IEEE (2009)

  29. Adept Technology Inc. V+ Language User's Guide, Version 12.1, Part # 00962-01130, Rev. A, September 1997

  30. McCarthy, C., Callele, D.: Virtools User Guide (2006)

  31. 3DVIA Virtools: Behavior Libraries—VR Library/VR Publisher 2.6 (Virtools 5.0.0.8—5.0) User Guide (2009)

  32. LaScalza, S., Arico, J., Hughes, R.: Effect of metal and sampling rate on accuracy of Flock of Birds electromagnetic tracking system. J. Biomech. 36, 141–144 (2003). https://doi.org/10.1016/S0021-9290(02)00322-6

    Article  Google Scholar 

  33. Milne, A.D., Chess, D.G., Johnson, J.A., King, G.J.W.: Accuracy of an electromagnetic tracking device: a study of the optimal operating range and metal interference. J. Biomech. 29, 791–793 (1996)

    Article  Google Scholar 

  34. Mcquade, K.J., Finley, M.A., Harris-love, M., Mccombe-waller, S.: Dynamic error analysis of ascension’ s flock of birds tm electromagnetic tracking device using a pendulum model. J. Appl. Biomech. 18, 171–179 (2002)

    Article  Google Scholar 

  35. Ascension Technology Corporation: The Flock of Birds ® Installation and operation guide (2004)

  36. Vosniakos, G.-C., Levedianos, E., Gogouvitis, X.V.: Streamlining virtual manufacturing cell modelling by behaviour modules. Int. J. Manuf. Res. 10, 17–43 (2015). https://doi.org/10.1504/IJMR.2015.067616

    Article  Google Scholar 

  37. Luhmann, T.: Close range photogrammetry for industrial applications. ISPRS J. Photogramm. Remote Sens. 65, 558–569 (2010). https://doi.org/10.1016/J.ISPRSJPRS.2010.06.003

    Article  Google Scholar 

  38. Mourelatos, A., Nathanael, D., Gkikas, K., Psarakis, L.: Development and evaluation of a wearable motion tracking system for sensorimotor tasks in VR environments. In: Bagnara, S., Tartaglia, R., Albolino, S., Alexander, Th., Fujita, Y. (eds.) Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018). Volume V: Human Simulation and Virtual Environments, Work With Computing Systems (WWCS), Process Control, pp 181–188. Springer, Berlin (2019)

Download references

Acknowledgements

Mr Nikolaos Melissas, Chief Technician in NTUA Manufacturing Technology Laboratory, is gratefully acknowledged for his contribution to tool constructions. Miss Margeaux Beaubet, of ENISE, France is gratefully acknowledged for her contribution in photogrammetry measurements.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to George-Christopher Vosniakos.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Manou, E., Vosniakos, GC. & Matsas, E. Off-line programming of an industrial robot in a virtual reality environment. Int J Interact Des Manuf 13, 507–519 (2019). https://doi.org/10.1007/s12008-018-0516-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12008-018-0516-2

Keywords

Navigation