Skip to main content
Log in

Orientation estimation in modern wearables with visual feature tracking

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

In this paper we propose and examine a robust method for orientation estimation of wearables (like Google Glass) through bias compensation of gyroscope. Through integration of raw angular rates, the resulting orientation estimate will drift, due to the non-zero bias of the measured gyroscope was also accumulated. A simple error model was constructed for the measurement capabilities of the device, in terms of inertial sensing of rotation. Using the benefits of feature point displacements (as optical flow) from the camera of the device, a sensor fusion algorithm was developed to lower, and compensate the bias. The introduced orientation estimator and bias removal method is based on adaptive reliability filter for the optical flow feature points. To fuse the remaining feature point displacements into one single value, various aggregation methods have been tested, and a maximum a-posteriori estimator was applied on the dataset. To further lower the bias from the gyroscope, the output of the system was used as a feedback to calculate a bias estimate. To measure the performance and investigate the benefits of the asynchronous work-flow, the optical flow was performed on various devices in real time. To validate the results of the simulations, real world measurements were performed with industrial robots. Our reliable, adaptive filter matched our expectations, could compensate the bias well, and the integrated output of the system had no significant drift.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Ahmadi M, Khayatian A, Karimaghaee P (2007) Orientation estimation by error-state extended Kalman filter in quaternion vector space. SICE Annu Conf 2007:60–67

    Article  Google Scholar 

  2. Marins JL, Yun X, Bachmann ER, McGhee R, Zyda MJ (2001) An extended kalman filter for quaternion-based orientation estimation using marg sensors, pp 2003–2011

  3. Madgwick SOH, Vaidyanathan R, Harrison AJL (April 2010) An efficient orientation filter for inertial measurement units (imus) and magnetic angular rate and gravity (marg) sensor arrays. Technical report,Department of Mechanical Engineering

  4. Wagner D, Mulloni A, Langlotz T, Schmalstieg D (2010) Real-time panoramic mapping and tracking on mobile phones. 2010 IEEE virtual reality conference (VR), pp 211–218

  5. Beyeler A, Floreano D (2009) optiPilot: control of take-off and landing using optic flow. European Micro Air Vehicle Conference and Competition (EMAV), Delft, Netherland

  6. Kundra L, Ekler P, Charaf H (2013) Improving orientation estimation in mobiles with built-in camera. 2013 IEEE 4th international conference on cognitive infocommunications (CogInfoCom), 1(9):765–770

  7. Kundra L, Ekler P (2014) Non-deceiving features in fused optical flow gyroscopes. In: IEEE 12th international symposium on applied machine intelligence and informatics, pp 63–66, Herlany, Slovakia

  8. Bouguet J (1999) Pyramidal Implementation of the Lucas–Kanade Feature Tracker. Technical report. Intel Corporation Microprocessor Research Labs, CA

  9. Sun D, Roth S, Black MJ (2010) Secrets of optical flow estimation and their principles. 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp 2432–2439

  10. Goldshtein M, Oshman Y, Efrati T (2007) Seeker gyro calibration via model-based fusion of visual and inertial data. In: 10th international conference on information fusion, pp 1–19

  11. Omari S, Ducard G (2013) Metric visual-inertial navigation system using single optical flow feature. In: Control conference (ECC), 2013 European, pp 1310–1316

  12. Liu H, Hong T-H, Herman M, Camus T, Chellappa R (1998) Accuracy vs efficiency trade-offs in optical flow algorithms. Comput Vis Image Underst 72(3):271–286

    Article  Google Scholar 

  13. Brox T, Malik J (2011) Large displacement optical flow: descriptor matching in variational motion estimation. IEEE Trans Pattern Anal Mach Intel 33(3):500–513

    Article  Google Scholar 

  14. Brownrigg DRK (1984) The weighted median filter. Commun ACM 27(8):807–818. doi:10.1145/358198.358222

  15. Radgui A, Demonceaux C, Mouaddib E, Rziza M, Aboutajdine D (2011) Optical flow estimation from multichannel spherical image decomposition. Comput Vis Image Underst 115(9):1263–1272

    Article  Google Scholar 

  16. Bazin JC, Demonceaux C, Vasseur P, Kweon IS (2010) Motion estimation by decoupling rotation and translation in catadioptric vision. Comput Vis Image Underst 114(2):254–273 Special issue on Omnidirectional Vision, Camera Networks and Non-conventional Cameras

  17. Gabriele L, Sabatini AM (2013) Extended Kalman filter-based methods for pose estimation using visual, inertial and magnetic sensors: comparative analysis and performance evaluation. Sensors 13(2):1919–1941

    Article  Google Scholar 

  18. Kundra L, Ekler P (2014) Aspect of aggregation methods of optical flow feature displacements for orientation estimation. In: 5th international conference on CIRCUITS, SYSTEMS, CONTROL, SIGNALS (CSCS ’14), pp 211–218

  19. Sun T, Gabbouj M, Neuvo Y (1995) Analysis of two-dimensional center weighted median filters. Multidimens Syst Signal 172:159–172

    Article  Google Scholar 

  20. Zhang Z (1997) Parameter estimation techniques: a tutorial with application to conic fitting. Image Vis Comput 15(1):59–76

    Article  Google Scholar 

  21. Black MJ, Anandan P (1996) The robust estimation of multiple motions: parametric and piecewise-smooth flow fields. Comput Vis Image Underst 63(1):75–104

    Article  Google Scholar 

  22. Lempitsky V, Roth S, Rother C (2008) FusionFlow: discrete-continuous optimization for optical flow estimation. 2008 IEEE conference on computer vision and pattern recognition, pp 1–8

  23. Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395

    Article  MathSciNet  Google Scholar 

  24. Baranyi P, Csapó Á (2012) Definition and synergies of cognitive infocommunications. Acta Polytechnica Hungarica 9(1):67–83

    Google Scholar 

  25. Csapó ÁB, Baranyi PZ (2012) A unified terminology for the structure and semantics of CogInfoCom channels. Acta Polytech Hung 9(1):85–105

    Google Scholar 

  26. Csapo A, Baranyi P (2011) A unified terminology for CogInfoCom applications. In: 2nd International Conference on Cognitive Infocommunications (CogInfoCom), pp 1–6

  27. Csapo A, Baranyi P, Várlaki P (2012) A taxonomy of coginfocom trigger types in practical use cases. (CogInfoCom), 2012 In: IEEE 3rd Conference on International Conference on Cognitive Infocommunications (CogInfoCom), pp 1–5

  28. Baranyi P, Csapo A (2010) Cognitive infocommunications: coginfocom. In: 11th IEEE International Symposium on Computational Intelligence and Informatics, CINTI 2010—Proceedings, pp 141–146

  29. Schmorrow DD, Kruse AA (2002) DARPA’s augmented cognition program-tomorrow’s human computer interaction from vision to reality: building cognitively aware computational systems. In: Proceedings of the IEEE 7th Conference on Human Factors and Power Plants, pp 1–4

  30. Kifor T, Gottdank T, Hajnal A, Baranyi P, Korondi B, Korondi P (2011) Smartphone emotions based on human-dog interaction. In: 2nd International Conference on Cognitive Infocommunications (CogInfoCom), pp 1–6

  31. Kim SJ, Dey AK (2009) Simulated augmented reality windshield display as a cognitive mapping aid for elder driver navigation. In: CHI’09 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, p 9

Download references

Acknowledgments

This work was partially supported by the European Union and the European Social Fund through project FuturICT.hu (Grant No.: TAMOP-4.2.2.C-11/1/KONV-2012-0013) organized by VIKING Zrt. Balatonfüred. This work is connected to the scientific program of the “Development of quality-oriented and harmonized R + D + I strategy and functional model at BME” project. This project is supported by the New Széchényi Plan (Project ID: TÁMOP-4.2.1/B-09/1/KMR-2010-0002).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to László Kundra.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kundra, L., Ekler, P. & Charaf, H. Orientation estimation in modern wearables with visual feature tracking. J Multimodal User Interfaces 9, 313–322 (2015). https://doi.org/10.1007/s12193-015-0180-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-015-0180-9

Keywords

Navigation