Skip to main content
Log in

Hard negative mining for correlation filters in visual tracking

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

Visual tracking is a fundamental computer vision task. Recent years have seen many tracking methods based on correlation filters exhibiting excellent performance. The strength of these methods comes from their ability to efficiently learn changes of the target appearance over time. A fundamental drawback to these methods is that the background of the object is not modeled over time which results in suboptimal results. In this paper, we propose a robust tracking method in which a hard negative mining scheme is employed in each frame. In addition, a target verification strategy is developed by introducing a peak signal-to-noise ratio (PSNR) criterion. The proposed method achieves strong tracking results, while maintaining a real-time speed of 30 frame per second without further optimization. Extensive experiments over multiple tracking datasets show the superior accuracy of our tracker compared to state-of-the-art methods including those based on deep learning features.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20

Similar content being viewed by others

References

  1. Galoogahi, H.K., Fagg, A., Lucey, S.: Learning background-aware correlation filters for visual tracking. In: Proceedings of the IEEE Conference on ICCV, pp. 1135–1143 (2017)

  2. Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: CVPR (2010)

  3. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. TPAMI 37(3), 583–596 (2015)

    Article  Google Scholar 

  4. Solis Montero A., Lang J., Laganire R.: Scalable kernel correlation filter with sparse feature integration. In: IEEE International Conference on Computer Vision Workshop on Visual Object Tracking (VOT2015), Santiago, Chile, pp. 24–31 (2015)

  5. Danelljan, M., Hager, G., Khan, F.S., Felsberg, M.: Learning spatially regularized correlation filters for visual tracking. In: ICCV (2015)

  6. Galoogahi, H.K., Sim, T., Lucey, S.: Correlation filters with limited boundaries. In: CVPR (2015)

  7. Li, Y., Zhu, J.: A scale adaptive kernel correlation filter tracker with feature integration. In: ECCVW (2014)

  8. Danelljan, M., Hager, G., Khan, F.S., Felsberg, M.: Discriminative scale space tracking. TPAMI 39(8), 1561–1575 (2017)

    Article  Google Scholar 

  9. Bibi, A., Ghanem, B.: Multi-template scale-adaptive kernelized correlation filters. In: ICCVW (2015)

  10. Zhang, K., Zhang, L., Liu, Q., Zhang, D., Yang, M.-H.: Fast visual tracking via dense spatio-temporal context learning. In: ECCV (2014)

  11. Qi, Y., Zhang, S., Qin, L., Yao, H., Huang, Q., Lim, J., Yang, M.-H.: Hedged deep tracking. In: CVPR (2016)

  12. Felzenszwalb, P., Girshick, R., McAllester, D., Ramanan, D.: Object detection with discriminatively trained part-based models. PAMI 32, 1627–1645 (2010)

    Article  Google Scholar 

  13. Shrivastava, A., Gupta, A., Girshick, R.: Training region-based object detectors with online hard example mining. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 761–769 (2016)

  14. Rowley, H., Baluja, S., Kanade, T.: Neural network-based face detection. IEEE PAMI 20, 23–38 (1998)

    Article  Google Scholar 

  15. Dollar, P., Tu, Z., Perona, P., Belongie, S.: Integral channel features. In: BMVC (2009)

  16. Loshchilov, I., Hutter, F.: Online batch selection for faster training of neural networks (2015). arXiv preprint arXiv:1511.06343

  17. Nam, H., Han, B.: Learning multi-domain convolutional neural networks for visual tracking. In: CVPR, pp. 4293–4302 (2016)

  18. Smeulders, A.W.M., Chu, D.M., Cucchiara, R., Calderara, S., Dehghan, A., Shah, M.: Visual tracking: an experimental survey. TPAMI 36(7), 1442–1468 (2014)

    Article  Google Scholar 

  19. Wu, Y., Lim, J., Yang, M.-H.: Object tracking benchmark. TPAMI 37, 1834–1848 (2015)

    Article  Google Scholar 

  20. Mueller, M., Smith, N., Ghanem, B.: A benchmark and simulator for uav tracking. In: ECCV, pp. 445–461 (2016)

  21. Galoogahi, H.K., Sim, T., Lucey, S.: Multi-channel correlation filters. In: ICCV (2013)

  22. Danelljan, M., Khan, F.S., Felsberg, M., van de Weijer, J.: Adaptive color attributes for real-time visual tracking. In: CVPR (2014)

  23. Ma, C., Yang, X., Zhang, C., Yang, M.H.: Long-term correlation tracking. In: CVPR (2015)

  24. Ma, C., Huang, J.-B., Yang, X., Yang, M.-H.: Hierarchical convolutional features for visual tracking. In: ICCV (2015)

  25. Danelljan, M., Robinson, A., Shahbaz Khan, F., Felsberg, M.: Beyond correlation filters: Learning continuous convolution operators for visual tracking. In: ECCV (2016)

  26. Danelljan, M., Bhat, G., Khan, F.S., Felsberg, M.: ECO: efficient convolution operators for tracking. In: CVPR (2017)

  27. Simo-Serra, E., Trulls, E., Ferraz, L., Kokkinos, I., Moreno-Noguer, F.: Fracking deep convolutional image descriptors (2014). arXiv preprint arXiv:1412.6537

  28. Wang, X., Gupta, A.: Unsupervised learning of visual representations using videos. In: ICCV (2015)

  29. https://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio. Accessed 1 Mar 2018

  30. Boyd, S.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3, 1–122 (2010)

    Article  MATH  Google Scholar 

  31. Wu, Y., Lim, J., Yang, M.-H.: Online object tracking: a benchmark. In: CVPR, pp. 2411–2418 (2013)

  32. Hong, S., You, T., Kwak, S., Han, B.: Online tracking by learning discriminative saliency map with convolutional neural network. In: ICML (2015)

  33. Choi, J., Chang, H.J., Yun, S., Fischer, T., Demiris, Y., Choi, J.Y.: Attentional correlation filter network for adaptive visual tracking. In: CVPR (2017)

  34. Tao, R., Gavves, E., Smeulders, A.W.: Siamese instance search for tracking. In: CVPR (2016)

  35. Bertinetto, L., Valmadre, J., Golodetz, S., Miksik, O., Torr, P.H.S: Staple: Complementary learners for real-time tracking. In: CVPR, pp. 1401–1409 (2016)

  36. Bertinetto, L., Valmadre, J., Henriques, J.F., Vedaldi, A., Torr, P.H.S.: Fully-convolutional siamese networks for object tracking. In: ECCV, pp. 850–865 (2016)

  37. Choi, J., Jin Chang, H., Jeong, J., Demiris, Y., Young Choi, J.: Visual tracking using attention-modulated disintegration and integration. In: CVPR, pp. 4321–4330 (2016)

  38. Zhang, J., Ma, S., Sclaroff, S.: MEEM: robust tracking via multiple experts using entropy minimization. In: ECCV (2014)

  39. Hong, Z., Chen, Z., Wang, C., Mei, X., Prokhorov, D., Tao, D.: Multi-store tracker (muster): a cognitive psychology inspired approach to object tracking. In: CVPR, pp. 749–758 (2015)

  40. Danelljan, M., Hger, G., Shahbaz Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. In: BMVC (2014)

  41. Hare, S., Saffari, A., Torr, P.H.S.: Struck: structured output tracking with kernels. In: ICCV, pp. 263–270 (2011)

  42. Jia, X., Lu, H., Yang, M.H.: Visual tracking via adaptive structural local sparse appearance model. In: CVPR, pp. 1822–1829 (2012)

  43. Ross, D., Lim, J., Lin, R., Yang, M.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77(1), 125–141 (2008)

    Article  Google Scholar 

  44. Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking–learning–detection. TPAMI 34(7), 1409–1422 (2011)

    Article  Google Scholar 

  45. Henriques, J., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: ECCV, pp. 702–715 (2012)

  46. Grabner, H., Grabner, M., Bischof, H.: Real-time tracking via on-line boosting. BMVC 1, 1–6 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yong Wang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, Z., Wang, Y. & Laganière, R. Hard negative mining for correlation filters in visual tracking. Machine Vision and Applications 30, 487–506 (2019). https://doi.org/10.1007/s00138-019-01004-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-019-01004-0

Keywords

Navigation