Skip to main content
Log in

Traffic flow detection and statistics via improved optical flow and connected region analysis

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Moving vehicle detection plays an important role in intelligent transportation systems. One of the common methods used in moving vehicle detection is optical flow. However, conventional Horn–Schunck optical flow consumes too much time when calculating dense optical flows so that it cannot meet the real-time requirements. This paper proposes a novel improved Horn–Schunck optical flow algorithm based on inter-frame differential method. In our algorithm, optical flow field distribution is only calculated for pixels with larger gray values in the difference image, while for other pixels we applied the iterative smooth. The number of vehicles in the videos of traffic conditions is counted by setting the virtual loop and detecting optical flow information. To extract the moving vehicle as accurately as possible, we also propose a method to obtain moving vehicle minimum bounding rectangle based on the connected region analysis. Finally, we compare the improved optical flow with other four optical flow algorithms in moving vehicle extraction and vehicle flow detection, from which our method gives a much more accurate result.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Zhang, S.P., Lan, X.Y., Yao, H.X., Zhou, H.Y.: A biologically inspired appearance model for robust visual tracking. IEEE Trans. Neural Netw. Learn. Syst. 99, 1–14 (2016)

    Google Scholar 

  2. Zhang, S.P., Lan, X.Y., Qi, Y.K., Yuen, P.C.: Robust visual tracking via basis matching. IIEEE Trans. Circuits Syst. Video Technol. 27(3), 421–430 (2017)

    Article  Google Scholar 

  3. Zhang, S.P., Zhou, H.Y., Jiang, F., Li, X.L.: Robust visual tracking using structurally random projection and weighted least squares. IEEE Trans. Circuits Syst. Video Technol. 25(11), 1749–1760 (2015)

    Article  Google Scholar 

  4. Hodge, V.J., O’Keefe, S., Weeks, M., Moulds, A.: Wireless sensor networks for condition monitoring in the railway industry: a survey. IEEE Trans. Intell. Transp. Syst. 16(3), 1088–1106 (2015)

    Article  Google Scholar 

  5. Zhang, S.P., Zhou, H.Y., Yao, H.X., Zhang, Y.H., Wang, K.Q., Zhang, J.: Adaptive normal hedge for robust visual tracking. Sig. Process. 110, 132–142 (2015)

    Article  Google Scholar 

  6. Yang, G., Wang, Z., Shitang, M.U., Xie, L., Liu, J.: An improved optical flow algorithm. Comput. Eng. 32(15), 187–188 (2006)

    Google Scholar 

  7. Black, M.J., Yacoob, Y., Jepson, A.D.: Learning parameterized models of image motion. In: Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 561–567 (1997)

  8. Gibson, J.J.: The Perception of the Visual World. Houghton Mifflin, Boston (1950)

    Google Scholar 

  9. Royden, C.S., Moore, K.D.: Use of speed cues in the detection of moving objects by moving observers. Vis. Res. 59(2), 17–24 (2012)

    Article  Google Scholar 

  10. Kniaz, V.V.: Real-time optical flow estimation on a GPU for a skied- steered mobile robot. Proc. SPIE Photonics Eur. 9897, 1–12 (2016)

    Google Scholar 

  11. Alibouch, B., Radgui, A., Demonceaux, C., Rziza, M., Aboutajdine, D.: A phase-based framework for optical flow estimation on omnidirectional images. SIViP 10(2), 1–8 (2016)

    Article  Google Scholar 

  12. Tsai, D.M., Chiu, W.Y., Lee, M.H.: Optical flow-motion history image (OF-MHI) for action recognition. SIViP 9(8), 1897–1906 (2016)

    Article  Google Scholar 

  13. Jin, D., Zhu, S., Sun, X., Liang, Z., Xu, G.: Optical flow and spatio-temporal gradient based abnormal behavior detection. In: Proceedings of Chinese Control and Decision Conference, pp. 1532–1537 (2016)

  14. Yang, Y., Zhang, T.W.: Moving target tracking based on feature optical flow. J. Universe 21(2), 8–15 (2000)

    Google Scholar 

  15. Demarcq, G., Mascarilla, L., Berthier, M.: The color monogenic signal: application to color edge detection and color optical flow. J. Math. Imaging Vis. 40(3), 269–284 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  16. Lucas, B.D., Kanade, T.: An iterative image registration technique with an application in stereo vision. In: Proceedings of the International Joint Conference on Artificial Intelligence, pp. 674–679 (1981)

  17. Horn, B., Schunck, B.: Determining optical flow. Artif. Intell. 17(1–3), 185–203 (1981)

    Article  Google Scholar 

  18. Tao, M., Bai, J., Kohli, P., Paris, S.: SimpleFlow: a non-iterative sublinear optical flow algorithm. Comput. Graph. Forum 31(2), 345–353 (2012)

    Article  Google Scholar 

  19. Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: Computer Vision-ECCV 2004, vol. 3024, no. 10, pp. 25–36 (2004)

  20. Yang, Y.M.: Moving objects tracking based on improved optical flow method. Comput. Digit. Eng. 39(9), 108–110 (2016)

    Google Scholar 

  21. Zhu, L.Y.: Improved Horn–Schunck optical flow algorithms for object tracking. China Stereol. Image Anal. 20(3), 218–226 (2015)

    Google Scholar 

  22. Bellamine, I., Tairi, H.: Optical flow estimation based on the structure-texture image decomposition. SIViP 9(1), 1–9 (2015)

    Article  Google Scholar 

  23. Mahraz, M.A., Riffi, J., Tairi, H.: High accuracy optical flow estimation based on PDE decomposition. SIViP 9(6), 1–10 (2015)

    Article  Google Scholar 

  24. Gibson, J., Marques, O.: Sparsity in optical flow and trajectories. SIViP 10(3), 1–8 (2016)

    Article  Google Scholar 

  25. Chan, A.B., Liang, Z.S.J., Vasconcelos, N.: Privacy preserving crowd monitoring: counting people without people models or tracking. In: Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–7 (2008)

  26. Li, Y.C., Li, L., Wang, M.S.: Statistics of the traffic flow on multiple lanes. Process Autom. Instrum. 31(10), 57–60 (2010)

    Google Scholar 

  27. Qing, H.E., Chen, H., Shi, H.: New algorithm of multilane traffic flow detection. Video Appl. Proj. 38(19), 192–210 (2014)

    Google Scholar 

  28. Zhou, S.F., Li, J.X., Shen, Z.Q., Zhang, F.: Multilane traffic flow detection algorithm based on Gaussian mixture models. Comput. Simul. 29(10), 331–335 (2012)

    Google Scholar 

  29. Reddy, K.K., Shah, M.: Recognizing 50 human action categories of web videos. Mach. Vis. Appl. 24(5), 971–981 (2013)

    Article  Google Scholar 

  30. Marszałek, M., Laptev, I., Schmid, C.: Actions in context. In: Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2929–2936 (2009)

Download references

Acknowledgements

This work has been supported by National Natural Science Foundation of China (61203261), China Postdoctoral Science Foundation funded project (2012M521335), Jiangsu Key Laboratory of Big Data Analysis Technology/B-DAT (Nanjing University of Information Science and Technology, Grant No.: KXK1404), Research Fund of Guangxi Key Lab of Multi-source Information Mining and Security (MIMS16-02) and the Fundamental Research Funds of Shandong University (2015JC014).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhenxue Chen.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Peng, Y., Chen, Z., Wu, Q.M.J. et al. Traffic flow detection and statistics via improved optical flow and connected region analysis. SIViP 12, 99–105 (2018). https://doi.org/10.1007/s11760-017-1135-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-017-1135-2

Keywords

Navigation