skip to main content
10.1145/3373419.3373457acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicaipConference Proceedingsconference-collections
research-article

Effects of Color Systems' Transformation on Optical Flow Estimation of Noisy and Degraded Images

Authors Info & Claims
Published:24 January 2020Publication History

ABSTRACT

Varying illumination and image blur are some of major challenges faced by contemporary methods of optical flow estimation. Despite significant advancement, these aspects have not received much of attention by modern-day methods. Latest work in this field is heavily affected and produce adverse results when dealing with images containing variable illumination and blur. In this paper, we investigate the effects of color space transformations on optical flow estimation from degraded and noisy images. In our experiments, clean and noisy images have been used. These images contain different kinds of blur and atmospheric effects such as fog, mist, shadows and dark regions. By estimating optical flow with three types of sequences in parallel (super clean, clean and noisy), and using four popular color systems, the effects of color space transformation have been observed on the estimated flow fields. The four color systems include RGB (red, blue, green), HSV (hue, saturation, value), HSL (hue, saturation, lightness) and XYZ (as standardized by the International Commission on Illumination in 1931). It is found that output of an optical flow algorithm not only depends on the color system adopted, but some color spaces tend to favor some special type of image sequences. For instance, XYZ color system is more favorable for the images abiding by the brightness constancy assumption while HSV color space is more suitable for blurry and noisy images. While keeping rest of the parameters unchanged but only transforming the color-space, we estimated the optical flow. Obviously the results of an algorithm applied to clean images for optical flow, would not be consistent with a flow estimated from same images containing noise. The objective is to compare this adversative effect for different color spaces. The flow estimation errors in four color systems have been reported and compared, and the best color-space is pointed out in each case. The paper also discusses the possible factors behind these variable outcomes with an insight into the basic frameworks of traditional methods for optical flow.

References

  1. T. Brox and J. Malik. "Large Displacement Optical Flow Descriptor Matching in Variational Motion Estimation" IEEE Transactions on Pattern Analysis and Machine Learning Vol.33, No.3, pp. 500--513, 2011.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. P. Weinzaepfel, J. Revaud, Z. Harchaoui, and C. Schmid, "DeepFlow: Large displacement optical flow with deep matching," Proc. IEEE Int. Conf. Comput. Vis., no. Section 2, pp. 1385--1392, 2013.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. J. Revaud, P. Weinzaepfel, Z. Harchaoui, and C. Schmid, "EpicFlow: Edge-preserving interpolation of correspondences for optical flow," Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit. vol. 07-12-June, pp. 1164--1172, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  4. J. Hur and S. Roth, "MirrorFlow: Exploiting Symmetries in Joint Optical Flow and Occlusion Estimation," Proc. IEEE Int. Conf. Comput. Vis., vol. 2017-Oct, pp. 312--321, 2017.Google ScholarGoogle ScholarCross RefCross Ref
  5. Y. Yang and S. Soatto, "S2F: Slow-to-fast interpolator flow," Proc. - 30th IEEE Conf. Comput. Vis. Pattern Recognition, vol. 2017-Janua, pp. 3767--3776, 2017.Google ScholarGoogle Scholar
  6. J. Wulff, L. Sevilla-Lara, and M. J. Black, "Optical flow in mostly rigid scenes," Proc.30th IEEE Conf. Comput. Vis. Pattern Recognition, vol. 2017-Jan, pp. 6911--6920, 2017.Google ScholarGoogle Scholar
  7. Z. Chen, H. Jin, Z. Lin, S. Cohen, and Y. Wu, "Large displacement optical flow from nearest neighbor fields," Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 2443--2450, 2013.Google ScholarGoogle Scholar
  8. Y. Hu, R. Song, and Y. Li, "Efficient coarse-to-fine patchmatch for large displacement optical flow," Proc. IEEE Conf. Comput. Vis. Pattern Recognit., pp. 5704--5712, 2016.Google ScholarGoogle Scholar
  9. Q. Chen and V. Koltun, "Full Flow: Optical Flow Estimation By Global Optimization over Regular Grids," Proc. IEEE Conf. Comput. Vis. Pattern Recognit. pp:1--9, 2016.Google ScholarGoogle ScholarCross RefCross Ref
  10. M. Menze, C. Heipke, and A. Geiger, "Discrete Optimization for Optical Flow," 37th GCPR 2015 vol. i, pp. 16--28, 2015.Google ScholarGoogle Scholar
  11. A. Dosovitskiy et al., "FlowNet: Learning optical flow with convolutional networks," Proc. IEEE Int. Conf. Comput. Vis., vol. 2015 Inter, pp. 2758--2766, 2015.Google ScholarGoogle Scholar
  12. E. Ilg, N. Mayer, T. Saikia, M. Keuper, A. Dosovitskiy, and T. Brox, "FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks," IEEE Conference on Computer Vision and Pattern Analysis, 2017.Google ScholarGoogle Scholar
  13. A. Ranjan, M. J. Black"Optical Flow Estimation using a Spatial Pyramid Network"pp.4161--4170.CVPR-2017Google ScholarGoogle Scholar
  14. S. Meister, J. Hur, and S. Roth, "UnFlow:Unsupervised Learning of Optical Flow with a Bidirectional Census Loss".arXiv:1711.07837v1 [cs.CV], 2017.Google ScholarGoogle Scholar
  15. D. Sun, X. Yang, M. Y. Liu, and J. Kautz, "PWC-Net: CNNs for Optical Flow Using Pyramid, Warping, and Cost Volume" Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pat. Rec., vol. D, pp. 8934--8943, 2018.Google ScholarGoogle ScholarCross RefCross Ref
  16. P. Liu, M. Lyu, I. King, and J. Xu, "SelFlow: Self-Supervised Learning of Optical Flow" arXiv:1904.09117 [cs.CV], 2019Google ScholarGoogle Scholar
  17. R. Schuster, C. Bailer, O. Wasenmüller, and D. Stricker, "FlowFields++: Accurate Optical Flow Correspondences Meet Robust Interpolation" arXiv:1805.03517 [cs.CV], 2018Google ScholarGoogle Scholar
  18. D. J. Butler, J. Wulff, G. B. Stanley, and M. J. Black, "A Naturalistic Open Source Movie for Optical Flow," ECCV, pp. 611--625, 2012.Google ScholarGoogle Scholar
  19. B. K. P. Horn and B. G. Schunck"Determining optical flow" Artif. Intell., vol.17, no. 1-3, pp. 185--203, 1981.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Effects of Color Systems' Transformation on Optical Flow Estimation of Noisy and Degraded Images

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ICAIP '19: Proceedings of the 2019 3rd International Conference on Advances in Image Processing
      November 2019
      232 pages
      ISBN:9781450376754
      DOI:10.1145/3373419

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 24 January 2020

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited
    • Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader