skip to main content
10.1145/3552482.3556554acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

Towards a Calibrated 360° Stereoscopic HDR Image Dataset for Architectural Lighting Studies

Published:10 October 2022Publication History

ABSTRACT

High fidelity 360 degree images enhance user experience and offer realistic representations for architectural design studies. Specifically, VR and hyper-realistic imaging technologies can be helpful tools to study daylight in architectural places thanks to the high level of immersion and its ability to create perceptually accurate and faithful scene representations.

In this paper, we present a novel method for collecting and processing physically calibrated 360 degree stereoscopic high-dynamic range images of daylit indoor places. The future dataset aims to provide a higher degree of realism, a wide range of various luminous interior spaces supplied with information on the physical characterization of the space. This paper presents the first applications of this method on different places and discusses challenges for assessing visual perception in these images. In the near future, this dataset will be publicly available for architectural as well as multimedia studies.

Skip Supplemental Material Section

Supplemental Material

PIES-ME22-pies03.mp4

mp4

165.7 MB

References

  1. 2020. https://qualinet.github.io/databases/video/image/empa_hdr_image_ database/Google ScholarGoogle Scholar
  2. Tara Akhavan and Hannes Kaufmann. 2015. Backward compatible HDR stereo matching: a hybrid tone-mapping-based framework. EURASIP Journal on Image and Video Processing 2015, 1 (2015), 1--12.Google ScholarGoogle ScholarCross RefCross Ref
  3. Antonieta Angulo. 2015. Rediscovering Virtual Reality in the Education of Architectural Design: The immersive simulation of spatial experiences. Ambiances. Environnement sensible, architecture et espace urbain 1 (2015).Google ScholarGoogle Scholar
  4. Tunc Ozan Aydin, Aljoscha Smolic, and Markus Gross. 2015. Automated Aesthetic Analysis of Photographic Images. IEEE Transactions on Visualization and Computer Graphics 21, 1 (Jan 2015), 31--42. https://doi.org/10.1109/TVCG.2014. 2325047Google ScholarGoogle ScholarCross RefCross Ref
  5. Alice Bellazzi, Laura Bellia, Giorgia Chinazzo, Federica Corbisiero, Pierpaolo D'Agostino, Anna Devitofrancesco, Francesca Fragliasso, Matteo Ghellere, Valentino Megale, and Francesco Salamone. 2021. Virtual reality for assessing visual quality and lighting perception: A systematic review. Building and Environment (2021), 108674.Google ScholarGoogle Scholar
  6. Marcello Carrozzino and Massimo Bergamasco. 2010. Beyond virtual museums: Experiencing immersive virtual reality in real museums. Journal of cultural heritage 11, 4 (2010), 452--458.Google ScholarGoogle ScholarCross RefCross Ref
  7. Coralie Cauwerts. 2013. Influence of presentation modes on visual perceptions of daylit spaces. Université catholique de Louvain (UCL), Louvain-la-Neuve, Belgium (2013).Google ScholarGoogle Scholar
  8. Coralie Cauwerts and Magali Bodart. 2013. Validation of a questionnaire for assessing perceptions of lighting characteristics in daylit spaces. In PLEA.Google ScholarGoogle Scholar
  9. Coralie Cauwerts, Sophie Jost, and Bertrand Deroisy. 2019. Colorimetric accuracy of high dynamic range images for lighting research. https://dial.uclouvain.be/ pr/boreal/object/boreal:219804Google ScholarGoogle Scholar
  10. Kynthia Chamilothori, Jan Wienold, and Marilyne Andersen. 2019. Adequacy of immersive virtual reality for the perception of daylit spaces: comparison of real and virtual environments. Leukos 15, 2--3 (2019), 203--226.Google ScholarGoogle ScholarCross RefCross Ref
  11. Laurent Debailleux, Geoffrey Hismans, and Natacha Duroisin. 2018. Exploring cultural heritage using virtual reality. In Digital cultural heritage. Springer, 289-- 303.Google ScholarGoogle Scholar
  12. EPFL. 2017. Omnidirectional HDR consumer camera dataset. https://www.epfl. ch/labs/mmspg/downloads/360hdr-consumercamera/Google ScholarGoogle Scholar
  13. Mark D Fairchild. 2008. HDR Images Appearance. http://markfairchild.org/ HDR.htmlGoogle ScholarGoogle Scholar
  14. Ific Goudé, Rémi Cozot, and Olivier Le Meur. 2020. A Perceptually Coherent TMO for Visualization of 360 HDR Images on HMD. Transactions on Computational Science XXXVII (Jul 2020), 109--128. https://doi.org/10.1007/978--3--662--61983--4_7Google ScholarGoogle ScholarCross RefCross Ref
  15. Muhammad Hegazy, Kensuke Yasufuku, and Hirokazu Abe. 2021. Evaluating and visualizing perceptual impressions of daylighting in immersive virtual environments. Journal of Asian Architecture and Building Engineering 20, 6 (2021), 768--784.Google ScholarGoogle ScholarCross RefCross Ref
  16. Arsalan Heydarian, Joao P Carneiro, David Gerber, Burcin Becerik-Gerber, Timothy Hayes, and Wendy Wood. 2014. Immersive virtual environments: experiments on impacting design and human building interaction. (2014).Google ScholarGoogle Scholar
  17. Lukás Krasula, Karel Fliegel, and Patrick Le Callet. 2019. FFTMI: features fusion for natural tone-mapped images quality evaluation. IEEE Transactions on Multimedia 22, 8 (2019), 2038--2047.Google ScholarGoogle ScholarCross RefCross Ref
  18. Lukás Krasula, Karel Fliegel, Patrick Le Callet, and Milos Klíma. 2014. Objective evaluation of naturalness, contrast, and colorfulness of tone-mapped images. In Applications of Digital Image Processing XXXVII, Andrew G. Tescher (Ed.), Vol. 9217. SPIE, 92172D. https://doi.org/10.1117/12.2075270 Citation Key: Krasula2014.Google ScholarGoogle Scholar
  19. Lukás Krasula, Manish Narwaria, Karel Fliegel, and Patrick Le Callet. 2017. Preference of Experience in Image Tone-Mapping: Dataset and Framework for Objective Measures Comparison. IEEE Journal of Selected Topics in Signal Processing 11, 1 (Feb 2017), 64--74. https://doi.org/10.1109/JSTSP.2016.2637168Google ScholarGoogle ScholarCross RefCross Ref
  20. Arian Mehrfard, Javad Fotouhi, Giacomo Taylor, Tess Forster, Nassir Navab, and Bernhard Fuerst. 2019. A Comparative Analysis of Virtual Reality Head-Mounted Display Systems. arXiv:1912.02913 (Dec 2019). http://arxiv.org/abs/1912.02913 arXiv:1912.02913 [cs].Google ScholarGoogle Scholar
  21. Manish Narwaria, Matthieu Perreira Da Silva, Patrick Le Callet, and Romuald Pepion. 2012. Effect of tone mapping operators on visual attention deployment, Andrew G. Tescher (Ed.). San Diego, California, USA, 84990G. https://doi.org/10. 1117/12.931499Google ScholarGoogle Scholar
  22. Anne-Flore Perrin, Cambodge Bist, Rémi Cozot, and Touradj Ebrahimi. 2017. Measuring quality of omnidirectional high dynamic range content. In Applications of Digital Image Processing XL, Vol. 10396. SPIE, 202--219. https://doi.org/10.1117/ 12.2275146Google ScholarGoogle Scholar
  23. Clotilde Pierson, Coralie Cauwerts, Magali Bodart, and Jan Wienold. 2021. Tutorial: luminance maps for daylighting studies from high dynamic range photography. Leukos 17, 2 (2021), 140--169.Google ScholarGoogle ScholarCross RefCross Ref
  24. S Rockcastle, M Danell, E Calabrese, G Sollom-Brotherton, A Mahic, K Van Den Wymelenberg, and R Davis. 2021. Comparing perceptions of a dimmable LED lighting system between a real space and a virtual reality display. Lighting Research & Technology 53, 8 (2021), 701--725.Google ScholarGoogle ScholarCross RefCross Ref
  25. Rafael Sacks, Amotz Perlman, and Ronen Barak. 2013. Construction safety training using immersive virtual reality. Construction Management and Economics 31, 9 (2013), 1005--1017.Google ScholarGoogle ScholarCross RefCross Ref
  26. Hojatollah Yeganeh and Zhou Wang. 2013. Objective Quality Assessment of Tone-Mapped Images. IEEE Transactions on Image Processing 22, 2 (Feb 2013), 657--667. https://doi.org/10.1109/TIP.2012.2221725Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Akiko Yoshida, Volker Blanz, Karol Myszkowski, and Hans-Peter Seidel. 2005. Perceptual evaluation of tone mapping operators with real-world scenes. In Human Vision and Electronic Imaging X, Vol. 5666. SPIE, 192--203Google ScholarGoogle Scholar

Index Terms

  1. Towards a Calibrated 360° Stereoscopic HDR Image Dataset for Architectural Lighting Studies

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          PIES-ME '22: Proceedings of the 1st Workshop on Photorealistic Image and Environment Synthesis for Multimedia Experiments
          October 2022
          47 pages
          ISBN:9781450395007
          DOI:10.1145/3552482

          Copyright © 2022 ACM

          Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 10 October 2022

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          PIES-ME '22 Paper Acceptance Rate5of5submissions,100%Overall Acceptance Rate5of5submissions,100%

          Upcoming Conference

          MM '24
          MM '24: The 32nd ACM International Conference on Multimedia
          October 28 - November 1, 2024
          Melbourne , VIC , Australia
        • Article Metrics

          • Downloads (Last 12 months)22
          • Downloads (Last 6 weeks)1

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader