Imagery datasets for photobiological lighting analysis of architectural models with shading panels

This paper describes eight imagery datasets including around 12000 images grouped in 1220 sets. The images were captured inside an architectural model aimed at exploring the impact of shading panels on photobiological lighting parameters. The architectural model represents a generic space at 1:10 scale with a single side fully glazing façade used to install shading panels. The datasets present interior lighting conditions under different shading configurations in terms of surface colors and glossiness, horizontal and vertical orientations and upwards, downwards, and left/right inclinations of panels, V-shape opening, low to high densities, and top and bottom positions at the window. The experiments of shading panel configurations were conducted under four to six different exterior overcast daylighting conditions simulated with very cool to very warm color temperatures and high to low intensities inside an artificial sky chamber. The datasets include bracketed low dynamic range (LDR) images which enable generating high dynamic range (HDR) images for photobiological lighting evaluations. Images were captured from the side and back viewpoints inside the model by using Raspberry Pi camera modules mounted with fisheye lenses. The datasets are reusable and useful for architects, lighting designers, and building engineers to study the impact of architectural variables and shading panels on photobiological lighting conditions in space. The datasets will also be interesting for computer vision specialists to run machine learning techniques and train artificial intelligence for architectural applications. The datasets are partially used in Parsaee, et al. [1]. The datasets are compiled as part of a doctoral dissertation in architecture at Laval University authored by Mojtaba Parsaee [2]. The datasets are shared through two Mendeley data repositories [3,4].


a b s t r a c t
This paper describes eight imagery datasets including around 120 0 0 images grouped in 1220 sets. The images were captured inside an architectural model aimed at exploring the impact of shading panels on photobiological lighting parameters. The architectural model represents a generic space at 1:10 scale with a single side fully glazing façade used to install shading panels. The datasets present interior lighting conditions under different shading configurations in terms of surface colors and glossiness, horizontal and vertical orientations and upwards, downwards, and left/right inclinations of panels, V-shape opening, low to high densities, and top and bottom positions at the window. The experiments of shading panel configurations were conducted under four to six different exterior overcast daylighting conditions simulated with very cool to very warm color temperatures and high to low intensities inside an artificial sky chamber. The datasets include bracketed low dynamic range (LDR) images which enable generating high dynamic range (HDR) images for photobiological lighting evaluations. Images were captured from the side and back viewpoints inside the model by using Raspberry Pi camera modules mounted with fisheye lenses. The datasets are reusable and useful for architects, lighting designers, and building engineers to study the impact of architectural variables and shading panels on photobiological lighting conditions in space. The datasets will also be interesting for computer vision specialists to run machine learning techniques and train artificial intelligence for architectural applications. The datasets are partially used in Parsaee, et al. [1] . The datasets are compiled as part of a doctoral dissertation in architecture at Laval University authored by Mojtaba Parsaee [2] . The datasets are shared through two Mendeley data repositories [3 , 4] .
©  [9] . The camera is manufactured as RPi Camera-I by WaveShare [10] . This model has a fixed aperture value of f/2, a focal length of 1.55mm, and a 5-megapixel OV5647 sensor with a CCD size of 1/4inch which could produce images with 2592-by-1944-pixel resolutions. The fisheye lens has a diagonal angle of view of around 185-degree. The camera and fisheye lens specifications are fully described in WaveShare [10] . Multiple LDR images were photographed from very dark to very bright high exposure values (i.e., -2 to + 3 EVs) provided by modifying the camera shutter speeds from around 15 to 1/2 seconds. All LDR images were captured with an ISO-100 and a fixed white balance (D65

Value of the Data
• These imagery datasets can be used to evaluate the impact of external shading panel characteristics on photobiological lighting conditions inside buildings. • The datasets are highly useful to educate architectural and engineering students on façade and daylighting design. It also guides students to reproduce such lighting experiments and develop architectural configurations and prototypes addressing the lighting needs of occupants. • The datasets can be used by architects, lighting specialists, interior designers, and building engineers who study the impact of architectural configurations and façades on indoor lighting conditions and individuals' perceptions and circadian responses. • The shared datasets could be used by computer vision researchers and machine learning specialists to train artificial intelligence for architectural applications and lighting-color interactions. • The datasets could be used for perception studies using questionnaires to enquire about emotional responses to façade configurations, lighting conditions, and color rendering in architecture.

Data Description
Eight datasets, including 120 0 0 images grouped in 1220 sets, are shared through two Mendeley Data repositories based on shading panel configurations. Fig. 1 displays the data classification tree and the number of captured folders and experimented lighting conditions shared in each dataset. Table 1 gives a brief description of each dataset. HDR images and their tone mapped plots, and false color maps of photobiological factors, i.e., photopic, melanopic, ratio of melanopic/photopic, and CCT units, are generated for side view captures of all datasets. The generated false color maps of side views are stored as a subfolder in each folder entitled 'Analysis-Results'. Note that the saturated LDR images of all side view captures are cut and stored in a subfolder. LDR images of back view captures, however, have not been checked in terms of saturated images. An excel file entitled 'Classifications' is provided in each dataset classifying the capture folders in terms of finishing exterior lighting conditions, colors, glossiness, and size, density, and position at the window where applicable. Tables 2 and 3 display the legend of acronyms and cell colors which are used in the Classification files. LDR images are provided in a

Experimental Setup, Architectural Configurations and Materials
The datasets are produced from experimental studies aimed at exploring impacts of shading panels' configurations on lighting conditions in architecture. The overall experimental setup is presented in Fig. 2 . The base model represents a generic space which can be used as an office, classroom, or cafeteria. The experiments were performed inside an artificial sky chamber simulating different exterior daylighting conditions. The artificial sky chamber is a human-scale mirror box equipped with a custom-made, tunable RGB light-emitting diode (LED) lighting system manufactured as Sunlike technology by Seoul Semiconductor [13] . The LED system is installed  The base model and shading panels are produced by plywood at a 1:10 scale. The size and proportion of the base model represent a generic architectural space based on Jafarian, et al. [5 , 6] , Poirier, et al. [7] . The model has three fully opaque sides and a 100% window made of a double-pane glazing with about 70-80% visual light transmission and a 12 mm air gap. All joints were completely sealed preventing light leakages as recommended by Ruck, et al. [8] , Baker, et al. [9] . The interior walls of the model are painted in a cool white color with about 20% to 30% glossiness produced by SICO Evolution [15] . The ceiling of the model is also painted by the SICO Evolution [15] cool white color with 50% to 60% glossiness. A SICO Evolution [15] light-gray matt color with about 5% to 10% glossiness is used to paint the floor of the model. Twenty-three sets of shading panels were produced in different colors, glossiness, and sizes, as presented in Fig. 3 . Eight sets of shading panels were painted and varnished with about 20-30% gloss in cool and warm white, saturated, and light blue, saturated green, light green, saturated yellow and saturated red colors. Two sets of panels were painted with double-colored sides with similar glossiness including saturated blue/saturated red and light-blue-saturated, yellow-colored sides. Ten shading panel sets with similar painted colors are also produced with matt finishing. Note that the double-colored sides panels offer two basic configurations in a horizontal orientation as shown in Fig. 3 . Relative spectral curves of all matt-colored panels' reflectance are measured by Parsaee, et al. [1] as depicted in Fig. 4 . Colored shading panels were tested in horizontal and vertical orientations, upwards, downwards, and left/right inclinations, and folded in V-shape as illustrated in Fig. 5 . Matt saturated blue colored panels are particularly built in four different sizes as shown in Fig. 6 . These panels with multiple sizes were also experienced as low-and high-density installations at the window. The size and density variations of panels are tested Fig. 4. Relative spectral reflectance curves of matt-colored panels measured by Parsaee, et al. [1] and melanopic and photopic efficiency curves retrieved from Enezi, et al. [16] , Amundadottir, et al. [17] , DiLaura, et al. [18] , CIE [19] . Side and back view scenes inside the model are captured by two Raspberry Pi cameras installed in the middle of the side and back façades. Cameras were manufactured as RPi Camera-I by WaveShare [10] . RPi Camera-I is mounted with a fisheye lens offering about a 185-degree diagonal field of view. The camera has a fixed aperture of f/2 and a focal length of 1.55 mm. It has a 5-megapixel OV5647 sensor with a CCD size of 1/4 inch. The camera could capture images

Image Capturing and Post-Processing Procedures
The following steps explain the overall workflow conducted to capture LDR images and generate, calibrate and post-process HDR files.
1. Capture bracketed images : the Python program RaspiCamera [20] is developed to automatically capture bracketed LDR images via Raspberry Pi cameras. As shown in Fig. 8 , the bracketed LDR images were captured from very dark to very bright high exposure values (i.e., around -2 EV to + 3 EV) by changing shutter speeds from around 15 to 1/2 seconds. All LDR images were photographed with a fixed white balance (D65) and ISO of 100. LDR images are stored with a sRGB (i.e., standard Red, Green, and Blue) color space in a JPG format. Raspberry Pi's are both connected to the operator's laptop via remote desktop applications. The captured LDR images are then transferred to the laptop for post processing. 2. Generate HDR images : the Python program HDR Generator [21] is developed to produce HDR images by recovering camera response functions (CRFs) and merging LDRs. This python program uses OpenCV [22] and ExifTool [23] libraries to generate HDR images. As shown in Fig. 9 , the CRFs are calculated for both cameras based on Debevec and Malik [24] which is available as an OpenCV method. The CRFs are shared alongside of the datasets. Note that the CRFs are generated for the first reference model captures (under lighting conditions of AS-1) and subsequently used to produce all other HDR images. LDR images are also cropped to cut the black regions of fisheye captures. Over and under saturated LDR images are manually discarded as recommended by Debevec and Malik [24] , Pierson, et al. [25] . HDR images are stored with a * .hdr file extension. 3. Calibrate color channels of HDR images : RGB and XYZ (CIE tristimulus) channels of HDR images are calibrated for both cameras to enable accurate photometric and photobiological studies. Protocols for chromaticity calibration of cameras are fully explained by Jung [11] , Jung and Inanici [12] . In brief, photometric and chromaticity properties, i.e., Illuminance and CIE-XYZ, of multiple scenes are recorded by using a calibrated Konica Minolta CL-200A photometer [26] . Note that the illuminance intensity is equal to CIE-Y. HDR images of the scenes are also simultaneously generated by both cameras. Photometric and chromaticity properties of all HDR images generated by each camera are calculated based on equations 1 -3 provided by Jung [11] , Jung and Inanici [12] . More specifically, equation (1) (3) . Note that the averages of RGB and CIE-XYZ values of all pixels in an HDR image are used to determine the calibration coefficients. The Python program HDRI Photobiological Visualizer [27] is developed to facilitate photometric and chromaticity calculations of HDR images. Fig. 10 shows the fitness curves plotted for both cameras.  (3) Fig. 10. Photometric and chromaticity calibrations of HDR images captured by side and back cameras based Jung [11] , Jung and Inanici [12] . for the side came ra for the back camera  Fig. 11 illustrates an example of false color maps rendered for a side view.
Photopic Luminance in cd/m 2 Equivalent melanopic Luminance in EMcd/m 2 0 . 0013 * R pixel−i * C R + 0 . 3812 * G pixel−i * C G + 0 . 6175 * B pixel−i * C B (5) CCT calcualtions for CIE-D65 white point C C T = 449 * n 3 + 3525 * n 2 + 6823 . 3 * n + 5518 . 87 (6) where x − 0 . 3320 0 . 1858 − y (6. 3) The described methodology could also be used for similar computational lighting analysis employing HDR imagery and post-processing techniques. The methodology is simplified to be fully understandable for architects and designers. The developed Python programs also enable batch processing to compute and render false color maps of large imagery datasets.

Ethics Statements
The presented datasets do not involve any human subjects, or animal experiments, or using social media platforms.

Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Data Availability
Imagery datasets for photobiological lighting analysis of architectural models with shading panels (No. 1) . Imagery datasets for photobiological lighting analysis of architectural models with shading panels (No. 2) (Original data) (Mendeley Data).