Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

ORION software tool for the geometrical calibration of all-sky cameras

  • Juan Carlos Antuña-Sánchez ,

    Roles Conceptualization, Data curation, Methodology, Software, Validation, Visualization, Writing – original draft

    jcantuna@goa.uva.es

    Affiliation Group of Atmospheric Optics, Universidad de Valladolid (GOA-UVa), Valladolid, Spain

  • Roberto Román,

    Roles Conceptualization, Formal analysis, Methodology, Software, Supervision, Validation, Writing – original draft

    Affiliation Group of Atmospheric Optics, Universidad de Valladolid (GOA-UVa), Valladolid, Spain

  • Juan Luis Bosch,

    Roles Methodology, Writing – review & editing

    Affiliation Departamento de Química y Física, Universidad de Almería, Almería, Spain

  • Carlos Toledano,

    Roles Resources, Supervision, Writing – review & editing

    Affiliation Group of Atmospheric Optics, Universidad de Valladolid (GOA-UVa), Valladolid, Spain

  • David Mateos,

    Roles Resources, Writing – review & editing

    Affiliation Group of Atmospheric Optics, Universidad de Valladolid (GOA-UVa), Valladolid, Spain

  • Ramiro González,

    Roles Data curation, Resources, Writing – review & editing

    Affiliation Group of Atmospheric Optics, Universidad de Valladolid (GOA-UVa), Valladolid, Spain

  • Victoria Cachorro,

    Roles Funding acquisition, Project administration, Writing – review & editing

    Affiliation Group of Atmospheric Optics, Universidad de Valladolid (GOA-UVa), Valladolid, Spain

  • Ángel de Frutos

    Roles Funding acquisition, Project administration, Writing – review & editing

    Affiliation Group of Atmospheric Optics, Universidad de Valladolid (GOA-UVa), Valladolid, Spain

Abstract

This paper presents the software application ORION (All-sky camera geOmetry calibRation from star positIONs). This software has been developed with the aim of providing geometrical calibration to all-sky cameras, i.e. assess which sky coordinates (zenith and azimuth angles) correspond to each camera pixel. It is useful to locate bodies over the celestial vault, like stars and planets, in the camera images. The user needs to feed ORION with a set of cloud-free sky images captured at night-time for obtaining the calibration matrices. ORION searches the position of various stars in the sky images. This search can be automatic or manual. The sky coordinates of the stars and the corresponding pixel positions in the camera images are used together to determine the calibration matrices. The calibration is based on three parameters: the pixel position of the sky zenith in the image; the shift angle of the azimuth viewed by the camera with respect to the real North; and the relationship between the sky zenith angle and the pixel radial distance regards to the sky zenith in the image. In addition, ORION includes other features to facilitate its use, such as the check of the accuracy of the calibration. An example of ORION application is shown, obtaining the calibration matrices for a set of images and studying the accuracy of the calibration to predict a star position. Accuracy is about 9.0 arcmin for the analyzed example using a camera with average resolution of 5.4 arcmin/pixel (about 1.7 pixels).

1 Introduction

All-sky cameras are ground-based instruments capable of capturing images of the full sky. There are many varieties: with electronic sensors (CMOS or CCD) or film (especially in the past, see [1] and references therein); with monochromatic sensors or with filters, typically RGB Bayer filters but also narrower; looking to a mirror oriented to the sky or looking directly to the sky with a fish-eye lens; static cameras or moving cameras, usually installed on a sun-tracker with a shadow ball to block the direct sun light; operating at daytime, night-time or both; and others. Some of the best features of the current all-sky cameras are: they allow the possibility of changing exposure time, sensor gain and other parameters to adapt the camera to the sky scenario; they are able to obtain a snapshot of the full sky radiance, covering every sky position and in various spectral ranges; the capture time is short; and they are inexpensive as compared to other instruments. Conversely, obtaining an accurate radiometric calibration of the images is difficult, the spectral filters are generally wide for some applications, and there are also issues with the presence of hot pixels, pixel saturation, lens aberrations, and image vignetting, among others.

This kind of instruments is generally used to observe and quantify clouds and cloud cover [210] or as a proxy of the sky conditions. However, all-sky cameras present high versatility and they can also be used for different purposes, among others: to derive other more complex cloud properties as cloud base height by stereoscopic methods [11, 12]; to detect and observe aurora, celestial bodies or bolides [1315]; to estimate the sky radiance [1618]; to study the cloud effects over solar radiation [19, 20]; to study the polarization of the sky light [21, 22]; to detect and retrieve atmospheric aerosol properties [2325]; and to obtain synergy in combination with other instruments like radiometers, ceilometers or photometers [2628].

The knowledge of the sky coordinates that correspond to each pixel of an all-sky camera is crucial in several applications; for example to extract sky radiance at specific sky angles [25, 28], to forecast solar irradiance [29, 30], to calculate aurora and cloud base altitudes by stereographic methods [31, 32], or simply to locate bodies over the celestial vault. This can be achieved by the geometrical calibration of the all-sky cameras, which consists of obtaining two matrices of the same size than the camera images, containing the values of both the azimuth and zenith angles viewed by each pixel. Several calibration methods have been described in the literature. The intrinsic calibration is used for determining the internal parameters of the camera. Images of a chessboard are recorded for this purpose, for instance, the OcamCalib toolbox [33, 34] employs images of the corners of the chessboard that are identified to detect any distortion in the camera optics. Extrinsic camera calibrations consist of determining the camera orientation and require the identifying the position of the Sun [30, 32, 35, 36] or any star [28, 37, 38] in several images and the correlation of these positions with the Sun or star coordinates.

We have developed the software application named ORION (all-sky camera geOmetry calibRation from star positIONs), with the main objective of providing an open and free tool for the geometrical calibration of all-sky cameras that is accurate, simple and user-friendly. The use of stars instead of the Sun for the geometric calibration is chosen because the Sun size in camera images is usually larger (causing problems to identify its center in the image) and, in addition, by using multiple stars we are able to cover more sky angles. ORION is written in Python 3 language and Qt5 for graphical interface [39], and it is capable of geometrically calibrating an all-sky camera by identifying star positions in the celestial vault. The source code and example data are hosted at Zenodo (https://doi.org/10.5281/zenodo.5595851). The Windows build of the application is hosted on the GOA-UVa (Atmospheric Optics Group, University of Valladolid) website (http://goa.uva.es/orion-app/).

This paper introduces the ORION application and it is structured as follows: Section 2 introduces the instrumentation, the workflow and the theoretical principles behind ORION while section 3 presents an example of the use of ORION. Finally, the main conclusions are summarized in Section 4.

2 Instrumentation, data and method

2.1 All-sky camera

In order to show how ORION works, sky images from an all-sky camera have been used. This camera is installed at the scientific platform of the GOA-UVa located on the rooftop of the Faculty of Sciences at Valladolid, Spain (41.664° N, 4.706° W, 705 m a.s.l.). More information about the GOA-UVa platform and the climate conditions of Valladolid can be found in [18, 4042].

The all-sky camera model used here is OMEA-3C from Alcor System manufacturer. It is formed by a SONY IMX178 RGB CMOS sensor with a fisheye lens, both encapsulated in a weatherproof case with a BK7 glass dome on top. The housing includes a heating system to avoid water condensation on the dome. This camera captures images with size of 3,096 X 2,080 pixels, a pixel scale of 5.4 arcmin/pixel and 14-bit resolution.

This all-sky camera is configured to capture a multi-exposure set of raw sky images every 2 minutes at night-time and every 5 minutes at daytime. These raw images are stored and then converted into 8-bits true color or gray-scale images. The 8-bits night-time images are used to carry out the geometrical calibration. A set of these sky images captured on August 23rd, 2020 from 20:20 UTC to 23:58 UTC (110 images) is used for the example shown in Section 3.1.

2.2 Identification of stars

The data that are needed for the geometrical calibration are the position (x and y pixel coordinates) of the center of a star and the sky coordinates of the chosen star (azimuth and zenith angles). Several images obtained at different times are used in the geometrical calibration to cover a wider range of pixel positions and angles. The ORION user must put this set of images in a folder and introduce the folder path in ORION. ORION reads the date and time of each image directly from the image filename. For that, the user must prepare the image set with the following naming convention: “text_YearMonthDay_HourMinute” (example: C006_20200817_0300.jpg). ORION includes an additional utility that allows changing the filename format of the images.

The flowchart for obtaining the data required for calibration is described in Fig 1. Two different ways of doing this have been implemented in ORION: manual and automatic mode. In automatic mode, you can select if the initial calibration matrix is Custom or Default. ORION uses these initial matrices to find the position of the pixel closest to the chosen star; for this calculation the Harvesine distance function [43] is used. Each analyzed image is first converted into gray scale.

thumbnail
Fig 1. Flowchart for star selection modes.

The rhombuses represent decisions made by the user.

https://doi.org/10.1371/journal.pone.0265959.g001

The size of the ROI (Region of Interest) depends on the selected mode. The manual mode allows the user to choose the dimensions of the ROI. For each sky image, ORION opens an additional window where the user must select the ROI as a rectangular pixel box. To do this, ORION uses the selectROI function that is a native part of the OpenCV library [44]. To find out where to manually select the ROI, or to check whether ORION correctly identifies star positions in either Manual or Automatic mode, we recommend using Stellarium (https://stellarium.org), which is a free open source planetarium that displays a realistic hemispheric sky similar to what is captured by a fisheye lens [45]. The automatic mode skips the selection of the ROI on an image-by-image, simplifying the process. The size of the ROI box depends on the image resolution as well as the chosen option (Custom or Default).

In the “Find the star position” step, the sky coordinates for the selected star are obtained. To calculate azimuth and zenith values of a chosen star, the position of the observer (all-sky camera) is required, which is determined by the latitude, longitude, and elevation above sea level of the all-sky camera. ORION obtains the star coordinates for each image from “PyEphem” library [46], using as input the mentioned all-sky camera coordinates and the date and time when the image was captured. If the zenith angle of a star is above 83°in an image, ORION does not consider this star for calibration in this image since it is close to the horizon, where star identification is more difficult due, in this particular case, to city lights.

After obtaining the coordinates of the stars, ORION only needs to identify the position of the center of the chosen star in each sky image. In manual mode ORION assumes that the center of the chosen star is located in the brightest pixel of the ROI (chosen manually) and it stores the x and y position of that pixel. In automatic mode, ORION uses the initial matrices (previously selected) to select the position of the ROI and then, to find automatically the brightest pixel inside. If the accuracy of the initial calibration matrices is admissible, the actual image of the star in the sky image must be close to the predicted pixel (ROI is centered in this pixel), although not necessarily in the same pixel.

The Custom matrices option uses the calibration matrices that were obtained in a previous calibration, for example, using the manual mode. In this case, the ROI used to find the position of the stars is a square box whose side dimension is the height (or width if larger) of the sky image divided by 100. This arbitrary value was chosen after performing several tests and seeing empirically that it worked well for different images.

The Default option is similar to the Custom one, but the calibration matrices are generated from two input parameters instead of the matrices obtained from a previous calibration. These input parameters are: 1) the azimuth shift from North, which indicates the angle between the North of the image (assumed as the top center of the image) and the real geographical North observed in the image; and 2) the extreme zenith, which is the sky zenith angle viewed by the pixel located in the top row and center column in the image. With this information, and assuming that the center of the entire sky (zenith) corresponds to the center of the image, a calibration azimuth and zenith matrices can be calculated. Fig 2 shows a couple of examples of the matrices generated by this method in a 2,000x2,000 pixel image. The first case (Fig 2a and 2b) presents no shift from North, being the azimuth angle from 0°to 360°counterclockwise, and a high range of zenith angles; while the second case (Fig 2c and 2d) presents a well defined shift from North of -50°and a lower variation in zenith angles due to the lower value of the extreme zenith angle. Finally, the ROI used to find the stars in Default option is a square box similar to that in Custom mode, but dividing by 50 instead of 100 (i.e. larger box), since the Default method is less accurate. This mode is useful to perform a quick calibration as an input to the automatic star detection mode.

thumbnail
Fig 2. Defaults matrices for azimuth and zenith.

a) Shift from north 0°. b) Extreme zenith 100°. c) Shift from north -50°. d) Extreme zenith 80°.

https://doi.org/10.1371/journal.pone.0265959.g002

The user must decide if the position of the brightest pixel is correct. If yes, the azimuth, zenith and the pixel position (x, y) will be stored and ORION will go to the next image. This process is repeated until the list of selected images is complete. This data is stored in an .npy file for use in calibration. More files can be generated for other stars.

2.3 Calibration algorithm

The position of a pixel in an image is given in Cartesian coordinates, where x and y coordinates are the column and row numbers, respectively, and the system is centered in the left-top pixel (x = 0, y = 0). The first element corresponds to zero position for x and y because ORION is programmed in Python 3 language. It is convenient to convert this system into polar coordinates, which more directly corresponds to the zenith and azimuth angles in the sky. In this sense, a polar coordinate system centered in the zenith of the sky with Cartesian coordinates equal to x = xC and y = yC, can be described as: (1) (2) where r is the radial distance (in pixels) from the center, and Φ is the polar angle in this new system. This polar angle presents its zero value in the direction of the y-axis, in a similar way as in Fig 2a.

Radial distance and polar angle are directly related to the sky zenith and azimuth angles, respectively. Assuming polar symmetry and that the camera is well leveled, the polar angle must be equal to the sky azimuth with a shift due to a non perfect alignment of the camera with respect to the North. This shift is the same used in the Default option of Automatic mode in Section 3.1. Sky zenith angle is related to radial distance, and the relationship between both can be linear (which is assumed in the mentioned Default option), or a higher degree polynomial. Hence, if we determined the xC and yC coordinates, we will obtain the polar coordinates from Eqs 1 and 2. Then, if we determine the shift of the polar angle with respect to the sky azimuth, and the relationship between radial distance and zenith angle, we can transform the obtained polar coordinates into the sky angles viewed by each pixel. This is the way that ORION uses to calculate the calibration matrices.

The required xC and yC values, the azimuth shift from polar coordinate and the relationship between zenith angle and radial distance can be obtained from the stored data in the previous Section 2.2: real azimuth and zenith star values and x and y pixel positions where the stars were found.

First, ORION calculates the xC and yC values. It involves four iterations to obtain a better precision after each iteration. Each iteration consists of scanning different pixel coordinates and assuming that they are the xC and yC values. The first iteration assumes that xC and yC values must be near to the center of the image itself (xci, yci): xci = (width-1)/2 and yci = (height-1)/2. A scan from x = xci -250 to x = xci+250 in 5 pixel steps is done. For each scan in x-column, an additional scan in y-raw is done from x = yci -250 to x = yci+250 in 5 pixel steps too. It implies that ORION tests 101x101 positions (10,201) in this iteration for the xC and yC values. For each one of these 10,201 potential centers, ORION calculates the Φ values by Eq 2 for all star positions that were chosen and previously stored, and then calculates the difference between these Φ values and the real azimuth of the stars (this information was also stored). If the center of the image is well determined, these differences must be constant and equal to the mentioned shift angle. Then, ORION calculates the standard deviation of these differences; the x and y position among the 10,201 showing the lowest standard deviation is assumed as the true xC and yC values.

This first iteration provides a good approximation of the real xC and yC values; however, the precision can still be improved, since the scans were done every 5 pixels in order to encompass a large part of the image but expending low computation time. Three more iterations are done by assuming the result of the previous iteration as the initial conditions. The second iteration scans from -50 to 50 around xci and yci in 1 pixel step; while the third and fourth scan from -0.5 to 0.5 and from -0.005 to 0.005 in 0.01 and 0.001 pixel steps, respectively. After the four iterations, ORION stops and provides the xC and yC values with a precision of 0.001 pixels.

Once xC and yC are calculated, the azimuth shift is calculated by ORION using the stored star information. A linear fit between the polar angle obtained using the calculated xC and yC and the real azimuth of the stars in the sky is performed by ORION. The y-intercept is considered the azimuth shift angle. This linear fit is done excluding star positions with zenith angles below 20°, in order to avoid pixels for which a little variation in x and y position implies a big variation in the polar angle.

Once the azimuth shift and the xC and yC values are calculated, ORION calculates the radial distance using the stored star information and Eq 1. A polynomial fit between the stored real zenith angle of the stars and the obtained radial distance of the image is performed. The degree of this polynomial fit is chosen by the user. After that, the zenith angle viewed by each pixel can be directly obtained by applying the fit coefficients to the known radial distance, which is also available from the previously calculated xC and yC values. This information is enough to obtain the geometrical calibration matrices of sky azimuth and zenith angles.

3 Results

3.1 Star position detection and calculation of azimuth and zenith matrices

An example of ORION calibration with real images is shown in this section. The used images correspond to the all-sky camera described in Section 2.1, installed at Valladolid. We selected the images every 2 minutes from 20:20 UTC to 23:58 UTC on August 23rd, 2020. We chose this set of images since it corresponds to a clear night with fully cloudless conditions and no Moon. It is however possible to use moonlight nights and even partly cloudy nights, as long as there are visible stars. First of all, the camera location information and the local path for image folder are introduced in the input parameters. After that, we can choose between manual or automatic mode (see Fig 1). Automatic default option is selected since we have no previous custom calibration. The information about default matrices in this example is: the shift from north equal to -5°and extreme zenith equal to 95°. Both parameters are introduced in the application as can be observed in the screenshot of Fig 3.

thumbnail
Fig 3. ORION screenshot of the star identification process in automatic mode under default matrices option.

The chosen star is Capella.

https://doi.org/10.1371/journal.pone.0265959.g003

Once this information has been entered, we start to identify and detect the position of the stars in the sky images. The first star chosen in this example is Capella. This star is selected from the list of available stars. The path where the data file will be stored is an input. The procedure described here is the same for each star. ORION obtains, from the “PyEphem” library, the azimuth and zenith angles corresponding to the time and place of the first image; then the position of the pixel closest to the coordinates of the star is obtained following the calibration matrices given by the Default input parameters. The position of the brightest pixel inside the ROI (a square box centered on the mentioned “nearest pixel”) is considered as the position of the center of the star chosen in the image. ORION displays the position of the chosen star marked with a white circle in the main sky image (left part of the application; Fig 3) as well as in an enlarged image in the bottom-right corner of the application (“Region of Image” in Fig 3). This enlarged image is useful to discern whether the position of the chosen star is correct or not (hot pixels can lead to erroneous identification). Considering that the position of the star is well identified, the data is stored and we can continue with the next image. Correctly selected points are displayed in the image in green, wrong points are marked in red.

When a star position is added or removed (either in Manual or Automatic mode), then ORION analyzes the next sky image (see Fig 1) and repeats the process until all the available images are analyzed. Once it is finished, ORION stores in the chosen file (in this case “capella.npy”) the azimuth and zenith angles of the star and the x and y pixel positions assigned to this star in each sky image. This file contains the necessary information for the calculation of the calibration matrix. However, a single star in one night does not usually cover the full range of zenith and azimuth angles. Therefore, it is recommended to perform the calibration with the positions of several stars.

In this example, we repeat the process to obtain the positions of Capella, Altair, Vega and Deneb. Once the four files are generated with the Default mode, they are used to obtain the azimuth and zenith calibration matrices of the camera. ORION calculates the image center and the relationship between the zenith angle and the radial distance, as explained in Section 2.3, from the previous files (star positions and coordinates) chosen by the user. The degree of the polynomial fit between zenith angle and radial distance can be chosen, being equal to 2 in our example (Fig 4), since the 1st degree option (linear fit) does not fit to the data under high zenith angle values (see S1 Fig). In this case the center of the image is (1000.148, 1000.341) and an offset with respect to north of -5.3°. It can be seen in Fig 4 that there were no data for zenith angles between 40 and 70 degrees. It is recommended to cover the maximum range of zenith angles for calibration, therefore in this example we need to look for more star positions to fill the zenith angle gap. For this purpose, two additional stars are selected: Arcturus and Alphecca.

thumbnail
Fig 4. Relationship between zenith angle and radial distance with polynomial adjustment of degree 2.

https://doi.org/10.1371/journal.pone.0265959.g004

The pixel positions of Arcturus and Alphecca are retrieved using Automatic mode but Custom option. This option must be more accurate than the Default. This mode is equal to Default but using a pre-calculated azimuth and zenith matrices (the path is introduced as input, see Fig 5) and considering a smaller ROI (square box with side 100 times smaller than the maximum dimension of the analyzed sky image, see Section 2.2) for finding the brightest pixel. Fig 5 presents the calculation of pixel and star positions for Alphecca. In this case, some pixels have not been correctly identified by ORION (the star was mixed with a hot pixel), and those are marked with red circles. The positions of these pixels and the coordinates of the star are not stored for these cases.

thumbnail
Fig 5. Selected (green dots) and discarded (red dots) points in the image sequence for the Alphecca star.

https://doi.org/10.1371/journal.pone.0265959.g005

In this example, the calibration matrices are recalculated adding the information of these two new stars. In Fig 6a we have the azimuth calibration matrix and in Fig 6b, the zenith matrix. As a final result, the center of the sky image, which corresponds to the sky zenith, is really close to the center of the sky image. Regarding the shift from North, it is -5.23°. Fig 6c shows that the fit between zenith angles and radial distance is now performed with a full range of zenith angles, and the second-order polynomial fit can be better observed. In addition, zenith angles from 2°to 82°have been covered.

thumbnail
Fig 6. Calibration result: a) azimuth matrix, b) zenith matrix, c) ratio of zenith angle to radial distance with polynomial adjustment of degree 2.

https://doi.org/10.1371/journal.pone.0265959.g006

3.2 Calibration check

Once the final calibration matrices have been calculated, their performance for the detection of stars in a sky image can be evaluated with the “Check calibration” functionality. For this purpose, one star must be chosen, for example Alioth, which has not be used in the calibration process, as shown in Fig 7. ORION analyzes each single image from the set in the image path. The calibration matrices point out that the center of the chosen star should be in a certain pixel (marked in red circle in Fig 7). ORION also looks for the brightest pixel in a box centered in the mentioned pixel. This box has the size of the box used in the Default option of Automatic mode (see section 2.2). The brightest pixel (marked as green circle; see Fig 7) is assumed as the real position of the star center in the analyzed sky image.

thumbnail
Fig 7. ORION screenshot for checking the calibration feature using the star Alioth.

https://doi.org/10.1371/journal.pone.0265959.g007

The software calculates the real star position (brightest pixel) and the one predicted by calibration matrices for all images, and then uses both positions to quantify the agreement between the predicted star position by calibration and the real one. Five panels with different analyses are provided for this verification, see Fig 8.

thumbnail
Fig 8.

Performance of the obtained calibrations matrices for Alioth star positions: a) azimuth obtained with the calibration for the located star position (brightest point) as a function of the real star azimuth; b) zenith obtained with the calibration for the located star position (brightest point) as a function of the real star zenith; c) absolute pixel distance between the star position given by the calibration matrices and the position of the assumed real center of the star in the image (brightest point in a square box defined by the height or width of the sky image) as a function of star azimuth angle (red dotted line represents the mean absolute distance); d) absolute pixel distance between the star position given by the calibration matrices and the position of the assumed real center of the star in the image (brightest point in a square box defined by the height or width of the sky image) as a function of star zenith angle (red dotted line represents the mean absolute distance); e) absolute pixel distance between the star position given by the calibration matrices and the position of the assumed real center of the star in the image (brightest point in a square box defined by the height or width of the sky image) for each available image (red dotted line represents the mean absolute distance).

https://doi.org/10.1371/journal.pone.0265959.g008

Fig 8a, which is the default panel shown by ORION (see Fig 7), represents the pixel distance between the predicted and real star positions for each image in the analyzed set. This distance is between 0 and 15 arcmin in the analyzed example, being the mean distance about 7.25 arcmin (dotted red line). This means that the obtained calibration matrices predicted the center of the star Alioth with an average difference of 7.25 arcmin (about 1.5 pixels) with respect to the real position. These differences are also presented as a function of the star azimuth (Fig 8b) and zenith (Fig 8c) angles. No azimuth or zenith angle dependence is observed in the analyzed example. Finally, ORION shows the azimuth (Fig 8d) and zenith (Fig 8e) angles, given by the calibration matrices, of the real star positions in the image (brightest pixel) as a function of the real ones angles for all analyzed sky images. As it can be observed in the example, the angles assigned by the calibration matrices to the star position in the image (brightest pixel) highly correlate with the real coordinates of the star.

This calibration check has been also carried out for 9 additional stars, see Table 1. It is observed that the mean for all stars ranges from 4 to 12 arcmin. The best results are obtained by Kochab with a mean of 4.50 arcmin and a standard deviation of 3.24 arcmin. The highest values are observed in Dubhe, Mirach and Sirrah, which correspond to the presence of hot pixels in positions close to the stars, that are not well identified. In general, the mean accuracy for the 10 stars is about 9.0 arcmin (1.7 pixels) and the mean precision, given by the standard deviation, is about 7.5 arcmin (1.4 pixels).

4 Conclusions

This paper presents ORION, a new software application, which provides the geometrical calibration of all-sky cameras using a set of sky images captured at night-time under cloudless conditions. An example of use has shown the capability of this application to obtain the azimuth and zenith angles viewed by each pixel of the camera. The accuracy of the calibration depends on the chosen stars and the sky positions covered by them. This accuracy can be also checked with ORION itself. A simple calibration was able to estimate the star positions with an average accuracy about 9.0 arcmin in the provided example, using a camera with 5.4 arcmin/pixel resolution (1.7 pixels). The average precision (standard deviation) in this case is about 7.5 arcmin (1.4 pixels). We encourage other users and researchers to use the ORION application for easy geometrical calibration of all-sky cameras, which will be helpful to locate any body (stars, planets, Sun, Moon, among others) in their sky images, if the sky coordinates of that body are known. Moreover, ORION includes other features such as exporting data in different formats, or calculating the field of view of each pixel, which is not detailed in this paper but can be useful for the ORION users.

Supporting information

S1 Fig. Relationship between zenith angle and radial distance with polynomial adjustment of degree 1.

https://doi.org/10.1371/journal.pone.0265959.s001

(TIFF)

Acknowledgments

The authors gratefully thank AERONET for the aerosol products used. Finally, the authors thank the GOA-UVa staff members Rogelio Carracedo, Daniel González-Fernández, Sara Herrero and Patricia Martín, who helped with the operation and maintenance of the camera.

References

  1. 1. McGuffie K, Henderson-Sellers A. Almost a century of “imaging” clouds over the whole-sky dome. Bulletin of the American Meteorological Society. 1989;70(10):1243–1253.
  2. 2. Tapakis R, Charalambides AG. Equipment and methodologies for cloud detection and classification: A review. Solar Energy. 2013;95:392–430.
  3. 3. Cazorla A, Olmo F, Alados-Arboledas L. Development of a sky imager for cloud cover assessment. JOSA A. 2008;25(1):29–39. pmid:18157209
  4. 4. Wacker S, Groebner J, Zysset C, Diener L, Tzoumanikas P, Kazantzidis A, et al. Cloud observations in Switzerland using hemispherical sky cameras. Journal of Geophysical Research: Atmospheres. 2015;120(2):695–707.
  5. 5. Calbo J, Sabburg J. Feature extraction from whole-sky ground-based images for cloud-type recognition. Journal of Atmospheric and Oceanic Technology. 2008;25(1):3–14.
  6. 6. Long CN, Sabburg JM, Calbó J, Pagès D. Retrieving cloud characteristics from ground-based daytime color all-sky images. Journal of Atmospheric and Oceanic Technology. 2006;23(5):633–652.
  7. 7. Ghonima MS, Urquhart B, Chow CW, Shields JE, Cazorla A, Kleissl J. A method for cloud detection and opacity classification based on ground based sky imagery. Atmospheric Measurement Techniques. 2012;5(11):2881–2892.
  8. 8. Yabuki M, Shiobara M, Nishinaka K, Kuji M. Development of a cloud detection method from whole-sky color images. Polar Science. 2014;8(4):315–326.
  9. 9. Liu S, Zhang L, Zhang Z, Wang C, Xiao B. Automatic cloud detection for all-sky images using superpixel segmentation. IEEE Geoscience and Remote Sensing Letters. 2014;12(2):354–358.
  10. 10. Koehler T, Johnson R, Shields J. Status of the whole sky imager database. Proc Cloud Impacts on DOD Operations and Systems, El Segundo, CA, USA, Department of Defense. 1991; p. 77–80.
  11. 11. Janeiro FM, Carretas F, Kandler K, Ramos PM, Wagner F. Automated cloud base height and wind speed measurement using consumer digital cameras. In: Proc. IMEKO World Congress; 2012.
  12. 12. Savoy FM, Dev S, Lee YH, Winkler S. Stereoscopic cloud base reconstruction using high-resolution whole sky imagers. In: 2017 IEEE International Conference on Image Processing (ICIP). IEEE; 2017. p. 141–145.
  13. 13. Wang Q, Liang J, Hu ZJ, Hu HH, Zhao H, Hu HQ, et al. Spatial texture based automatic classification of dayside aurora in all-sky images. Journal of Atmospheric and Solar-Terrestrial Physics. 2010;72(5):498–508.
  14. 14. Kenyon DA, Watson WT. The All Sky Camera Fireball Detector. In: Society for Astronomical Sciences Annual Symposium. vol. 24; 2005. p. 11.
  15. 15. Trigo-Rodriguez JM, Madiedo JM, Gural PS, Castro-Tirado AJ, Llorca J, Fabregat J, et al. Determination of meteoroid orbits and spatial fluxes by using high-resolution all-sky CCD cameras. In: Advances in Meteoroid and Meteor Science. Springer; 2008. p. 231–240.
  16. 16. Zibordi G, Voss KJ. Geometrical and spectral distribution of sky radiance: comparison between simulations and field measurements. Remote Sensing of Environment. 1989;27(3):343–358.
  17. 17. Román R, Antón M, Cazorla A, de Miguel A, Olmo FJ, Bilbao J, et al. Calibration of an all-sky camera for obtaining sky radiance at three wavelengths. Atmospheric Measurement Techniques. 2012;5(8):2013–2024.
  18. 18. Antuña Sánchez JC, Román R, Cachorro VE, Toledano C, López C, González R, et al. Relative sky radiance from multi-exposure all-sky camera images. Atmospheric Measurement Techniques. 2021;14(3):2201–2217.
  19. 19. Calbó J, Pages D, González JA. Empirical studies of cloud effects on UV radiation: A review. Reviews of Geophysics. 2005;43(2).
  20. 20. Antón M, Gil J, Cazorla A, Fernández-Gálvez J, Foyo-Moreno I, Olmo F, et al. Short-term variability of experimental ultraviolet and total solar irradiance in Southeastern Spain. Atmospheric environment. 2011;45(28):4815–4821.
  21. 21. Kreuter A, Emde C, Blumthaler M. Measuring the influence of aerosols and albedo on sky polarization. Atmospheric Research. 2010;98(2):363–367. pmid:24068851
  22. 22. Zhang W, Cao Y, Zhang X, Yang Y, Ning Y. Angle of sky light polarization derived from digital images of the sky under various conditions. Appl Opt. 2017;56(3):587–595. pmid:28157914
  23. 23. Cazorla A, Olmo FJ, Alados-Arboledas L. Using a Sky Imager for aerosol characterization. Atmospheric Environment. 2008;42(11):2739–2745.
  24. 24. Kreuter A, Blumthaler M. Feasibility of polarized all-sky imaging for aerosol characterization. Atmospheric Measurement Techniques. 2013;6(7):1845–1854.
  25. 25. Román R, Antuña Sánchez JC, Cachorro VE, Toledano C, Torres B, Mateos D, et al. Retrieval of aerosol properties using relative radiance measurements from an all-sky camera. Atmospheric Measurement Techniques. 2022;15(2):407–433.
  26. 26. Martínez-Chico M, Batlles F, Bosch J. Cloud classification in a mediterranean location using radiation data and sky images. Energy. 2011;36(7):4055–4062.
  27. 27. Román R, Cazorla A, Toledano C, Olmo FJ, Cachorro VE, de Frutos A, et al. Cloud cover detection combining high dynamic range sky images and ceilometer measurements. Atmospheric Research. 2017;196:224–236.
  28. 28. Román R, Torres B, Fuertes D, Cachorro VE, Dubovik O, Toledano C, et al. Remote sensing of lunar aureole with a sky camera: Adding information in the nocturnal retrieval of aerosol properties with GRASP code. Remote Sensing of Environment. 2017;196:238–252.
  29. 29. Alonso-Montesinos J, Batlles FJ, Portillo C. Solar irradiance forecasting at one-minute intervals for different sky conditions using sky camera images. Energy Conversion and Management. 2015;105:1166–1177.
  30. 30. Kazantzidis A, Tzoumanikas P, Blanc P, Massip P, Wilbert S, Ramirez-Santigosa L. Short-term forecasting based on all-sky cameras. In: Renewable energy forecasting. Elsevier; 2017. p. 153–178.
  31. 31. Kataoka R, Miyoshi Y, Shigematsu K, Hampton D, Mori Y, Kubo T, et al. Stereoscopic determination of all-sky altitude map of aurora using two ground-based Nikon DSLR cameras. Annales Geophysicae. 2013;31(9):1543–1548.
  32. 32. Nguyen DA, Kleissl J. Stereographic methods for cloud base height determination using two sky imagers. Solar Energy. 2014;107:495–509.
  33. 33. Scaramuzza D, Martinelli A, Siegwart R. A toolbox for easily calibrating omnidirectional cameras. In: 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE; 2006. p. 5695–5701.
  34. 34. Crispel P, Roberts G. All-sky photogrammetry techniques to georeference a cloud field. Atmospheric measurement techniques. 2018;11(1):593–609.
  35. 35. Lalonde JF, Narasimhan SG, Efros AA. What do the sun and the sky tell us about the camera? International Journal of Computer Vision. 2010;88(1):24–51.
  36. 36. Urquhart B, Kurtz B, Kleissl J. Sky camera geometric calibration using solar observations. Atmospheric Measurement Techniques. 2016;9(9):4279–4294.
  37. 37. Mori Y, Yamashita A, Tanaka M, Kataoka R, Miyoshi Y, Kaneko T, et al. Calibration of fish-eye stereo camera for aurora observation. In: Proceedings of the International Workshop on Advanced Image Technology (IWAIT2013); 2013. p. 729–734.
  38. 38. Barghini D, Gardiol D, Carbognani A, Mancuso S. Astrometric calibration for all-sky cameras revisited. Astronomy & Astrophysics. 2019;626:A105.
  39. 39. Company TQ. Qt for developers by developers | Cross-platform development;. Available from: https://www.qt.io/developers.
  40. 40. Bennouna YS, Cachorro VE, Torres B, Toledano C, Berjón A, de Frutos AM, et al. Atmospheric turbidity determined by the annual cycle of the aerosol optical depth over north-center Spain from ground (AERONET) and satellite (MODIS). Atmospheric Environment. 2013;67:352–364.
  41. 41. Román R, Bilbao J, de Miguel A. Uncertainty and variability in satellite-based water vapor column, aerosol optical depth and Angström exponent, and its effect on radiative transfer simulations in the Iberian Peninsula. Atmospheric Environment. 2014;89:556–569.
  42. 42. Cachorro VE, Burgos MA, Mateos D, Toledano C, Bennouna Y, Torres B, et al. Inventory of African desert dust events in the north-central Iberian Peninsula in 2003–2014 based on sun-photometer–AERONET and particulate-mass–EMEP data. Atmospheric Chemistry and Physics. 2016;16(13):8227–8248.
  43. 43. Brummelen GV. Heavenly Mathematics: The Forgotten Art of Spherical Trigonometry. Princeton University Press; 2013.
  44. 44. OpenCV. OpenCV: Operations on arrays;. Available from: https://docs.opencv.org/3.4/d2/de8/group__core__array.html#gab473bf2eb6d14ff97e89b355dac20707.
  45. 45. Zotti G, Hoffmann SM, Wolf A, Chéreau F, Chéreau G. The Simulated Sky: Stellarium for Cultural Astronomy Research. Journal of Skyscape Archaeology. 2020;6(2):221–258.
  46. 46. Rhodes BC. PyEphem: astronomical ephemeris for Python. Astrophysics Source Code Library. 2011; p. ascl–1112.