ORIENTATION AND CALIBRATION REQUIREMENTS FOR HYPERPECTRAL IMAGING USING UAVs : A CASE STUDY

Flexible tools for photogrammetry and remote sensing using unmanned airborne vehicles (UAVs) have been attractive topics of research and development. The lightweight hyperspectral camera based on a Fabry-Pérot interferometer (FPI) is one of the highly interesting tools for UAV based remote sensing for environmental and agricultural applications. The camera used in this study acquires images from different wavelengths by changing the FPI gap and using two CMOS sensors. Due to the acquisition principle of this camera, the interior orientation parameters (IOP) of the spectral bands can vary for each band and sensor and changing the configuration also would change these sets of parameters posing an operational problem when several bands configurations are being used. The objective of this study is to assess the impact of use IOPs estimated for some bands in one configuration for other bands of different configuration the FPI camera, considering different IOP and EOP constraints. The experiments were performed with two FPI-hyperspectral camera data sets: the first were collected 3D terrestrial close-range calibration field and the second onboard of an UAV in a parking area in the interior of São Paulo State.


INTRODUCTION
Environmental mapping and monitoring of forest areas have been greatly facilitated with the use of Unmanned Aerial Vehicles (UAV) carrying imaging sensors, like hyperspectral cameras.Higher temporal and spatial resolution can be achieved when using UAVs making feasible several detailed analysis that are more complex to achieve with existing remote sensing sensors.Recently, hyperspectral cameras have been adapted for UAV platforms, which are lightweight and some of them acquiring frame-format images (Saari et al., 2009;Honkavaara et al., 2013;Aasen et al., 2015;Näsi et al., 2015).Existing pushbroom based hyperspectral cameras require high grade Inertial Navigation Systems (INS) to provide instantaneous position and attitude and this type of systems are high cost and sometimes heavy.Frame format hyperspectral cameras are attractive alternatives because bundle adjustment can be used to compute platform attitude and position, relaxing the need for high grade IMU, stereo reconstruction is feasible and multidirectional reflectivity features can be measured (Honkavaara et al., 2013).
One example of frame based hyperspectral cameras is the Rikola camera (Rikola Ltd., 2015), which acquires a sequence of images in different spectral bands, with frame geometry.This camera uses a technology based on a tuneable Fabry-Perot Interferometer (FPI), which is placed into the lens system to collimate the light that transmits the spectral bands as a function of the interferometer air gap (Saari et al., 2009).Using several air gap values of the FPI, a set of wavelengths is acquired enabling capture of hyperspectral frame format imagery at desired spectral bands.
Estimating Exterior Orientation Parameters (EOPs), describing platform position and attitude is fundamental when accurate 3D information is required, which has been more common in several environmental applications.EOPs can be determined directly, by using INS, indirectly, by Bundle Adjustment on by Integrated Sensor Orientation (ISO) which is a combined strategy.In any case, rigorous sensor modelling is required.The most common sensor model in Photogrammetry uses a set of Interior Orientation Parameters (IOPs) to recover the optical path within the sensor.The IOPs must be accurately determined by camera calibration, which can be done by laboratory or field techniques.However, it cannot be ensured that the IOPs determined by laboratory or terrestrial techniques are stable, because of the camera operational characteristics and the hard environmental conditions, with vibrations, impacts and temperature variations.In such a case of IOPs variations, the results of BBA or ISO are likely to be affected.For this reason, it is relevant to assess the behaviour of these new cameras when doing this combined process (terrestrial calibration followed by aerial survey and BBA).Also, as this camera is reconfigurable, the set of bands can be changed depending on the reflectance properties of the objects to be sensed.This poses an additional and critical problem since it is unfeasible to generate IOPs for all possible sets of configurations that can be tested and used in a surveying.
The main aim of this paper is to present an experimental assessment of the bundle block adjustment (BBA) and on-the job calibration (OJC) performed with different spectral bands of the hyperspectral frame camera (Rikola), using IOPs estimated by close range terrestrial calibration.For both cases different sets of IOPs and EOPs constraints were considered and assessed.This study can drive future effort to set some requirements for the calibration and orientation of cameras with similar features.The importance of the assessment of these different configurations is related with the future use of the UAV-FPI camera in forest areas, where the access to the inside area limits the collection of control points.

Hyperspectral cameras
Hyperspectral cameras can grab up to hundreds of spectral bands and, from this data, the reflectance spectrum of each image pixel can be reconstructed.Usually these cameras were based on pushbroom or whiskbroom scanning (Schowengerdt, 2007).
Alternatives are hyperspectral cameras which grab twodimensional frame format images based on tuneable filters (Saari et al., 2009;Lapray et al., 2014;Aasen et al., 2015).Some of these cameras acquire the image cube as a time dependent image sequence (Honkavaara et al., 2013) whilst others grab the entire cube at the same time, but at the cost of having low resolution images (Aasen et al., 2015).The unsynchronised cameras require a post-processing registration step to produce a coregistered hyperspectral image cube.Some of these cameras are of particular interest for applications requiring high resolution (temporal, spectral and spatial) because they are lightweight and thus can be carried by UAVs.One of these lightweight hyperspectral camera based on a Fabry-Pérot Interferometer (FPI), was developed by the VTT Technical Research Centre of Finland (Saari et al., 2009).Rikola Ltd. (2015) manufactures cameras based on this technology for the commercial market and one of these cameras was used in this work.A summary of technical specifications can be seen in Table 1 and a picture of the camera and components are shown in Fig. 1.a.The camera has several internal components and the most important one is a FPI which has two partially reflective parallel plates with variable distance (air gap) which is controlled by piezoelectric actuators (Saari et al., 2009).The wavelengths transmitted through the interferometer are dependent on the FPI gap (Mäkynen et al., 2011 1. Specifications of a Rikola camera, model 2014. Figure 1.a shows the 2014 model with external components: irradiance sensor and a GPS receiver of a Rikola camera.The rays bundle passes through the primary lens and other internal optical components, including the FPI and is redirected to two CMOS sensors by a beam splitting device (Fig. 1.b).Sensor 1 receives energy from near infrared (650-900 nm) and Sensor 2 is optimised to record bands of the visible part of the spectrum (500-636 nm).The spectral bands and their range limits can be selected by the user, depending on the application.Due to these features, it is unfeasible to define a single set of IOPs for this camera and alternatives have to be derived.Deriving a set of IOPs for each image band is troublesome because the configuration can change depending on the application.On the other side, a single set of IOPs for each sensor or for both sensors is also unsuitable to cope with the sensors misalignments and eventual changes in the IOPs due to the internal changes in the rays paths.Depending on the accuracy, some reference bands can be chosen and their IOPs, generated by calibration, could be used for the remaining bands.

Bundle Adjustment and camera calibration
Bundle adjustment is a standard photogrammetric technique for determination of sensor orientation that can also be used to estimate the parameters of the internal sensor model (Brown, 1971;Clarke and Fryer, 1998;Remondino and Fraser, 2006).The camera IOPs are usually the principal distance, principal point coordinates, lens distortions and affinity coefficients (Brown, 1971;Habib and Morgan, 2002).
The well-known collinearity equations are used as the main mathematical model, as can be seen in Eq. 1.
in which x and y are the image point coordinates; X, Y, and Z are the ground coordinates of the corresponding point; m ij are the elements of the rotation matrix; X 0 , Y 0 and Z 0 are the ground coordinates of the camera's perspective centre (PC); x 0 and y 0 are the coordinates of the principal point; c is the camera's principal distance; δx i and δy i are the effects of radial and decentring distortions and affinity.
The IOPs, EOPs and the ground coordinates of tie points can be estimated simultaneously, using the image coordinates of the these points and additional observations, such as the ground coordinates of some points, coordinates of the PC, determined by GNSS.Another technique, known as self calibration (Kenefick et al., 1972;Merchant, 1979) does not use control points (GCPs), but a set of minimum of seven constraints to define an object's reference frame (Kenefick et al., 1972;Merchant, 1979), avoiding the error propagation of the GCP surveying.The frame definition can be done by constraining the six EOPs of one selected image and one or more distances (Kenefick et al., 1972).

METHODOLOGY AND EXPERIMENTS
There are several differences between FPI cameras and conventional digital photogrammetric cameras.In the FPI camera system, the set of bands can be selected according to the target spectra responses and application.Thus, a formal camera calibration certificate is unfeasible to be used in the photogrammetric data processing.In the current study, the objects are trees from tropical forest and their reflectance factors can be estimated by field or laboratory measurements.

Selecting bands configurations for forest coverage
Selection of the spectral bands to be used in the Rikola camera were based on measurement performed over tree leafs of one selected area over the tropical forest.Leaves were collected in the study area and stored in proper boxes.Radiometric measurements were performed in laboratory with a spectroradiometer (Specfield hand held Pro -ASD) six hours after the field collection.The species collected are: Pouteria ramiflora; Croton floribundus; Astronium graveolens; Aspidosperma ramiflorum; Handroanthus avellanedae; Hymenaea courbaril and Eugenia uniflora.The CCRF (Conical Conical Reflectance Factor) were obtained and the spectral range was limited to 500 nm e 900 nm, corresponding to the Rikola capability.Figure 2 presents the CCRF curves extracted for some leaf samples.It can be observed that changes in the curve intensities occurred mainly in the near infrared and yellow spectral bands.The spectral features of those species are useful for future hyperspectral classification (Miyoshi, 2016).The spectral bands configuration were defined by using the main differences on the CCRF curves, the wavelengths of some vegetation indexes (REP, SR, NDVI) bounded by some restrictions of the Rikola camera.Two configurations with 25 bands were defined to be tested.Their features (band number, central wavelength and FWHM) are shown in Tables 2 and 3.
As it can be seen the wavelengths of the two configurations are similar but do not match, which means that the IOPs for each different configuration may be different.
In this work the terrestrial calibration was performed with configuration 2 (Table 2) and the camera was reconfigured to configuration 1 (Table 1) for the flight, simulating scenarios of real projects.The main question is whether the IOPs derived for configuration 2 could be used for configuration 1 and what should be the results with some refinement with onthe-job calibration.3. Spectral configuration 1 used in flight (cfg.1).

Terrestrial camera calibration
The first step is the estimation of the camera IOPs by selfcalibrating bundle adjustment, using images acquired in a 3D terrestrial calibration field composed of coded targets with the ArUco codification (Garrido-Jurado et al., 2014;Tommaselli et al., 2014).Each target is composed of a rectangular external crown and 5 x 5 internal squares arranged in five rows and five columns (see Figure 3).The automatic processing for locating, identifying and accurately measuring the corners of the crown has been described by Tommaselli et al. (2014).
In this camera model, two CMOS sensors were used.To cope with this geometry, the calibration was made for both sensors using a reference spectral band for each sensor.Firstly, the camera was configured with the cfg. 2 (Table 2) and calibrated considering two reference bands for each sensor (band 8 for sensor 2 and band 23 for sensor 1).Then, additional calibrations trials were performed for more two bands of the sensor 1 (bands 15 and 22).Twelve cubes were acquired with different positions and rotations (Table 4).The reference frame for the self-calibration with bundle adjustment was defined by the 3D coordinates of two points and the Z coordinate (depth) of a third point.The first point was set as the origin of the system.The distance between the two points was measured with a precision calliper and the Y and Z coordinate of this second point were the same as the first point.The IOPs, the EOPs and the object coordinates of the tie points were simultaneously estimated in the adjustment based only on internal information (7 constraints), except the distance measured in the object space.The self-calibration adjustment was performed using an in-house-developed software Calibration with Multiple Cameras (CMC) (Tommaselli et al., 2013), which uses the unified approach for least-squares (Mikhail and Ackermann, 1976, p. 133).
The standard deviations of the image observations were considered to be 0.5 pixels for x and y coordinates and the a priori variance factor was set to 1.The full set of ten parameters was firstly used in the calibration experiments, but the effects of affine parameters were deemed to be insignificant and were removed.The experiments were then performed with 8 parameters (c, x 0 , y 0 , k 1 , k 2 , k 3 , P 1 , P 2 ).In some cases k 3 was not significant but it was used to maintain the same configuration.The quality control of the calibration results were based on the analysis of the a posteriori sigma, the estimated standard deviation of the IOPs and also the discrepancies of check distances.These distances were measured directly in the field with a calliper and the corresponding distances were computed using the coordinates estimated in the bundle adjustment.In order to compare the equivalence of the estimated sets of IOPs for several image bands, the ZROT (Zero Rotation Method) developed by Habib and Morgan (2006) was used.First, a regular grid is defined in the image space.The two sets of IOPs to be compared are used to refine the coordinates of each grid point, thus removing distortions.The differences in the principal distances are compensated by projecting the coordinates of the second set to the image plane of the first set.The RMSE of these two grid points are computed and can be used to assess the similarity of the bundles generated with these grid points and the two sets of IOPs.If the differences were within the expected standard deviation of the image coordinate measurements, then the IOP set could be considered to be equivalent (Habib and Morgan, 2006).It can be seen that the RMSEx and RMSEy were smaller than half pixel for the IOPs generated with images of bands from the same sensor, except for the RMSEx of band 15.However, the IOPs from different sensors presented RMSEs higher than 8 pixels, and can be considered different, as also was observed from the analysis of Table 5.

Validation with images collected from an UAV
An UAV octopter (Fig. 4.a) was equipped with the hyperspectral camera and a dual-frequency GNSS receiver (Novatel SPAN-IGM) to acquire a set of aerial images in a nearly flat test field (parking area, see Fig. 4.b).The camera was configured with the spectral set cfg. 1 (Table 3) which is slightly different from the cfg.2, used in the terrestrial calibration, with an integration time of 5 ms.An image block (composed of two flight strips) with a range of approximately 360 m was collected at a flight height of 90 m, which generated spectral images with 25 bands and GSD of 6 cm.Forward overlap was approximately 60% and side overlap varied from 10 to 20 %.Nineteen ground points arranged in the block were surveyed with double frequency GNSS receivers and six were used as control and other thirteen as check points.Two bands of cfg. 1 were selected for this empirical assessment: band 8 (609.79 nm, Sensor 2) and band 23 (786.16,Sensor 1) which do not correspond with those with the same id for the calibration set in which cfg. 2 was used.
GPS time for each event of cube acquisition (intervals of 4 s) was grabbed by the Rikola GPS receiver.This time refers to the first cube band and the acquisition time of the remaining bands were estimated by the nominal time differences (22 ms).A spreadsheet (SequenceInfoTool), supplied by Rikola, was used to perform this estimation from the metadata files and the GPS log file stored for each cube.The double frequency GNSS receiver of the Novatel SPAN-IGM-S1 grabbed raw data with a frequency of 1 Hz and from this data accurate positions were computed.A reference station was settled in the area and a double frequency receiver collected data during the flight.The trajectory was computed with differential positioning technique with Inertial Explorer Software achieving standard deviations of 2 cm for horizontal and vertical components.The position of each image band was interpolated from these data and used as observations in the bundle adjustment with a standard deviation of 20 cm for XYZ coordinates.Antenna to camera offsets were directly measured and used in the trajectory processing.The attitude angles provided by the INS were not used in the experiments.
Two photogrammetric projects were set up for each set of images from bands 8 and 23 to perform the BBA with Leica Photogrammetric Suite (LPS).The Perspective Centre coordinates determined by GPS were used as weighted constraints and the attitude angles were considered as unknowns.Tie points were automatically extracted with Leica LPS software.Several trials were conducted using bundle block adjustment (BBA) to assess the suitability of the previously determined IOPs in the indirect image orientation considering three settings: the first one used the calibrated IOPs as fixed and the EOPs as unknowns; the second trial used fixed IOPs and weighted constraints in the EOPs to perform bundle adjustment and; the third trial used and weighted constraints in the IOPs and in the EOPs to perform on-the-job calibration.Several variations in the weights of the PC coordinates and IOPs were also performed to check the results because there is some uncertainty about the accuracy of the technique used for event synchronization with two different receivers (single frequency used by Rikola and dual frequency used by Novatel).Event logging directly from the Rikola to SPAN-IGM is planned to be implemented in the near future to provide more accurate event handling.
In this paper only the most significant results will be presented.Tables 8 and 9 to 11 present the a posteriori sigma and the RMSE in the check points for the experiments with images of band 8 and 23, respectively.These results were obtained with LPS Aerial Triangulation module and checked with CMC, which achieved similar values.
In  2 and 3).The configuration of the constraints in the EOPs and IOPs were the same as presented for band 8.The results are presented in Table 9.
BBA 1 BBA 2 OJC σ IOP (c, x  It can be concluded that for medium accuracy the conventional BBA leads to suitable results.For areas with difficult terrestrial access, such as forest areas, direct georeferencing is recommended and, then, OJC should be required to compensate these variations in the IOPs.

CONCLUSIONS
The aim of this study was to assess the use of a lightweight hyperspectral camera based on a Fabry-Pérot interferometer (FPI) with photogrammetric techniques.The main concern when using this FPI camera is the determination of the IOPs and its change with different bands configurations.Experiments with some reference bands of two sensors were performed both with terrestrial and aerial data.
The configurations were not optimal, mainly the overlap.Also, only one flight height was used and it should be better to use two flights heights to minimize correlations.The results have shown that some IOPs have to be estimated on-the-job with bundle adjustment to provide suitable results, especially when using direct georeferencing data.Combining GNSS data is of crucial importance in forest applications because of the difficult to establish dense control in the block extent.Further research is needed to assess the stability of the camera inner orientation and the use of IOP values of some reference bands.
(a) Rikola Hyperspectral camera with accessories and (b) diagram showing internal components.
(a) Set of twelve images of band 8 (605.64 nm) used for calibration; (b) example of corners automatically extracted and labelled.
Estimated IOP and a posteriori sigma for image bands of sensors 1 and 2.
(a) UAV with camera and accessories; (b) Parking area used in the experiments (source: Google Earth).

Figure 5 .
Figure 5. Block configuration used in the experiments. ).

Table 4 .
Image bands used for terrestrial calibration and the number of images and points.

Table 5
can be explained by the image quality: images from sensor 2 are more blurred probably due to the beam splitting optics.The values for the parameters are clearly different for images from different sensors.When comparing the results for bands of sensor 1 (15, 22 and 23) it clear that they are similar and the differences are within the estimated standard deviations.

Table 6 .
Results obtained for check distances for each set of bands IOPs.
Table6presents the results showing that the RMSE for the check distances were around 4.4 mm for band 8 (Sensor 2) and 4.2 mm for bands of Sensor 1.This value is expected since the average pixel footprint is around 4 mm.

Table 7
presents the results of the ZROT method for three pairs of sets of IOPs: the first pair compares IOPs generated with images of band 8 (Sensor 2) with those IOPs generated for the band 23 (Sensor 1) and the other two pairs compare bands of the same sensor (bands 15 and 22 of sensor 1).

Table 7 .
Results of ZROT method comparing sets of pairs of sets of IOPs.

Table 8 .
Results for aerial images of band 8 with IOPs of band 8 from terrestrial calibration: a posteriori sigma and RMSE in the check points coordinates.

Table 9 .
Results for aerial images of band 23 with IOPs of band 23 from terrestrial calibration: a posteriori sigma and RMSE in the check points coordinates.The results with images of band 23 were similar to band 8, showing that small corrections in the IOPs are required to improve results.The experiment was repeated but changing the initial IOPs for those obtained for band 22 and 15 with terrestrial calibrations.Results are presented in Tables10 and 11, showing similar results.

Table 11 .
Results for aerial images of band 23 with IOPs of band 15 from terrestrial calibration: a posteriori sigma and RMSE in the check points coordinates.A further analysis was performed comparing the IOPs (c, x o , y o ) estimated with on-the-job calibration with aerial images of band 23 and using initial values provided by terrestrial calibration with images from bands 23, 22 and 15.The estimated values are presented in Table12.It can be seem that the IOPs estimated on-the-job have similar values, no matter the initial values used and these values are different from those obtained in the terrestrial calibration, specially the coordinates of the principal point.This can be explained by the mechanical instability or even to delay in the event logging which can be absorbed by the estimated coordinates of the principal point.

Table 12 .
IOPs estimated in terrestrial and on-the-job calibration (OJC) band 23.