L IDAR A RCHITECTURE FOR H ARSH E NVIRONMENT A PPLICATIONS

An overview is provided of the obscurant-penetrating OPAL lidar sensor developed for harsh environments, including poor visibility conditions. The underlying technology, hardware and software architecture of the sensor are presented along with some examples of its software modules’ applications. The paper also discusses the performance of the OPAL in the presence of various types of obscurants.


INTRODUCTION
Automation is part of a rapid technological evolution in many industries.Several benefits include productivity increase, allowing for an improvement in workplace safety, and facilitating significant gains through reduced maintenance requirements and managing potential labour shortfalls [1], [2].Automation in heavy machinery industries, such as mining and construction, will require vision sensors capable of operating in harsh environments and see through obscurants.Neptec Technologies Corp. has developed the OPAL lidar product line in fulfillment of this requirement.

OPAL Lidar Architecture
The OPAL lidar sensors are designed as compact, rugged, multi-purpose 3D sensors, specifically built for harsh environments running real-time applications.OPAL overcomes two major limitations of traditional lidar when operating in harsh environments; object perception in the presence of obscurants and the ability to withstand severe environmental conditions in the field.
The lidar sensor is environmentally sealed (IP67) with no external moving parts, has an operating temperature range of -40° to +65°C, with no fans or heaters for improved reliability, and is ruggedized to withstand significant vibration and shock levels while operating.The OPAL comes in two distinct designs for the scanning field-of-view (FOV): a panoramic 360 o FOV and a conical 60 o or 90 o FOV.

Lidar Hardware
Figure 1 illustrates an OPAL-120 with the conical FOV along with its main hardware architecture.The lidar operates at a wavelength of 1540nm and achieves a Pulse Repetition Frequency (PRF) of up to 200kHz.
The OPAL lidars produce scan patterns using Risley prisms under independent motor control.Figure 2 shows an example of increased pattern density coverage for scanning periods ranging from 100ms to one second.The quality of the pattern coverage is highly dependent upon the selection of the motor speeds.A given motor speed ratio will uniquely define a particular scan pattern.Integer speed ratios will typically cause the patterns to repeat at the same locations from one scan to the next.Patterns that incorporate phase variations between scans provide better uniform coverage as repetitious patterns are minimized.This is shown quantitatively in Figure 3 where a Figure of Merit (FOM) is developed to measure the uniformity of a specific scan pattern; the larger the FOM, the less uniform the pattern.It can be interpreted from the graph that scan patterns corresponding to motor speed ratios that are integers will not provide good coverage uniformity.3DRi (3D Real-time intelligence) is a collection of highly efficient algorithms and software to extract actionable information from 3D sensor data in real-time and from moving platforms.This technology, when combined with the OPAL sensors, provides an autonomous machine the ability to perceive its environment, dynamically respond to changes, and identify and track objects in its field of view in real-time and while in motion.3DRi makes it easy to develop, integrate and support intelligent 3D machine vision systems for machine automation and other industrial applications.The 3DRi modular "plug-ins" are shown in Figure 7 and can be implemented based upon the application requirements.An API is usually customized to interface with a specific application target such as a vehicle or heavy machinery that requires control input.Some examples of 3DRi processing are shown below.Figure 8 shows raw lidar data of a landing zone where dust from the soil has just been stirred up by an approaching helicopter.Figure 9 shows how the 'Obscurant' plug-in, filters out the dust to reveal objects hidden in the dust.Finally, Figure 10 shows how the 'Segment' plug-in separates out the ground plane to leave only the above ground objects that represent a potential hazard to the helicopter.

Obscurants' Penetration Results
It is important for a lidar sensor operating in harsh environments to be capable of penetrating obscurants to some degree.Dust and smoke are often present in mining operations.Airborne platforms often face rain, fog and snow.Landing helicopters can also generate large severe brownouts and whiteouts when done in unprepared areas.
A series of evaluations on the OPAL were conducted using the aerosol chamber facility at Defence Research and Development Canada (DRDC) Valcartier in Quebec, Canada [3].The chamber has a length of 22m.The optical depth (defined as -ln(T), T being the transmittance) of the aerosol clouds generated in the chamber ranges from 0.01 to over 10; the range that can be monitored by a transmissometer is limited to optical depths of 0.01 to 4. The aerosol chamber was used to evaluate laser propagation in dust and water fog.
Figure 11 illustrates a few examples of obscurants' measured penetration ranges compared with the predicted penetration ranges using the single scattering lidar equation.The mass extinction coefficients were derived from the Mie scattering model for a laser wavelength of 1540nm and using particles' properties, such as their expected size and refraction index.
For the case of dust, a particle size of 20µm is used for the model, with the sensor at a standoff distance of 43m from the dust cloud.Experiments were conducted with the MIL-810 dust that has an average particle size of 20µm.The agreement between the actual and predicted ranges is fairly good; however, in reality there is a particle size distribution that will affect the penetration characteristics and this is not accounted for by the current model.In addition, the model assumes an obscurant cloud with a constant density, which is rarely the case in reality.
For dense water fog cases, a particle size of 10µm is used for the model, with the sensor at a standoff distance of 46m from the fog cloud.The agreement between the model and the field measurements is good; however, in reality the water fog also has a particle size distribution that will affect the penetration characteristics, and this is not accounted for by the current model.
In the case of snow as an obscurant, the data was gathered using a helicopter hovering over a fresh snow field to create a whiteout scenario [4].Independent measurements of the snow generation rate were not available.For the model, a snowfall rate of 230mm/hr was used, corresponding to an extremely heavy snowfall observed by weather services.It can be noted that there is a significant discrepancy between the model and the measurements, due to the fact that a true whiteout model is unavailable.
Finally, a measured detection range in clear conditions of wires with a diameter size of half inch (½") is shown with a close agreement with the model.

CONCLUSIONS
The hardware and software architectures of the OPAL lidar sensor have been described.The sensor architecture is well optimized for harsh and poor visibility environments.The OPAL lidar is currently successfully employed to acquire 3D vision data for machinery control in environments such as open-pit mining, heavy construction, ship loading, and military helicopter landing in unprepared areas.Obscurant penetration performances of the sensor are in reasonable agreement with predictions from a single scattering model.This holds true despite the fact that detected light returns will include multiple scattering events and the obscurants' density and particle sizes are not uniform in real situations.
Further tests with obscurants in controlled environments are planned to better characterize the sensor resolution and accuracy as a function of the obscurant type and density.

Figure 3 :
Figure 3: Figure of Merit (FOM) of the scan uniformity versus the motor speed ratio.OPAL sensors incorporate Neptec's patented realtime obscurant penetrating lidar technology.Originally developed for helicopters landing in brownout conditions in the desert, OPAL is able to penetrate obscurants such as dust, fog, rain and snow to dynamically image objects engulfed within those obscurants.This capability is accomplished, in part, through a unique hardware design where features of the return waveform are evaluated in real-time without the need for full processor-intensive digitization.A real-time filter is also implemented in software to filter out the returns coming from obscurant particles.An example of dust penetration by the OPAL in an open-pit mining area is shown in Figure 4, 5 and 6 during a shoveling operation.The bottom image, acquired with OPAL, clearly shows the mine face behind the haul truck compared to an obscured image captured by conventional lidar.

Figure 4 :
Figure 4: Mining shovel loading a hauling truck

Figure
Figure 7: 3DRi software architecture

Figure 8 :
Figure 8: Landing zone raw lidar data including dust and above ground objects.

Figure 9 :
Figure 9: Dust removed leaving only ground and above ground objects.

Figure 10 :
Figure 10: Ground segmented out, leaving only above ground objects.

Figure 11 :
Figure 11: Comparison of predicted detection ranges in obscurants with measured detection ranges.