Next Article in Journal
LiDAR Intensity Completion: Fully Exploiting the Message from LiDAR Sensors
Previous Article in Journal
Performance Assessment for a Guided Wave-Based SHM System Applied to a Stiffened Composite Structure
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Requirements for Automotive LiDAR Systems

1
Institute of Product Development, Leibniz University Hannover, 30823 Garbsen, Germany
2
Cluster of Excellence PhoenixD (Photonics, Optics, and Engineering-Innovation across Disciplines), Gottfried Wilhelm Leibniz Universität Hannover, Welfengarten 1A, 30167 Hannover, Germany
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(19), 7532; https://doi.org/10.3390/s22197532
Submission received: 24 August 2022 / Revised: 23 September 2022 / Accepted: 28 September 2022 / Published: 4 October 2022
(This article belongs to the Section Optical Sensors)

Abstract

:
Light detection and ranging (LiDAR) are fundamental sensors that help driving tasks for autonomous driving at various levels. Commercially available systems come in different specialized design schemes and involve plenty of specifications. In the literature, there are insufficient representations of the technical requirements for LiDAR systems in the automotive context, such as range, detection quality, resolving power, field of view, and eye safety. For this reason, the requirements above require to be derived based on ADAS functions. The requirements for various key LiDAR metrics, including detection range, field of view, angular resolution, and laser safety, are analyzed in this paper. LiDAR systems are available with various radiation patterns that significantly impact on detection range. Therefore, the detection range under various radiation patterns is firstly investigated in this paper. Based on ADAS functions, the required detection range and field of view for LiDAR systems are examined, taking into account various travel speeds to avoid collision and the coverage of the entire lane width. Furthermore, the angular resolution limits are obtained utilizing the KITTI dataset and exemplary 3D detection algorithms. Finally, the maximum detection ranges for the different radiation patterns are compared under the consideration of derived requirements and laser safety.

1. Introduction

Researchers and manufacturers have rapidly developed light detection and ranging (LiDAR) systems to help perform driving tasks for human drivers and autonomous driving due to their ability to accurately measure distances and detect objects nearly regardless of the lighting conditions [1]. LiDAR systems obtain the distance information by measuring the time difference between emitting and receiving optical signals. The system can be classified as mechanical scanning with a collimated beam or flash irradiation setups with a diverging irradiance [2,3,4,5]. The specifications of some metrics for LiDAR systems, such as detection range, field of view (FOV), and angular resolution, are suggested in many studies [6,7,8,9,10,11,12,13,14]. However, a consistent result cannot be summarized from them. Since the respective solution depends on the boundary conditions and requirements, a detailed consideration of these influencing factors is necessary. For example, LiDAR systems that use collimated beams can acquire a long detection range due to the small energy drop off from the emitter to the target. However, the collimated beam must be steered with a significant pulse repetition rate, limiting the emitted energy concerning laser safety regulations [15]. This paper evaluates the detection range for various beam radiation patterns, including flash irradiation and various scanning approaches. The criteria for the detection range, FOV, and the angular resolution are determined under the consideration of ADAS functions, serving as a benchmark for the specification and design of LiDAR systems.

2. State of the Art

The detection range, FOV, and angular resolution are essential metrics that affect the performance of LiDAR systems. This chapter introduces the current state of these metrics and the functional principle of LiDAR systems.

2.1. Current Requirements for Automotive LiDAR Systems

So far, the performance requirements of automotive LiDAR are not unified. The study by Warren [6] demonstrates the requirements for automotive LiDAR systems regarding discussions with manufacturers, wherein LiDAR systems are defined as short-range from 20 m to 30 m and long-range up to 300 m, individually. The angular resolution for short-range LiDAR systems is proposed at 1°, while the long-range LiDAR systems require an angular resolution between 0.1° and 0.15° to detail remote objects. Zhao et al. [9] and Yeong et al. [10] list the current best products of various manufacturers in terms of performance metrics in their study, including detection range, FOV, and angular resolution. The statistics indicate that commercial LiDAR systems have a maximum detection range of up to 300 m concerning a reflectivity of 10% of the detection object. A horizontal FOV of 360° appears only in mechanical LiDAR systems, while other systems cover a horizontal view angle between 60° and 270°. The vertical FOV varies between 4° and 60°, and a finest angular resolution of 0.1° in both directions can be summarized from commercial products. In the study by Takashima et al. [13], the specifications for LiDAR systems are listed for an advanced driver assistance system (ADAS) and robotic car applications based on the sampling rate of current mechanical scanning LiDAR systems. In their study, forward-looking LiDAR systems have to detect objects up to 200 m with a 14° × 3.5° FOV and 0.3° × 0.3° angular resolution. In contrast, side looking LiDAR systems are required to have a 30 m range with a 60° × 5.7° FOV and 0.2° × 0.2° angular resolution. Raj et al. state that LiDAR systems for ADAS applications require a minimum detection range between 100 m and 200 m and a horizontal FOV of more than 90° [14]. The studies introduced above indicate that it is difficult to summarize a particular standard of LiDAR metrics. Additionally, the particular application scenarios must be taken into account while determining the requirements. For instance, the angular resolution affects the density of the point cloud, which also affects the accuracy of the object detection via LiDAR systems.

2.2. Functional Principle and Principal Components of LiDAR Systems

LiDAR systems obtain the range information based on the time of flight (TOF) of an emitted light signal approach. The distance between a LiDAR system and a target is computed by half the multiplying of the light velocity and the signal’s flight time when neglecting the distance between the emitter and detector. The TOF refers to the time difference or phase shift between the emitted and received signals, by which emitted beams are modulated as pulsed or as continuous waves (CW).
In pulsed LiDAR systems, the emitter radiates light pulses with a nanoseconds duration [14]. The TOF is directly estimated using the time difference between pulse transmission and reception. This approach allows LiDAR systems to emit a high instantaneous power to obtain a long detection range concerning the limit of eye safety.
The continuous wave amplitude modulated (AMCW) approach regulates the intensity of a continuous wave with a constant frequency f A . The object distance R is determined using the phase shift between emitted and received waves φ and the light speed c with:
R = c · φ 4 · π · f A
The AMCW method is often applied in TOF cameras for indoor applications, involving cost-effective LEDs as emitters rather than laser diodes [16]. The available power restriction prompts a low signal-to-noise ratio (SNR) and, consequently, a short detection range [16]. In addition, as the object distance increases, AMCW LiDAR systems cannot receive enough photons to determine the phase difference, which also restricts their applications for long-range detections.
Besides the AMCM method, the emitter signal can also be modulated with a periodically shifted frequency in a continuous wave frequency modulated (FMCW) approach. The received signal wave is blended with the emitted wave as references, leading to a frequency difference f F . The object distance is obtained under the consideration of the bandwidth B and the period of the frequency modulation T with:
R = f F · c · T 2 · B
Waves must interfere with themselves at different times to distinguish the frequency differences. Hence, the laser coherence must be maintained in the round trip to the target to determine the frequency difference [17]. Commercially available diode lasers regularly have a coherence length of <1 mm [18]. In contrast, external cavity diode lasers (ECDL) and fiber lasers can be utilized as emitters to achieve an over 100 m detection range [18]. However, external resonators used to stabilize the coherence length improve the size and cost of such LiDAR systems.
As emitters, diode lasers are preferred in LiDAR systems for automotive applications due to their compact size and minimal expense. The two most commonly used diode lasers are edge-emitting lasers (EEL) and vertical-cavity surface-emitting lasers (VCSEL). Compared with the EELs, VCSELs have a minor spectral shift in terms of temperature (approximately 0 . 06   nm · K 1 versus 0 . 25   nm · K 1 ) [17]. Thus, using a narrow bandpass filter can achieve an improved SNR. Besides this, VCSELs can be fabricated in an array so that multiple laser sources can be used in one compact system. In this way, the pulse repetition rate of each laser can be reduced, increasing the detection range, which is limited by laser safety aspects which are discussed in Section 4.4.
In order to detect reflected photons regularly, either linear mode avalanche photodiodes (APDs) or single-photon avalanche diodes (SPADs) are utilized in LiDAR systems. Linear-mode APDs produce a signal amplification with a typical gain of ca. 200, proportional to the incoming light [16]. However, they are not able to detect a single photon. In contrast, SPADs are APDs operating in Geiger mode with a gain greater than 104 once it is stimulated with even a single photon [16]. Hence, it can be used when an extremely high sensitivity for single-photon detection is required. However, one disadvantage of SPAD is the dead time of 400 ns to 1 µs between two adjacent measurements [19]. During this time, the detector is saturated and other photons received by the detector no longer generate an electrical signal. The pulse repetition rate is thus limited. Due to the high sensitivity, the sensor is easily affected by noise. Therefore, for LiDAR systems using SPAD sensors, one frame image usually requires 50 to 100 measurements per pixel to reduce the SNR and increase the detection accuracy [20].
As stated above, the detection range is one of the essential metrics for LiDAR systems. AMCW LiDAR systems require a high-power emitter to capture sufficient reflected photons to determine the phase difference. However, the high emitted power can lead to eye hazards concerning laser safety. For FMCW LiDAR systems, the coherence length of a cost-effective laser source limits their maximum detection range. In pulsed LiDAR systems, short pulse widths and low-duty cycles can help achieve a high peak emitted power to increase the detection range, limiting the exposure to ensure the eye-safe. Therefore, they are appropriate for automotive applications and are focused on in this paper.

3. Range Equation for Different Radiation Patterns

Different beam shaping methods are available in LiDAR systems, producing various radiation patterns. This chapter proposes a range equation for LiDAR systems and evaluates the detection range of LiDAR systems with various beam patterns.
According to the usual definition, the detection range of LiDAR systems is the distance at which the reflection of an object with 10% reflectivity and a Lambertian backscatter characteristic can still be detected by a detector pixel [21]. Equation (3) describes the relationship between the emitted power P s and received power P r of LiDAR systems. The main variables in Equation (3) are listed in Table 1.
P r = P s T A η t A s · Γ ρ · T A π r 2 · A R η r
Among the Equation (3), A s is the beam spread area of the emitter at the target. T A and η t are the atmospheric transmission and the optical efficiency of the emitter. The first term of Equation (3) indicates the optical irraidance (W∙m−2) incident on the target. In the second term, the target cross-section Γ is the irradiated area of the target. ρ is the reflectance of the target. The product of the first and the second term is equivalent to the total optical power scattered by the target in the detector’s direction. In the third term, r is the distance between the LiDAR system and the object. An omnidirectional (4π) scattering is assumed for microwave radar systems that operate a millimeter-wavelength for small objects. However, for LiDAR systems that operate with shorter wavelengths (near-infrared), the scattering only occurs in the backward direction. The maximal radiant intensity is normal along the surface and 1/π of the total reflection radiant intensity, assuming Lambertian scattering (see Appendix A). The product of the first three terms gives the maximal optical irradiance of the backscatter at the receiver. After multiplying the optical aperture A R and the optical efficiency η r of the detector, the optical power received by the detector can be obtained.
In Equation (3), the beam spread area A s is a function of the distance r and the radiation angle in horizontal ( φ h ) and vertical ( φ v ) directions. The cross-section Γ is related to the irradiated area viewed by a single detector pixel, which is a function of the distance r and the opening angle of a single pixel α. Therefore, the ratio of the cross-section Γ and the beam spread area A s must be further investigated for different radiation patterns with different radiation angle. Accordingly, the Equation (3) can be simplified with:
P r = P s · K · Γ A s
where the factor K is the set of the parameter irrelevant to the beam patterns and given with:
K = T A 2 · η t · η r · A R · ρ π r 2
Figure 1 shows three radiation patterns depending on the beam shaping relative to the size of a quadratic-shaped single-pixel image.
In the first case, the laser beam is assumed to be collimated with a circular or elliptical cross-section. Either a single-pixel detector or a 2D array detector can be used for this case. The beam spread area A s is smaller than the area viewed by a detector pixel (black rectangle in Figure 1a) for all relevant distances. Therefore, the cross-section Γ equals to the beam spread area A s .
In the case of LiDAR systems using blade irradiation, as shown in Figure 1b, the total irradiated area is larger than the a pixel image only in one direction. The scanning direction can be horizontal or vertical to cover the entire FOV. In this case, either a 1D linear detector array or a 2D detector array can be used. According to the schematic, the cross-section Γ of a single-pixel occupies only a small part of the beam spread A s , determined by the width ratio between the image size of a single pixel and the total number of pixels. Due to this ( α     φ h ), the received energy P r decreases significantly compared to the spot irradiation concerning a constant emitted optical power P s and factor K . For vertical blade irradiations, the indices h and v have to be swapped in the corresponding equations of Table 2.
Since the entire FOV is irradiated at once in flash LiDAR systems, only a 2D array detector can be used to generate the required resolution. The cross-section Γ is determined by the size of a pixel image on the irradiated surface (a black unit in Figure 1c), which is much smaller than the beam spread area A s , leading to a further decrement of received energy P r compared to the other two cases. The range equation for all cases is summarized in Table 2. The derivation of the beam spread area A s and the cross-section Γ for all cases is given in Appendix A.
For the discussion above, we focus only the center pixel and the object that is directly in front of the LiDAR system. In this case, the maximum reflected radiant flux can be received on the pixel when Lambert backscatter is considered. In addition, it is assumed that a target shows at least one point in LiDAR frames, leading to the target size being larger than the size of a single pixel image. Some special cases are neglected such as a small target appearing in a long distance. LiDAR systems detect only one point, which makes object detection difficult. Among the three equations, the irradiance angle of the emitter ( φ h and φ v ), the angle viewed by a single-pixel ( α ), and the detection range (r) play essential roles to describe the energy flow of LiDAR systems. Moreover, these parameters have significant impacts on the design metrics of the FOV, angular resolution, and the detection range. Therefore, their requirements have to be investigated in the following sections.

4. Investigation of the Requirements for LiDAR Systems

Commercially available LiDAR products show a variety of concepts and capabilities. So far, most studies on LiDAR sensors barely focus on discussing the requirements. In contrast, automotive lighting is regimented in detail by standards and regulations [22,23,24,25,26,27,28]. This chapter investigates the necessary detection range, field of view, and angular resolution of LiDAR systems, considering advanced driver assistance system (ADAS) functions and object detection via LiDAR point clouds.

4.1. Detection Range of the ADAS Applications

ADAS support the driver by assisting in longitudinal and lateral driving tasks, e.g., braking and steering, to increase safety and comfort while driving. Adaptive cruise control (ACC) is a typical ADAS function that adjusts the subject vehicle’s speed to ensure a safe distance from a preceding vehicle. Another important ADAS function that focuses on longitudinal driving tasks is the forward vehicle collision warning system (FVCWS), which monitors the speed and distance of the vehicle driving ahead to prevent a collision, or at least reduce its severity. While detecting a non-moving vehicle is optional according to ISO 15623 [29], static obstacles discussion is included in this paper.
Since LiDAR systems provide accurate distance information, they are mainly used for ACC and FVCWS applications [30,31]. Because forward detection often requires a more extended range than monitoring the surrounding environment, the following discussions of the requirement are based on the assumption that an individual sensor located in the middle of the vehicle front is used.
The function of ACC is to control the driving speed to adapt to a preceding vehicle [32]. The maximum required detection range d max of the ACC function is calculated with the maximum driver-selectable set speed v max and the maximum selectable time gap τ max by [32]:
d max = τ max · v max
The standard ISO 15622 [32] specifies that the minimum selectable time gap for the subsequent control shall be ≥0.8 s, and the system shall provide at least one time gap between 1.5 s and 2.2 s for speeds higher than 8 m·s−1 (30 km·h−1). For example, a detection distance of 85 m is required to have a time gap of 2.2 s at a speed of 38.8 m·s−1 (140 km·h−1).
The function of the FVCWS is to warn the driver when a preceding vehicle appears in the trajectory of the subject vehicle and becomes a potential hazard [29]. The standard ISO 15623 [29] specifies the range requirements regarding scenarios where a warning rises. Among them, three typical scenarios for the FVCWS according to different states of the obstacle vehicle are indicated in Table 3.
In the first scenario, the preceding vehicle is traveling at the same speed as the subject vehicle, where the warning distance is only related to the subject vehicle’s speed v 1 and the driver’s brake reaction time T .
In the second scenario, the preceding vehicle is regarded as a stationary obstacle. The subject vehicle’s deceleration a 1 must be taken into account by determining the warning distance.
In the last scenario, the preceding vehicle is decelerating with a relative speed v rel relative to the subject vehicle, while the subject vehicle is traveling with v 1 . Scenario one and scenario 2 are the two boundary conditions of scenario 3 when v rel = 0 and v rel = v 1 , respectively.
Increasing the brake reaction time T and the subject’s vehicle deceleration a 1 increase the required warning distance. ISO 15623 utilizes the result in the study of Johansson et al. [33] to determine the brake reaction time T. The study indicates that the reaction time of 98% of tested people is under 1.5 s. Moreover, the minimum subject vehicle deceleration a 1 is given with 5.3 m·s−2 by evaluating the emergency brake performance of cars on a dry, flat road surface in ISO 15623 [29]. Different braking capabilities have to be additionally considered for other vehicles like trucks. The maximum warning distances for the three scenarios are presented in Figure 2.
Although the driving speed on several motorways is not limited in Germany, most countries regulate the maximum driving speed. Typical speed limits are between 90 km·h1 and 130 km·h1. According to [34], the maximum limit of the listed countries is 140 km·h1, chosen as the subject vehicle speed for scenario 3 in Figure 2. Scenario 2 is a boundary condition of scenario 3 when the preceding vehicle decelerates until it stops. Therefore, concerning all scenarios and a subject vehicle speed of 140 km·h1, a minimum detection range of 200 m is required to avoid a forward collision.
Noticeably, overtaking maneuvers on rural roads require an extensive detection range to avoid a forward collision when the oncoming traffic is considered. For example, the German rural road is classified into four types [35]. Two of them are designed with overtaking lanes, which means the overtaking maneuver can be achieved without the influence of the oncoming traffic. The other two rural roads have a maximum design speed of 90 km·h−1 without overtaking lanes. Assuming that both the subject and oncoming vehicles are traveling at 100 km·h−1 (which is the legal maximum and more applicable than the reduced design speed), a warning distance of 180 m is required.

4.2. Field of View Requirements for ADAS

Compared with straight roads, detection in curves requires an increased horizontal FOV. The ACC system shall enable the subject vehicle to follow a preceding vehicle on curves with a minimum radius of 500 m [32]. The geometric relations to cover the complete lane width in a curve are shown in Figure 3a. Equation (7) can be used to determine the half horizontal viewing angle θ to completely cover the width of the own lane up to the apex of the inner curve, assuming that the subject vehicle’s position is in the middle of the lane. A larger horizontal angle is required to capture more distance in a curve.
θ = arccos ( R - W L / 2 R )
In Equation (7), R is the curve radius, and W L is the lane width. The lane width typically varies between 2.75 m and 3.75 m depending on the design speed and the number of parallel lanes [35,36]. Based on the maximum value of 3.75 m, Figure 3b shows the required whole horizontal angle to cover the complete lane width as a function of the curve radius. Accordingly, the minimum horizontal FOV required for the ACC function is approx. ±5° concerning a 500 m curve radius.
The forward vehicle collision warning system (FVCWS) function is classified by curve radius capability that an obstacle can be detected in a curve with a certain radius in the subject vehicle’s trajectory. System classes Ⅰ, Ⅱ, and Ⅲ refer to the curve radii of >500 m, >250 m, and >125 m, respectively. Detecting a preceding vehicle with a lateral offset of up to 20% of its width must be possible from the minimum distance, which varies between 5 m and 10 m depending on the system class [29]. Figure 4 illustrates these geometric requirements.
The minimum detection height h 1 and the maximum height h from the ground are given in [29] with 0.2 m and 1.1 m, respectively, to determine the vertical FOV for the FVCWS function. Concerning a vehicle width W v of 2 m [37], the resulting FOVs for the FVCWS function are listed in Table 4.
The results in Table 4 indicate that a smaller curve radius requires a larger FOV in horizontal and vertical views to detect potentially dangerous objects in the current lane. Concerning a minimum curve radius of 125 m, the horizontal angle to cover the full lane width must be more than 20° (see Figure 3b), which is below the Class Ⅲ FVCWS. In summary, ADAS functions that focus on principal longitudinal driving tasks require a FOV of 32.6° × 10.2° concerning curve radii ≥ 125 m.

4.3. Angular Resolution Requirements

The angular resolution is defined by the minimum angular distance between two objects which can be resolved by LiDAR systems [31]. According to the description in chapter 3, the angular resolution refers to the angle of two adjacent scanning points for scanning LiDAR systems and two adjacent detector pixels for non-scanning LiDAR systems.
Angular resolutions of 0.1°~0.2° are stated for LiDAR systems [9,10,11,12,13]. Since it directly affects the detection performance, object detection algorithms based on LiDAR point clouds must be considered when determining the required angular resolution.
Table 5 compares the average precision and processing time for commonly used open-source LiDAR-based object detection algorithms. The algorithms are trained and tested on the KITTI validation set [38] via OpenPCDet toolbox, using RTX 3080 GPU and i7-10700K CPU.
The KITTI dataset contains 7481 LiDAR frames (images) for training and 7518 LiDAR frames for testing. Among them, 80,256 objects are labeled, including eight categories, e.g., ‘Car’, ‘Truck’, ‘Pedestrian’, and ’Bicyclist’. The statistic of the dataset shows that cars, pedestrians, and bicyclists are the most predominant categories [38]. Hence, they are focused on in this paper.
The average precision shown in Table 5 is evaluated using the PASCAL metrics [43], resulting from all the 7518 testing LiDAR frames in the KITTI dataset. In object detection tasks, recall is defined as the proportion of detected relevant elements over the total number of the relevant elements. Precision is the proportion of detected relevant elements among all detected elements. The average precision is computed from the precision/recall curve and defined as the mean precision at 40 equally spaced recall levels.
According to the results listed in Table 5, each algorithm has minor differences in the detection performance of traffic objects. The average precision of all algorithms for identifying cars, pedestrians, and bicyclists has a standard deviation of 0.59%, 3.95%, and 1.77%, respectively. The PointPillars algorithm requires the shortest processing time for each image. Under the hardware conditions above, the processing of each image requires 0.027 s with GPU and 0.0286 s with CPU. This leads to a ~35 Hz data processing rate. An automotive sensor has to deal with real-time dynamic information to capture moving objects or obstacles. A minimum frame rate of 25 Hz has to be considered while selecting the frame rate for an automotive LiDAR [44]. Concerning this requirement, the data processing has to be greater than 25 Hz to achieve real-time detection. PointPillars is the only algorithm that achieves a frame rate of over 25 Hz real-time detection on the tested system. Despite its average precision by pedestrians and bicyclists being slightly lower than other algorithms, considering the importance of real-time detection for automotive applications, it is selected as an example to obtain the angular resolution requirement in this paper.
Classification and position tasks are usually used to evaluate a detection result, e.g., “is there a car in the scenario?” and “where is it?” [45]. The result is often visualized as a bounding box for each detected object with a confidence score.
The confidence score is the product of the probability that an object contained in a bounding box (Pobject) and the intersection over union (IoU). If no object is contained in a box, the confidence score equals zero. Otherwise, it equals the IoU. The IoU indicates the overlap ratio between the predicted box B p and the ground truth B g , given with [46]:
IoU = area ( B p B g ) area ( B p B g )
Lang et al. [42] introduce the PointPillars algorithm in their study at first, and the threshold of the IoU is set to 0.5. According to Equation (8), an IoU of 0.5 refers to a maximum 33.3% offset between the predicted box and the ground truth, e.g., in the horizontal direction, leading to a maximum 0.67 m offset of car detections concerning a width of 2 m. The threshold of an IoU can be set to more than 0.5 to obtain a more accurate location prediction. Since many LiDAR-based object detection algorithms and evaluation metrics prefer to apply an IoU threshold of 0.5 to consider a correct detection [39,40,41,45], the angular resolution limits are acquired with 0.5 confidence as a reference in this paper.
Three of the most common road users are chosen for the detection. For cars and bicyclists, the detection is evaluated for two perspectives due to these participants’ different width and depth dimensions. Under an identical distance, the object at a larger cross-section perspective may be easily detected, leading to a minimum angular resolution requirement. Since pedestrians show a similar width and depth, only one perspective is taken into account.
The raw point clouds are downsampled to different angular resolutions to find a limit for the 0.5 confidence score. The Velodyne HDL-64E LiDAR used in the KIITI dataset has an angular resolution of 0.08° in the horizontal and 0.4° in the vertical direction. These values decide the finest possible angular resolution for the image and the minimum interval for downsampling. The upper limits for horizontal and vertical angular resolution are selected at 1.04° and 1.6°, respectively, providing enough tolerance for determining the angular resolution limit. For every object and perspective, 52 downsamplings are executed. The detection results for the three object types are shown in Figure 5.
The right column in Figure 5 shows the confidence score for various angular resolutions for the scenario shown in the corresponding scene in the left column. The threshold of the confidence score is set to 0.4. Since detections with a lower confidence score cannot be treated as positive classifications, they are shown as blank fields in the heatmap. The Velodyne LiDAR has a different angular resolution in the horizontal and vertical directions. Since using quadratic pixels is a more general approach to realize than using rectangular ones, especially with flash LiDAR systems, the required angular resolution for the confidence score of 0.5 is determined using this pixel shape.
The required number of LiDAR points is determined from the calculated results shown in the right column of Table 6. In combination with the aspect ratio of the detection objects, this table gives the required number of quadratic shaped pixels (last column in Table 6). The point numbers leading to a score between 0.45 and 0.55 are averaged. With a maximum variance of 5%, the estimated findings are able to offer a detection confidence of 0.5. The object aspect ratio is determined using the dimension of the raw point cloud shown in the middle column in Figure 5. Since car windows usually do not appear in LiDAR point clouds (they do not backscatter the LiDAR irradiation), only the lower half of the vehicle body is considered.
Due to the different sizes of the objects, bicyclists and pedestrians require a finer angular resolution than cars for a successful detection at the same distance. In addition, objects are more difficult to detect in the rear view than in the side view. The resulting dependency between object distance and required angular resolution is shown in Figure 6.
As shown in Figure 6, the standard deviation of the required angular resolution for all scenarios reduces with an increasing distance. According to the discussion in Section 4.1, a minimum range of 200 m is able to prevent a collision for traveling speeds of up to 140 km∙h−1. This distance requires an average angular resolution of 0.07° with a standard deviation of 1.8% for all scenarios. The highest criterion among them is bicyclist detection in the rear perspective view. An angular resolution of less than 0.04° is required. This value will be considered for evaluating laser safety in the following section.

4.4. Laser Safety and Comparison of the Detection Range

Besides the influence of the radiation pattern, the output power, which is limited by eye safety, strongly affects the detection range. According to the IEC 60825-1 standard, automotive LiDAR systems are certified with laser class 1 as safe for the human eye [29]. The following parameters are criteria that influence the determination of the accessible emission limit (AEL) under the consideration of eye safety [15]:
  • Wavelength;
  • Pulse duration;
  • Frame rate;
  • Number of pulses per frame;
  • Laser beam divergence.
The most significant hazard for wavelengths between 400 nm and 1400 nm is thermal damage to the eye’s retina. For pulsed LiDAR systems that operate with a wavelength <1400 nm, the output power limit is determined by applying the most restrictive of these three conditions [15], namely:
  • The maximum AEL for a single pulse (AEL.single);
  • The average power for a pulse train (AEL.s.p.T) of an emission duration T;
  • The AEL for a single pulse multiplied by a correction factor C5 (AEL.s.p.train).
Among them, the emission duration T varies from 10 s to 100 s regarding the divergence angle of the laser beam [15]. The correction factor C5 is determined by the effective number of pulses for a given exposure duration [15]. Only AEL.single and AEL.s.p.T have to be compared for LiDAR systems that operate with a wavelength > 1400 nm. For the wavelengths > 1400 nm, radiation penetrates into the aqueous humour, where the heating effect is dissipated due to the water absorption [15]. The following parameters are assumed to compare the limitation of the output power for LiDAR systems with different radiation patterns indicated in Section 3:
  • Pixel number: 815 × 255;
  • Frame rate: 30 Hz;
  • Beam divergence in spot scanning LiDAR systems: <1.5 mrad;
  • Single pulse duration: 5 ns;
  • Number of light sources for each pattern: 1.
As discussed in Section 4.2 and Section 4.3, LiDAR systems require a FOV of 32.6° × 10.2° for ADAS functions. It is known that detecting a small object at a long distance requires a finer angular resolution than other situations. To detect a bicyclist with a rear perspective at 200 m, an angular resolution of 0.04° × 0.04° is necessary (see Section 4.3). Accordingly, the pixel number of the considered LiDAR systems is 815 × 255. The pulse repetition rate of flash LiDAR systems is equivalent to the systems frame rate, defined here at 30 Hz.
Meanwhile, blade irradiation scanning LiDAR systems need only to scan in one direction. Hence, the pulse repetition rate of blade irradiation scanning LiDAR systems is 815 × 30 Hz for horizontal and 255 × 30 Hz for vertical scanning. For spot scanning LiDAR systems, the angle between two adjacent scanning points is the angular resolution. Thus, the laser’s beam divergence must be less than the angular resolution. Concerning a 0.04° angular resolution (see Section 5), the beam divergence must be less than 0.70 mrad. Hence, the AEL of the spot scanning LiDAR systems is evaluated according to [15] for beam divergences < 1.5 mrad.
On the contrary, the AEL of blade irradiation scanning and flash LiDAR systems is obtained according to [15] for extended source radiations. Moreover, a typical single pulse duration of 5 ns is assumed [16] to determine AEL.single. Figure 7 shows the maximum accessible exposure limits as a function of the wavelength for the three radiation patterns concerning the retinal hazard region (wavelength up to 1400 nm).
The AEL values in Figure 7 indicate the energy limitation falling into the eye, assuming a 7 mm pupil aperture and 100 mm measuring distance. The most critical value of the three curves (two curves for wavelength > 1400 nm) has to be considered for a specific wavelength. The pupil aperture only occupies part of the irradiated area, depending on the distance and the beam divergence angle. Hence, the most restricted AEL value is not identical to the emitted energy of LiDAR systems which increases the possible emission of a LiDAR system.
Equation (9) indicates the calculation of the total accessible emitted energy at the pupil, which is donated by E p .
E p = AEL A r
where AEL is the energy limitation on the retina from Figure 7. A r is the ratio of the pupil area Ap and the size of the area irradiated by the laser AL in a certain distance. For Ap   >   AL , all emitted energy may fall into the observer’s eye, leading to E p = AEL .
To further compare the detection range of the three radiation patterns mentioned in Section 3, the following assumptions are made:
  • Laser wavelength: 905 nm.
  • Laser divergence angle:
    spot scanning: 0.04° × 0.04° (equals the resolution requirement);
    blade irradiation, horizontal scanning: 0.04° × 10.2°;
    blade irradiation, vertical scanning: 32.6° × 0.04°;
    flash irradiation: 32.6° × 10.2° (equals the FOV).
  • Optical efficiency of the emitter: 90%.
  • Optical efficiency of the detector: 90%.
  • Reflectivity of the object: 10% with Lambertian scattering characteristic (Section 3).
  • Aperture of detector optical system: Ø 25.4 mm (1′′).
  • Pupil diameter: Ø 7 mm.
  • Atmospheric attenuation and scattering: neglected.
  • Pixel gap: neglected.
  • Intensity distribution of the emitter: homogeneous, K = 1.
A typical wavelength of 905 nm can be stated for automotive LiDAR systems [1,9,47,48]. Therefore, a wavelength of 905 nm is considered for exemplary systems. According to Figure 7, the emission energy for a spot scanning LiDAR system is limited to 0.1608 nJ at this wavelength. The limit is 77.49 nJ for vertical and 247.7 nJ for horizontal blade irradiations. In contrast, the emission energy of flash LiDAR systems can be up to 445.3 nJ.
The type of detector determines the required receiving energy of LiDAR systems. The comparison of the maximum detection range for different radiation patterns has to be assumed to use identical detectors. For example, SPADs are appropriate for LiDAR systems to achieve a long detection range due to their single-photon sensitivity. The energy of a single-photon saturates the detector, causing it to enter the death time (see Section 2.2). During this death time, other photons received by the detector no longer generate electrical signals. In this case, the required energy for one detection equals the energy of a single photon. Multiple measurements can compensate for the uncertainty caused by single-photon measurements (see Section 2). The energy of a single photon E photon at 905 nm is 2.2 × 10−19 J, according to Equation (10), with h as Plank’s constant and the light speed c .
E photon = h · c λ
The nominal ocular hazard distance (NOHD) is the distance from the output aperture, beyond which the emitted energy remains below the AEL [15]. For each assumed NOHD, the corresponding maximum energy output can be calculated according to Equation (9). This leads to a detection range (calculated with the equations in Table 2) for each scanning type depending on the chosen NOHD (Figure 8).
For spot irradiation, a laser beam with a divergence angle of 0.04° extends to the pupil diameter of 7 mm in about 10 m. All the emitted energy is able to fall on the retina up to this range. Hence, the total emitted energy equals the AEL value and leads to a constant detection range of ~103 m. As the NOHD increases further, the spot irradiation covers a larger area than the pupil aperture, allowing an increased energy output and an extensive detection range. Concerning a detection range of 200 m, an NOHD of ~20 m is required.
For blade and flash irradiation, the detection range increases significantly with the NOHD. With the given assumptions, horizontal and vertical blade irradiation enables a detection range of the required 200 m with an NOHD of 0.03 m. These distances can easily be maintained with a suitable housing or mounting position of the system. For flash irradiations, the eye-safety distance (NOHD) is 0.17 m.

5. Discussion

In the Section 6, we examine the required detection range and FOV for LiDAR systems based on ADAS functions. Using an exemplary algorithm and different object categories, the requirement of angular resolution is determined by object detection. Furthermore, different beam irradiation patterns are compared for the eye-safety distance (NOHD).
Section 4.1 deduces the required detection range of LiDAR systems based on collision avoidance. Concerning a maximum speed of 140 km∙h−1, LiDAR systems must be able to detect objects in a distance of 200 m. In Germany, several motorways have no speed limit. LiDAR systems require a more extensive detection range for this case, and no current system can guarantee driving safety for an unlimited speed scenario.
The discussion of the FOV requirement in Section 4.2 focuses on the coverage of the subject lane, which results in a limited horizontal FOV. Therefore, detecting a cut-in vehicle can be very late, as illustrated in Figure 9a. A substantial offset (sensors capture only a part of the object width) of the traffic can also lead to failures in detection, especially in detecting motorbikes due to their narrow silhouette [49]. A similar situation happens on bicycles riding on the street (Figure 9b). These issues can be compensated by extending the horizontal FOV of the sensors or by using additional sensors with a large horizontal FOV and possibly a short detection range (Figure 9c). With a lane width of 3.75 m and a minimum distance to a cut-in vehicle of 2 m at a low speed [49], an entire horizontal FOV of 140° is required to cover the width of three lanes.
Section 4.3 investigates the angular resolution limits of LiDAR systems based on the PointPillars object detection algorithm. Concerning an angular resolution of 0.1°, as stated in the mentioned literature (see Section 2), a vehicle in the rear view can be detected at a maximum distance of 150 m. The speed of a subject vehicle must be less than 100 km∙h1 to avoid forward collisions at this distance. This warning distance allows a traveling speed of 100 km∙h−1 to prevent forward collisions. Considering oncoming traffic, the subject and oncoming vehicles must drive at less than 65 km∙h−1 (assuming the same speed for both vehicles).
As the driving speed increases or smaller detection objects have to be considered (such as bicycles or motorbikes), this detection distance and the angular resolution are insufficient. For a speed of 100 km∙h1 of both approaching vehicles, a detection range of 180 m is required.
Considering a motorway (the detection of oncoming traffic is not relevant) and a driving speed of 140 km∙h1, a detection range of 200 m is required. In this case, the minimum angular resolution in the horizontal and vertical directions to identify a stationary passenger car is 0.075°. For smaller objects, like (motor)cycles in the rear view, a minimum angular resolution of 0.04° is required for a confidence score of 0.5. LiDAR systems require sharper lenses and detectors with more pixels to obtain a superior angular resolution, which can raise the manufacturing cost. Instead, improving the accuracy of neural networks by using such type-3 fuzzy logic systems [50] can increase the detection precision in low-density point clouds, while decreasing the requirements of the angular resolution for LiDAR systems.
Section 4.4 compares the maximum detection range for three radiation patterns concerning laser safety limitations. With blade and flash irradiation, the required detection range of 200 m is realized with a safety distance (NOHD) of less than 0.2 m. Furthermore, for the very low NOHD of 0.03 m of the blade irradiation in the example presented in this paper, a mechanical solution like an appropriate housing or mounting position of the LiDAR system is sufficient to ensure eye safety.
LiDAR systems that scan with a collimated laser spot have to meet the demand of a significant pulse repetition rate, significantly limiting the output power to maintain eye safety. To increase the detection range, the emitter can be designed as an array of lasers to decrease the pulse repetition rate when using a single laser diode.
Another possibility for increasing eye safety or the range of LiDAR systems is using lasers with longer wavelengths, for example, 1550 nm. In this way, a significant detection range can be achieved even with the technologically simple approach of flash LiDAR while maintaining eye safety. The tradeoff, however, is the required use of an InGaAs-based detector, which is significantly more expensive than a conventional one for lower wavelengths.

6. Conclusions

In this paper, the requirements of essential LiDAR metrics are investigated. Concerning a motorway speed limit of 140 km∙h1, LiDAR systems require a minimum detection range of 200 m to avoid the worst case of forwarding collisions. Regarding the ACC and FVCWS function, LiDAR systems must have a FOV of 32.6° × 10.2° to detect preceding vehicles in curves with a minimum radius of 125 m. The angular resolution requirements are obtained based on the PointPillars algorithm with a confidence score of 0.5. A LiDAR system must have an angular resolution of 0.04° × 0.04° to classify (motor)cyclists in a long-range (200 m), assuming quadratic-shaped detector pixels. Under the consideration of laser safety and using some basic assumptions of the variables in range equations, LiDAR systems using a blade irradiation fulfill the requirements with the shortest eye-safety distance (NOHD) of all considered system designs, which is 0.03 m in our example. For spot irradiation LiDAR systems, the maximum accessible emitted power is strongly limited by the high pulse frequency, reducing the detection range. Decreasing the pulse frequency can help obtain a more extended detection range under the premise of eye safety.

Author Contributions

Conceptualization, Z.D., A.W., T.G. and P.-P.L.; methodology, Z.D., A.W. and P.-P.L.; software, Z.D.; validation, Z.D., A.W. and P.-P.L.; formal analysis, Z.D., A.W., M.C.S., T.G. and P.-P.L.; investigation, Z.D., A.W., P.-P.L., M.C.S. and T.G.; resources, R.L.; data curation, Z.D.; writing—original draft preparation, Z.D.; writing—review and editing, A.W., P.-P.L., M.C.S. and T.G.; visualization, Z.D.; supervision, R.L.; project administration, R.L.; funding acquisition, Z.D. and R.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Deutsche Forschungsgemeinschaft (DFG) under Germany’s Excellence Strategy within the Cluster of Excellence PhoenixD (EXC 2122, Project ID 390833453). The publication of this article was funded by the Open Access Fund of Leibniz Universität Hannover.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data analyzed in this study are available via open access (KITTI dataset, https://www.cvlibs.net/datasets/kitti/ (accessed on 22 March 2022)). The downsampled version further analyzed here can be obtained from the authors, upon reasonable request.

Acknowledgments

The authors would like to thank Yang Li for his valuable comments and constructive suggestions to polish this paper to a high quality.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Appendix A.1. Lambertian Backscatter

The lambertian backscatter is shown in Figure A1.
Figure A1. Lambertian backscatter.
Figure A1. Lambertian backscatter.
Sensors 22 07532 g0a1
Accroding to the Lambert’s cosine law, the radiant intensity I (W∙sr−1) of a backscatter is the product of the peak intensity I max (W∙sr−1) in the normal direction and the consine of the angle   θ between the direction of the reflection and the surface normal.
I = I max · cos θ
The total reflected radiant flux ϕ total (W) is the integral of the radiant intensity I over the solid angle and is given in:
ϕ total = 0 2 π 0 π / 2 I max · cos θ sin θ d θ d φ = π   sr · I max

Appendix A.2. Spot Irradiation

In this case, the beam divergence angle is smaller than the FOV of a single detector pixel in horizontal and vertical directions, as shown in Figure A2.
Figure A2. Spot irradiation.
Figure A2. Spot irradiation.
Sensors 22 07532 g0a2
Concerning a circular or elliptical beam cross-section, the beam spread area of the emitter A s 1 is given with:
A s 1 = π · a · b
where a is the half-width of the beam cross-section. b is the half-height of the beam cross-section.
Considering a distance r and a horizontal beam divergence angle φ h , the half-width of the beam cross-section a is given with:
  a = tan ( φ h / 2 ) · r
Similarly, the half-height of the beam cross-section b is given with:
b = tan ( φ v / 2 ) · r
Substituting Equations (A4) and (A5) into Equation (A3),
A s 1 = π · tan ( φ h / 2 ) · tan ( φ v / 2 ) · r 2
Because the target is assumed to be larger than a single pixel, the cross-section Γ 1 equals the beam spread area of the emitter A s 1 with:
Γ 1 = π · tan ( φ h / 2 ) · tan ( φ v / 2 ) · r 2

Appendix A.3. Blade Irradiation

Using blade illumination, the total beam spread area A s 2 is larger than a pixel image only in one direction (horizontal or vertical). Here, we assume, for simplicity, a large beam spread in a horizontal direction and a narrow beam in a vertical direction.
As shown in Figure A3, the vertical beam divergence angle φ v is assumed to be smaller than the open angle of a single detector pixel α , the A s 2 can be approximated by the area of a rectangle with:
A s 2 = i · j
where   i is the width of the rectangle, corresponding to the horizontal FOV φ h :
i = 2 ·   tan ( φ h / 2 ) · r
The height of the rectangle j corresponds to the vertical FOV φ v with:
j = 2 ·   tan ( φ v / 2 ) · r
Substituting Equations (A9) and (A10) into Equation (A8),
A s 2 = 4 · tan ( φ h / 2 ) · tan ( φ v / 2 ) · r 2
Concerning a single detector pixel, the cross-section Γ 2 is a part of the A s 2 with:
Γ 2 =   4 · tan ( α / 2 ) · tan ( φ v / 2 ) · r 2
where α is the opening angle of the middle detector pixel.
Figure A3. Horizontal blade irradiation.
Figure A3. Horizontal blade irradiation.
Sensors 22 07532 g0a3
The derivation of the range distance of vertical blade irradiations is similar to the steps above. Only indices h and v in the equations have to be swapped.

Appendix A.4. Flash Irradiation

The case of flash irradiation is shown in Figure A4.
Figure A4. Flash irradiation.
Figure A4. Flash irradiation.
Sensors 22 07532 g0a4
Similar to the blade illumination, the whole irradiation area of the emitter A s 3 corresponds to a rectangular area, concerning a FOV of φ h and φ v in horizontal and vertical directions.
A s 3 = 4 · tan ( φ h / 2 ) · tan ( φ v / 2 ) · r 2
Similarly, concerning quadratic-shaped pixels, the opening angle of a middle detector pixel can be assumed with α in horizontal and vertical directions. The cross-section Γ 3 is calculated with:
Γ 3 = 4 · tan ( α / 2 ) 2 · r 2

References

  1. Hecht, J. LiDAR for self-driving cars. Opt. Photonics News 2018, 29, 26–33. [Google Scholar] [CrossRef]
  2. Stann, B.L.; Dammann, J.F.; Giza, M.M.; Jian, P.S.; Lawler, W.B.; Nguyen, H.M.; Sadler, L.C. MEMS-scanned ladar sensor for small ground robots. In Laser Radar Technology and Applications XV; International Society for Optics and Photonics: Orlando, FL, USA, 2010; Volume 7684, p. 76841E. [Google Scholar] [CrossRef]
  3. Chung, S.; Abediasl, H.; Hashemi, H. A Monolithically Integrated Large-Scale Optical Phased Array in Silicon-on-Insulator CMOS. IEEE J. Solid-State Circuits 2018, 53, 275–296. [Google Scholar] [CrossRef]
  4. Halterman, R.; Bruch, M. Velodyne HDL-64E lidar for unmanned surface vehicle obstacle detection. In Unmanned Systems Technology XII; International Society for Optics and Photonics: Orlando, FL, USA, 2010; Volume 7692, p. 76920D. [Google Scholar] [CrossRef]
  5. Hao, Q.; Tao, Y.; Cao, J.; Cheng, Y. Development of pulsed-laser three-dimensional imaging flash lidar using APD arrays. Microw. Opt. Technol. Lett. 2021, 63, 2492–2509. [Google Scholar] [CrossRef]
  6. Warren, M.E. Automotive LIDAR technology. In Proceedings of the 2019 Symposium on VLSI Circuits, Kyoto, Japan, 9–14 June 2019; IEEE: New York, NY, USA, 2019; pp. C254–C255. [Google Scholar] [CrossRef]
  7. Wang, D.; Watkins, C.; Xie, H. MEMS mirrors for LiDAR: A review. Micromachines 2020, 11, 456. [Google Scholar] [CrossRef]
  8. Poulton, C.V.; Byrd, M.J.; Russo, P.; Timurdogan., E.; Khandaker., M.; Vermeulen., D.; Watts, M.R. Long-range LiDAR and free-space data communication with high-performance optical phased arrays. IEEE J. Sel. Top. Quantum Electron. 2019, 25, 1–8. [Google Scholar] [CrossRef]
  9. Zhao, F.; Jiang, H.; Liu, Z. Recent development of automotive LiDAR technology, industry and trends. In Proceedings of the Eleventh International Conference on Digital Image Processing, Guangzhou, China, 10–13 May 2019; Volume 11179, p. 111794A. [Google Scholar] [CrossRef]
  10. Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and sensor fusion technology in autonomous vehicles: A review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef]
  11. Xu, F.; Qiao, D.; Xia, C.; Song, X.; Zheng, W.; He, Y.; Fan, Q. A semi-coaxial MEMS LiDAR design with independently adjustable detection range and angular resolution. Sens. Actuators A Phys. 2021, 326, 112715. [Google Scholar] [CrossRef]
  12. Hsu, C.P.; Li, B.; Rivas, B.S.; Gohil, A.; Chan, P.H.; Moore, A.D.; Donzella, V. A review and perspective on optical phased array for automotive LiDAR. IEEE J. Sel. Top. Quantum Electron. 2020, 27, 8300416. [Google Scholar] [CrossRef]
  13. Takashima, Y.; Hellman, B. Review paper: Imaging lidar by digital micromirror device. Opt. Rev. 2020, 27, 400–408. [Google Scholar] [CrossRef]
  14. Raj, T.; Hashim, F.H.; Huddin, A.B.; Ibrahim, M.F.; Hussain, A. A Survey on LiDAR Scanning Mechanisms. Electronics 2020, 9, 741. [Google Scholar] [CrossRef]
  15. IEC 60825-1; Safety of Laser Products—Part 1: Equipment Classification and Requirements. IEC: Geneva, Switzerland, 2014.
  16. Royo, S.; Ballesta-Garcia, M. An Overview of Lidar Imaging Systems for Autonomous Vehicles. Appl. Sci. 2019, 9, 4093. [Google Scholar] [CrossRef] [Green Version]
  17. Dummer, M.; Johnson, K.; Rothwell, S.; Tatah, K.; Hibbs-Brenner, M. The role of VCSELs in 3D sensing and LiDAR. In Optical Interconnects XXI; International Society for Optics and Photonics: Orlando, FL, USA, 2021; Volume 11692, p. 116920C. [Google Scholar] [CrossRef]
  18. Eichler, H.J.; Eichler, J. Laser: Bauformen, Strahlführung, Anwendungen; Springer: Berlin/Heidelberg, Germany, 2015; p. 370. [Google Scholar] [CrossRef]
  19. McManamon, P.F.; Banks, P.S.; Beck, J.D.; Fried, D.G.; Huntington, A.S.; Watson, E.A. Comparison of flash LiDAR detector options. Opt. Eng. 2017, 56, 031223. [Google Scholar] [CrossRef] [Green Version]
  20. McManamon, P.F. Field Guide to LiDAR; SPIE Press: Bellingham, WA, USA, 2015; ISBN 9781628416541. [Google Scholar]
  21. Pulikkaseril, C.; Lam, S. Laser eyes for driverless cars: The road to automotive LIDAR. In Proceedings of the Optical Fiber Communication Conference, San Diego, CA, USA, 3–7 March 2019; Optical Society of America: San Diego, CA, USA, 2019; pp. 1–4. [Google Scholar] [CrossRef]
  22. Villa, F.; Severini, F.; Madonini, F.; Zappa, F. SPADs and SiPMs Arrays for Long-Range High-Speed Light Detection and Ranging (LiDAR). Sensors 2021, 21, 3839. [Google Scholar] [CrossRef] [PubMed]
  23. Kloppenburg, G.; Wolf, A.; Lachmayer, R. High-resolution vehicle headlamps: Technologies and scanning prototype. Adv. Opt. Technol. 2016, 5, 147–155. [Google Scholar] [CrossRef] [Green Version]
  24. Ley, P.P.; Lachmayer, R. Imaging and non-imaging illumination of DLP for high resolution headlamps. In Emerging Digital Micromirror Device Based Systems and Applications XI; International Society for Optics and Photonics: San Francisco, CA, USA, 2019; Volume 10932, p. 109320O. [Google Scholar] [CrossRef]
  25. Kutila, M.; Pyykönen, P.; Ritter, W.; Sawade, O.; Schäufele, B. Automotive LIDAR sensor development scenarios for harsh weather conditions. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016; pp. 265–270. [Google Scholar] [CrossRef]
  26. Wolf, A.; Kloppenburg, G.; Danov, R.; Lachmayer, R. DMD Based Automotive Lighting Unit. In Proceedings of the DGaO Proceedings, Hannover, Germany, 17–21 May 2016. [Google Scholar] [CrossRef]
  27. Knöchelmann, M.; Wolf, A.; Kloppenburg, G.; Lachmayer, R. Aktiver Scheinwerfer mit DMD-Technologie zur Erzeugung vollständiger Lichtverteilungen. VDI Ber. 2018, 2323, 61–78. [Google Scholar] [CrossRef]
  28. Li, Y.; Wolf, A.; Lachmayer, R. Laser LCoS High-Resolution Headlamp. In Spatially, Temporally and Spectrally Adapted Light for Applications; TEWISS: Garbsen, Germany, 2020; pp. 33–35. ISBN 978-3-95900-424-4. [Google Scholar]
  29. ISO 15623:2013; Intelligent Transport Systems—Forward Vehicle Collision Warning Systems—Performance Requirements and test Procedures. International Organization for Standardization: Geneva, Switzerland, 2013.
  30. Gotzig, H.; Geduld, G.O. LIDAR-sensorik. In Handbuch Fahrerassistenzsysteme; ATZ/MTZ-Fachbuch; Springer Vieweg: Wiesbaden, Germany, 2015; pp. 317–334. [Google Scholar] [CrossRef]
  31. Thakur, R. Scanning LIDAR in Advanced Driver Assistance Systems and Beyond: Building a road map for next-generation LIDAR technology. IEEE Consum. Electron. Mag. 2016, 5, 48–54. [Google Scholar] [CrossRef]
  32. ISO 15622:2018; Intelligent Transport Systems—Adaptive Cruise Control Systems—Performance Requirements and Test Procedures. International Organization for Standardization: Geneva, Switzerland, 2018.
  33. Johansson, G.; Rumar, K. Drivers’ brake reaction times. Hum. Factors 1971, 13, 23–27. [Google Scholar] [CrossRef]
  34. Speed Limits by Country. Available online: https://en.wikipedia.org/wiki/Speed_limits_by_country (accessed on 24 January 2022).
  35. Forschungsgesellschaft für Straßen- und Verkehrswesen (FGSV). Richtlinien für die Anlage von Landstraßen. RASt 06; FGSV Verlag: Köln, Germany, 2012. [Google Scholar]
  36. Forschungsgesellschaft für Straßen- und Verkehrswesen (FGSV). Richtlinien für die Anlage von Stadtstraßen. RASt 06; FGSV Verlag: Köln, Germany, 2008. [Google Scholar]
  37. Gut, C.; Cristea, I.; Neumann, C. High-resolution headlamp. Adv. Opt. Technol. 2016, 5, 109–116. [Google Scholar] [CrossRef]
  38. Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The kitti dataset. Int. J. Robot. Res. 2013, 32, 1231–1237. [Google Scholar] [CrossRef]
  39. Shi, S.; Guo, C.; Jiang, L.; Wang, Z.; Shi, J.; Wang, X.; Li, H. Pv-rcnn: Point-voxel feature set abstraction for 3d object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 10529–10538. [Google Scholar]
  40. Shi, S.; Wang, X.; Li, H. Pointrcnn: 3D object proposal generation and detection from point cloud. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 770–779. [Google Scholar]
  41. Yan, Y.; Mao, Y.; Li, B. Second: Sparsely embedded convolutional detection. Sensors 2018, 18, 3337. [Google Scholar] [CrossRef] [Green Version]
  42. Lang, A.H.; Vora, S.; Caesar, H.; Zhou, L.; Yang, J.; Beijbom, O. Pointpillars: Fast encoders for object detection from point clouds. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 12697–12705. [Google Scholar]
  43. Simonelli, A.; Bulo, S.R.; Porzi, L.; López-Antequera, M.; Kontschieder, P. Disentangling monocular 3d object detection. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Korea, 27 October–2 November 2019; pp. 1991–1999. [Google Scholar]
  44. Maksymova, I.; Steger, C.; Druml, N. Review of LiDAR Sensor Data Acquisition and Compression for Automotive Applications. Proceedings 2018, 2, 852. [Google Scholar] [CrossRef] [Green Version]
  45. Everingham, M.; Van Gool, L.; Williams, C.K.; Winn, J.; Zisserman, A. The pascal visual object classes (voc) challenge. Int. J. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef] [Green Version]
  46. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
  47. Kutila, M.; Pyykönen, P.; Holzhüter, H.; Colomb, M.; Duthon, P. Automotive LiDAR performance verification in fog and rain. In Proceedings of the 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 4–7 November 2018; pp. 1695–1701. [Google Scholar] [CrossRef]
  48. Knigge, A.; Klehr, A.; Wenzel, H.; Zeghuzi, A.; Fricke, J.; Maaßdorf, A.; Tränkle, G. Wavelength-stabilized high-pulse-power laser diodes for automotive LiDAR. Phys. Status Solidi 2018, 215, 1700439. [Google Scholar] [CrossRef]
  49. Winner, H.; Schopper, M. Adaptive Cruise Control. In Handbuch Fahrerassistenzsysteme; ATZ/MTZ-Fachbuch; Springer Vieweg: Wiesbaden, Germany, 2015; pp. 851–890. [Google Scholar] [CrossRef]
  50. Qasem, S.N.; Ahmadian, A.; Mohammadzadeh, A.; Rathinasamy, S.; Pahlevanzadeh, B. A type-3 logic fuzzy system: Optimized by a correntropy based Kalman filter with adaptive fuzzy kernel size. Inf. Sci. 2021, 572, 424–443. [Google Scholar] [CrossRef]
Figure 1. Radiation patterns of LiDAR systems according to [22]: (a) spot irradiation with collimated beam; (b) horizontal blade irradiation; and (c) flash irradiation of the entire FOV.
Figure 1. Radiation patterns of LiDAR systems according to [22]: (a) spot irradiation with collimated beam; (b) horizontal blade irradiation; and (c) flash irradiation of the entire FOV.
Sensors 22 07532 g001
Figure 2. Warning distance as a function of the subject vehicle speed or relative speed for three typical scenarios; subject vehicle speed for scenario 3 is 140 km∙h−1.
Figure 2. Warning distance as a function of the subject vehicle speed or relative speed for three typical scenarios; subject vehicle speed for scenario 3 is 140 km∙h−1.
Sensors 22 07532 g002
Figure 3. Detection area in a curve: (a) schematic to calculate the horizontal angle to cover the full lane width in a curve with radius R according to [29]; (b) required entire horizontal FOV for different curve radii.
Figure 3. Detection area in a curve: (a) schematic to calculate the horizontal angle to cover the full lane width in a curve with radius R according to [29]; (b) required entire horizontal FOV for different curve radii.
Sensors 22 07532 g003
Figure 4. Determination of the FOV for the FVCWS function according to [29].
Figure 4. Determination of the FOV for the FVCWS function according to [29].
Sensors 22 07532 g004
Figure 5. Detection results for cars, bicyclists, and pedestrians: (ac) car in rear perspective; (df) car in lateral perspective; (gi) bicyclist in rear perspective; (jl) bicyblist in lateral perspective; (mo) pedestrian. The left column shows RGB images of detection objects from the KITTI dataset [38]; the middle column shows raw point clouds of detection objects; and the right column shows the distribution of confidence scores varying with different angular resolutions.
Figure 5. Detection results for cars, bicyclists, and pedestrians: (ac) car in rear perspective; (df) car in lateral perspective; (gi) bicyclist in rear perspective; (jl) bicyblist in lateral perspective; (mo) pedestrian. The left column shows RGB images of detection objects from the KITTI dataset [38]; the middle column shows raw point clouds of detection objects; and the right column shows the distribution of confidence scores varying with different angular resolutions.
Sensors 22 07532 g005aSensors 22 07532 g005b
Figure 6. Required angular resolution for quadratic shaped pixels and different objects as a function of distance with a confidence score of 0.5.
Figure 6. Required angular resolution for quadratic shaped pixels and different objects as a function of distance with a confidence score of 0.5.
Sensors 22 07532 g006
Figure 7. Maximum accessible emitting energy for (a) spot scanning LiDAR systems; (b) blade irradiation horizontal scanning LiDAR systems; (c) blade irradiation vertical scanning LiDAR systems; and (d) flash LiDAR systems.
Figure 7. Maximum accessible emitting energy for (a) spot scanning LiDAR systems; (b) blade irradiation horizontal scanning LiDAR systems; (c) blade irradiation vertical scanning LiDAR systems; and (d) flash LiDAR systems.
Sensors 22 07532 g007aSensors 22 07532 g007b
Figure 8. Detection range as a function of the eye safety distance for different LiDAR systems.
Figure 8. Detection range as a function of the eye safety distance for different LiDAR systems.
Sensors 22 07532 g008
Figure 9. Illustration of exemplary problem situations due to a narrow FOV: (a) late detection of a cut-in vehicle; (b) offset of a motorbike; and (c) detection of three lanes.
Figure 9. Illustration of exemplary problem situations due to a narrow FOV: (a) late detection of a cut-in vehicle; (b) offset of a motorbike; and (c) detection of three lanes.
Sensors 22 07532 g009
Table 1. List of main variables in Equation (3).
Table 1. List of main variables in Equation (3).
SymbolQuantityUnits
P r Received powerW
P s Emitted powerW
T A Atmospheric transmission-
η t Optical efficiency of the emitter-
A s Beam spread area of the emitter at the targetm2
Γ Target cross-sectionm2
ρ Reflectance of the target-
r Distance between LiDAR and the targetm
A R Optical aperture of the receiverm2
η r Optical efficiency of the receiver-
Table 2. Range equations concerning a single pixel for various radiation patterns.
Table 2. Range equations concerning a single pixel for various radiation patterns.
CaseBeam Spread Area A s Cross-Section Γ Range Equation
Spot irradiation π · tan ( φ h / 2 ) · tan ( φ v / 2 ) · r 2 π · tan ( φ h / 2 ) · tan ( φ v / 2 ) · r 2 P r = P s · K
Blade irradiation 4 · tan ( φ h / 2 ) · tan ( φ v / 2 ) · r 2 4 · tan ( α / 2 ) · tan ( φ v / 2 ) · r 2 P r = P s · K · tan ( α / 2 ) tan ( φ h / 2 )
Flash irradiation 4 · tan ( φ h / 2 ) · tan ( φ v / 2 ) · r 2 Γ 3 = 4 · tan ( α / 2 ) 2 · r 2 P r = P s · K · tan ( α / 2 ) tan ( φ h / 2 ) · tan ( α / 2 ) tan ( φ v / 2 )
Table 3. Scenarios in FVCWS according to ISO 15623 [29].
Table 3. Scenarios in FVCWS according to ISO 15623 [29].
No.ScenarioWarning Distance
1Preceding vehicle travels at ordinary speed v 1 · T
2Preceding vehicle is stationary   v 1 · T + v 1 2 / 2 a 1
3Preceding vehicle decelerates with a relative speed v rel   ( T + v rel / a 1 ) · v 1     v rel 2 / 2 a 1
Table 4. The required opening angle for FVCWS function with different curve radii according to [34].
Table 4. The required opening angle for FVCWS function with different curve radii according to [34].
Curve ClassCurve RadiusFull Horizontal FOVFull Vertical FOV
Class Ⅰ≥500 m12.4°5.2°
Class Ⅱ≥250 m18.0°6.8°
Class Ⅲ≥125 m32.6°10.2°
Table 5. Comparison of various 3D object detection algorithms based on the KITTI dataset.
Table 5. Comparison of various 3D object detection algorithms based on the KITTI dataset.
AlgorithmsAverage PrecisionProcessing Time per Image (GPU/CPU)
CarsPedestriansBicyclists
PV-RCNN [39]94.10%66.38%75.77%0.1837 s/0.1726 s
PointRCNN [40]92.90%75.03%76.76%0.0861 s/0.0851 s
SECOND [41]94.51%71.94%76.50%0.0487 s/0.0488 s
PointPillars [42]93.91%65.46%72.34%0.0270 s/0.0286 s
Table 6. Minimum pixel number to detect main traffic targets under a confidence score of 0.5.
Table 6. Minimum pixel number to detect main traffic targets under a confidence score of 0.5.
ObjectPerspectiveAspect Ratio (B:H)Min. Required PointsMin. Pixel Number
CarRear2:1318 × 4
Lateral4:12512 × 3
BicyclistRear1:3464 × 12
Lateral7:6488 × 6
Pedestrian-3:8143 × 8
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dai, Z.; Wolf, A.; Ley, P.-P.; Glück, T.; Sundermeier, M.C.; Lachmayer, R. Requirements for Automotive LiDAR Systems. Sensors 2022, 22, 7532. https://doi.org/10.3390/s22197532

AMA Style

Dai Z, Wolf A, Ley P-P, Glück T, Sundermeier MC, Lachmayer R. Requirements for Automotive LiDAR Systems. Sensors. 2022; 22(19):7532. https://doi.org/10.3390/s22197532

Chicago/Turabian Style

Dai, Zhuoqun, Alexander Wolf, Peer-Phillip Ley, Tobias Glück, Max Caspar Sundermeier, and Roland Lachmayer. 2022. "Requirements for Automotive LiDAR Systems" Sensors 22, no. 19: 7532. https://doi.org/10.3390/s22197532

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop