Algorithm Fusion for 3D Ground-Penetrating Radar Imaging with Field Examples

: Numerous data processing algorithms are available for ground-penetrating radar (GPR) data processing. However, most of the existing processing algorithms are derived from Fourier theory and assume that the system is linear or that data are stationary, which may oversimplify the case. Some nonlinear algorithms are accessible for improvement but generally are for stationary and deterministic systems. To alleviate the dilemma, this study proposes an algorithm fusion scheme that employs standard linear techniques in conjunction with a newer nonlinear and non-stationary method. The linear techniques include linear ﬁltering, migration, and interpolation. The newer method is mainly for nonlinear ﬁltering and image reconstruction. The results can be demonstrated in a two-dimensional single proﬁle (time–distance section) or a 3D visualization if survey lines fulﬁll the 3D Nyquist sample intervals requirement. Two controlled experiments were conducted to justify the proposed scheme. Then, a ﬁeld study including two examples was carried out to demonstrate the feasibility of practical applications. Compared with conventional methods, the proposed algorithm fusion provides better visualization and integrative interpretation for GPR imaging.


Introduction
In scientific and engineering investigations, various techniques are available for nondestructive testing (NDT) and earth observation (EO). Ground-penetrating radar (GPR) is one of the essential geophysical techniques practiced in these fields [1][2][3][4]. The advantages of using GPR are its versatility in detecting targets, regardless of metallic or nonmetallic nature [3], with low cost, high resolution, and real-time imaging [5,6].
To carry out a successful GPR survey, data processing is crucial, and effective algorithms or appropriate processing schemes certainly help. In the past few decades, numerous data processing algorithms have been proposed to improve outcomes in GPR imaging. There are commonly two categories, i.e., data analysis and simulation. This study focuses on data analysis.
As is well known, linearity (system satisfies the superposition principle) and stationarity (data features are constant over time) are two major concerns when processing real-world data. For linear and stationary processing, Fourier analysis and Fourier transform are well-developed robust approaches, and therefore several Fourier-based novel algorithms are available. Among them, wavelet transform [7], multiresolution analysis (MRA) [8], curvelet filtering [9], and singular value decomposition (SVD) [10] are popular for noise suppression of non-stationary data under some fundamental restrictions similar to Fourier transform. For example, wavelet transformation and its related approaches [11,12] postulate that the system is linear, and the data must be periodic or stationary [13]. In fact, all the other algorithms mentioned above assume, or at least do not reject, that the systems are linear when processing non-stationary data, but this is rarely the case for real data. Most systems are usually nonlinear and data are non-stationary. Other than the data analysis approaches, an optimized modeling technique, the full waveform inversion [14][15][16] method, is suggested for improving GPR image resolution. Moreover, some innovative machine learning approaches using statistical methods and neural network mathematical models [17][18][19] have evolved in recent years. Optimized modeling and simulation techniques provide alternative solutions, but those are akin to a popular seismological method called evolutionary spectrum [20] and the applications may change data analysis into statistical data simulation [21], which is outside the scope of this study.
For nonlinear data processing, several nonlinear time-series analysis methods are accessible [22,23] but assume the systems are stationary and deterministic [24,25], which is unrealistic because an a priori basis creates more problems than solutions [26].
Although having limitations in application, most existing methods of data analysis can basically deal either with linear nonstationary or nonlinear stationary data processing but have difficulty in performing nonlinear and nonstationary data analysis.
In this study, we aim to propose an algorithm fusion scheme to alleviate the dilemma of processing nonlinear and nonstationary data. The conventional (called "standard" as well) and novel algorithms all have their advantages and shortcomings. Most conventional algorithms are theoretically complete and authentic in technical applications, even though they may not be perfect. The novel algorithms can cope with nonlinear and nonstationary problems but usually suffer from incomplete theory, confidence limit, and high computation cost. Therefore, a decent fusion of conventional and contemporary algorithms may provide a better solution. For this reason, we designed an algorithm fusion scheme that combines a nonlinear and nonstationary data analysis method with conventional Fourier-based algorithms. The nonlinear and nonstationary algorithm, natural logarithmic transformed (NLT) ensemble empirical mode decomposition (EEMD), is a modified empirical mode decomposition (EMD) method with a noise-assisted extension. The conventional algorithms are standard GPR processing techniques including background noise removal filters at the beginning of processing, frequency-wavenumber (f -k) migration to focus GPR images, and 2D time slice interpolation in the pseudo-3D data construction. We designed two control experiments, including 2D profiling and 3D imaging, to assess the proposed method before carrying out a field survey. An archaeological site located in a remote mountainous village was selected as the site for the case study. The proposed method shows significant advantages, both in the experiments and in the case study.

Materials and Methods
Since the 1970s onwards, GPR is a reflection technique that has become more popular for high-resolution near-surface EO in geophysics with links to NDT [5,27]. However, GPR data usually need a skillful process to develop a meaningful profile (time-distance section) for interpretation. To acquire a desired high-resolution GPR image, signal attenuation and noise interference are two main issues to solve. The higher-level goal is to achieve full-resolution 3D imaging. Time slices of the 3D data volume allow the interpreter to generate contour maps for different horizons. Therefore, a better image of buried targets can be obtained from the contour maps of a 3D GPR data volume. Although both field techniques and commercial software for 3D GPR surveys have been available for decades, researchers are still making efforts to improve the quality and efficiency of 3D GPR field surveys [28][29][30][31]. A particular focus is on rugged terrain and places with vegetation. The situation could be more difficult for employing 3D techniques in some special environments such as potential archeological sites, due to the requirements of cultural heritage preservation. To deal with the mentioned challenging tasks and make better GPR imaging possible, we propose an algorithm fusion scheme with experiments and field examples in the hope of making it a valuable tool for future GPR studies. In the experiments, we separated 2D and 3D considerations into different controlled tests because 2D imaging is still the most practiced technique in the field, and the proposed method is not only effective for 3D imaging. The field study aimed to investigate a cultural protection archaeological site with Remote Sens. 2023, 15, 2886 3 of 24 dense vegetation that might prohibit laying out full-resolution 3D survey lines. Therefore, we designed a casket model to carry out 2D imaging in the first experiment and tested the 3D imaging of a miniature multilayered archaeological object model with extremely limited survey lines in the second experiment to evaluate the proposed method before conducting the field study. The rest of this section describes the details of the proposed fusion scheme.

Standard Linear Process
Original GPR data may suffer severely from systemic and environmental noise. Before applying the nonlinear method, we adopt common linear filtering techniques for preliminary data processing because the techniques are well-developed and effective for simple routine processing. These include a de-pulse filter to mute abnormal peaks, a de-DC (vertical background removal) filter to remove temporally consistent noise, e.g., system noise or locally superposed diffraction energy, and a de-mean (horizontal or sliding background removal) filter to subtract the horizontally coherent noise from the data. All the techniques mentioned are available In common textbooks [32,33] and commercial GPR data processing software, e.g., REFLEXW (K.J. Sandmeier Zipser Straße 1 D-76227 Karlsruhe Germany) and Matlab (a product of the MathWorks, Inc., Natick, MA, USA). However, to connect with our nonlinear processing procedure efficiently, we created a Matlab script for the preliminary linear process.

Nonlinear and Nonstationary NLT EEMD Filter Bank
EEMD, or its earlier version EMD, is an empirically based adaptive nonlinear dataanalysis method. The filter bank derived from the EMD or EEMD analysis is dyadic, which indicates a collection of band-pass filters that decomposes the input data into multiple components, and the spectra of neighboring filters can be overlapped [34]. The theoretical investigation and applications of EMD or EEMD and related analysis methods are still evolving. Numerous reports are available in the literature [35][36][37][38][39][40][41][42][43][44][45][46][47][48]. We summarize EEMD's technical details in the Appendix A for readers who do not specialize in this subject.
Unlike those using mono-frequency and stationary harmonic functions as the basis, the components derived from the EMD/EEMD filter bank are "oscillating functions" called intrinsic mode functions (IMFs). IMFs must satisfy two conditions: (1) The number of extrema and the number of zero crossings must be equal, or differ at most by one; (2) the envelopes defined by the local maxima and local minima are symmetrical to zero [21,25]. IMFs, therefore, represent the significant physical oscillation modes embedded in the data [21,38]. As a consequence, the filter bank provides a new means of nonlinear filtering to extract the signal and eliminate the noise from GPR data [49].
EEMD is a noise-assisted EMD algorithm. This approach effectively reduces mode mixing and is a substantial improvement over the EMD method. While EEMD yields an effective way of enhancing the signal and has fewer mode mixing problems than EMD, the energy dissipation caused by intrinsic attenuation and geometric spreading in GPR data is still a major factor that affects the success of EEMD filtering. One way to avoid distorting the data attributes when compensating for energy dissipation is by applying NLT to the amplitude of the data before EEMD [49]. The conversion is simply to take the natural logarithmic transform of the attenuated signal y(t) as where Ly(t) is the logarithm of the signal y(t); sign is the sign function for keeping the signal's original polarity, which is defined as 1 for y(t) > 0, 0 for y(t) = 0, and −1 for y(t) < 0 [45]. The constant c is to make the logarithmic value positive; the scaling factor s is the maximum output value control. NLT just displays signals on a logarithmic scale, and amplitude values are intact. This conversion operation is not a conventional practice in GPR data processing; it is similar to the well-known techniques of the dB scale in acoustic engineering and the Richter scale in seismology. NLT resembles the coordinate transformation commonly undertaken in mathematical physics. The proper choice of scale or reference system enhances the signal, and more importantly, the correlations and physical facts of the data do not depend upon which coordinate system or scale we use. Based on the reasons stated above, the logarithmic scale improves the dynamic range, i.e., permits a wider range of values than the linear scale when displaying data on the amplitude axis, and causes no artificial distortion [49,50].
A valuable application of the NLT EEMD filter bank is to reconstruct data. When reconstructing data, investigators should have fundamental knowledge of the signal and noise attributes in the data, and make basic assumptions for selecting components [38]. Since the patterns of signal and noise in reflection data are usually discernable and easy to identify, it is simple to reconstruct the data. The procedure is first to decompose each single GPR trace into a finite number of IMFs, and then to select components with convincing physical significance for 1D trace reconstruction. As a result, an EEMD-filtered 2D GPR profile is a collection of filtered 1D traces acquired from 2D survey lines, and a 3D volume is the combination of filtered 2D profiles with equal spacing. If the 2D survey line spacing and trace spacing are too coarse to achieve the Nyquist sample interval, or the two spacings are not equal, interpolation is required.

Algorithm Fusion and Construction of 3D Visualization
There are numerous algorithms available for GPR data processing. Most of them are effective under certain conditions. In this study, we try to integrate decent algorithms from various sources to achieve a better methodology for 3D GPR data processing that is difficult or not feasible using a specific processing system. It is well-known that filtered 2D data may construct a 3D cube if the original data are well-sampled, i.e., densely deployed parallel 2D lines of equal spacing with a high temporal sampling rate in one direction or, even better, in two perpendicular directions. Certainly, some conventional GPR processing steps (i.e., linear filtering, migration, etc.) are useful before or after 3D data construction. The constructed 3D cube can be re-interpolated (further interpolated) to increase sampling density if the data density in a certain direction is not high enough for presenting a satisfactory 3D cube or to compare two different 3D data files that were acquired in two different directions. These procedures are vital in depicting the data features. In the proposed algorithm fusion, we apply the conventional fast Fourier transform interpolation (FFTI) approach to pad zeros at higher frequencies of the Fourier coefficient vector, thus yielding more points in the time domain and producing the effect of interpolation [51]. Figure 1 is a flowchart indicating both the standard process flow and the proposed algorithm fusion sequence. The fusion strategy in this study is an integration of conventional GPR data processing techniques with the NLT EEMD method. At first, standard background noise-removing filters (de-pulse, de-DC, de-mean, etc.) are applied to eliminate temporally consistent noise (vertical noise) and spatially coherent noise (horizontal noise) from the data, and then each GPR trace is transformed into log scale by implementing NLT to reduce the exponential decay of the original data. As the major contribution of NLT is to compensate for energy dissipation, it may exaggerate horizontal coherent events. Therefore, it must be performed before EEMD but not necessarily before horizontal background noise removal. In practical GPR data processing, NLT can be performed before or after horizontal background noise removal. EEMD filtering is a key procedure to enhance the signal by removing noise compo nents and reconstructing the data using components of significant events. Depending o the distribution of signal events within the decomposed components, a reconstructed 2D profile could comprise one or several components. The EEMD-filtered 2D profiles are the ready for interpretation or generating a 3D file directly, as long as the original data ar well-sampled. Usually, if the stated condition fails, interpolations are required before 3D data volume construction, and the result is a pseudo-3D image.
When using EEMD techniques, we first decompose each single GPR trace into a finit number of IMFs by using the NLT EEMD filter bank. With basic knowledge of reflectio methods (GPR or seismic), one can easily distinguish the noise/signal events and then ex tract components with convincing physical significance for 1D trace reconstructing an 2D profiling (a good example will be shown in Section 3.1.2). However, some tricky fak events (see Section 3.1.1) may need further analysis such as velocity analysis, modeling spectral analysis, and so on. We prefer to adopt marginal spectrum analysis, which wa successfully applied in several of our studies [6,45,49]. Since this technique has bee widely applied in many subjects and the aim of this paper does not focus on this topic, w briefly mention it and give references. Marginal spectrum analysis is also vital for conven tional filtering to determine the pass-band of the filter, as will be shown and justified i the later section.
In general, an NLT EEMD-filtered 2D GPR profile is a collection of NLT EEMD-fi tered 1D traces acquired from 2D survey lines, and a 3D volume is the integration of NL EEMD-filtered 2D profiles of equal spacing with interpolation as needed. NLT EEMD fi tering can be multidimensional if the computation cost is affordable.

Results
We carried out two controlled experiments followed by two field examples for as Figure 1. Flowchart of the proposed algorithm fusion. Procedures are the same for the conventional and proposed approach until the completion of background noise removal. After trace editing, the proposed sequence is indicated by the solid red arrow. We use the term pseudo-3D to point out that the original data may not be well-sampled. EEMD filtering is a key procedure to enhance the signal by removing noise components and reconstructing the data using components of significant events. Depending on the distribution of signal events within the decomposed components, a reconstructed 2D profile could comprise one or several components. The EEMD-filtered 2D profiles are then ready for interpretation or generating a 3D file directly, as long as the original data are well-sampled. Usually, if the stated condition fails, interpolations are required before 3D data volume construction, and the result is a pseudo-3D image.
When using EEMD techniques, we first decompose each single GPR trace into a finite number of IMFs by using the NLT EEMD filter bank. With basic knowledge of reflection methods (GPR or seismic), one can easily distinguish the noise/signal events and then extract components with convincing physical significance for 1D trace reconstructing and 2D profiling (a good example will be shown in Section 3.1.2). However, some tricky fake events (see Section 3.1.1) may need further analysis such as velocity analysis, modeling, spectral analysis, and so on. We prefer to adopt marginal spectrum analysis, which was successfully applied in several of our studies [6,45,49]. Since this technique has been widely applied in many subjects and the aim of this paper does not focus on this topic, we briefly mention it and give references. Marginal spectrum analysis is also vital for conventional filtering to determine the pass-band of the filter, as will be shown and justified in the later section.
In general, an NLT EEMD-filtered 2D GPR profile is a collection of NLT EEMD-filtered 1D traces acquired from 2D survey lines, and a 3D volume is the integration of NLT EEMDfiltered 2D profiles of equal spacing with interpolation as needed. NLT EEMD filtering can be multidimensional if the computation cost is affordable.

Results
We carried out two controlled experiments followed by two field examples for assessing the feasibility of applying the proposed algorithm fusion to GPR field investigation. The experimental site was located within a university recreation park near a riverbank. It was a riverside farm field before it became a park. The field site was on the Chuping archaeological site in central Taiwan (Figure 2), which is also close to a river. Therefore, the sedimentary environments and the soil at both sites are comparable. However, due to construction, the surface at the experimental site is paved with a thin layer of waste dump soil together with the unconsolidated layers of silt, sand, and gravel underneath. In addition, the water table at the experimental site is always higher because of the weather. These factors make the experimental site a place of poor reflection quality [52], and a challenging environment for conducting GPR experiments.
emote Sens. 2023, 13, x FOR PEER REVIEW 6 of 2 Chuping archaeological site in central Taiwan (Figure 2), which is also close to a river Therefore, the sedimentary environments and the soil at both sites are comparable. How ever, due to construction, the surface at the experimental site is paved with a thin layer o waste dump soil together with the unconsolidated layers of silt, sand, and gravel under neath. In addition, the water table at the experimental site is always higher because of th weather. These factors make the experimental site a place of poor reflection quality [52] and a challenging environment for conducting GPR experiments.

Controlled Experiments
To validate the proposed algorithm fusion, we demonstrate two controlled experi ments. The first experiment focuses on a 2D test while the second experiment is to examin 3D imaging.

2D Algorithm Experiment
The survey target was a brick casket model (1.0 m × 0.40 m × 0.35 m), originally buried in the park about 80-85 cm below the surface for geophysical experiments since 2002 (Fig  ure 3a). It has been utilized to perform magnetic and seismic studies and has shown chal lenging geophysical attributes [53,54]. The casket model was made of dry stacked con struction bricks (0.20 m × 0.092 m × 0.05 m). The lid and bottom of the casket model wer composed of a single layer of 0.05 m thick bricks. The four 0.092 m thick walls were stacked up with the bricks horizontally. Previously excavated soil materials (silt, sand, and gravel were mixed to fill up the space inside the casket model and then also used for backfilling The disturbed soil materials were initially less dense, and then gradually become mor compact over time. Moreover, the park was re-landscaped by the university after th

Controlled Experiments
To validate the proposed algorithm fusion, we demonstrate two controlled experiments. The first experiment focuses on a 2D test while the second experiment is to examine 3D imaging.

2D Algorithm Experiment
The survey target was a brick casket model (1.0 m × 0.40 m × 0.35 m), originally buried in the park about 80-85 cm below the surface for geophysical experiments since 2002 ( Figure 3a). It has been utilized to perform magnetic and seismic studies and has shown challenging geophysical attributes [53,54]. The casket model was made of dry stacked construction bricks (0.20 m × 0.092 m × 0.05 m). The lid and bottom of the casket model were composed of a single layer of 0.05 m thick bricks. The four 0.092 m thick walls were stacked up with the bricks horizontally. Previously excavated soil materials (silt, sand, and gravel) were mixed to fill up the space inside the casket model and then also used for backfilling. The disturbed soil materials were initially less dense, and then gradually become more compact over time. Moreover, the park was re-landscaped by the university after the model was buried. Therefore, the depth of the casket model could be altered. Another interfering factor in this experiment is that the experiment was conducted soon Remote Sens. 2023, 15, 2886 7 of 24 after a severe flooding event. The water table was close to the ground surface due to heavy rainfall in previous days. Although the environment was not suitable for GPR surveying, it was a keen test for the proposed method. In this experiment, we acquired data from several E-W survey lines using a shielded 500 MHz GPR system. The acquisition parameters were set to 0.052 ns temporal sample intervals for 1024 samples per trace, and 0.018 m spatial sample interval on the survey line to avoid in-line aliasing. A typical survey line, EW09 (Figure 3b), was selected for demonstrating the proposed processing technique. The soil auger-pricking probe ( Figure 3c) after GPR imaging confirmed that the depth was between 0.8 m and 0.85 m.
Remote Sens. 2023, 13, x FOR PEER REVIEW 7 o was a keen test for the proposed method. In this experiment, we acquired data from s eral E-W survey lines using a shielded 500 MHz GPR system. The acquisition paramet were set to 0.052 ns temporal sample intervals for 1024 samples per trace, and 0.018 spatial sample interval on the survey line to avoid in-line aliasing. A typical survey li EW09 (Figure 3b), was selected for demonstrating the proposed processing technique. T soil auger-pricking probe ( Figure 3c) after GPR imaging confirmed that the depth w between 0.8 m and 0.85 m.  Figure 4a shows the original profile of EW09 after the removal of vertical noise ( normal pulses and the DC bias). In this profile, the airwave, direct wave, and strong wa table reflection with multiples masked the deeper reflections. The penetration depth o reached about 5 ns. After the NLT process, the penetration depth extended to about 14 but no significant reflections were present ( Figure 4b). Apparently, the interference  14 ns but no significant reflections were present (Figure 4b). Apparently, the interference of strong horizontally coherent noise generated by the water table was a major factor. Therefore, we employed the linear horizontal filtering technique, which applies a de-mean filter to subtract the horizontally coherent noise from the data. After de-mean filtering, the horizontally coherent noise was collapsed to reveal the deeper image, and the resolution of the NLT profile was significantly improved (Figure 4c). The next procedure was to carry out the nonlinear filtering technique (EEMD) vertically, i.e., to decompose each trace of the de-mean + NLT profile shown in Figure 4c. In this case, each GPR trace in the profile of Figure 4c was decomposed into eleven IMF components of different frequency bands to establish the trace filter bank (traces of different frequency bands). After examination, we selected the four most meaningful IMF components, C3, C4, C5, and C6. The next step was to try out the best combination of the four components. the NLT profile was significantly improved (Figure 4c). The next procedure was to carry out the nonlinear filtering technique (EEMD) vertically, i.e., to decompose each trace of the de-mean + NLT profile shown in Figure 4c. In this case, each GPR trace in the profile of Figure 4c was decomposed into eleven IMF components of different frequency bands to establish the trace filter bank (traces of different frequency bands). After examination, we selected the four most meaningful IMF components, C3, C4, C5, and C6. The next step was to try out the best combination of the four components. Through comparing the results for using each component and different combinations of various components for reconstruction, we determined that using only component C4 to reconstruct the trace would obtain the best NLT EEMD filtered profile as a result ( Figure  5b). Figure 5a,c, and d are the NLT EEMD filtered profiles using components C3, C5, and C6, respectively. Note that the model reflection energy is concentrated in component C4; the higher frequency component C3 (Figure 5a) has no event correlated to the model, while the lower frequency component C5 shows a blank window at the inferred buried casket model position (Figure 5c). The profile constructed using C6 is dominated by lowfrequency noises. In this study, it happened that a single-component reconstruction outperformed multi-component combinations. Many cases that we have dealt with in other fields of study are the combination of more than one component. A single-component reconstruction is not a typical case. One component taking it all happened accidentally in this study. A possible reason is that the feature of the signal is relatively unique and resides in only one component. As EEMD effectively removes mode mixing, a single-component reconstruction is conceivable. Through comparing the results for using each component and different combinations of various components for reconstruction, we determined that using only component C4 to reconstruct the trace would obtain the best NLT EEMD filtered profile as a result (Figure 5b). Figure 5a,c, and d are the NLT EEMD filtered profiles using components C3, C5, and C6, respectively. Note that the model reflection energy is concentrated in component C4; the higher frequency component C3 (Figure 5a) has no event correlated to the model, while the lower frequency component C5 shows a blank window at the inferred buried casket model position (Figure 5c). The profile constructed using C6 is dominated by low-frequency noises. In this study, it happened that a single-component reconstruction outperformed multi-component combinations. Many cases that we have dealt with in other fields of study are the combination of more than one component. A single-component reconstruction is not a typical case. One component taking it all happened accidentally in this study. A possible reason is that the feature of the signal is relatively unique and resides in only one component. As EEMD effectively removes mode mixing, a single-component reconstruction is conceivable. For comparison, the original profile ( Figure 4a) was also processed using a conventional processing scheme. We employed a gain function to compensate for energy loss, background removal to suppress horizontal coherent noise, and a band-pass filter to enhance the signal. The result is shown in Figure 6c. Before processing, we performed a model study on the shielded 500 MHz GPR system that we used in the experiment. The gain function g(t) consists of a linear and an exponential part: We assumed the attenuation function for the model trace to be t e 20

−
[49] and inputted initial values of α = 1 and β = 10 to invert the final values for g(t). Of course, we could use a stronger gain function or use the AGC (auto gain control) commonly adopted in seismic data processing, but this could distort the original signal and cause a spectral shift, resulting in fake events [55].
It is also important to note that we determined the optimal pass-band of the conventional band-pass filter by analyzing the spectrograms of the EEMD components [56]. Comparing Figure 6a with Figure 6c, the proposed scheme outperformed the conventional method. For comparison, the original profile (Figure 4a) was also processed using a conventional processing scheme. We employed a gain function to compensate for energy loss, background removal to suppress horizontal coherent noise, and a band-pass filter to enhance the signal. The result is shown in Figure 6c. Before processing, we performed a model study on the shielded 500 MHz GPR system that we used in the experiment. The gain function g(t) consists of a linear and an exponential part: Remote Sens. 2023, 13, x FOR PEER REVIEW 10 of 25

3D Imaging Experiment
Non-destructive 3D GPR imaging is vital in detecting shallow targets. Even though both innovative field techniques and software for real 3D surveys are available in reflection seismology for exploring deep structures, such advanced methods may not be appro- We assumed the attenuation function for the model trace to be e −20t [49] and inputted initial values of α = 1 and β = 10 to invert the final values for g(t). Of course, we could use a stronger gain function or use the AGC (auto gain control) commonly adopted in seismic data processing, but this could distort the original signal and cause a spectral shift, resulting in fake events [55].
It is also important to note that we determined the optimal pass-band of the conventional band-pass filter by analyzing the spectrograms of the EEMD components [56]. Comparing Figure 6a with Figure 6c, the proposed scheme outperformed the conventional method.

3D Imaging Experiment
Non-destructive 3D GPR imaging is vital in detecting shallow targets. Even though both innovative field techniques and software for real 3D surveys are available in reflection seismology for exploring deep structures, such advanced methods may not be appropriate for shallow subsurface imaging in GPR investigation [29,[56][57][58], particularly in rugged terrain and places with dense vegetation. Besides, the extra effort required for collecting full-resolution 3D field data [29] and much higher costs for handling large 3D data volumes may discourage GPR investigators. To confront this dilemma, we designed an experiment to show the possibility of keeping the acquisition effort at a reasonable level. The experimental target was a miniature model containing two buried DVD drives of different lengths to simulate multilayered historical objects or stacked tombs that may exist in our predetermined field area, the Chuping archaeological site. The longer DVD drive (20 cm ×14.2 cm × 4 cm) was buried right above the shorter one (16 cm Figure 7 is demonstrated as an example. Following the processing sequence stated earlier, the original profile was decomposed into 9 components using the EEMD method. The major components (C 1 to C 6 ) are shown in Figure 8.  Figure 7 is demonstrated as an example. Following the processing sequence stated earlier, the original profile was decomposed into 9 compo nents using the EEMD method. The major components (C1 to C6) are shown in Figure 8. In this 3D imaging section, we changed the font of the component number to a sub script to avoid confusion with the symbols used in the 2D algorithm. With the help o logarithmic transform, the decomposed components were easy to classify as noise or sig Figure 8. Original data, NLT data, and EEMD components (C 1 to C 6 ) of the GPR profile of E-W survey line T12. The reflection events revealed in the EEMD Component C 2 profile clearly indicate two separate reflections from two buried targets. Background noise filtering i = was not applied at this stage. Note that the font of the component number is replaced by a subscript to distinguish it from the symbols used in the 2D algorithm.
In this 3D imaging section, we changed the font of the component number to a subscript to avoid confusion with the symbols used in the 2D algorithm. With the help of logarithmic transform, the decomposed components were easy to classify as noise or signal. Despite the interference of background noise, the C 2 component still accurately resolved the two buried targets of different lengths at different depths. Thus, in this case, only C 2 of the EEMD decomposed components was recognized as the signal and selected for reconstructing the 2D profile. The events at 7 ns and 11 ns indicate the two buried DVD targets. Alternative reconstruction of the data can be performed by adding one more component, but no significant difference was observed in this case.
For comparison, Figure 9a,b show 2D profiles obtained by the conventional method (background noise removal plus band-pass filtering) and the proposed algorithm fusion method (background removal and NLT EEMD), respectively. The typical bandwidth of a 1200 MHz center frequency GPR system is between 600 MHz and 1800 MHz based on the −3 dB criterion. Optimum band-pass filtering in conventional processing can be achieved by simple tryout (iteration) or by spectral analysis. The 400-1000 MHz window simply refers to the results of spectral analysis. It is lower than the center frequency of the radar, which is possible because we dug a hole to bury the model and the previously excavated soil materials (silt, sand, and gravel) were used for backfilling. The unconsolidated dump materials attenuate much of the energy of higher-frequency radar waves, resulting in lower-frequency reflections. Nevertheless, the results shown in Figure 9 are effective in demonstrating the good performance of EEMD compared to conventional processing. which is possible because we dug a hole to bury the model and the previously excava soil materials (silt, sand, and gravel) were used for backfilling. The unconsolidated dum materials attenuate much of the energy of higher-frequency radar waves, resulting lower-frequency reflections. Nevertheless, the results shown in Figure 9 are effective demonstrating the good performance of EEMD compared to conventional processing. With success in processing 2D profiles, we extended the image to a 3D display. image constructed using all 21 N-S survey lines with conventional processing sequenc shown in Figure 10a. To test the possibility of minimizing the field survey lines, we th took the data of just three N-S survey lines (L1, L11, and L21 in Figure 7) in this mo study to construct the 3D image. Notice that only L11 traverses the subsurface target. W high interpolation, ten equally spaced lines were interpolated between the two surv lines, and the EEMD processed pseudo-3D data set revealed a more convincing ima (Figure 10b) of the buried two DVD targets than that constructed by 21 N-S survey lin with conventional processing sequence. This outcome indicates that the proposed proach may significantly reduce field endeavors. However, the purpose of trying tremely few scans is to cope with very difficult environments such as dense vegetati With success in processing 2D profiles, we extended the image to a 3D display. An image constructed using all 21 N-S survey lines with conventional processing sequence is shown in Figure 10a. To test the possibility of minimizing the field survey lines, we then took the data of just three N-S survey lines (L1, L11, and L21 in Figure 7) in this model study to construct the 3D image. Notice that only L11 traverses the subsurface target. With high interpolation, ten equally spaced lines were interpolated between the two survey lines, and the EEMD processed pseudo-3D data set revealed a more convincing image (Figure 10b) of the buried two DVD targets than that constructed by 21 N-S survey lines with conventional processing sequence. This outcome indicates that the proposed approach may significantly reduce field endeavors. However, the purpose of trying extremely few scans is to cope with very difficult environments such as dense vegetation, rugged terrain, and the most troublesome sites, such as a cultural heritage area where investigators are not allowed to lay out their survey lines however they need. Therefore, trying to use as few scans as possible is necessary, although it may be problematic, particularly if the target does not lie below one of the scans. Therefore, using extremely few scans is a compromise if an acceptable field layout is impossible. Examining our result, as long as one survey line crosses the target, the difficult environment seems not to be too distressing. With more endeavors, better solutions should be accessible.
trying to use as few scans as possible is necessary, although it may be problematic, partic ularly if the target does not lie below one of the scans. Therefore, using extremely few scans is a compromise if an acceptable field layout is impossible. Examining our result, a long as one survey line crosses the target, the difficult environment seems not to be to distressing. With more endeavors, better solutions should be accessible.  Figure 7 and interpolating ten equally-spaced line between two survey lines.

Field Example from the Chuping Archaeological Site
Our field study area, the Chuping archaeological site, is located in a remote moun tainous village. It is probably the most comprehensive high-altitude (924 m) prehistori archaeological site documented in Taiwan to date. The site was a terrace farm before ar chaeologists developed an interest in this area, and it has been known to be a terrain o archaeological importance since the early 1980s when prehistoric stoneware was expose on the hill slope [59]. Initiated in 1981, the Chuping archaeological site was excavated an investigated for seven years (1981)(1982)(1983)(1984)(1985)(1986)(1987) by archaeologist Chung-Yu Chen to study th buried remains of a prehistoric settlement of highland Taiwanese aborigines in the uppe valley of the Choshiu River. The excavated archaeological site was backfilled to preserv cultural heritage after the project was terminated, but the complete features of the Chup ing archaeological site are still unclear. Thus, the history of this broad-scale prehistori heritage is an interesting topic for further investigation. In addition, the Chuping archae ological site is probably multilayered and extends outside the excavated area due to it long period of settlement history. Therefore, assessment for deeper and broader re-exca vation via a nondestructive method is still a research topic worth investigating further. Figure 11a is a sketch that indicates the survey locations on the Chuping archaeo logical site where the backfilled area is surrounded by metallic grid fences. Two GPR sur vey squares shown in this subplot, the backfilled survey square (red square inside th fence) and the incomplete excavation survey square (outside the north end of the fence area), were the places selected to test our proposed technique in practical archaeologica  Figure 7 and interpolating ten equally-spaced lines between two survey lines.

Field Example from the Chuping Archaeological Site
Our field study area, the Chuping archaeological site, is located in a remote mountainous village. It is probably the most comprehensive high-altitude (924 m) prehistoric archaeological site documented in Taiwan to date. The site was a terrace farm before archaeologists developed an interest in this area, and it has been known to be a terrain of archaeological importance since the early 1980s when prehistoric stoneware was exposed on the hill slope [59]. Initiated in 1981, the Chuping archaeological site was excavated and investigated for seven years (1981-1987) by archaeologist Chung-Yu Chen to study the buried remains of a prehistoric settlement of highland Taiwanese aborigines in the upper valley of the Choshiu River. The excavated archaeological site was backfilled to preserve cultural heritage after the project was terminated, but the complete features of the Chuping archaeological site are still unclear. Thus, the history of this broad-scale prehistoric heritage is an interesting topic for further investigation. In addition, the Chuping archaeological site is probably multilayered and extends outside the excavated area due to its long period of settlement history. Therefore, assessment for deeper and broader re-excavation via a nondestructive method is still a research topic worth investigating further. Figure 11a is a sketch that indicates the survey locations on the Chuping archaeological site where the backfilled area is surrounded by metallic grid fences. Two GPR survey squares shown in this subplot, the backfilled survey square (red square inside the fence) and the incomplete excavation survey square (outside the north end of the fenced area), were the places selected to test our proposed technique in practical archaeological investigation. Within the fenced area, the remains of ancient house foundations were frequently discovered along with slate caskets in the subsurface during the 1980s excavation [59]. This finding is quite unusual, judging from the present customs of Taiwanese indigenous peoples. One possible explanation for this is that the customs changed over time. Many of the ancient aboriginal tribes in Taiwan retained the custom of burying naturally deceased family members in the house to respect their souls and to continue family life with them [60]. Compared with conventional processing, the coherent horizontal noise in the first 10 ns (Figure 12b) is greatly suppressed using the proposed algorithm fusion, and finer events and deeper reflections are revealed in Figure 12c. Therefore, the proposed method was selected for 3D imaging. However, the difficulty is that the space interval between parallel acquisitions and the lack of transversal scans makes it very hard to expect a reliable 3D interpretation. To achieve the best imaging, we tried to rotate the 3D image based on the potential positions of buried remains indicated on the map (Figure 12a). The result is shown in Figure 13. At the time of the survey, the entire area was a field without any indication of the subsurface foundations and caskets. Figure 11b shows the landscape of the area where the discovered remains were reburied. The backfilled survey square is a block of 15 m × 3 m located inside the fenced excavated region, close to the east fence. The incomplete excavation survey square is a 17.5 m × 11 m area situated in a cherry orchard outside the north fence (Figure 11c), where only a small area was excavated and the presence of unexcavated house foundations has long been speculated by previous investigators using conventional archeological techniques and suggested for further study [59,61]. We believe that our algorithm fusion technique might help solve the mystery.
Before carrying out the field survey, we noticed that at the incomplete excavation survey square, part of a backfilled house foundation (the one crossing under the north fence) could extend into this square, as referred to in the 1980s excavation report [59]. In addition, another foundation system may exist in this square. We carefully verified the positions of our survey squares against the documented excavation data; however, uncertainties may still exist due to insufficient documented position data and land deformation caused by frequent earthquakes and landslides in this area.

Backfilled Survey Square
According to the report of Chen [59] and our in-situ positioning, four separate room foundations (Figure 11a) and seven various sizes of caskets (Figure 12a) with a fire pit may exist underneath this square. The structures of these houses are studs and slate walls upheld by cornerstones in a T-shaped feature that constitutes the basis of the house and reinforces the connection between interior and exterior. The remains of the slate walls, studs or pillars, and cornerstones were recovered on the excavation scene and documented. Because of the complexity and interesting objects found in this square, we selected it as a field site for the 3D imaging experiment to assess the effectiveness and limitations of the proposed method. A shielded 500 MHz GPR system was employed to collect the data. We deployed four 15 m survey lines with a line spacing of 1 m at this square ( Figure 12a). Each survey line was acquired with the parameters of 0.18 ns temporal sample interval for 512 samples per trace and 0.017 m spatial sample interval on the survey line. The same processing sequence used in the previous model study was applied to each of the survey lines. A typical 2D profile (profile of L1 in Figure 12a) processed using the conventional method is demonstrated in Figure 12b and compared with the result processed using the proposed method (Figure 12c). In conventional processing, background noise removal and a band-pass filter with 100-600 MHz pass-band are applied while data from the same survey line with background noise removal and NLT EEMD were employed to carry out the proposed algorithm fusion. Similar to the control experiment, only one component was selected to reconstruct the data, probably because this site was excavated and then backfilled similarly to the control models.
Compared with conventional processing, the coherent horizontal noise in the first 10 ns (Figure 12b) is greatly suppressed using the proposed algorithm fusion, and finer events and deeper reflections are revealed in Figure 12c. Therefore, the proposed method was selected for 3D imaging. However, the difficulty is that the space interval between parallel acquisitions and the lack of transversal scans makes it very hard to expect a reliable 3D interpretation. To achieve the best imaging, we tried to rotate the 3D image based on the potential positions of buried remains indicated on the map (Figure 12a). The result is shown in Figure 13.
The result ( Figure 13) indicates that the positions of subsurface features (slate caskets, rocks, wall footings) were generally consistent with the excavation data [59]. Some small caskets and fire pits were not well mapped in Figure 13 because the spatial sample interval (cross-line spacing) did not fulfill full-resolution sampling criteria. More specifically, the maximum spatial sample interval should not exceed 1 4 of the apparent wavelength (about 0.04 m in this case) of the GPR signal [29]. Therefore, this survey is spatially aliased in the cross-line direction and should be described as pseudo-3D imaging.
rocks, wall footings) were generally consistent with the excavation data [59]. Some caskets and fire pits were not well mapped in Figure 13 because the spatial sample in (cross-line spacing) did not fulfill full-resolution sampling criteria. More specifical maximum spatial sample interval should not exceed ¼ of the apparent wavelength ( 0.04 m in this case) of the GPR signal [29]. Therefore, this survey is spatially aliased cross-line direction and should be described as pseudo-3D imaging.

Incomplete Excavation Survey Square
The incomplete excavation survey square is located outside the north fence of the excavated Chuping archaeological site. This area is a cherry orchard at present ( Figure   Figure 13. 3D representation of the interpolated image of the backfilled survey square. The data are processed using the proposed method.

Incomplete Excavation Survey Square
The incomplete excavation survey square is located outside the north fence of the excavated Chuping archaeological site. This area is a cherry orchard at present (Figure 11c). Referring to the Chuping archaeological site map and our on-site landscape survey, Figure 14a indicates that some reburied excavated house remains and a speculative unexcavated house foundation may exist in this square. According to the pictures taken at the time of excavation [59], the floor of this reburied house was paved with small slate fragments (from approximately 4 cm to 30 cm in diameter); therefore, it is an ideal indication for data correlation and interpretation. We targeted this area to find the undiscovered remains and if possible, image a multilayered internal structure of the Chuping archaeological site. Because this square was only excavated in a small area, we aimed to provide an unbiased assessment for a complete deeper, and broader excavation. Considering this point we eliminated FFTI in the processing sequence to minimize aliasing and artifacts. A pseudo-3D cube constructed by integrating the NLT EEMD processed 2D profiles without interpolation is shown in Figure 15. The dense distribution of spotty events on the southeast side should be the reflections of slate fragments from the excavated house floor, as stated previously. With reasonable inference, the same reflection pattern found on the The field data were acquired by deploying eleven survey lines with 1 m line spacing. The line length was about 17.5 m, the temporal sample interval was 0.052 ns for 1024 samples per trace, and the spatial sample interval on the survey line was 0.017 m. One concern for data quality is the interference of vegetation ( Figure 11c); it not only hinders the field survey but also may generate pitfall events in data interpretation.
To provide an objective presentation, we still chose the first survey line (L1) in the incomplete excavation survey square as an example. Figure 14b shows the 2D profile of L1 (Figure 14a) processed using the conventional method, and the result of applying the NLT EEMD method is demonstrated in Figure 14c. The pass-band of the linear filter used in the conventional method was determined by referring to the marginal spectrum [6,21,45,49,62] of the selected component. The bandwidth of a 500 MHz center frequency GPR system is between 250 MHz and 750 MHz. We applied the 400-1000 MHz band-pass filter in conventional processing to enhance ultra-shallow targets at the Chuping archaeological site (less than 0.5 m).
Because this square was only excavated in a small area, we aimed to provide an unbiased assessment for a complete deeper, and broader excavation. Considering this point, we eliminated FFTI in the processing sequence to minimize aliasing and artifacts. A pseudo-3D cube constructed by integrating the NLT EEMD processed 2D profiles without interpolation is shown in Figure 15. The dense distribution of spotty events on the southeast side should be the reflections of slate fragments from the excavated house floor, as stated previously. With reasonable inference, the same reflection pattern found on the west side could be an unexcavated house foundation, as expected. Although interpretation could vary, we may conclude that more remains are highly likely to be found in this area.

Discussion
All required Matlab code related to EMD, EEMD, and other advanced algorithms is available on the inventors' websites. For the conventional methods, the algorithms and

Discussion
All required Matlab code related to EMD, EEMD, and other advanced algorithms is available on the inventors' websites. For the conventional methods, the algorithms and source code are accessible in the literature and some popular commercial software. Usually, investigators prefer to write their code because it is more flexible in dealing with cases of various fusion conditions. However, an argument may arise that neither the conventional nor the newer nonlinear algorithms within the proposed algorithm fusion scheme are frontier techniques. In that context, there are many major differences in how to handle the research issue. From a practical point of view, the techniques in our algorithm fusion are robust, easy to apply, and most importantly, have low computation costs. These advantages should be encouraging. It is also worth noting that a GPR profile is an assembly of single traces representing cumulative reflections over the recording time. Since the original GPR traces acquired in the field are one-dimensional, it may not be practical to employ multidimensional EEMD [39,41] filtering techniques. In light of our previous research [6,41,45,49,55,62] and personal experiences, we performed a technical survey and compared the test results of the advanced multidimensional EEMD technique and the proposed methodology before carrying out this research. The algorithm fusion technique had a much lower computational cost with an acceptable outcome. Applying a multidimensional algorithm to process data from a 1D feature may only achieve marginal improvements with substantial extra endeavors, and at worst, may generate unexpected artifacts.
The NLT EEMD is the highlight of the proposed algorithm fusion. There are several advanced or alternative EMD algorithms available nowadays. They put forward more considerations and diversify the applications of EMD. One remarkable achievement is Holo-Hilbert spectral analysis, which can precisely determine the linearity and nonlinearity of the data to be analyzed, and yields a full spectral representation for nonlinear and non-stationary data in both amplitude and frequency modulation modes [42]. This novel method has been successfully applied to 1D data series with acceptable computing efficiency, and it is possible to process GPR data. However, the complexity of the algorithm and the computation cost for the volume of GPR data could be challenging. In any case, this progress on an EMD algorithm is inspiring and should be a future research topic in GPR data processing.
From the examples presented, it can be seen that interpolation has a significant impact on constructing a 3D GPR dataset, particularly between parallel profiles. We used FFTI to increase sampling density. The advantages are that this method generates stable approximation and enhances the signal of poorly sampled data, but the downsides are that it may still have problems producing aliasing events and unrealistic values if the data sampling density is too low for 3D data construction. However, some improvements are possible if more 2D survey lines traverse the subsurface targets. Another concern when using FFTI is the end effect (data drift or disturbance at the window boundaries when performing transformation). Thus, preserving the original file coordinates, making the end slope more uniformly distributed, and improving the disturbance of the end part are necessary. Admittedly, a more convincing interpolation technique such as shape-preserving spline interpolation [48], which has been successfully applied to improve the EMD algorithm, may alleviate the risk.

Conclusions
The proposed algorithm fusion scheme provides several advantages and renders important contributions to GPR field investigation such as minimizing disturbance at the study relic site, reducing labor in the field, providing high-resolution subsurface images, and improving post-processing flexibility.
Since we adopted the interpolation technique for 3D imaging, it could be categorized as pseudo-3D GPR imaging. Due to different techniques performed in data acquisition, seismologists seldom admit GPR 3D surveys to be genuine 3D. They call it 2.5D or pseudo-3D at most because without swath shooting in the field and adequate 3D migration in post-processing, all 3D images may be problematic. What we can do is use the limited resources and support to present a better result. However, with the experimental results and field examples presented, we may conclude that the proposed methodology is useful in increasing the accuracy of a 3D GPR image, particularly in areas where real 3D surveys are difficult to employ, as we experienced at the incomplete excavation survey area. Although it is unlikely that the pseudo-3D method we demonstrated in this study will replace real 3D techniques, the simple processing sequence and relatively straightforward algorithm make it possible to be welcomed by investigators not specializing in geophysics. Nevertheless, when considering preservation and environmental issues for conducting a GPR investigation, the proposed method is a feasible and environmentally friendly option. Extended excavation at the Chuping archaeological site may not be possible at this stage; however, detailed GPR mapping to reveal the unexcavated remains in the subsurface of the Chuping area is still valuable for archaeologists to reconstruct the entire prehistoric settlement of this unique highland relic in Taiwan. We believe that the stated mapping could be obtained with reasonable labor by employing the proposed method. In addition to detecting undiscovered relics, this method is a cost-effective tool for precisely locating spots for re-excavating a backfilled archaeological site if a routine assessment of the advantages of backfilling in preserving cultural heritage is required. We believe that the proposed method is ready for upgrading to real 3D investigation if dense cross-line data are acquired. Although this study presents the advantages of the proposed algorithm, we never disregard conventional methods. As it is said, "no method is perfect". The proposed method is just an alternative. We hope that some investigators may be interested in this topic and improve it.  (a) Determine the upper and lower envelopes encompassing all data y(t) using a cubic spline. (b) Calculate the mean m 1 (t) of the two envelopes. (c) Subtract the mean m 1 (t) from the data y(t) to obtain the first component h 1 (t), which is the prototype of IMF C 1 . (d) Test whether h 1 (t) is an IMF or not. If it is not, h 1 (t) is treated as the data, and then steps (a) to (c) must be repeated. (e) Repeat the sifting procedure j times until the obtained h 1j (t) satisfies the conditions of an IMF.
A criterion for the sifting process to stop is carrying out the Cauchy type of convergence test, which guarantees that enough physical sense is retained in the IMF components. The convergence test is performed by calculating the size of the standard deviation, SD, computed from two consecutive sifting results. For a given data set with S + 1 samples, the size of the standard deviation after j times of sifting, SD j , is where d j is the difference between the input data and the mean of the two envelopes at j times of sifting. Suggested values of SD j are between 0.2 and 0.3 [21]; however, this is an empirical criterion, and a rigorous justification is needed. When h 1j (t) satisfies the conditions of an IMF, it is designated as IMF c 1 , indicating the first IMF decomposed from the data. Component c 1 can then be separated from the rest of the data by subtracting it from the original data, and a residue r 1 results. The residue may contain lower frequency components and is treated as the new data for generating the next level's IMF. The decomposition process is stopped when the residue r n is less than a predetermined stoppage value, or the residue becomes a monotonic function from which no more IMF can be extracted. Consequently, the original data set is iteratively decomposed into a set of IMFs and a residue as where c i , i = 1, . . . , n, are the n IMF components of the different frequency bands, and r n is the residue after repeating the sifting procedure n times. A frequently encountered difficulty in EMD is mode mixing (i.e., signals of different scales are mixed in a single IMF or a signal of the same scale residing in different IMF components). Mode mixing may cause decomposition to be unstable and lack physical uniqueness when applying EMD. In addition, mode mixing is particularly serious when the data are attenuated or contain intermittent signals, and it can disturb the physical meaning and uniqueness of an IMF [6]. Wu and Huang [34] proposed a new noise-assisted data analysis method, EEMD, to overcome the mode mixing problem. EEMD derives each IMF component from the mean of an ensemble of the same level of IMFs, and each IMF in the ensemble consists of the signal plus white noise of finite amplitude. The EEMD algorithm contains steps as follows: (a) add a white noise series w(t) of finite amplitude to the original data y(t); then, the noise-added dataset Y(t) is where R is the ratio of the standard deviation of the added noise amplitude to that of the data y(t).
(b) Decompose the white noise-added data into IMFs. (c) Repeat step (a) and step (b) k times with different white noise series of the same amplitude each time. (d) Obtain the (ensemble) means of the corresponding IMFs of the decompositions as represents the ensemble of the j th level of IMF. Because different white noise series do not correlate with one another, they cancel each other out when taking the ensemble mean. Only the signal remains if k (the number of ensemble members) is infinitive. The added white noise series, therefore, provide uniform reference scales in the time-frequency domain for the signal but cause no interference. This approach effectively reduces mode mixing and is a substantial improvement over the EMD method.