Novel Bayesian Track-Before-Detection for Drones Based VB-Multi-Bernoulli Filter and a GIGM Implementation

Joint detection and tracking of drones is a challenging radar technology; especially estimating their states with unknown measurement variances. The Bayesian track-before-detect (TBD) approach is an efficient way to detect low observable targets. In this paper, we proposed a new variational Bayesian (VB)-TBD technique for drones based on Multi-Bernoulli filter, which implemented with unknown measurement variances. Current implementation includes an analytical Gaussian inverse Gamma mixtures solution, which applied to estimate augmented kinematic drones state under the same circumstance. The results demonstrate that the proposed filter is more accurate than other Multi-Bernoulli filters in cardinality estimation. The proposed algorithm estimates the fluctuated parameters for each drone and it has no difficulty in handling the crossing of multiple drones. The Optimal Subpattern Assignment (OSPA) distances of proposed algorithm under different SNR are less than the other filters. It can be seen that at SNR (–5dB), the proposed algorithm and the other filters settle to errors 51 m, 125 m and 200 m, respectively.


Introduction
In recent years, cardinality-balanced multi-target multi-Bernoulli (CBMeMBer) filter has been widely used for multi-target tracking with excellent performance [1][2][3][4][5]. The CBMeMBer recursion is a tractable approximation of Bayesian multi-target recursion under low clutter density. It directly propagates approximate posterior density of the targets. The key advantage of multi-Bernoulli lies in extracting the reliable and inexpensive state estimates. This algorithm only performs well under given detection parameters; otherwise, it may decrease the tracking performance greatly. In this case, the common approach applies a threshold and treats those cells of exceeding the threshold. It is acceptable if signal-to-noise ratio (SNR) is high. The suitability of conventional radar systems to de-tect and identify micro-drones is a matter of significant importance to national security. It is being investigated while the use of such platforms is becoming more and more widespread. This is expected to be challenging, as microdrones have low radar cross-section (RCS), fly at low altitude, and speed in comparison with more conventional targets for military applications [6][7][8][9][10]. For low observable targets such as drones, there is a considerable advantage of using unthresholded data in simultaneous detection and tracking, which is known as track-before-detect (TBD). Many methods have been proposed to deal with the multitarget filtering under given parameters of low SNR [2], [3], [9]. However, they all have the same problem of larger computation and only adapted slowly to measurement variances. Normally the measurement variances are unknown and time-varying in real small targets tracking scenarios such as drones. In [2], the authors proposed a Multi-Bernoulli based on TBD filtering solution that can accommodate a linear Gaussian function and unknown detection profile, but with a given measurement variance. Recently, tracking filters based on variational Bayesian (VB) approximation method [11][12][13][14][15][16][17] have been applied to estimate the state of linear Gaussian process whose measurement variances are unknown. However, these algorithms depend on constant detection probability, which is unsuitable for drones tracking.
In this paper, a new VB-CBMeMBer filter with Bayesian TBD is proposed to cope with jointly unknown detection probability and measurement variances. The detection profile is prior unknown due to variation of drone parameters such as small RCS [8][9][10]. Therefore, we introduce a VB approximation into the framework of CBMeMBer-TBD filter instead of the recent filters [2], [11][12][13][14][15][16][17]. Further, we derive a closed-form solution for time-varying drones tracking in the circumstance of jointly unknown detection parameters. Moreover, a Gaussian Inverse Gamma mixtures implementation is applied to estimate the augmented kinematic state. The rest of the paper is organized as follows. In Section 2, we give a formulation of drones' TBD, and in Section 3 we present the updated formula of new recursions. The analytical implementation of GIGM is given in Section 4 and the numerical results and conclusions are given in Section 5, 6.

Problem Formulation of Drones' TBD
where x k,ℓ = x k,ℓ ẋ k,ℓ y k,ℓ ẏ k,ℓ I k,ℓ is the state of drone target ℓ and comprises the position, velocity and intensity (RCS) of the drone target at instant k, F k denotes the state transition function and w k is the process noise. Then, drones state (X k ) at time k, are finite sets, X k = {x k,1 ,⋯, x k,N(k) } ⊂ χ. We now describe a drones' observation model for TBD. Drone target returns measured by the radar are assumed to fluctuate according to the return amplitude fluctuation models [13]. The 2D images of the surveillance region (with n x n y resolution cells) provided by the sensor can be denoted as [4]: : Target is present, ℓ is a measurement matrix and represents the contribution intensity of drone target ℓ to the cell (i,j), ν k (i,j)  (0,R k ) is a Gaussian measurement noise cell, and the covariance matrix R k is diagonal. If the point drone is assumed, the drone intensity can be approximated as (3) according to the point spread function [2]: , , where Σ is the amount of blurring introduced by the sensor. The complete set of measurements at instant k and up to instant k is denoted as Z k = {z k (i,j) :i = 1…n x , j = 1…n y }. Assuming of an independent measurement noise from cell to cell. Here the fluctuation models are incorporated into the likelihood function, to account for the target return fluctuations. Under the further assumption of Gaussian background noise, the PDFs can be expressed as [2]: , , The drones state at time k is naturally modelled by a Random Finite Set (RFS) X k = {x k,1 ,⋯, x k,N(k) } ⊂ χ, χ  n . The surveillance region is divided into D resolution cells denoted by V 1 ,V 2 ,…,V D   n/2 , a single drone target with state x occupies a set of resolution cells denoted by U(x).
Here, we only consider the case that drones are rigid bodies. It means that the regions affected by different drones without overlap, i.e., U(x)  U(x') = 0 if x x'. Assume that the measurements in different cells are mutually independent condition of the drones state X k . The proposed probability density of measurement data Z k conditioned on the drones state X k at time k by the two likelihoods is given by: The drone target state prior and process noise are assumed to be known, while the drone target measurement variance R k and probability of drone target detection p D,k (x) are assumed unknown due to fluctuation of drone RCS. In addition, the process noise, R k , as a function of initial target state value are independent of each other. The objective of this paper is to propose a state estimation algorithm for the nonlinear system with jointly unknown R k . Unlike the conventional state estimation, what we considered here is to infer sequentially unobserved state x k together with jointly unknown measurement variances statistics R k according to a set of measurements. The variational Bayesian (VB) approximation method has been applied to obtaining the state estimation of non-linear systems with unknown R k . The main idea is to approximate the jointposterior distribution of the states and R k by factorising the fixed form distribution and constructing the recursive expressions.

Variational Bayesian Multi-Bernoulli-TBD Recursions
The basic idea of this improvement approach is to joint unknown drones parameters (i.e. measurement variance (R)) with unknown detection probability (a) of TBD scheme into a single drone target state. Estimating the augmented state of drones will yield information of particular drone target, such as number of drones, as well as individual kinematic states, jointly unknown R. In the following, we derive VB Multi-Bernoulli-TBD recursion for augmented state model, which propagates cardinality distribution of drones and intensity function of augmented state. We propose a modified filter with unknown parameters based on VB and TBD algorithm. It can be derived by the original filter in a specially chosen state space. The drones' profiles are accommodated by incorporating unknown R with the state variable (x). It is achieved by putting a kinematic state into augmented variable (corre-sponding to unknown R). The filter should be able to estimate unknown drones' parameters according to how well the measurements fit to an underlying drone target model .  An explicit specification of single drone target model with  new  augmented  state  space  is  given is the transition density at time k, for drone target augmented state with unknown drone target parameters , given previous value , .

VB-CBMeMBer-TBD Prediction
A multi-Bernoulli RFS [1][2][3] is united by a fixed number of independent Bernoulli RFS with existence probability r (i)  (1,0) and probability density p (i) , i = 1,…,M, where M is the number of Bernoulli RFS. Thus a multi-Bernoulli RFS is completely described by the multi-Ber- Assuming that the state vector x and measurement covariance R are independent, and the survival probability are independent of them, so p s,k (x,R) = p s,k , we also assume that posterior multi-target density at time step k -1 is represented by multi-Bernoulli parameter set as [16]: where r (i) k-1 is existence probability and p (i) k-1 (.) is state distribution of the i th Bernoulli component. M k-1 denotes the number of posterior hypothesized tracks at time k -1, then the predicted multi-target density is multi-Bernoulli given by a union of birth and persisting or predicted components [16]: parameter set of multi-Bernoulli RFS for the surviving targets and spontaneous births, respectively. M k-1 and M Γ,k denote predicted hypothesized track number of surviving targets and spontaneous births, respectively, then , (10) where , represents inner product operation. The total number of predicted hypothesized tracks is

VB-CBMeMBer-TBD Update
If at time k, the predicted multi-target density is multi- , given the measurement vector of an image observation at time k, , , , where . Note that, R and , ⁄ | are unknown in detail. Moreover, when the latest measurement Z k is available, the joint posterior distribution P (i) D,k (x,RZ k ) are intractable analytically and we cannot arrive at the solution. In order to calculate the posterior distribution with unknown R k , the VB approximation method [16] is proposed to achieve the approximation solution. Now, assume that P (i) D,k (x,RZ k ) can be approximated by two intensity functions , and , , while , and , are approximate posterior for x and R respectively. Therefore, P (i) D,k (x,Rz) , , . In order to obtain the best approximation of intensity function P (i) D,k (x,Rz), the variational calculus is employed to minimize following Kullback-Leibler (KL)-divergence as [12][13][14]16]: The solution is obtained iteratively by optimization with respect to only one of the multiplicative factors in • and fixing all the others to their last estimated values. The analytical solutions for , and , with such a procedure are given as follows [13]: where and are constants with respect to variables and , respectively. The expected values on the right-hand sides of (17) are taken by using the last estimated versions of , and , to obtain the new values which yield a convergent recursion [13]. Owe to coupling of , and , , they cannot be solved directly. However, similar to the derivation of variational calculus in [12], [ (16) where superscript d denotes the dimension of . Inv-Gamma(;α,β) denotes the inverse Gamma distribution of variable, whose degree of freedom is and scalar parameter is , so

GIGM Implementation Based TBD
In the following, the Gaussian Inverse-Gamma mixtures (GIGM) implementation of the VB-Multi-Bernoulli TBD (VB-CBMeMBer-TBD) recursion is described. The solution is obtained on GIGM-TBD distributions, the model of a proposed pdf is represented by, where J k (i) is number of GM components of the i th drone; w k (i,j) is weight of the j th component; α k,l (i,j) and β k,l (i,j) are Inverse-Gamma distribution' parameters used for estimating unknown R k estimation; m k (i,j) and p k (i,j) are Gaussian mean and covariance of the j th component, respectively; Furthermore, suppose in an intermediate stage at time k -1 and estimation of CBMeMBer can be approximated by an unnormalized mixture of GIGM-TBD distributions, we have: where , , and , , are Inverse-Gamma distribution' parameters which using for the unknown R k-1 estimation,

GIGM VB CBMeMBer-TBD Prediction
Suppose that the intensity of birth RFS is an unnormalized mixture of GIGM-TBD distributions. Thus, the posterior birth model of the i th drone target is a multi-Bernoulli distribution with parameter set of Inv -Gamma ; , , ; , where w Γ,k (i,j) , α Γ,k,l (i,j) and β Γ,k,l (i,j) denote weights and parameters of the j th component of birth model corresponding to the i th drone target. The survival probability is state independent, i.e. p S,k (x) = p S,k. Then, predicted (multi-Bernoulli) multi-target posterior density computed by a birth model as where l = 1,…,d and is a fading factor parameter, that actually affects estimation of drone target variance, ρ l (0,1]. When ρ l is large, it represents a gentle drone variance fluctuation from time k -1 to k, in contrast ρ l is small, the variances will be nonstationary and have strong fluctuations [11][12][13][14][15].

GIGM VB-CBMeMBer-TBD Updating
Suppose that the predicted VB CBMeMBer-TBD posterior density given at time k and each drone target probability density p (i) k/k-1 , i = 1,…,M k/k-1 is comprised of a GIGM-TBD distributions, we have.

Numerical Results
This section presents numerical results of a new VB-CBMeMBer-GIGM-TBD filter. Linear TBD scenarios are used to illustrate and examine the performance against the recent VB-CBMeMBer [15] and CBMeMBer-TBD [2] filters. In [6], the authors proposed a practical scenario to simulate radar back-scattering of moving drone targets. The goal was to collect and process RCS data of drones and birds at high frequencies (K-band and W-band). The experimental setup has been intentionally chosen such that it corresponds to a realistic scenario for a drone/bird detection radar system. A complex RCS is normally computed by coherently combining cross sections of the simple target shapes. In [8], [9], the authors proposed a facet model to simulate radar back-scattering of a moving complex target based on POFACET program. Three drones of different sizes (DJI Phantom 3 Standard, DJI Inspire 1 and DJI S900 Hexacopter) are shown in Fig. 1. The statistical distribution of RCS for each target falls within a certain range which is useful for predicting the performance of drone detection radar. The real drone RCS values can be used to reconstruct the drones' signatures at any aspect in angular extent summarized in Tab. 1. The RCS data of different drone target models are used to predict true probability density function (pdf). Figure 1 shows the Cumulative Density Function (CDF) of statistical histogram method for different statistical models. Note that for actual drones are unknown to the filter. Thus, the measurement data is simulated according to a state dependent and a of each drone target model, whose measurement variance peaks at the edge of surveillance region and tapers off at the origin. We  Fig. 2. The sample image of a simulation scenario is shown in Fig. 3 at times k = 20, 60 s, it shows the simulated noisy data at various SNR levels based on RCS with the middle point drones. Three targets appear at the position with circle marked, but they are buried by the background noise. In following dynamical and measurement models, state T , ,   The single-drone target transition model is linear Gaussian and specified by TBD [2] filters for comparison, the same pruning and merging of Gaussian components are performed on the posterior intensity. The selection of fading factor ρ l is difficult, because the stability of noise estimation is controlled by ρ l . We find that the performance of CBMeMBer-TBD is better when ρ l = 0.95 and R = 5 m [2], VB-CBMeMBer using a = 0.7 for all drones [16].  Figure 4 shows the three drones trajectories (x and y coordinates vs. time), they are estimated in a single run by different filters with unknown detection probability and measurement variance. The results indicate that proposed filter provides accurate tracking performance. However, the other filters encounter occasional tracking false and drops because the detection probability is not known for VB-CBMeMBer-GIGM filter, and the measurement variances is not known for CBMeMBer-TBD filter. These results demonstrate that the proposed filter is able to correctly track the motions of individual drone target and identify the various births and deaths throughout. The filter has no difficulty in handling crossings of two drones (DJI Inspire 1, DJI Phantom 3) at time t = 47, so as two drones (DJI S900, DJI Inspire 1) at time t = 56. The proposed algorithm estimates the fluctuated parameters for each drone target, it has better adaptive ability for jointly unknown drones parameters such as estimated than other algorithms as shown in Fig. 5. Figure 6 shows the number of drone target estimated by VB-CBMeMBer, CBMeMBer-TBD and the proposed algorithms. It is clear that the proposed algorithm has better performance in estimating drone target number than CBMeMBer-TBD and VB-CBMeMBer under unknown a and R. This is due to the fact that both a and R are correctly estimated by the proposed algorithm. The performances of other algorithms decrease greatly with erroneous σ and a values. The reason is that assumed CBMeMBer-TBD is inaccurate for value σ = 5 m. While, VB-CBMeMBer is neither accurate at a = 0.7, due to the same reason. Figure 7

Conclusion
This work considered the method of multi-Bernoulli-TBD filter for unknown drones' detection parameters. To tackle the problems of drones tracking, an improved VB-CBMeMBer-TBD algorithm is proposed in this paper. The VB-CBMeMBer-TBD is approximated by GIGM implementation. Therefore, it is capable of estimating the drone kinematic state augmented by unknown parameters. The simulation results show that the proposed algorithm has better performance than recently tracking filters.