Estimation of Regression Function in Multi-Response Nonparametric Regression Model Using Smoothing Spline and Kernel Estimators

The functions which describe relationship of more than one response variables observed at several values of the predictor variables in which there are correlations among the responses can be estimated by using a multi-response nonparametric regression model approach. In this study, we discuss about how we estimate the regression function of the multi-response nonparametric regression model by using both smoothing spline and kernel estimators. The principal objective is determining the smoothing spline and kernel estimators to estimate the regression function of the multi-response nonparametric regression model. The obtained results show that the regression functions obtained by using smoothing spline and kernel estimators are mathematically just distinguished by their smoother matrices. In addition, they are linear in observation and bias estimators.


Introduction
Speaking about a function which draws relationship of more than one the response variables observed at several values of the predictor variables, we cannot omit a common model called as a regression model. In statistical analysis that applies the regression model approach, we always be faced to the main statistical problem, i.e., how we estimate the regression function in the regression model. There are two main regression model approaches in the regression analysis. We can apply parametric regression model approach when the pattern of the regression function indicates the specific pattern, for examples, linear, quadratic, cubic, etc. On the other hand, when it pattern does not indicate the specific pattern, we must use the nonparametric regression model approach. The estimating of regression function of the nonparametric regression model can be used some estimators, i.e., kernel estimator, spline estimator, local polynomial estimator, wavelet estimator, etc. Spline is an estimator that has the best flexibility in estimating the nonparametic regression function compared with the others. Spline estimator used for estimating the regression function of the nonparametric regression model has been discussed by many researchers. Estimation of regression function of the nonparametric regression for smooth data by using original spline has been discussed by [1] and [2]. In [3]  compared between generalized cross validation (GCV) and generalized maximum likelihood (GML) mehods for selecting the smoothing parameter in the generalized spline smoothing problem. The using of M-type spline for overcoming outliers in the nonparametric regression has been proposed by [4] and [5]. In [6] researcher used bayesian method for constructing the confidence interval for original spline model. Relaxed spline and quantile spline estimators ware used by [7] and [8], respectively, for estimating the regression functions. In [9] researchers estimated the regression function of nonparametric regression model that has different variances of errors by using weighted spline estimator. The smoothing spline estimator in the nonparametric regression models which have correlation among their random errors was discussed by [10]. In [11] researcher used reproducing kernel Hilbert spaces (RKHS) concept to create techniques to build spline statistical model. In [12] researchers investigated the asymptotic properties of spline estimators of functional linear regression with errors-in-variables. In [13] researchers estimated the variance functions by using smoothing spline estimator. Besides that, there are some researches who have discussed about kernel estimator. In [14] researcher pointed that the spline estimator is better than kernel estimator in estimating nonparametric regression model of gross national product data. A weighted average to estimate the regression function of the raw data was used by [15]. In [16] and [17] researchers used kernel estimator for estimating the regression function and stated that kernel function should be symmetric. Note that, researchers mentioned above discussed spline and kernel estimators just for single response nonparametric regression models. They have not discussed the multi-responses nonparametric regression model.
The model discussed in this study provides powerful tools to model the function that draws relationship of more than one response variables observed at several values of predictor variables where among responses are correlated. The nonparametric models of multi-response data have been studied by some researchers. Algorithms of spline smoothing have been created by [18], [19] and [20]. The estimating of multivariate function by using smoothing spline and RKHS has been developed by [21]. In [22] and [23] researchers estimated regression function of the nonparametric regression models with serially and spatially correlated errors, respectively. In [24] researchers estimated biresponse nonparametric regression function with equal correlation of errors by using spline smoothing. In [25] and [26] researchers have determined spline estimators for estimating the multiresponse nonparametric regression model with equal and unequal correlations of errors, respectively. In [27] researchers applied the multi-response nonparametric regression approach to design child growth chart. In [28] researchers estimated the multi-responses nonparametric regression model that has heteroscedastic variances by using spline estimator. Estimation of the homoscedastic multiresponses nonparametric regression in which the number of observations were unbalance discussed by [29]. Estimations of covariance matrix by using spline have been studied by [30] and [31]. But, these researchers only discussed the using of spline estimator for estimating the multi-response nonparametric regression model. They have not discussed the estimating of regression function by using kernel estimator. In addition, although [14] has discussed about smoothing spline and kernel regression estimation techniques, but [14] discussed them to estimate regression function of the uniresponse nonparametric regression model only, and not in multi-response model.
In this study, we build the multi-response nonparametric regression model by developing the biresponse nonparametric model proposed by [24] to the more than two responses model. Next, we determine the smoothing spline and kernel estimators for estimating the regression function of the multi-response nonparametric regression model.

Results and Discussion
In this section, we give results and discussion about estimation of regression function in the multiresponse nonparametric regression model by using smoothing spline and kernel estimators. Firstly, we consider a paired data set ) , ( ki ki t y that follows a model called as the multi-response nonparametric regression model as follows:

Estimation of Regression Function Using Smoothing Spline Estimator
where k repesents the number of response, and p f f f ,..., , 2 1 are unknown regression functions ki  are zero-mean independent random errors with variance 2 ki  ( [19]). The main objective of nonparametric regression analysis is estimate unknown f is some known, smooth function, we must get the suitable form of f . In contrary, in the nonparametric regression model, some of f is unknown, smooth function, and we are not specify it. Next, suppose that , we can write equation (1) in the following equation: and [32]).
Estimating of the functions f in (2) by using smoothing spline estimator appears as a solution to the penalized weighted least-square (PWLS) minimization problem, i.e., determine f that can make the following PWLS minimum: for pre-specified value 12 ( , ,..., ) Note that, in equation (3), the first term represents the sum squares of errors and it penalizes the lack of fit. While, the second term which is weighted by  represents the roughness penalty and it imposes a penalty on roughness. It means that the curvature of f is penalized by it. In equation (3), is called as the smoothing parameter. The solution will be vary from interpolation to a linear model, if k  varies from 0 to  . So that, if k    , the roughness penalty will dominante in (3), and the smoothing spline estimate will be forced to be a constant. If 0 k   , the roughness penalty will disappear in (3), and the spline estimate will interpolate the data. Thus, the trade-off between the goodness of fit given by: and smoothness of the estimate given by: Based on model (1), and by applying the Riesz representation theorem ( [33]), and because of ki t L Hk is bounded linear functional, then according to [33] there is a representer ki   Hk of ki t L which follows: where ,  denotes an inner product. Based on (4) and by applying the properties of the inner product, we get: Next, by applying equation (8), for 1  k we have: where:   In the similar process, we obtain: In equation (10)  Therefore, we can write model in (2) as follows: We use the RKHS method to obtain the estimation of f , by solving the following optimization: with constraint: To solve the optimization (11) with constraint (12) is equaivalent to solve the optimization PWLS: where k  , p k ,..., 2 , 1  are smoothing parameters that control trade-off between goodness of fit represented by: To get the solution to (13), we first decompose the roughness penalty as follows: To get the solution to (17), firstly we must take the partially differential of ( , ) Q c d and then their results are equaled to zeros as follows: Next, if we substitute (19) into (18) , we obtain: Finally, based on (10), (19) and (20), we get the smoothing spline estimator which can be expressed as follows: where

Estimation of Regression Function Using Kernel Estimator
In the nonparametric regression, basically to estimate the regression function f based on kernel estimator is by using a weighted average of the raw data. The weight is a decreasing function of distance in the t-space. For uniresponse nonparametric regression model, [15] has proposed a weighted average of the raw data scheme by associating it with observations j y , for prediction at i t as follows: where () Ku is a decreasing function of u called as a kernel function, and 0 h  is bandwidth or smoothing parameter.
() Ku should be symmetric that usually take a probability density function such as a Gaussian ( [16] and [17]).
Next, based on equation (22) and by considering model given in (1) In this case, we use matrix V to denote a kernel hat matrix or a kernel smoother matrix that is used for transform j y 's to the ˆi y 's. It is similar to the hat matrix in ordinary least square. We may obtain the kernel predictions at an any point ki t by using equation (25) and replacing the " ki " by " k1 ". So that, the kernel prediction at any point ki t is given as follows:   As discussed above, similarly to estimation the regression function based on smoothing spline estimator given in (21), and by considering equations (24), (25) and (26), the kernel estimator to estimate the regression function of the model (1) is given by: