1 Erratum to: Neural Comput & Applic DOI 10.1007/s00521-008-0179-1

Unfortunately, an error occurred in the text of this article. One formula was missing in the first paragraph of Chap. 4. The complete paragraph is given below.

4 Experimental results

In order to evaluate the performance of the proposed SSVM_NDR algorithm, we compared it with two algorithms, such as SVM, SSVM_Zhu, on six UCI real datasets [9] such as Tae, Sonar, New-thyroid, BUPA, Ionosphere, Pima, and on Vehicle dataset from the LibSVM datasets [10]. Table 1 lists main characteristics of seven datasets used in the experiments. On each dataset, we employed the Gaussian Kernel function K(x i , x j ) = exp(−δ·||x i x j ||2) for SSVM_Zhu, SSVM_NDR and K(x i , x j ) = exp(−0.5||x i x j ||2/δ 2) for SVM. The tenfold cross validation method is applied to estimating the generalization accuracies and the average CPU time elapsed in the training and test phase of SSVM_Zhu, SSVM_NDR and SVM. In the cross-validation method, we ensured that each training set and each test set were the same for all algorithms. Because the same parameters setup could lead to results that favor one algorithm over another, we compared the results of SSVM_Zhu, SSVM_NDR and SVM using the better kernel parameter δ, the regularization parameter C and the preferable parameters θ by several random selections of these three parameters tuned with cross-validation. The value of these parameters is listed in Table 4. Moreover, experiments were done by Matlab on WindowsXP operating system running on a personal computer with Pentium 1.6 GHz CPU and 512 MB RAM. SPRTool [11] was used to train the multi-class SVM classifier by one-versus-all decomposition, and a Matlab’s function called quadprog was used to solve the quadratic optimization problems for SSVM_Zhu and SSVM_NDR.