Skip to main content
Log in

Kernel logistic regression using truncated Newton method

  • Original Paper
  • Published:
Computational Management Science Aims and scope Submit manuscript

Abstract

Kernel logistic regression (KLR) is a powerful nonlinear classifier. The combination of KLR and the truncated-regularized iteratively re-weighted least-squares (TR-IRLS) algorithm, has led to a powerful classification method using small-to-medium size data sets. This method (algorithm), is called truncated-regularized kernel logistic regression (TR-KLR). Compared to support vector machines (SVM) and TR-IRLS on twelve benchmark publicly available data sets, the proposed TR-KLR algorithm is as accurate as, and much faster than, SVM and more accurate than TR-IRLS. The TR-KLR algorithm also has the advantage of providing direct prediction probabilities.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Asuncion A, Newman DJ (2007) UCI machine learning repository. University of california, irvine, School of information and computer sciences. http://www.ics.uci.edu/~mlearn/MLRepository.html

  • Canu S, Smola A (2006) Kernel methods and the exponential family. Neurocomputing 69(7–9): 714–720

    Article  Google Scholar 

  • Chang CC, Lin CJ (2001) LIBSVM: a library for support vector machines. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm

  • Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, London

    Google Scholar 

  • Garthwaite P, Jolliffe I, Jones B (2002) Statistical inference. Oxford University Press, London

    Google Scholar 

  • Gunn SR (1998) MATLAB support vector machine toolbox. Software available at http://www.isis.ecs.soton.ac.uk/isystems/kernel/

  • Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning. Springer, Berlin

    Google Scholar 

  • Hosmer DW, Lemeshow S (2000) Applied logistic regression, 2nd edn. Wiley, London

    Book  Google Scholar 

  • Jaakkola TS, Haussler D (1999) Probabilistic kernel regression models. In: Proceedings of the 1999 conference on AI and statistics. Morgan Kaufmann, Cambridge

  • Karsmakers P, Pelckmans K, Suykens JAK (2007) Multi-class kernel logistic regression: a fixed-size implementation. In: IJCNN: Proceedings of the international joint conference on neural networks, IEEE, pp 1756–1761

  • Keerthi SS, Duan KB, Shevade SK, Poo AN (2005) A fast dual algorithm for kernel logistic regression. J Mach Learning 61(1–3): 151–165. doi:10.1007/s10994-005-0768-5

    Article  Google Scholar 

  • Kressel UHG (1999) Pairwise classification and support vector machines. In: Advances in kernel methods: support vector learning. MIT Press, Cambridge, pp 255–268

  • Komarek P, Moore A (2005) Making LR a core data mining tool with TR-IRLS. In: ICDM: proceedings of the fifth IEEE international conference on data mining, IEEE Computer Society, Washington, USA, pp 685–688

  • Koh K, Kim S, Boyd S (2007) An interior-point method for large-scale ℓ1-regularized logistic regression. J Mach Learn Res 8: 1519–1555

    Google Scholar 

  • Komarek P, Moore A (2005) Making logistic regression a core data mining tool: a practical investigation of accuracy, speed, and simplicity. Tech. Rep. CMU-RI-TR-05-27, Robotics Institute, Carnegie Mellon University, Pittsburgh, PA

  • Lin CJ, Weng RC, Keerthi SS (2007) Trust region newton methods for large-scale logistic regression. In: ICML ’07 proceedings of the 24th international conference on machine learning, ACM, New York, pp 561–568

  • Malouf R (2002) A comparison of algorithms for maximum entropy parameter estimation. In: COLING-02 proceeding of the 6th conference on natural language learning. Association for Computational Linguistics, Morristown, USA, pp 1–7. doi:10.3115/1118853.1118871

  • Maalouf M, Trafalis TB (2011) Robust weighted kernel logistic regression in imbalanced and rare events data. Comput Stat Data Anal 55(1): 168–183

    Article  Google Scholar 

  • Minka TP (2003) A comparison of numerical optimizers for logistic regression. Tech rep, Deptartment of Statistics, Carnegie Mellon University

  • Platt JC, Cristianini N, Shawe-taylor J (2000) Large margin DAGs for multiclass classification. In: Advances in neural information processing systems, MIT Press, Cambridge, pp 547–553

  • Rifkin R, Klautau A (2004) In defense of one-vs-all classification. J Mach Learning Res 5: 101–141

    Google Scholar 

  • Roth V (2001) Probabilistic discriminative kernel classifiers for multi-class problems. In: Proceedings of the 23rd DAGM-symposium on pattern recognition. Springer, London, pp 246–253

  • Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press, London

    Book  Google Scholar 

  • Vapnik V (1995) The Nature of Statistical Learning. Springer, New York

    Google Scholar 

  • Zhu J, Hastie T (2005) Kernel logistic regression and the import vector machine. J Comput Graphic Stat 14: 185–205

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maher Maalouf.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Maalouf, M., Trafalis, T.B. & Adrianto, I. Kernel logistic regression using truncated Newton method. Comput Manag Sci 8, 415–428 (2011). https://doi.org/10.1007/s10287-010-0128-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10287-010-0128-1

Keywords

Navigation