Skip to main content

Bayesian Regularization of Neural Networks

  • Protocol
Artificial Neural Networks

Part of the book series: Methods in Molecular Biology™ ((MIMB,volume 458))

Abstract

Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a “well-posed” statistical problem in the manner of a ridge regression. The advantage of BRANNs is that the models are robust and the validation process, which scales as O(N2) in normal regression methods, such as back propagation, is unnecessary. These networks provide solutions to a number of problems that arise in QSAR modeling, such as choice of model, robustness of model, choice of validation set, size of validation effort, and optimization of network architecture. They are difficult to overtrain, since evidence procedures provide an objective Bayesian criterion for stopping training. They are also difficult to overfit, because the BRANN calculates and trains on a number of effective network parameters or weights, effectively turning off those that are not relevant. This effective number is usually considerably smaller than the number of weights in a standard fully connected back-propagation neural net. Automatic relevance determination (ARD) of the input variables can be used with BRANNs, and this allows the network to “estimate” the importance of each input. The ARD method ensures that irrelevant or highly correlated indices used in the modeling are neglected as well as showing which are the most important variables for modeling the activity data.

This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model. Some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Protocol
USD 49.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Burden FR, Winkler DA (1999) Robust QSAR models using Bayesian regularized neural networks. J Med Chem 42:3183–3187.

    Article  CAS  PubMed  Google Scholar 

  2. Winkler DA, Burden FR (2000) Robust QSAR models from novel descriptors and Bayesian regularized neural networks. Mol Simul 24:243–258.

    Article  CAS  Google Scholar 

  3. MacKay DJC (1992) A practical Bayesian framework for backpropagation networks. Neural Computation 4:448–472.

    Article  Google Scholar 

  4. Lucic B, Amic D, Trinajstic N. (2000) Nonlinear multivariate regression outperforms several concisely designed neural networks on three QSPR data sets. J Chem Inf Comput Sci 40:403–413.

    CAS  PubMed  Google Scholar 

  5. Neal RN (1996) Bayesian learning for neural networks. Springer-Verlag New York, Inc., Secaucus, NJ.

    Google Scholar 

  6. Hawkins DM, Basak SC, Mills D (2003) Assessing model fit by cross-validation. J Chem. Inf Comput Sci 43:579–58

    CAS  PubMed  Google Scholar 

  7. Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford.

    Google Scholar 

  8. Nabney IT (2002) Netlab: algorithms for pattern recognition. Springer-Verlag, London.

    Google Scholar 

  9. Baskin II, Ait AO, Halberstamc NM, PalyulinVA, Zefirov NS (2002) An approach to the interpretation of backpropagation neural network models in QSAR studies. SAR QSAR Environ Res 13:35–41.

    Article  CAS  PubMed  Google Scholar 

  10. Burden FR, Ford MG, Whitley DC, Winkler DA (2000) Use of automatic relevance determination in QSAR studies using Bayesian neural networks. J Chem Inf Comput Sci 40:1423–1430.

    CAS  PubMed  Google Scholar 

  11. Polley MJ, Burden FR, Winkler, D. A. (2005) Predictive human intestinal absorption QSAR models using Bayesian regularized neural networks. Australian Journal of Chemistry 58:859–863.

    Article  CAS  Google Scholar 

  12. Burden F R (1996) Using artificial neural networks to predict biological activity from simple molecular structure considerations. Quant Struct-Act Relat 15:7–11.

    Article  CAS  Google Scholar 

  13. Burden FR (1989) Molecular identification number for substructure searches. J Chem Inf Comput Sci 29:225–227.

    CAS  Google Scholar 

  14. Winkler DA, Burden FR (2004) Bayesian neural nets for modeling in drug discovery. Biosilico 2:104–111.

    CAS  Google Scholar 

  15. Gasteiger J, Marsili,M (1980) Iterative partial equalization of orbital electronegativity—a rapid access to atomic charges. Tetrahedron. 36:3219–3288.

    Article  CAS  Google Scholar 

  16. Burden FR, Winkler DA. (2000) A QSAR model for the acute toxicity of substituted benzenes to tetrahymena pyriformis using Bayesian Regularized neural networks. Chem Re. Toxicol 13:436--440.

    Article  CAS  Google Scholar 

  17. Burden FR (1997) A chemically intuitive molecular index based on the eigenvalues of a modified adjacency matrix. Quant Struct-Act Relat 16:309–314.

    Article  CAS  Google Scholar 

  18. Winkler DA, Burden FR (2004) Modelling blood brain barrier partitioning using Bayesian neural nets, J. Mol. Graph. Model. 22:499–508.

    Article  CAS  PubMed  Google Scholar 

  19. van Rossum G. (1995) Python tutorial. Technical Report CS-R9526, Centrum voor Wiskunde en Informatica (CWI), Amsterdam, May1995.

    Google Scholar 

  20. van Rossum G, Drake FL Jr (eds) (2003) Python/C API reference manual. PythonLabs, release 2.2.330 May.

    Google Scholar 

  21. van Rossum G, Drake FL Jr (eds) (2003) Python library reference. PythonLabs, release 2.2.330 May.

    Google Scholar 

  22. Winkler DA, Burden FR (2000) Robust QSAR models from novel descriptors and Bayesian regularized neural networks. Mol Simul 24:243–258.

    Article  CAS  Google Scholar 

  23. Winkler DA, Burden FR. (2002) Application of neural networks to large dataset QSAR, virtual screening and library design. in: Bellavance-English,L (ed) Combinatorial chemistry methods and protocols., Humana Press, Totowa, NJ.

    Google Scholar 

  24. Bruneau P (2001) Search for predictive generic model of aqueous solubility using Bayesian neural nets. J Chem Inf Comput Sci 41:1605–1616.

    CAS  PubMed  Google Scholar 

  25. Klocker J, Wailzer B, Buchbauer G, Wolschann P (2002) Bayesian neural networks for aroma classification. J Chem Inf Comput Sci 42:1443–1449.

    CAS  PubMed  Google Scholar 

  26. MacKay DJC (1992) Bayesian interpolation. Neural Computation 4:415–447.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Frank Burden BSc(Hons), PhD, MRACI .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Humana Press, a part of Springer Science + Business Media, LLC

About this protocol

Cite this protocol

Burden, F., Winkler, D. (2008). Bayesian Regularization of Neural Networks. In: Livingstone, D.J. (eds) Artificial Neural Networks. Methods in Molecular Biology™, vol 458. Humana Press. https://doi.org/10.1007/978-1-60327-101-1_3

Download citation

  • DOI: https://doi.org/10.1007/978-1-60327-101-1_3

  • Publisher Name: Humana Press

  • Print ISBN: 978-1-58829-718-1

  • Online ISBN: 978-1-60327-101-1

  • eBook Packages: Springer Protocols

Publish with us

Policies and ethics