Skip to main content

Entropy and Information in Biology

  • Chapter
  • First Online:
Biophysics
  • 1849 Accesses

Abstract

With the work of Leo Szilárd and Claude Shannon, we now realize that the information stored in a life system can be measured, and that the loss of that information produces entropy. In fact, there is a universality to open systems having an energy flow through them, making internal ordering a natural process, including evolution of new structures and principles derived from optimization.

“Animals have genes for altruism, and those genes have been selected in the evolution of many creatures because of the advantage they confer for the continuing survival of the species.” — Lewis Thomas

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    L. Szilárd Über die entropieverminderung in einem thermodynamischen system bei eingriffen intelligenter wesen, Z Phys 53, 840–856 (1929).

  2. 2.

    C. E. Shannon, A mathematical theory of communication, Bell System Technical Journal 27, 379–423 and 623–656 (July and October, 1948). A good exposition can be found in L. Brillouin, Science and Information Theory, [Acad Press] (1962).

  3. 3.

    To prove this statement, use the Lagrange-multiplier method (described in Appendix I.1) to maximize − p ilog2p i under the constraint \( \sum p_i=1\), i.e., \(\partial / \partial p_i\left ( -p_i \log _2 p_i+\alpha \left ( \sum p_i-1\right ) \right )=0\), which become p l is a constant, which must then be 1∕N.

  4. 4.

    L. Szilárd, On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings, Z Phys 53, 840 (1929).

  5. 5.

    R. Landauer, Irreversibility and heat generation in the computing process IBM J Res. Dev 5 (3), 183 (1961); see also C.H. Bennett, Demons, engines and the Second Law, Sci Am 257 (5), 108–116 (1987).

  6. 6.

    D.A. Huffman, “A Method for the Construction of Minimum-Redundancy Codes,” Proceedings of the I.R.E., September 1952, pp 1098–1102.

  7. 7.

    B. Strait and T. Dewey, The Shannon Information Entropy of Protein Sequences, Biophysical Journal 71, 148–155 (1986).

  8. 8.

    The first to propose how the DNA base-pair sequence might code protein structures was The George Washington University Professor George Gamow, three months after Watson and Crick published, in February of 1953, a description of the helical structure of DNA.

  9. 9.

    There are indications that the information needed is low because the developing embryo uses a strategy of repeating basic themes and then adding variations.

  10. 10.

    In fact, information and storage systems are a class of ordering processes described in Chap. 15.

  11. 11.

    Noise can arise from many sources, including thermal fluctuations and interference from spurious uncontrollable external interactions.

  12. 12.

    C.E. Shannon, Communication in the presence of noise, Proc IRE 37 10–21 (Jan 1949).

  13. 13.

    G. Raisbeck, Information Theory, [MIT] (1963).

  14. 14.

    Chaos may develop in non-linear systems. See Chap. 15.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Parke, W.C. (2020). Entropy and Information in Biology. In: Biophysics. Springer, Cham. https://doi.org/10.1007/978-3-030-44146-3_12

Download citation

Publish with us

Policies and ethics