Abstract
With the work of Leo Szilárd and Claude Shannon, we now realize that the information stored in a life system can be measured, and that the loss of that information produces entropy. In fact, there is a universality to open systems having an energy flow through them, making internal ordering a natural process, including evolution of new structures and principles derived from optimization.
“Animals have genes for altruism, and those genes have been selected in the evolution of many creatures because of the advantage they confer for the continuing survival of the species.” — Lewis Thomas
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
L. Szilárd Über die entropieverminderung in einem thermodynamischen system bei eingriffen intelligenter wesen, Z Phys 53, 840–856 (1929).
- 2.
C. E. Shannon, A mathematical theory of communication, Bell System Technical Journal 27, 379–423 and 623–656 (July and October, 1948). A good exposition can be found in L. Brillouin, Science and Information Theory, [Acad Press] (1962).
- 3.
To prove this statement, use the Lagrange-multiplier method (described in Appendix I.1) to maximize − p ilog2p i under the constraint \( \sum p_i=1\), i.e., \(\partial / \partial p_i\left ( -p_i \log _2 p_i+\alpha \left ( \sum p_i-1\right ) \right )=0\), which become p l is a constant, which must then be 1∕N.
- 4.
L. Szilárd, On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings, Z Phys 53, 840 (1929).
- 5.
R. Landauer, Irreversibility and heat generation in the computing process IBM J Res. Dev 5 (3), 183 (1961); see also C.H. Bennett, Demons, engines and the Second Law, Sci Am 257 (5), 108–116 (1987).
- 6.
D.A. Huffman, “A Method for the Construction of Minimum-Redundancy Codes,” Proceedings of the I.R.E., September 1952, pp 1098–1102.
- 7.
B. Strait and T. Dewey, The Shannon Information Entropy of Protein Sequences, Biophysical Journal 71, 148–155 (1986).
- 8.
The first to propose how the DNA base-pair sequence might code protein structures was The George Washington University Professor George Gamow, three months after Watson and Crick published, in February of 1953, a description of the helical structure of DNA.
- 9.
There are indications that the information needed is low because the developing embryo uses a strategy of repeating basic themes and then adding variations.
- 10.
In fact, information and storage systems are a class of ordering processes described in Chap. 15.
- 11.
Noise can arise from many sources, including thermal fluctuations and interference from spurious uncontrollable external interactions.
- 12.
C.E. Shannon, Communication in the presence of noise, Proc IRE 37 10–21 (Jan 1949).
- 13.
G. Raisbeck, Information Theory, [MIT] (1963).
- 14.
Chaos may develop in non-linear systems. See Chap. 15.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Parke, W.C. (2020). Entropy and Information in Biology. In: Biophysics. Springer, Cham. https://doi.org/10.1007/978-3-030-44146-3_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-44146-3_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-44145-6
Online ISBN: 978-3-030-44146-3
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)