Abstract
This paper shows how the maximum (information) entropy principle allows the derivation of the short-time propagator from experimental data provided the process is Markovian. From the propagator, the Fokker-Planck equation can be derived. The Lagrange parameters that are used in the maximum information entropy principle can be derived by minimizing the Kullback information.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Jaynes, E.T.: Phys. Rev. 106, 620 (1957)
Jaynes, E.T.: in Complex Systems. Operational approaches. H. Haken (ed.), Springer, Berlin (1985)
Haken, H.: Information and Self-Organization, Section 9. 2. Springer, Berlin (1988)
This section generalizes Sect. 9.3 of [3] from the one-dimensional to the multidimensional case
This section generalizes the one-dimensional case treated in [6] to the multidimensional case
Borland, L., and Haken, H.: Z. Physik B-Condensed matter 88, 95 (1992)
Borland, L., Haken, H.: Ann. Physik 1, 452 (1992)
Borland, L., Haken, H.: Rev. Math. Phys. 33, 35 (1993)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Haken, H. (2003). Application of the Maximum (Information) Entropy Principle to Stochastic Processes far from Thermal Equilibrium. In: Karmeshu (eds) Entropy Measures, Maximum Entropy Principle and Emerging Applications. Studies in Fuzziness and Soft Computing, vol 119. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-36212-8_3
Download citation
DOI: https://doi.org/10.1007/978-3-540-36212-8_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-05531-7
Online ISBN: 978-3-540-36212-8
eBook Packages: Springer Book Archive