Entropy production and extraction in dynamical systems and turbulence

In this paper we consider systems deviated from equilibrium by some external factors and discuss the internal entropy production and entropy extraction by the environment. For a system moving away from equilibrium, we express the entropy extraction via a two-point correlation function for any time and any distance from equilibrium. The long-time limit gives the sum of the Lyapunov exponents expressed via the formula of Green–Kubo type. We discuss what is known about the entropy production from deviations away from the equilibrium and back to equilibrium and for turbulent states. In particular, we show that the entropy production is due to the degrees of freedom participating in a direct cascade but not in an inverse cascade.


Introduction
Our goal here is to draw the attention of researchers who study turbulence and far-fromequilibrium systems to the classical subject of entropy. We believe that one can get important insights into physics far from equilibrium by studying how the entropy changes. For a closed Hamiltonian system, the Liouville theorem guarantees that the Gibbs entropy stays constant. The entropy change can come from the following sources. First, the external environment may provide for the non-Hamiltonian terms in the equations of motion and violate the Liouville theorem. Such an 'external' entropy change is always negative relative to the equilibrium that corresponds to the entropy maximum. Furthermore, one can show that if there is a nonequilibrium steady state then the rate of entropy change is non-positive. For those reasons, we say that the environment provides for entropy extraction. This type of entropy extraction is discussed in section 2. In sections 3 and 4 of the paper, we discuss situations where entropy extraction by the environment takes place on the boundaries. Entropy extraction is accompanied by entropy production by the system taking place due to coarse-graining (spatial or temporal) that entails loss of information. Indeed, coarse-graining turns dynamical equations into kinetic equations which satisfy the respective H-theorems. Note that the proofs of production positivity and extraction negativity are independent. In a steady state, production and extraction are equal to each other. The time evolution of those two quantities generally proceeds in different ways.
We first discuss how systems move away from equilibrium under the action of external factors and describe entropy extraction. We then describe the entropy production, particularly in turbulence.

Entropy extraction
An appropriate way to study statistical properties of dynamical systems is to study the evolution of the density n which satisfies the Liouville (continuity) equation Here n(t, r) is either a phase-space density or just a plain density in space and v(t, r) is either a velocity field defining the phase space dynamics or a velocity field in space. An important characteristic of the dynamics of the Liouville equation is the Gibbs entropy defined by S = − n(t, r) ln n(t, r) dr. (2.2) The time derivative of the entropy contained within a volume V is given by where ω = div v. The first term represents the entropy change within the volume while the second corresponds to the entropy flux through the boundary. In this section and the next one we consider the case where the volume V under consideration is invariant with respect to the flow.
We assume that at any t the Lagrangian map q(t, r) defined byq = v(t, q) and q(0, r) = r is a smooth one-to-one transformation of V . In this case, we have It is remarkable that the entropy production rate is given by an average (i.e. integral over space with the density) of a dynamical variable. Entropy itself is not representable in this way. For closed Hamiltonian systems, the equilibrium is given by the microcanonical distribution n eq = 1/V where now V is the volume of the energy shell. A uniform density is stationary for velocity fields having ω = 0, which is the case of Hamiltonian systems. Note that constant n maximizes S over all normalized distributions. We now assume that non-Hamiltonian forces with ω = 0 appear at t = 0 and the system begins to move away from the equilibrium. The density field n(t, r) turns inhomogeneous and the Gibbs entropy starts decreasing. This decrease can be interpreted as the entropy flux from the system to the environment that provides for the compressibility of the velocity field. Our main result is the formula for the entropy extraction rate at an arbitrary time in terms of the two-point correlation function along the Lagrangian trajectories (see the details of the derivation in [1]): The formula holds both for a time-independent v(r) and also for a statistically steady v(t, r). For a time-independent velocity, (2.6) is an example of the Kawasaki formula [2] giving the result in terms of the transient-time correlation function defined in [3]. Note the remarkable similarity to the Green-Kubo formula holding near equilibrium. Although the averaging measure is the same in both cases (i.e. uniform), the dynamics used to calculate q(t, x) is different. Here the full velocity field must be used, whereas for the Green-Kubo case one uses the trajectories at ω = 0. Clearly, near the equilibrium the two formulae are the same. For short time intervals, equation (2.6) gives dS/dt ≈ −t ω 2 (0) < 0, which is physically quite natural since the entropy has nowhere to go but decrease. When the velocity field is steady (or statistically steady), the density field may have a limit at t → ∞ (see [4,5] and references therein). Examples of such non-equilibrium limiting states are Sinai-Ruelle-Bowen measures in hyperbolic dynamical systems [6,7]. Such a state is characterized by the limiting entropy extraction rate lim t→∞Ṡ = λ i . Here λ i is the sum of Lyapunov exponents if they exist. This sum is generally non-positive for the simple reason that contracting regions have a higher measure than expanding ones [8,9]. Its non-positivity is equivalent to the statement that Gibbs entropy is maximal for a constant distribution [1]. Therefore, our formula (2.6) at t → ∞ gives the sum of the Lyapunov exponents for a general dynamical system and a random flow with a stationary statistics: It looks exactly as the Green-Kubo formula. Note that λ i is the unique combination of the Lyapunov exponents that can be presented as the time average of a function on the phase space. For the particular case of particles in the electromagnetic field, (2.7) was derived in [10].
Although the total (mean) entropy decreases under the action of ω, one can encounter fluctuations, i.e. the local entropy density s(t, r) = −ln n(t, q(t, r)) may grow. Our Lagrangian approach allows us to generalize (for time-dependent flows) the so-called Evans-Searles formula for the probabilities of opposite-sign entropy changes [1,11]. The measure of s(t, r) ≡ s(t, r) − s(0, r) obeys In the case of time-reversible dynamics, dr δ( s(−t, r) − ) = dr δ( s(t, r) − ) and which says that, starting from equilibrium, the probability to have the entropy change − is by a factor exp( ) larger than the probability to have the entropy change . The relation between (2.9) and the Gallavotti-Cohen formula [12] is discussed in [13].
If the non-equilibrium limiting density µ = lim t→∞ n is smooth then its Gibbs entropy is finite: and necessarily lim t→∞Ṡ = λ i = 0. It is often the case that µ is singular; the simplest example is steady v(r) in the one-dimensional case where n accumulates in the stagnation points.
Generally, for such states,Ṡ = λ i < 0 and it does not contradict stationarity since S = −∞ where the measure is singular. A constant flux of entropy from the system to the environment is an important feature of these states. We see that the entropy extraction has the same sign both at the beginning and at the end of the evolution. How it behaves in between may depend on the system. One can imagine all possible ways to deviate systems from equilibrium, one of them being a deviation that corresponds to the non-monotonic change of the entropy extraction (again, a simple example is a flow having a one-dimensional component close to periodic and without stagnation points). We believe that the formula (2.5) presents an important tool for the classification of compressible flows and non-equilibrium states to establish whether they satisfy any extremum of the entropy extraction. In particular, it follows from (2.5) that when a two-point correlation function of the velocity divergence does not change sign, the entropy production rateṠ monotonically decreases from 0 in equilibrium to its (negative) minimum realized in the non-equilibrium limiting state with the singular density µ.

Gaussian thermostats
Extraction of entropy from the system (described by ω) is generally associated with the dissipation of energy by a thermostat. External agents sustain a non-equilibrium steady state by providing both the sources and the sinks of energy. Here one may either consider a truly open system with a non-compact phase space or a system constrained into some compact region. This is very much like canonical and microcanonical approaches at equilibrium and it is unclear which results are sensitive to the choice of approach far from equilibrium. Assuming that the main feature of interaction with the thermostat is preservation of the total energy in the presence of external forces, it is natural to consider the minimal dynamics having this conservation property. Such a dynamics is provided by the Gauss minimum constraint principle and is associated with the names of Nose, Hoover, Evans, Morriss and others; see [3,5] and the references therein. Here we consider it for the class of systems having a Hamiltonian as the sum of the potential and kinetic energies. From the formal viewpoint, the constraint of a constant total energy confines the motion to a compact manifold thus making the steady state possible (we assume that co-ordinates are confined). Other constraints can be considered as well, see [4]. In addition to the isoenergetic ensemble constrained to have constant total energy, we shall consider the isokinetic ensemble that conserves only the kinetic part of the total energy. In both cases the equations readq where F e represents the external force and α the thermostat. We have chosen a simple form of the external force. For the conserved quantities under consideration, the role of the Gaussian thermostat is to provide for an effective friction coefficient which is in tune with the external force: where α en and α kin represent the isoenergetic and isokinetic ensembles respectively. For the divergence of the velocity field in the phase space and the energy dissipation by the thermostat we have where d is the space dimension, N the number of particles and dq the amount of heat transferred to the system by the thermostat. It follows from (3.3) and (2.4) that −Ṡ is determined by the friction coefficient averaged over the trajectories. At t = 0, the system is still at equilibrium so thatṠ = 0. At later moments of time one expects that there are more trajectories for which external force increases the energy so that the friction coefficient (or −Ṡ) is positive. Note, however, that there is no direct proportionality between the energy absorbed by the thermostat and entropy extraction rate: We note that to avoid the problem with the definition of S due to the δ-functions appearing in n( ) one can work with either reduced phase space or use the energy shell. A remarkable property of the isokinetic ensemble is that the rates of entropy extraction and energy dissipation are proportional [10]: Since dN − 1 is the number of degrees of freedom associated with the momenta in the isokinetic ensemble, equation (3.4) looks like an equilibrium relation between the entropy and the heat provided one uses p 2 i /m = Td (which holds on average in the Gibbs ensemble of classical statistical mechanics). For further properties of the isokinetic ensemble, see e.g. [4]. In contrast, within the isoenergetic ensemble, energy dissipation and entropy extraction are not proportional generally and one can only question the proportionality (properly understood) in the steady state.
Let us now apply the result of the previous section. We have where (t, ) is the time t dynamical map of the phase space. This map can only be found in the simplest cases. Yet we expect that the advancement is possible in the analysis of such questions as the sign of the pair-correlation function, its behaviour at large times and whether λ i = 0 or λ i < 0. We describe schematically how one studies non-equilibrium statistical mechanics within the framework of an isoenergetic ensemble. One starts from a constant distribution in the energy shell in the phase space. Under the action of external forces, evolution generally brings the thermostatted system to a non-equilibrium steady state in the limit of large time. The evolution is characterized by the pair-correlation function of the 'friction coefficient' that gives the entropy production on the way to the new state. Finally, note that the above scheme is a non-equilibrium analogue of the microcanonical description in equilibrium statistical mechanics since it produces a steady state which is a function of the energy and the number of particles. Transformations from microcanonical to canonical and grand canonical ensembles are the subject of future work. Let us also recall that we do not consider in this section the entropy extraction due to boundary conditions described by the second term in (2.3).
We conclude the section on the entropy extraction in thermostatted dynamical systems with a simple example where one can complete the calculation. Consider a thermostatted ideal gas in a constant electric field: Note that isokinetic and isoenergetic ensembles are the same here. We observe that α satisfies a closed dynamical equation thus making the problem one-dimensional. The qualitative picture of the dynamics is very simple in this case: at large times the friction coefficient saturates at the constant value α * = (Ne 2 E 2 /U) 1/2 except for the initial conditions satisfying α(t = 0) = −α * which can be neglected as having zero measure. The momentum of each particle saturates at the same value eE/α * the magnitude of which is equal by necessity to (U/N) 1/2 . Let us now describe the dynamics quantitatively and calculate the entropy extraction. We denote cos θ(t) ≡ α(t)/α * and write cos θ(t) = tanh(α * t) + cos θ 0 1 + tanh(α * t) cos θ 0 , θ 0 ≡ θ(0).
To find α(0)α(t) , we need the probability distribution of θ 0 . It is given by the microcanonical distribution that can be found by considering P as a vector in dN dimensions with components P ij = (p i ) j . Introducing a vector A with components A ij = E j one has where φ 1 is the spherical angle in dN dimensions with the 'z' axis determined by the direction of A. Fixing the proportionality constant by normalization we find where µ = (dN − 1)/2 and B(x, y) is the beta function. Note that in terms of the variable x defined by cos θ = tanh x the dynamics is a mere shift by α * t. The calculation of the averaging measure is however more intuitive in terms of θ. It is clear from (3.8) that ω(0)ω(t) monotonically decreases to zero as t grows. In the simplest case of a single particle in two dimensions, we obtain Using [19] we can calculate We observe that the structure of the correlation function is rather complicated even in the simplest case of the ideal gas. In the limit of large N we see that the main correlation at short time is given by the terms with the largest l but they decrease fast with time: ω(0)ω(t) ∝ exp(−α * dN t). At large time, the term with l = 1 gives the main contribution and ω(0)ω(t) ∝ exp(−2α * t). As a subject of future research, it would be interesting to study the explicit solution (3.10) in the thermodynamic limit N → ∞ when the series becomes infinite. It follows from (3.8) and (2.6) that the entropy extraction rate changes monotonically and reaches a maximum in the steady state in this case: where the integral can be calculated directly from equation (3.8). This result is of course obvious from the qualitative picture of dynamics: the entropy extraction rate is minus the number of the degrees of freedom (dN − 1) times the average friction coefficient which is given at large times by α * for almost every initial condition. The steady state measure is singular because momenta take definite values in the steady state.

Entropy production
Let us now consider entropy production due to the coarse-graining. Entropy defined with the help of a non-singular coarse-grained density is finite and has zero time derivative in the nonequilibrium steady state. The entropy change due to the environment (2.3) must be balanced by the entropy production. In particular, where either the bulk term has a non-zero limit (equal to λ i ) or the boundary term does not vanish, the entropy production must stay finite when the scale of coarse-graining goes to zero. Where the entropy extraction comes from the bulk, for a general chaotic system one has λ i = 0 and the steady-state entropy production is determined by the dynamical properties. This should be contrasted with relaxation to the equilibrium where the entropy production rate generally depends on the scale of coarse-graining.
We now put ω ≡ 0, i.e., we consider either relaxation to equilibrium or evolution to non-equilibrium caused by the boundary term (as it is customary in turbulence and other hydrodynamic problems). The entropy production in the bulk is non-negative vanishing only on the equilibrium solution. Does entropy production evolve monotonically when the system goes to the steady state (equilibrium or non-equilibrium)? Here we consider density n that corresponds to a reduced description of the system and satisfies a kinetic equation: Kinetic equations generally satisfy H-theorem that states that dS/dt = − I ln n dr 0. If one is able to show that d 2 S/dt 2 = − (İ ln n + I 2 /n) dr 0 then relaxation to both equilibrium and non-equilibrium steady states is monotonic. It would also mean that the entropy production is minimal (under given boundary conditions) at the non-equilibrium steady state.
Such an extremum principle can be established generally only close to thermodynamic equilibrium when n = n 0 + δn and S = − (δn) 2 dr/(2n 0 ). The linearized kinetic equation, δṅ = L δn, gives dS/dt = − δn L δn dr/n 0 which is non-negative by virtue of the negativity of the operator L. The second time derivative of the entropy, d 2 S/dt 2 = − [δn L 2 δn + ( L δn) 2 ] dr/n 0 , is non-positive when L is symmetric which is generally the case for kinetic equations. This consideration is, of course, valid for a general set of macroscopic co-ordinates a i and forces X i = ∂S/∂x i = −G ij a j which satisfyȧ i = L ij X j near the equilibrium. The entropy extremum and Onsager reciprocity relation guarantee that the matrices L and G are symmetric and positive definite so that d n S/dt n = X L(−2 G L) n−1 X has a sign coinciding with that of (−1) n+1 . We see that, near equilibrium, the entropy production rate is always positive and monotonically decreases to zero as one approaches equilibrium. Note that the symmetry of the linearized operator is sufficient but not necessary for d 2 S/dt 2 0.
Attempts to consider far-from-equilibrium steady states must be made specifically for different kinetic equations. In the simplest case of scale separation, integrating over small scales brings diffusion I = div κ∇n. The bulk entropy production by diffusion is positive: However, the second time derivative of the entropy is not negative generally. For example, for κ = const = 1, This is not surprising since it is the rate of change of E = n 2 dr/2 which has an extremum. Indeed, one can find (again for bulk dissipation) The monotonic decrease of −Ė upon relaxation leads to the extremum principle that the steady state div κ∇n = 0 realizes the minimum of −dE/dt with respect to variations of n that vanish at the boundaries (this is the Kirchhoff formulation of Ohm's law with n being the electrostatic potential and κ the conductivity). Again, we see that, only near the equilibrium when S ≈ −E/n 0 , the entropy production is guaranteed to have an extremum in the steady state.
For developed turbulence, one has external action that provides for the excitation and dissipation regions separated by the inertial interval in the wavenumber space. Entropy is added in the excitation region and is extracted in the dissipation region. Entropy production can be readily considered for wave turbulence where the reduced description is expressed in terms of the spectral density of waves or quasi-particles with occupation numbers N k . In the inertial interval, N k satisfies the kinetic equatioṅ (4.5) Here I(k, {N}) is the collision integral that describes either three-wave or four-wave processes, its form can be found, e.g. in [15]. The entropy of the quasi-particles (per unit volume) is S = ln(N k ) dk, and the collision integral satisfies the H-theorem: (I(k)/N k ) dk 0 [15]. The second time derivative of entropy is given in this case by d 2 S/dt 2 = − [I 2 /N 2 −İ/N] dk. At this moment, we cannot make a general statement on the sign of this quantity. The collision integral linearized near equilibrium corresponds to a symmetric linear operator so that the entropy production changes monotonically. At the near-equilibrium steady state, the entropy production is minimal under the given boundary conditions. Far-from-equilibrium states of wave turbulence carry fluxes of energy or wave action and are called Kolmogorov-Zakharov spectra. The collision integral linearized near such spectra is non-symmetric so that the approach to them can proceed in a non-monotonic way [15]. If, however, the Kolmogorov-Zakharov spectra are established by self-similar propagation fronts, either N k (t) = t −q f(kt −p ) (for spectra with infinite capacity) or N k (t) (for finite capacity) [15,16] then one can show that dS/dt is either A/t or A/(t 0 − t) with some positive A, i.e. the entropy production by the self-similar part of the solution either decays monotonically or grows monotonically with time. For spectra with an infinite capacity, one thus cannot at the moment say anything definite about whether the entropy production is at its maximum. Although an explosive formation of Kolmogorov-Zakharov spectra with finite capacity were observed in some cases, there are cases (described by peculiar kinetic equations) where the formation of spectra goes in two stages and one cannot yet say anything definite about the evolution of the entropy production in this case as well [17].
If the turbulent steady state is established by a front propagating from the excitation to the dissipation region then one may expect the entropy production to grow monotonically reaching a maximum at the steady state. Note that we talk about a maximum of entropy production under the prescribed conservation law in the spirit of [14]. Rigorous description of the formation of the turbulence spectra and determination of the evolution of the entropy production are tasks to be completed in the future.

Entropy production within turbulent cascades
Let us discuss how the entropy production in a turbulent steady state is distributed over the scales. Consider first the direct energy cascade where the rate of energy input per unit volume P is determined by a large-scale pumping. As already mentioned, the rate of entropy production must be determined by the dissipation region, i.e. by large wavenumbers. Consider, for instance, the kinetic equation (4.5). The Kolmogorov-Zakharov turbulence spectrum N k corresponds to a constant energy flux in k-space: ω k I k k d P = const. On such a spectrum, the integral for entropy production is as follows: The main contribution is always due to large k (at the dissipative end of the inertial interval) because N k decays faster with k than ω −1 k . The reason for this is simple: the turbulence spectrum must be steeper than the equipartition so that the energy flux flows to large k [15]. In other words, the ratio between energy and entropy changes for a given k is proportional to the energy density ω k N k which is a decreasing function of k for a direct energy cascade. Let us reiterate that we talk here about the positive entropy production by wave interactions which, in a steady state, is compensated by external factors that remove entropy from the wave system.
On the other hand, for the inverse cascades, the situation is different. Consider, for instance, the inverse cascade with the flux Q of the wave action (number of particles): I k k d Q = const. In this case, dS The maximum is again achieved generally at the largest wavenumber in the inertial interval but here it corresponds to the pumping scale. Recall that it is necessary to dissipate both integrals of motion so that direct and inverse cascade always co-exist. Performing matching at the pumping scale and using the result on the direct cascade we conclude that the entropy production is due to the direct cascade and takes place at large k. Vanishing entropy production in the inverse cascade may be related to some equilibrium-like properties of the inverse energy cascade in two-dimensional hydrodynamics described in [18].
To avoid misunderstanding, let us stress that (5.1) and (5.2) are related to the entropy of the subsystem of waves or quasi-particles in the inertial intervals. If one wishes to consider the total entropy of the medium then (assuming that every piece is close to thermal equilibrium despite the turbulence) the rate of change is equal to the dissipated heat (i.e. the energy flux) divided by the temperature. If the occupation numbers at the dissipative scale far exceed the thermal level N k N (0) k = T/ω k then P/ω k N k P/T . Non-equilibrium steady solutions that describe cascades are known also for the kinetic Boltzmann equation for particles (they describe power-law spectra in cosmic rays and in emission currents from metals irradiated by laser pulses) [15]. In this case, S = − N p ln(N p /e) dp and the entropy production by the energy cascade, dS/dt = − dp ln(N p )P/ p p d , is determined by the lowest momenta, i.e. by the same particles that participate in the energy pumping. Note that in this case dS/dt ∝ P = dE/dt (up to a logarithmic factor) even far away from equilibrium.