Information geometry, trade-off relations, and generalized Glansdorff-Prigogine criterion for stability

We discuss a relationship between information geometry and the Glansdorff-Prigogine criterion for stability. For the linear master equation, we found a relation between the line element and the excess entropy production rate. This relation leads to a new perspective of stability in a nonequilibrium steady-state. We also generalize the Glansdorff-Prigogine criterion for stability based on information geometry. Our information-geometric criterion for stability works well for the nonlinear master equation, where the Glansdorff-Prigogine criterion for stability does not work well. We derive a trade-off relation among the fluctuation of the observable, the mean change of the observable, and the intrinsic speed. We also derive a novel thermodynamic trade-off relation between the excess entropy production rate and the intrinsic speed. These trade-off relations provide a physical interpretation of our information-geometric criterion for stability. We illustrate our information-geometric criterion for stability by an autocatalytic reaction model, where dynamics are driven by a nonlinear master equation.


Introduction
The behavior of the system around the nonequilibrium steady-state has been well discussed in nonequilibrium thermodynamics. Around 1970, Glandorff and Prigogine proposed the criterion for stability based on a generalization of the entropy production rate for the steady-state, namely the excess entropy production rate [1][2][3]. This criterion is called the Glansdorff-Prigogine criterion for stability. They proposed the positivity of the excess entropy production rate as a simple sufficient condition of the stability in the steady-state. The range of the validity of this criterion has been exposed to criticism sometimes [4][5][6]. Nowadays, this criterion has been regarded as the Lyapunov criterion for a particular system [7,8]. For example, in the linear master equation, a relationship between the Lyapunov criterion and the Glansdorff-Prigogine criterion for stability has been discussed based on the linear irreversible thermodynamics [9,10]. Recently, nonequilibrium thermodynamics for the linear master equation has been intensively studied in terms of stochastic thermodynamics [11,12]. The idea of excess entropy production rate has been widely used in stochastic thermodynamics [13][14][15][16], and the Glansdorff-Prigogine criterion for stability has been recently revisited [17]. While the Glansdorff-Prigogine criterion for stability is reasonable for the linear master equation, the Glansdorff-Prigogine criterion for stability does not work well for the nonlinear master equation [5,8].
In recent years, thermodynamic trade-off relations have been well discussed in terms of thermodynamic uncertainty relations [52][53][54][55]. The thermodynamic uncertainty relations are based on the mathematical property of the Fisher information matrix [46,56,57]. This Fisher information matrix becomes a Riemannian metric of information geometry [32], and several thermodynamic trade-off relations have been derived from the mathematical property of information geometry [45][46][47][49][50][51]. These thermodynamic trade-off relations mainly focus on a trade-off between the fluctuation of observable and the entropy production rate. The concept of thermodynamic trade-off relations is similar to the Glansdorff-Prigogine criterion for stability because the excess entropy production rate decides the stability of the steady-state.
This article clarifies a relationship between the Glansdorff-Prigogine criterion for stability and information geometry and discusses the stability of the steady-state based on thermodynamic trade-off relations. We show that the excess entropy production rate is given by the time derivative of the Lyapunov candidate function for the linear master equation, and this Lyapunov candidate function is related to the square of the line element in information geometry. This relation gives a geometric interpretation of the Onsager matrix under the near-equilibrium condition. We discuss the stability of the steady-state in terms of the monotonicity of the Kullback-Leibler divergence.
We also generalize the Glansdorff-Prigogine criterion for stability based on information geometry. Our information-geometric criterion for stability focus on the intrinsic speed on the manifold of probability simplex. If the intrinsic speed is reducing in time around the steadystate, the steady-state is stable. Our generalized criterion for stability works well even for the nonlinear master equation where the Glansdorff-Prigogine criterion for stability does not work well. We also derive trade-off relations among the fluctuation of the observable, the mean change of the observable, and the intrinsic speed, and a novel thermodynamic trade-off relation between the excess entropy production rate and the intrinsic speed. We discuss a physical interpretation of our information-geometric criterion for stability based on these trade-off relations. The speed of observable decreases around the steady-state if the steady-state is stable. We illustrate the merit of our information-geometric criterion for stability by a model of autocatalytic reaction driven by a nonlinear master equation.

The entropy production rate and the Lyapunov candidate function
We first explain a relation between the entropy production rate and the Lyapunov candidate function for the relaxation to equilibrium [10]. We start with the master equation, where W i→j is the transition rate from the state i to j, and p i is the probability of the state i. We introduce the vector notation of the probability p = {p i |i = 1, . . .}. In this relaxation process, we assume that the transition rate does not depend on time and the detailed balance condition is satisfied. The detail balance condition implies the existence of an equilibrium state for any pair (i, j). The equilibrium state is the steady-state, that is dp eq dt = 0.
It implies that the equilibrium state is the fixed point of the master equation. We now introduce the entropy production rate σ( p) for the state p. We here assume that the transition rate is non-zero W j→i = 0 and the Boltzmann constant is set to unity k B = 1 to simplify the discussion. The entropy production rate σ is defined as the product of the force F j→i ( p) and the flux J j→i ( p), where the flux and the force are defined as respectively. Its nonnegativity σ( p) 0 is well known as the second law of thermodynamics. This nonnegativity can be derived from the fact that the flux J j→i ( p) and the force F j→i ( p) have the same signs. In equilibrium, the entropy production rate is zero σ( p eq ) = 0 because of the detailed balance condition equation (2).
The entropy production rate can be the time derivative of a Lyapunov candidate function if we assume the relaxation to equilibrium. In this relaxation process, the entropy production rate can be rewritten as where we used the detailed balance condition equation (2), the master equation (1), the equilibrium property equation (3), and the normalization of the probability d( i p i )/dt = 0. The quantity is the Kullback-Leibler divergence between two probabilities p and p eq . This quantity is nonnegative and zero if and only if p = p eq . Because the equilibrium state p eq is the fixed point of the master equation, the Kullback-Leibler divergence D KL ( p p eq ) can be a Lyapunov candidate function. In terms of the Lyapunov criterion, the second law of thermodynamics provides the condition of stability d dt D KL ( p p eq ) 0.
It implies that the equilibrium state is always stable.

The excess entropy production rate and the Glansdorff-Prigogine criterion for stability
We next revisit the Glansdorff-Prigogine criterion for stability [10]. We consider a steady-statē p = {p i |i = 1, . . . } as a fixed point of the master equation, that is dp From the master equation (1), this condition implies j J j→i (p) = 0 for any i. The Glansdorff-Prigogine criterion denotes the stability of a fixed pointp based on the generalization of the entropy production rate, that is the excess entropy production rate. To define the excess entropy production rate, we introduce the excess flux and the excess force defined as respectively. The excess entropy production rate δ 2 σ( p p) is given by the product of the excess flux and the excess force, If the steady-state is given by the equilibrium distributionp = p eq , the detailed balance condition J j→i (p) = J j→i ( p eq ) = 0 (and F j→i (p) = F j→i ( p eq ) = 0) holds, and we obtain δ 2 σ( p p eq ) = σ( p). Thus, this excess entropy production rate is a generalization of the entropy production rate where the detailed balance condition is violated.
In the Glansdorff-Prigogine criterion for stability [2,3], we discuss the sign of this excess entropy production rate to detect the stability of the steady-statep. If it is non-negative the steady-statep is stable. If it is negative the steady-statep is unstable. We discuss a relationship between the Glansdorff-Prigogine criterion for stability and the Lyapunov criterion in the case of the linear master equation, where the transition rate does not depend on the probability dW i→j /dp k = 0. To discuss the nonnegativity of the excess entropy production rate, we introduce the difference between the probability distribution and the steady-state distribution Up to the order δ p i , the excess flux and the excess force are calculated as respectively, where the order O(|δ p| 2 ) is the Landau notation. We find that the time evolution of δ p i is given by the sum of the excess fluxes d dt From equations (17)- (19), this excess entropy production rate is calculated as The quantity is the Lyapunov candidate function because it is nonnegative δ 2 L( p p) 0, and zero if and only if the probability p is the fixed pointp. Then, the Glansdorff-Prigogine criterion for stability can be considered as the Lyapunov criterion [7] as follows We compare this Lyapunov candidate function with the Kullback-Leibler divergence. The Kullback-Leibler divergence between p andp is equal to this Lyapunov candidate function δ 2 L( p p) up to the second order of δ p i , where we used the normalization of the probability i δ p i = 0. Thus, this Lyapunov candidate function δ 2 L( p p) can be regarded as the Kullback-Leibler divergence if δ p i is small.

The stability for the linear master equation and the monotonicity of the Kullback-Leibler divergence
We discuss the stability for the linear master equation based on the monotonicity of the Kullback-Leibler divergence. We now consider the situation that the transition rate does not depend on time dW i→j /dt = 0. The difference equation corresponding to the master equation (1) is given by where the transition matrix T Δt i j is defined as In the vector notation, we write the above difference equation as p(t + Δt) = T Δt p(t). We easily check that the steady-state distributionp satisfiesp = T Δtp . By using this transition matrix T Δt i j , the excess entropy production rate around the steady-state p(t) p is calculated as Because the monotonicity of the Kullback-Leibler divergence [11] holds, we obtain the nonnegativity of the excess entropy production rate which implies that the steady-statep is stable from the viewpoint of the Glansdorff-Prigogine criterion for stability. This fact can be regarded as the principle of minimum entropy production. We remark that the monotonicity can be violated in the case of the nonlinear master equation. In the case of the nonlinear master equation, the transition matrix generally depends on the probability T Δt ( p), and the steady-state distribution is given byp = T Δt (p)p. Thus, the quantity is not always nonnegative because the transition matrix T Δt ( p(t)) is different from the transition matrix T Δt (p).

Information geometry and the Lyapunov candidate function
We here discuss a relationship between the Glansdorff-Prigogine criterion for stability and information geometry. Information geometry is a differential geometry where the Riemannian metric tensor is given by the Fisher information matrix for the set of parameters The square of the line element ds 2 ( p) is given by The square of the line element does not depend on the choice of parameters because we obtain the following identity Thus, the square of the line element can be regarded as the Hessian of the Kullback-Leibler divergence Thus, the Lyapunov candidate function δ 2 L( p p) can be regarded as the square of the line element around the steady-state δ p = d p as follows, This fact implies that the Glansdorff-Prigogine criterion for stability is the Lyapunov criterion on information-geometric manifold.

The Fisher metric and the Onsager matrix under the near-equilibrium condition
We introduce the setup of linear irreversible thermodynamics [10], which explains the behavior of the system under the near-equilibrium condition. We now start with the equilibrium distribution p eq . The detailed balance condition holds for any pair (i, j), where W eq j→i is the transition rate for the equilibrium state. We discuss the transition rate W j→i = W eq j→i + O( ) where the order O( ) implies the small change of the transition rate. This transition rate W j→i leads to the steady-state statep around the equilibrium state p eq , where the steady-state statep is given by In this setup, the steady flux and the steady force are small, The steady force is proportional to the steady flux up to the order O( 2 ), The coefficient α j→i is determined by parameters in the equilibrium state W eq i→j p eq i . The coefficient is also given by up to the order O( ). We introduce the cycle force F μ (p) and the cycle flux J μ (p) defined as where S j→i (μ) is the cycle matrix [10]. The cycle matrix satisfies that S j→i (μ) = −S i→j (μ) = 1 if the μth cycle includes the edge j → i, and that S j→i (μ) = S i→j (μ) = 0 if the μth cycle does not include the edge j → i. The νth cycle force is given by where M μν is the inverse of the Onsager matrix defined as We can check that the Onsager reciprocal relations M νμ = M μν hold. If we used the fundamental set of cycle [10], the matrix M μν is regular, and the relation also holds for the inverse matrix M −1 μν . The matrix M −1 μν is known as the Onsager matrix. The Onsager reciprocal relations M −1 νμ = M −1 μν also hold because of M νμ = M μν . We start with a generalization of the above discussion for the probability p, where δ p is small enough δ p O( ). The excess force is the proportional to the excess flux as follows, where we used equations (17), (18) and (41). If we do not assume the steady-state, the cycle matrix is not regular in general. Thus, we replace the cycle matrix with a regular matrix S j→i (μ). The μ-mode is given by the set of edges, which is not a cycle necessarily. We defined the μth mode of the force and flux as respectively, where the regular matrix S j→i (μ) also satisfies the condition that S j→i (μ) = −S i→j (μ) = 1 if the μth mode includes the edge j → i, and that S j→i (μ) = S i→j (μ) = 0 if the μth mode does not include the edge j → i. In a similar way, we define the μth mode of the excess force and excess flux as Thus, the νth mode of the excess force is given by where M νμ is the inverse of the Onsager matrix defined as We also consider the inverse matrix S −1 j→i (μ) which satisfies Thus, the νth mode of the excess flux is given by where M −1 νμ is the Onsager matrix defined as By definition, the Onsager reciprocal relations M νμ = M μν and M −1 νμ = M −1 μν holds. Therefore, the excess entropy production rate is written as the quadratic form of δF ν ( p p) and δJ ν ( p p) with the inverse of the Onsager matrix M νμ and the Onsager matrix M −1 νμ , respectively, In terms of the Glansdorff-Prigogine criterion for stability, we can discuss the stability of the steady-statep based on the Onsager matrix M −1 νμ and the inverse of the Onsager matrix

S Ito
M νμ under the near-equilibrium condition. We remark that the entropy production rate is also given by Because the second law of thermodynamics σ(p) 0, the Onsager matrix M −1 νμ and the inverse of the Onsager matrix M νμ are the nonnegative-definite matrices, which satisfy Under the near-equilibrium condition, the Fisher metric is related to the inverse of the Onsager matrix and the Onsager matrix. The quantity δ p i /p i is calculated as Thus, the square of the line element around the steady-state δ p = d p is given by and Here, g J μν (p) and g F μν (p) are the Fisher information matrices for the mode of the flux and the force respectively. The matrices g J μν (p) and g F μν (p) can be interpreted as the metric tensor for the steady-statep under the near-equilibrium condition. From the relation between the line element and the excess entropy equation (36), we obtain information-geometric interpretations of the Onsager matrix and the inverse of the Onsager matrix as follows,

Disadvantage of the Glansdorff-Prigogine criterion for stability and new criterion for stability based on information geometry
Based on the above discussion, we discuss a disadvantage of the Glansdorff-Prigogine criterion for stability for the master equation [5,8]. For the linear master equation, the Glansdorff-Prigogine criterion for stability is equal to the Lyapunov criterion, but the Glansdorff-Prigogine criterion for stability always claims that the steady-state is stable. On the other hand, the Glansdorff-Prigogine criterion for stability might not be regarded as the Lyapunov criterion for the nonlinear master equation. Thus, the Glansdorff-Prigogine criterion for stability is not so useful for the master equation at least 1. We now propose a new Lyapunov criterion based on information geometry, which is a main claim in this article. Based on a relationship between the Lyapunov candidate function and the square of the line element, we generalize the Lyapunov criterion even for the nonlinear master equation. We here introduce the intrinsic speed in information geometry as a Lyapunov candidate function. The square of the intrinsic speed is defined as the Fisher information of time In the relaxation process to the stable steady-state, the square of the intrinsic speed (ds(p)/dt) 2 is decreasing in time. The color of arrow indicates the amount of the intrinsic speed, i.e. red (fast), yellow (middle) and blue (slow). Speed decay around the steady-statep indicates that the steady-statep is stable in our-information-geometric criterion for stability. Around the unstable steady-state, the probability p is removed from the steady statep and the intrinsic speed is accelerated. Speed acceleration around the steady-statep indicates that the steady-statep is unstable in our-information-geometric criterion for stability.
Thus, the square of the intrinsic speed can be a Lyapunov candidate function around the steadystate. This intrinsic speed is well defined even for the nonlinear master equation. We propose a new stability criterion based on information geometry d dt  1). If the transition rate does not depend on time dW i→j /dt = 0, we can prove that the Fisher information of time (ds( p)/dt) 2 is monotonically decreasing in time. We here introduce the matrix-valued generator G i j = W j→i − δ i j k W i→k which satisfies dp i /dt = j G i j p j . We explicitly calculate d/dt (ds(( p)/dt) 2 as follows [46], d dt Because the second term is always nonpositive we obtain the inequality d dt Therefore, the inequality equation (73) is given by d dt Because the right-hand side can be positive, the steady-state can be unstable in our informationgeometric criterion for stability.

The Cramér-Rao inequality and trade-off relations
Based on the intrinsic speed ds( p)/dt, we can obtain a trade-off relation between speed and the fluctuation of the observable. This fact is known as the Cramér-Rao inequality [29,32] in information theory. Because thermodynamic trade-off relations such as the thermodynamic uncertainty relations is mathematically based on the Cramér-Rao inequality for the path probability [56,57], the result in this section can be interpreted as a variant of thermodynamic trade-off relations. We interpret the Cramér-Rao inequality as a trade-off relation. We consider an observable R i which is a function of the microscopic state i. The fluctuation of the observable for the probability p is expressed as the standard deviation Let τ R ( p) be the time which satisfies implies the speed of the observable R i . The Cramér-Rao inequality indicates that the upper bound on the speed v R ( p) is the intrinsic speed ds( p)/dt := (ds( p)/dt) 2 [46], In terms of a trade-off relations, the Cramér-Rao inequality can be rewritten as This is a trade-off relation among the fluctuation of the observable, the mean change of the observable and the square of the intrinsic speed. The Cramér-Rao inequality provides upper bounds on the excess entropy production rate and the entropy production rate for the relaxation process driven by the linear master equation. If we consider the observable the expected value of Δ ln p i is the Kullback-Leibler divergence

S Ito
Thus, the Cramér-Rao inequality for the observable Δ ln p i is given by Around the steady-state δ p = O( ), the inequality equation (84) can also be written as up to the order O( 2 ) for the linear master equation. These result implies a novel thermodynamic trade-off relation between the intrinsic speed ds( p)/dt and the excess entropy production rate δ 2 σ( p p). This trade-off relation also implies that the excess entropy production rate should be zero δ 2 σ( p p) = 0 if the intrinsic speed is zero ds( p)/dt = 0 because the excess entropy production is nonnegative δ 2 σ( p p) 0 for the linear master equation. Thus, the excess entropy production rate δ 2 σ( p p) should be decreasing if ds( p)/dt approaches zero around the steady state. The inequality equation (84) also gives the upper bound on the entropy production rate in the case ofp = p eq , We remark that a corresponding result of equation (86) for chemical thermodynamics has been already derived in reference [49]. The Cramér-Rao inequality provides a physical interpretation for our informationgeometric criterion for stability. From the Cramér-Rao inequality, the intrinsic speed implies the speed of an optimal observable R * := argmax R v R ( p), In this sense, our information-geometric criterion for stability means that d dt d dt when δ p is small. Thus, the speed of the observable R * decreases around the steady-statep if the steady-statep is stable in our information-geometric criterion for stability.

Remarks on our information-geometric criterion for stability
At first, we compare our information-geometric criterion for stability and the Lyapunov criterion for the Lyapunov candidate function L KL ( p p), which is the Kullback-Leibler divergence between p andp. This Lyapunov candidate function L KL ( p p) is nonnegative, and zero if and only if p =p. This Lyapunov criterion can work for the nonlinear master equation because we do not use the property of the master equation to define the Lyapunov candidate function L KL ( p p). For the linear master equation, this Lyapunov criterion is approximately equal to the Gransdorff-Prigogine criterion for stability around the steady-state because the following relation holds. We remark that the quantity −d/dt(L KL ( p p)) is called as the boundary part of the nonadiabatic entropy production rate [58] or the Hatano-Sasa excess entropy production rate [13] in stochastic thermodynamics. Thus, Eq. (84) can be regarded as the trade-off relation between the intrinsic speed and the boundary part of the nonadiabatic entropy production rate. If we consider the situation p(0) p for the unstable fixed pointp, the square of the intrinsic speed is given by with the small time Δt. If we consider the situation p(τ ) p for the stable fixed pointp, the square of the intrinsic speed is given by with the small time Δt. Thus, the Lyapunov candidate function of our information-geometric criterion for stability is the Lyapunov candidate function L KL ( p p) divided by Δt 2 /2. For the nonlinear master equation, our information-geometric criterion for stability might have an advantage compared to the Lyapunov criterion for the Lyapunov candidate function L KL ( p p). For the nonlinear master equation, the steady-state solutionp is not unique. While the Lyapunov candidate function L KL ( p p) is a two-point functions of p andp, the square of intrinsic speed (ds( p)/dt) 2 2 ] in the case of our information-geometric criterion for stability. This fact might be an advantage of our information-geometric criterion for stability when the steady-state solutionp is not uniquely determined.
We remark the scope of application. Our information-geometric criterion for stability is based only on the Lyapunov candidate function (ds( p)/dt) 2 . This Lyapunov candidate function (ds( p)/dt) 2 is physically meaningful based on the Cramér-Rao inequality as the trade-off relations. If we can introduce the probability distribution, the Cramér-Rao inequality for the Lyapunov candidate function (ds( p)/dt) 2 does generally hold regardless of the types of stochastic dynamics. Thus, our information-geometric criterion for stability might work for the general Markov process and the non-Markov process. As shown in the example, we can apply our information-geometric criterion for stability to the deterministic dynamics of the autocatalytic reaction because we can introduce the probability distribution from the conservation law.
Our information-geometric criterion for stability cannot be simply applied if it is difficult to introduce the probability distribution in dynamics. For example, it might be difficult to apply our information-geometric criterion for stability to fluid dynamics. On the other hand, the original Glansdorff-Prigogine criterion for stability has been discussed for fluid dynamics [3]. Even though the general approach for dynamics without the probability distribution is unclear, we might generalize our approach for particular deterministic dynamics, such as the deterministic dynamics on chemical reaction networks. On chemical reaction networks, the intrinsic speed and the Cramér-Rao inequality can be generalized [49]. Thus we may use the square of the generalized intrinsic speed as the Lyapunov candidate function for the deterministic dynamics on chemical reaction networks.
We discuss the measurability of the intrinsic speed. The intrinsic speed is experimentally measurable if we can see dynamics of the probability distribution p(t). For example, the intrinsic speed can be measured in the single-cell experiment [50]. On the other hand, the excess entropy production rate is relatively hard to be measured because measurement of the flux and the force J i→j ( p) and F i→j ( p) at the current state p and the steady flux and the steady force J i→j (p) and F i→j (p) at the steady-statep is needed to obtain the excess entropy production rate experimentally.
The intrinsic speed ds( p)/dt is also experimentally accessible via the measurement of the expected value of the observable R i , i.e. E p [ is decreasing in time, it implies a relaxation to the stable steady-state. Because |d/dt E p(t) [R * ]| quantifies the time response of the efficient observable due to the change of the probability p, equations (88) and (89) can be regarded as a relation between the stability and the response of the efficient observable.

Example of the nonlinear master equation: autocatalytic reaction
We here discuss a simple example to show an advantage of our information-geometric criterion for stability. To discuss the difference between the Glansdorff-Prigogine criterion for stabil-  We illustrate the trade-off relation equation (81) by this autocatalytic reaction. We consider the observable and The expected values of the observable are given by and the standard deviations of the observable are given by  figure 4. We also show that the following equality holds, and thus the trade-off relation v R ( p) ds( p)/dt holds. Indeed, we can analytically derive the above equality as follows, Thus, any observable R i is efficient v R = v R * for the two dimensional case. This fact implies that dv R /dt 0 in the region d dt ds( p)/dt 2 0 around the stable steady-statep and dv R /dt > 0 in the region d dt ds( p)/dt 2 > 0 around the unstable steady-statep * . Therefore, our information-geometric criterion for stability is meaningful because this trade-off relation gives a behavior of the observable, while the Glandorff-Prigogine criterion for the stability gives an elusive conclusion.

Conclusion
We find a relation between the excess entropy production rate and the square of the line element in information geometry. Around the steady-statep, the Lyapunov candidate function of the Glansdorff-Prigogine criterion for stability δ 2 L( p p) implies the square of the line element ds 2 (p) in information geometry. Based on this relation, we generalize the Glansdorff-Prigogine criterion for stability. Our generalized criterion is more reasonable than the original criterion because our criterion can be the Lyapunov criterion for the Lyapunov candidate function (ds(δp +p)/dt) 2 which works well even for the nonlinear master equation. Thus, our information-geometric criterion for stability is useful to detect the stability of the steady-state in the nonlinear master equation. Our criterion is related to the decay speed d/dt(ds(δp +p)/dt) 2 on the manifold of the probability simplex, and naturally understandable from the viewpoint of information geometry. Moreover, a trade-off relation between the intrinsic speed ds( p)/dt and the speed of observable v R ( p) provides a novel thermodynamic trade-off relation between the intrinsic speed and the excess entropy production rate and a physical interpretation of our information-geometric criterion for stability.
Our result gives a new information-geometric perspective on the nonlinear master equation. Thermodynamics for nonlinear dynamics has been recently discussed in terms of stochastic thermodynamics and chemical thermodynamics [49,[59][60][61][62]. Our result might be useful to understand the stability of such nonlinear dynamics. It might be interesting to consider the bifurcation theory for nonlinear dynamics form the viewpoint of our information-geometric criterion for stability. Because we can discuss the critical phenomena based on the bifurcation theory in statistical physics, our approach might give new information-geometric perspective to the critical phenomena.