Information Geometry of κ-Exponential Families: Dually-Flat, Hessian and Legendre Structures

In this paper, we present a review of recent developments on the κ-deformed statistical mechanics in the framework of the information geometry. Three different geometric structures are introduced in the κ-formalism which are obtained starting from three, not equivalent, divergence functions, corresponding to the κ-deformed version of Kullback–Leibler, “Kerridge” and Brègman divergences. The first statistical manifold derived from the κ-Kullback–Leibler divergence form an invariant geometry with a positive curvature that vanishes in the κ→0 limit. The other two statistical manifolds are related to each other by means of a scaling transform and are both dually-flat. They have a dualistic Hessian structure endowed by a deformed Fisher metric and an affine connection that are consistent with a statistical scalar product based on the κ-escort expectation. These flat geometries admit dual potentials corresponding to the thermodynamic Massieu and entropy functions that induce a Legendre structure of κ-thermodynamics in the picture of the information geometry.


Introduction
In the study of complex systems, long tailed probability distributions are often discussed. Anomalous statistical behavior is largely observed in physical systems plagued by long-range interactions or long-time memory effects as well as in non-physical systems, such as biological, social and economic ones. In such systems, family of probability distributions characterized by deformed exponentials with asymptotic power-law tails play a relevant role.
To deal with these new phenomenologies, in recent decades, a generalized version of statistical mechanics has been formulated where power-law distributions are expressed in terms of a generalized exponential function which maximizes suitable entropic forms.
In recent years, there is a growing interest in studying statistical physics, including its generalized versions, by means of information geometry [1,2]. In this framework, the methods of the differential geometry are applied to study the properties of statistical manifolds constructed starting from a given parameterized probability distribution function.
The usefulness of the geometric approach in the study of the foundations of statistical mechanics and thermodynamics has been clear since from the early works of Gibbs and Charathéodory [3,4].
On the thermodynamical ground, after the pioneering works of Ruppeiner (see [5] for a review), which introduces a Riemannian metric in the thermodynamical parameter space starting from potentials like entropy and free energy, a rich literature has been developed to study the implications that geometry might have on the thermostatistics and in particular about its contact structure [6,7], the relationships between physical observable [8,9] and the thermodynamics transformations in a physical system [10,11].
Otherwise, the information geometry gives priority to the probabilities distributions. In this framework, one investigates the geometric structures of the statistical manifold generated by parametric families of distributions, by introducing a Riemann metric in the probability distribution space [12], identified with the Fisher metric [13]. After that, many efforts have been done to better understand the role of geometry in statistics, like, to cite a few,Čencov [14], that introduced the affine connection in a statistical manifold, Csizsár [15], that introduced the concept of f -divergence, Efron [16], that studied the role of curvature in a statistical model, Eguchi [17] that relates the affine connection to a divergence function or relative entropy and Amari [18][19][20], that further developed the differential geometry of statistical models by elucidating its dualistic nature and introducing the Pythagorean theorem and the projective theorem in the framework of information geometry.
More recently, Amari and co-workers showed the existence of a rich geometry in the probability space, the so-called α-geometry, consisting of a dually affine connection endowed by a Hessian structure [21], that is pertinent, in particular, to study of deformed exponential family [22][23][24][25].
In recent years, a certain interest has been given to the study of some probability distributions generated by deformed exponential families in which the exponential function is replaced by its generalized version. For instance, in [26] a family of generalized exponentials has been introduced by means of a positive increasing function and the corresponding thermostatistics was studied. Belonging to this family, the κ-exponential [27,28], at the heart of the κ-deformed statistical mechanics introduced by Kaniadakis in [29,30], is useful to study anomalous systems characterized by non-Gibbsian distributions with an asymptotic free-scale behavior.
In this paper, inspired by a recent work of Zhang and Naudts [66,67], we study the κ-distribution and its associated statistical manifold in the framework of information geometry, revisiting some results already known [68][69][70] and deriving new geometrical structures obtained from three different, not equivalent, divergence functions corresponding to the κ-deformed version of Brègman, "Kerridge" and Kullback-Leibler divergences.
It is known that starting from a deformed exponential family, one can introduce several kinds of different statistical manifold [71][72][73]. These can be derived from a hierarchical structure of escort expectations [74,75]. In particular, a Fisher metric can be obtained by introducing an expectation based on the first order escort distribution while a cubic form can be obtained by introducing an expectation based on the second escort distribution.
Escort averages can be useful to overcome several problems plaguing power-law distributions. For instance, such distributions have not finite momenta of any order as in κ-statistics where ⟨x n ⟩ < ∞ only for n < 1 κ − 1 [76]. An expectation defined by means of escort distribution might be more appropriate to solve these and other questions, although other types of nonlinear expectation can also be introduced [77].
In the framework of κ-deformed formalism escort distributions has been previously introduced in [68] to investigate a dually flat geometry in the κ-distribution space and than a double escort distribution has been defined in [70] to study a second dually flat geometry, which is based on the escort expectation instead of the standard expectation. In both these geometric structures a kind of fluctuation-response relation, that could be relevant in the framework of the non-equilibrium κ-deformed statistical mechanics, has been deduced by using, respectively, escort and double escort expectations. In addition, escort distribution [78] appears recurrently in the research area of multi-fractals and generalized statistical mechanics.
In this paper, we show that within the κ-deformed exponential formalism, one can introduce three kinds of geometrical structures. The first, is an invariant geometry where the Fisher information is the unique Riemannian metric together with a dual pair of invariant affine connections. The other two, are dually-flat geometries that introduce a Legendre structure and a Hessian structure on the corresponding statistical manifold [70] where some potential functions, like the Massieu function ψ κ and the entropic form S κ , are related in a way formally similar to what is done in the Kählerian geometry.
The main novelty of this work is to illuminate on the existent of an invariant geometry related to the κ-deformed formalism that has been never explored in the existent literature as well as on the existent of another dually-flat structure different from, but related to the one already studied in [70] that is characterized by scaled potentials, named para-potentials, and by a slimmer expression for the probability distribution.
The structure of the paper is as follows: In the next Section 2 we summarize the main aspect of some κ-deformed functions given by exp κ (x), ln κ (x) and u κ (x), that represent the bricks constituting the entire κ-deformed formalism. In Section 3, for the convenience of the reader, we present an overview of the (now) classical information geometry. The three statistical manifolds derived from different version of divergence function are studied in Section 4 and in particular, in Section 4.1 we introduce the statistical manifold derived from the κ-Kullback-Leibler divergence that is an invariant geometry with a positive curvature, while in Sections 4.2 and 4.3 we introduced two dually-flat geometries with their Hessian structures obtained, respectively, from the κ-Kerridge and κ-Brègman divergences. Our conclusions are reported in Section 5.

κ-Deformed Functions
To begin with, let us review some preliminary facts concerning the κ-deformed functions remanding to the relevant literature for the details [26,27,37].
Like ln κ (x), the deformed function u κ (x) satisfies the same relations λ u κ (ε) = 1 as well as u κ (x) = u κ The two functions ln κ (x) and u κ (x) are closely related each other in several ways. For instance, from the calculus we have These two relations can be rewritten in an integral form according to that can be assumed as the defining relations for these κ-deformed functions. A very general approach to introduce deformed logarithms using an integral relation has been formulated in [26]. Let φ(s) be a strictly increasing function, we can define a deformed logarithm, namely φ-logarithm, in When we choose φ(s) = s, the φ-logarithm reduces to the standard logarithm ln(x). Otherwise, by posing φ(s) = s u κ (s), Equation (5) is just the first of Equation (4), and the φ-logarithm reproduces the κ-logarithm (1).
It is interesting to note that, according to Equation (4), the function u κ (x) can be introduced as like as the κ-logarithm, by posing φ(s) = s (κ 2 ln κ (s)). However, it should be noted that u κ (x) is not a deformed logarithm since s (κ 2 ln κ (s)) is not a monotonic increasing function.
The φ-logarithm admits the inverse function, namely the φ-exponential, defined in an implicit form as In the κ-formalism, this equation can be solved explicitly so that the κ-deformed exponential takes the expression Its analytic properties follows from those of the κ-logarithm, i.e., for κ 2 < 1, exp κ (x) = exp −κ (x) is a continuous, monotonic, increasing and convex function, with exp κ (R) ⊆ R + and exp κ (0) = 1. Moreover, it satisfies the relation exp κ (−x) exp κ (x) = 1 like the standard exponential does.
On the physical ground, equilibrium distribution of a system described by an entropic form S(p) constrained by a given observable U, for instance, the internal energy, can be obtained from the optimal problem where β is the Lagrange multipliers associated to the constraint U = E p [ ], with E p [x] = ∑ µ p µ x µ the standard linear average and µ are the outcomes of the observable corresponding, for instance, to the available energy levels of the system (hereinafter, Greek indices run on 0 to n; Latin indices run on 1 to n).
In Equation (8) we assumed, at priori, the normalization of pdf so that, we consider p 0 = 1 − ∑ i p i as a function of the other p i avoiding to introduce, in this way, the normalization as a further constraint. In the κ-formalism, entropy assumes the following trace-form expression [29,30] which reduces to the Boltzmann-Gibbs-Shannon entropy S BGS (p) = − ∑ µ p µ ln(p µ ) in the κ → 0 limit. The optimal problem (8) becomes that, solved for p i , gives where we have posed θ ≡ {θ i }, with Here, γ(θ) is a function of parameters θ i and is fixed by the normalization condition. Strictly related to a given entropic form, the divergence function plays a relevant role in information theory. In a sense, it measures the "distance" between an arbitrary distribution p µ and the reference distribution q µ , although it is not rigorously a "distance" function since, in general, is not symmetric in its arguments and does not satisfy the triangle inequality.
Within the Shannon entropy, the Kullback-Leibler divergence D[p; q] [79] is defined as and can be written in the equivalent forms that we will call, with abuse of language, "Kerridge" divergency, because the cross term is known in information theory as Kerridge inaccuracy [80], or also in where, this last expression is known as Brègman divergence [81]. However, the situation is more complicated in the κ-formalism [44] since, in this case, the above expressions, rewritten in turn out to be no more equivalent, nor related to each other in a simple manner, as in the κ = 0 case. Functionals (16)- (18) are non-negative definite and vanish iff p = q, a property that follows from the concavity of ln κ (x). In particular, the κ-Brègman divergence (18) has been introduced and studied in [37].

Statistical Manifold and Its Hessian Structure
In this section, we summarize the main aspects of a statistical manifold [1,2]. Let θ ≡ {θ 1 , . . . , θ m } ∈ Ξ a set of m real parameters, where Ξ is an open subset of R m . We introduce a discrete statistical model S , the (n + 1)-dimensional simplex of discrete probability functions on the simple space Ω: where µ = 0, . . . , n, labels the n + 1 possible outcomes with probability p µ . Under suitable conditions S can be regarded as a manifold with local coordinates system θ, endowed by a metric tensor with that is a third-order completely symmetric tensor and by a Riemannian (Levi-Civita) connection Starting from definitions (21)- (22) we can introduce a family of affine connections, named In particular, 1-connection and −1-connection play a special role in information geometry. They are called, respectively, exponential-connection ∇ (e) ≡ ∇ (1) , and mixture-connection ∇ (m) ≡ ∇ (−1) and in a local representation are given by We recall that metric (20) is known in statistics as Fisher information and is the only Riemannian metric invariant under sufficient statistics on the manifold of a probability distribution. For this reason, the geometry defined by Equations (20)-(22) is called invariant geometry.
More in general, starting from the affine connection ∇, we can introduce with respect to g a dual connection ∇ * , defined by of the relation We say the triplet (Ξ, g, ∇) is a statistical manifold if ∇g is totally symmetric, that is and in addition, if g is torsion-free, also ∇ * g is totally symmetric. Therefore, (Ξ, g, ∇ * ) is again a statistical manifold dual to (Ξ, g, ∇). We say the statistical manifold (Ξ, g, ∇) is flat if ∇ is curvature-free. In this case, there exist locally on Ξ a coordinate systems θ, named affine coordinate systems, such that the connection Γ ij,l vanish on its coordinates neighbourhood.
It can be shown that Γ (α) and Γ (−α) are dual to each other and this hold also for the exponential and the mixture connection. In particular, as it is well-known, the statistical model defined by the exponential family where c j (χ) are given functions of a random variable χ and γ(θ) is the normalization factor, admits a set of affine coordinates such that the exponential connection vanishes and the corresponding statistical manifold is said exponential-flat. In the same way, the statistical model defined by the mixture family where r(χ) = {r µ (χ)} are given distribution functions of a random variable χ, with ∑ µ η µ = 1, admits a set of affine coordinates such that the mixture connection vanishes and the corresponding statistical manifold is said mixture flat. If ∇ is flat on Ξ, we say that the statistical manifold (Ξ, g, ∇) form a Hessian structure on Ξ if there exists a function ψ ≡ ψ(θ) such that, at least locally, the following formula holds In this case, be θ a ∇-affine coordinate system on Ξ, there exist a dual ∇-affine coordinate system η ≡ {η j } on the dual space Ξ, such that where ∂ j = ∂ ∂η j and in addition, the following relation hold for some function ϕ, where g ij is the inverse of the Riemannian metric g ij , with while the cubic form is given by It can be shown that the two dual-affine coordinate systems can be introduced according to where the two functions, named respectively ψ-potential and ϕ-potential, are related by the relation that introduces a Legendre structure on the statistical manifold.
Finally, let us define on Ξ × Ξ the function called canonical divergence on the Hessian manifold (Ξ, g, ∇), that is a Brégman divergence [81]. It introduces a pseudo-distance that is asymmetric, with D[p, q] ≥ 0, where equality holds iff p = q, in agreement with Equation (35). If (Ξ, g, ∇) form a Hessian structure, it is easy to verify the following relations where ∂ p i ≡ ∂ ∂θ i (p), ∂ p i ≡ ∂ ∂η i (p) and similarly for q. More in general, following [82], any definite positive contrast function ρ[p, q], that is a divergence compatible with the structure of the statistical manifold, induces a Riemannian geometry on Ξ, not necessarily flat, whose metric and connections are given by and similarly for the low indices.
Remarkably, different contrast functions that share the same cross term induce the same geometric structure, since, according to the above equations, the crossing derivative vanishes for terms depending only on p or q.

Statistical Manifolds in κ-Formalism
As known, any discrete distribution is in an exponential family and in a mixture family at the same time [2]. In fact, starting from the statistical model (19) we can define the distribution with where ϑ i (ν) = 1 when the discrete random variable ν = i and zero otherwise, that clearly belongs to the mixture-family (28), with r µ (x ν ) ≡ ϑ µ (ν) and η i ≡ p i . Otherwise, by applying the logarithm to Equation (42) we obtain the distribution that belongs to the exponential-family (27), with θ ν ≡ ∑ i (ln(p i ) − ln(p 0 )), c ν (x i ) = ϑ i (ν) and γ(θ) ≡ − ln(p 0 ).

More in general, given a generalized logarithm function ln φ (x), from Equation (42) we can introduce the distribution
that is in a discrete generalized exponential-family with parameters θ ν ≡ ∑ i (ln φ (p i ) − ln φ (p 0 )), c ν (x i ) = ϑ i (ν) and γ(θ) ≡ − ln φ (p 0 ). Therefore, the manifold of a discrete probability distribution is a super-manifold in which any statistical model of a discrete random variable, including φ-exponential family a là Naudts, is embedded as a submanifold. Different embedding of the same distribution induces different geometric structures which are related to non equivalent divergence functionals. In the following, we explore these alternatives starting from the divergence functionals introduced in Equations (16)- (18).
It is worth noticing that in the above arguments no hypothesis has been done on the origin of distribution p. However, if this distribution came from an optimization problem, as in Equation (10), the coordinates θ i would coincide with those introduced in Equation (12) and are related to the Lagrange multipliers of the corresponding constraints.

First Statistical Structure
The first geometric structure we are introducing came from the κ-deformed version of Kullback-Leibler divergence (16). It belongs to the family of the f -divergence studied by Csiszár [15] where, in general, f (x) is any convex differentiable function satisfying the condition f (1) = 0. Clearly, Equation (16) follows from (46) for f (x) = − ln κ (x).
As known, any f -divergence induces a geometry invariant under sufficient statistics and, excepting the Kullback-Leibler divergence, the corresponding statistical manifold has not a Hessian structure nor is dually-flat and thence is characterized by a not vanishing curvature [2].
In this case, the embedding we are implementing is given by the standard logarithmic where the natural coordinates are those of the exponential family η ≡ {p i } and θ ≡ {θ i }. By posing ρ[p, q] ≡ D κ [q; p] from Equations (41) we straightforward derive metric and connection in the Ξ space, according to where δ ijl = 1 for i = j = l and zero otherwise, and coincide, as expected, with the definitions of statistical Fisher information matrix and α-connection.
The inverse relations are given by while from Equation (23) we can derive the Levi-Civita connection that does not depend on the deformation parameter and the cubic form These relations confirm that κ-Kullback-Leibler divergence actually introduces the α-geometry in the framework of the κ-formalism where in agreement with the general relation α = 3 + 2 f ′′′ (1) [23]. Remark that this correspondence holds only for α ≥ 1, considering that the exponential family follows in the κ → 0 limit. This should be compared with the results discussed in [25], where an invariant geometry has been derived in the framework of the Tsallis formalism of statistical mechanics starting from the corresponding relative entropy. In that case, the analogue relationships between parameters is given by α = 1 − 2 q, which is consistent for α < 1.
The statistical manifold introduced in Equations (48)-(50) has not a dually flat structure, nor a Hessian structure, but it has a constant curvature whose Riemann-Christoffel tensor, given by becomes with As expected, R (κ) ijls vanishes in the κ → 0 limit, where the whole geometric structure collapses to the dually-flat 1-geometry.
The Ricci curvature tensor and the scalar curvature are readily evaluated in that are positive definite quantities for κ 2 < 1. Finally, in spite of the lack of an entropic form related to this invariant geometry, we can derive a particular family of discrete distributions within the κ-formalism adopting an optimization procedure based on the κ-deformed Kullback-Leibler divergence.
In this way, the equilibrium distribution achieving the minimum divergence of the function (16) is given by the following problem where we assume, at priori, the normalization for p, that is p 0 = 1 − ∑ i p i , and similar for the target distribution q. We get the κ-distribution in the form with γ(θ) = −λ ln κ (ε p 0 q 0 ) andθ, defined in a similar manner, as in Equation (12), are related to the natural coordinates θ in Therefore, let q be a reference probability distribution on S , the sub-manifold of the statistical family that optimize the divergence function (16) admits a Riemannian structure described by the Fisher metric (48), dual connection (49) and (50) and constant positive curvature (59). This geometric structure collapses, in the κ → 0 limit, in the dually-flat geometry of the exponential distribution families.

Second Statistical Structure
The next geometric structure we are introducing came from κ-deformed version of "Kerridge" divergence (17). It is strictly related to the following statistical model consistent with the embedding To introduce a manifold structure on S κ (2) we pose η ≡ {p i } so that, from relations (41) applied to ρ[p, q] ≡ D κ [q p], we obtain metric and connection as These quantities define a dually-flat structure on S κ (2) compatible with a Hessian geometry. In fact, give the ϕ-potential as whereS κ (η) is the parentropy introduced in [34] and related to the κ-entropy iñ metric and connection follows straightforward from Equations (31) and (33), respectively. Equation (68) states that coordinate η is ∇-affine on Ξ, whereas its dual coordinate system is given by with γ(θ) = − ln κ (η 0 ). The ψ-potential can be obtained from the Legendre transform (35) on ϕ(η) and is given by where the pull-back ofĨ κ (θ) on Ξ, denoted inĨ * κ (η), is defined iñ FunctionĨ * κ (η) is related to I * κ (η), usually introduced in the κ-deformed statistical mechanics [35], according toĨ * where reduces to unity in the κ → 0 limit. In [34], functional I * κ (η) has been related to the parentropyS κ [η], through the relation As will be highlighted in Section 4.3, ψ-potential is related on the physical ground to the κ-logarithm of partition function and to the scaled free energy, the para-free energy, of a physical system described by the parentropy (70), in agreement with the relatioñ By using the expression (72) we can verify the relation so that metric and connections on the Ξ-space reads confirming the dually-flat structure of S κ (2) , with θ the ∇-affine coordinate on Ξ. It is worthy to derive the canonical divergence for S κ (2) , that is a Brégman-like divergence obtainable from Equation (36). In fact, by using κ-parentropy it reads and, as expected, although differs from divergence (17), they share the same cross term so that both divergences (17) and (82) give rise to the same geometric structure. Finally, it is easy to verify that exponential family S κ (2) can be derived from the optimization problem (8), where the entropic form is given by the parentropyS κ (η) and the dual coordinate are related to the Lagrange multiplier according to Equation (12).
To go over in the study of this geometry, we show that the Hessian structure on Ξ is consistent with a scalar product on S κ (2) induced by the κ-escort probability distribution where We employ the embedding such that EP (1) ∂ ĩ (κ) = 0, which is our bias condition [69].
Introducing the tangent vector at point θ as the metric follows from the standard definitioñ that is conformal equivalent to metric (66) according to [73] In the same manner, recalling that connection is related to the variation of the tangent vectors T j under infinitesimal changing of the parameters θ → θ + dθ, we consider the quantities ∂ i T j ≡ ∂ ij̃ (κ) , that projected on the base vectors T l , gives which is related to Equation (67) by means of the conformal factor U κ . It is a duty to observe that ∂ i T j is actually unbiased because therefore the right way to obtain the connection elements is by biasing these quantities and then project them on the base vectors T l according tõ that, nevertheless, reproduce the result (90) due to our bias condition. Finally, Equation (68) follows from relation (25) and the results (66) and (67).
In a similar way, we can derive the expressions (79)-(81) from the definitions of g ij , Γ ij,l and Γ * ij,l , that here rewrite in the κ-escort formalism where we introduced the double-escort probabilityP and used the relation obtained from of the condition ∂ i (∑ µ p µ ) = 0.
Remarkably, in the escort formalism, the κ-Brègman divergence can be express in the language of the escort distribution according to which has a very straightforward interpretation.

Third Statistical Structure
The last geometric structure that we are introducing came from the Brégman-like divergence and has been already studied from a different perspective in [68][69][70].
It is consistent with the following statistical model derived from the embedding To introduce a manifold structure on S κ (3) , let us identify the η-coordinate in η ≡ {p i } and apply relations (41) to ρ[p, q] ≡ D κ [q p]. We obtain metric and connections as Clearly, S κ (3) has a dually-flat structure compatible with the Hessian geometry related to the ϕ-potential ϕ(η) ≡ −S κ (η), that is the negentropy of S κ (η) given in Equation (9), so that the dual ∇-affine coordinate on Ξ are while the ψ-potential follows from the Legendre transform on ϕ(η) and reads Remark that for positive measures, not constrained by ∑ µ p µ = 1, I κ is just the full Legendre transform of S κ in the probability space [68].
On the physical ground, the potential (105) corresponds to the κ-logarithm of partition function and is related to the free energy of the system according to Therefore, the third statistical structure, as well as the second one in its scaled formulation, reproduces and confirms the Legendre structure of the κ-deformed thermostatics previously derived by using standard arguments of statistical mechanics [35].
By using the expression of ψ(θ) it is ready to verify the first of Equations (34), that is so that metric and connections on the Ξ-space read: that confirm the dually-flat structure of S κ (3) . It is trivial to verify that canonical divergence D[p, q] on the statistical manifold S κ (3) coincides, as expected, with the κ-Brègman divergence D κ [q p] according to [68] Finally, let us just to observe that, in complete analogy with the discussion done in Section 4.2, the Hessian structure on S κ (3) is also compatible with the scalar product where are the escort average and the κ-escort distribution P (1) is Again, in the escort formalism, the κ-Brègman divergence can be written in the straightforward manner given by (115)

Concluding Remarks
In the framework of the κ-formalism we have derived several geometric structures starting from non-equivalent κ-divergence functions corresponding, in the κ → 0 limit, to the Kullback-Leibler, "Kerridge" and Brégman divergences, respectively.
The main results that characterize the emerging geometries, such as metric, connection, θ-coordinate and ϕ-and ψ-potentials, when available, are reported in the final Table 1, for an easy comparison. In particular, the statistical manifold derived from the κ-Kullback-Leibler divergence has an invariant geometry with a dual structure and constant curvature. This geometric structure corresponds to the well-known α-geometry [1,2], with α = 1 + 2 κ 2 , and is related to the natural embedding of the standard logarithm.
The other two statistical manifolds, derived from the κ-Kerridge and κ-Brégman divergences, have a dually-flat structure with dual potentials related to each other by the corresponding Legendre transform. Their respective potentials are given by the scaled entropy (or parentropy) and scaled free energy in the first case and by the entropic form and free energy in the latter case. Both these geometries are compatible with an embedding based on the κ-logarithm and a scalar product based on the escort probability average. Moreover, the κ-Brégman functional turns out to be the natural canonical divergence in the own statistical manifold while the κ-Kerridge functional share with the canonical divergence the same cross term and therefore, it is equivalent to the corresponding canonical divergence. These dually-flat structures are together related by a merely scaling transformation of the potentials and the distributions. In this way, the structure based on the κ-Brégman functional has simplest expressions for the potentials with respect to the structure based on the κ-Kerridge functional that has a slimmer expression for the distribution.
θ-coordinate θ i = λ lnκ (ε p i ) + lnκ (ε p 0 ) φ-potential ϕ(η) = −Sκ[η] ψ-potential ψ(θ) = Iκ(θ) + γ(θ) Author Contributions: A.M.S. designed the main part of the work with the help of the rest of authors and mainly wrote the manuscript. H.M. and T.W. commented on the manuscript at all stages. All authors equally performed the research and discussed the results. All authors have read and approved the final manuscript.

Conflicts of Interest:
The authors declare no conflict of interest.