Skip to main content
Log in

The Augustin Capacity and Center

  • Information Theory
  • Published:
Problems of Information Transmission Aims and scope Submit manuscript

Abstract

For any channel, the existence of a unique Augustin mean is established for any positive order and probability mass function on the input set. The Augustin mean is shown to be the unique fixed point of an operator defined in terms of the order and the input distribution. The Augustin information is shown to be continuously differentiable in the order. For any channel and convex constraint set with finite Augustin capacity, the existence of a unique Augustin center and the associated van Erven-Harremoes bound are established. The Augustin-Legendre (A-L) information, capacity, center, and radius are introduced, and the latter three are proved to be equal to the corresponding Rényi-Gallager quantities. The equality of the A-L capacity to the A-L radius for arbitrary channels and the existence of a unique A-L center for channels with finite A-L capacity are established. For all interior points of the feasible set of cost constraints, the cost constrained Augustin capacity and center are expressed in terms of the A-L capacity and center. Certain shift-invariant families of probabilities and certain Gaussian channels are analyzed as examples.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Nakiboğlu, B., The Augustin Center and the Sphere Packing Bound for Memoryless Channels, Proc. 2017 IEEE Int. Sympos. on Information Theory (ISIT’2017), Aachen, Germany, June 25–30, 2017, pp. 1401–1405.

  2. Csiszár, I., Generalized Cutoff Rates and Rényi’s Information Measures, IEEE Trans. Inform. Theory, 1995, vol. 41, no. 1, pp. 26–34.

    Article  MathSciNet  Google Scholar 

  3. Dalai, M., Some Remarks on Classical and Classical-Quantum Sphere Packing Bounds: Rányi vs. Kullback-Leibler, Entropy, 2017, vol. 19, no. 7, pp. 355 (11 pp.).

    Article  MathSciNet  Google Scholar 

  4. Mosonyi, M. and Ogawa, T., Divergence Radii and the Strong Converse Exponent of Classical-Quantum Channel Coding with Constant Compositions, arXiv:1811.10599v4 [quant-ph], 2018.

  5. Csiszár, I. and Körner, J., Information Theory: Coding Theorems for Discrete Memoryless Systems, Cambridge, UK: Cambridge Univ. Press, 2011, 2nd ed. First edition translated under the title Teoriya informatsii: teoremy kodirovaniya dlya diskretnykh sistem bez pamyati, Moscow: Mir, 1985.

    Book  Google Scholar 

  6. Augustin, U., Noisy Channels, Habilitation Thesis, Universität Erlangen-Nürnberg, 1978. Avaliable at http://bit.ly/2ID8h7m.

  7. Nakiboğlu, B., The Sphere Packing Bound for Memoryless Channels, arXiv:1804.06372 [cs.IT], 2018.

  8. van Erven, T. and Harremöes, P., Rényi Divergence and Kullback-Leibler Divergence, IEEE Trans. Inform. Theory, 2014, vol. 60, no. 7, pp. 3797–3820.

    Article  MathSciNet  Google Scholar 

  9. Shayevitz, O., A Note on a Characterization of Rényi Measures and Its Relation to Composite Hypothesis Testing, arXiv:1012.4401v2 [cs.IT], 2010.

  10. Shayevitz, O., On Rényi Measures and Hypothesis Testing, in Proc. 2010 IEEE Int. Sympos. on Information Theory (ISIT’2010), Austin, Texas, USA, June 13–18, 2010, pp. 894–898.

  11. Verdú, S., α-Mutual Information, in Proc. 2015 Information Theory and Applications Workshop (ITA’2015), San Diego, CA, USA, Feb. 1–6, 2015, pp. 1–6. Available at http://ita.ucsd.edu/workshop/15/files/paper/paper_374.pdf.

  12. Kemperman, J.H.B., On the Shannon Capacity of an Arbitrary Channel, Indag. Math., 1974, vol. 77, no. 2, pp. 101–115.

    Article  MathSciNet  Google Scholar 

  13. Nakiboğlu, B., The Rényi Capacity and Center, IEEE Trans. Inform. Theory, 2019, vol. 65, no. 2, pp. 841–860.

    Article  MathSciNet  Google Scholar 

  14. Gallager, R.G., A Simple Derivation of the Coding Theorem and Some Applications, IEEE Trans. Inform. Theory, 1965, vol. 11, no. 1, pp. 3–18.

    Article  MathSciNet  Google Scholar 

  15. Gallager, R.G., Information Theory and Reliable Communication, New York: Wiley, 1968.

    MATH  Google Scholar 

  16. Ebert, P.M., Error Bounds For Parallel Communication Channels, Tech. Rep. of Research Lab. of Electronics, MIT, Cambridge, MA, USA, 1966, no. 448. Available at https://dspace.mit.edu/handle/1721.1/4295.

  17. Richters, J.S., Communication over Fading Dispersive Channels, Tech. Rep. of Research Lab. of Electronics, MIT, Cambridge, MA, USA, 1967, no. 464. Available at https://dspace.mit.edu/handle/1721.1/4279.

  18. Haroutunian, E.A., Bounds for the Exponent of the Probability of Error for a Semicontinuous Memory-less Channel, Probl. Peredachi Inf., 1968, vol. 4, no. 4, pp. 37–48 [Probl. Inf. Transm. (Engl. Transl.), 1968, vol. 4, no. 4, pp. 29–39].

    Google Scholar 

  19. Poltyrev, G. Sh., Random Coding Bounds for Discrete Memoryless Channels, Probl. Peredachi Inf., 1982, vol. 18, no. 1, pp. 12–26 [Probl. Inf. Transm. (Engl. Transl.), 1982, vol. 18, no. 1, pp. 9–21].

    MathSciNet  MATH  Google Scholar 

  20. Dudley, R.M., Real Analysis and Probability, Cambridge: Cambridge Univ. Press, 2002.

    Book  Google Scholar 

  21. Bogachev, V.I., Measure Theory, Berlin: Springer, 2007.

    Book  Google Scholar 

  22. Nakiboğlu, B., The Augustin Capacity and Center, arXiv:1803.07937 [cs.IT], 2018.

  23. Fano, R.M., Transmission of Information: A Statistical Theory of Communications, New York: M.I.T. Press, 1961.

    Book  Google Scholar 

  24. Arimoto, S., Computation of Random Coding Exponent Functions, IEEE Trans. Inform. Theory, 1976, vol. 22, no. 6, pp. 665–671.

    Article  MathSciNet  Google Scholar 

  25. Oohama, Y., The Optimal Exponent Function for the Additive White Gaussian Noise Channel at Rates above the Capacity, in Proc. 2017 IEEE Int. Sympos. on Information Theory (ISIT’2017), Aachen, Germany, June 25–30, 2017, pp. 1053–s1057.

  26. Oohama, Y., Exponent Function for Stationary Memoryless Channels with Input Cost at Rates above the Capacity, arXiv:1701.06545v3 [cs.IT], 2017.

  27. Vazquez-Vilar, G., Martinez, A., and Guillén i Fàbregas, A., A Derivation of the Cost-Constrained Sphere-Packing Exponent, in Proc. 2015 IEEE Int. Sympos. on Information Theory (ISIT’2015), Hong Kong, China, June 14–19, 2015, pp. 929–933.

  28. Rényi, A., On Measures of Entropy and Information, Proc. 4th Berkeley Sympos. on Mathematical Statistics and Probability, Berkely, CA, USA, June 20–July 30, 1960, Neyman, J., Ed., Berkely: Univ. of California Press, 1961, vol. 1: Contributions to the Theory of Statistics, pp. 547–561.

  29. Csiszár, I., Information-type Measures of Difference of Probability Distributions and Indirect Observations, Studia Sci. Math. Hungar., 1967, vol. 2, no. 3–4, pp. 299–318.

    MathSciNet  MATH  Google Scholar 

  30. Gilardoni, G.L., On Pinsker’s and Vajda’s Type Inequalities for Csiszar’s f-Divergences, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 11, pp. 5377–5386.

    Article  MathSciNet  Google Scholar 

  31. Shiryaev, A.N., Probability, New York: Springer, 1995.

    MATH  Google Scholar 

  32. Polyanskiy, Y. and Verdú, S., Arimoto Channel Coding Converse and Rényi Divergence, in Proc. 48th Annual Allerton Conf. on Communication, Control, and Computation, Sept. 29–Oct. 1, 2010, Allerton, IL, USA, pp. 1327–1333.

  33. Kolmogorov, A.N. and Fomin, S.V., Elementy teorii funktsii i funktsional’nogo analiza (Basics of Function Theory and Functional Analysis), Moscow: Nauka, 1968. Translated under the title Introductory Real Analysis, New York: Dover, 1975.

    Google Scholar 

  34. Csiszár, I., A Class of Measures of Informativity of Observation Channels, Period. Math. Hungar., 1972, vol. 2, no. 1–4, pp. 191–213.

    Article  MathSciNet  Google Scholar 

  35. Sibson, R., Information Radius, Z. Wahrsch. Verw. Gebiete, 1969, vol. 14, no. 2, pp. 149–160.

    Article  MathSciNet  Google Scholar 

  36. Blahut, R.E., Hypothesis Testing and Information Theory, IEEE Trans. Inform. Theory, 1974, vol. 20, no. 4, pp. 405–417.

    Article  MathSciNet  Google Scholar 

  37. Kostina, V. and Verdú, S., Channels with Cost Constraints: Strong Converse and Dispersion, IEEE Trans. Inform. Theory, 2015, vol. 61, no. 5, pp. 2415–2429.

    Article  MathSciNet  Google Scholar 

  38. Nakiboğlu, B., The Sphere Packing Bound via Augustin’s Method, IEEE Trans. Inform. Theory, 2019, vol. 65, no. 2, pp. 816–840.

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgment

The author would like to thank Fatma Nakiboğlu and Mehmet Nakiboğlu for their hospitality; this work would not have been possible without it. The author would like to thank Marco Dalai for informing him about Fano’s implicit assertion of the fixed point property in [23] and Gonzalo Vazquez-Vilar for pointing out Poltyrev’s paper [19] on the random coding bound. Author would also like to thank the reviewer for his meticulous report, which allowed the author to correct a number of inaccurate and/or imprecise statements in the original manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to B. Nakiboğlu.

Additional information

Russian Text © The Author(s), 2019, published in Problemy Peredachi Informatsii, 2019, Vol. 55, No. 4, pp. 3–51.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nakiboğlu, B. The Augustin Capacity and Center. Probl Inf Transm 55, 299–342 (2019). https://doi.org/10.1134/S003294601904001X

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S003294601904001X

Key words

Navigation