Uniform Estimates on Length of Programs and Computing Algorithmic Complexities for Quantitative Information Measures

Verma, Rohit Kumar and Laxmi, M. Bhagya (2024) Uniform Estimates on Length of Programs and Computing Algorithmic Complexities for Quantitative Information Measures. Journal of Advances in Mathematics and Computer Science, 39 (5). pp. 44-50. ISSN 2456-9968

[thumbnail of Verma3952024JAMCS115410.pdf] Text
Verma3952024JAMCS115410.pdf - Published Version

Download (251kB)

Abstract

Shannon entropy and Kolmogorov complexity are two conceptually distinct information metrics since the latter is based on probability distributions while the former is based on program size. All recursive probability distributions, however, are known to have an expected Up to a constant that solely depends on the distribution, the Kolmogorov complexity value is equal to its Shannon entropy. We investigate if a comparable correlation exists between Renyi and Havrda- Charvat Entropy entropies order α, indicating that it is consistent solely with Renyi and Havrda- Charvat entropies of order 1.

Kolmogorov noted that the characteristics of Shannon entropy and algorithmic complexity are comparable. We examine a single facet of this resemblance. Specifically, linear inequalities that hold true for Shannon entropy and for Kolmogorov complexity. As it happens, the following are true: (1) all linear inequalities that hold true for Shannon entropy and vice versa for Kolmogorov complexity; (2) all linear inequalities that hold true for ranks of finite subsets of linear spaces for Shannon entropy; and (3) the reverse is untrue.

Item Type: Article
Subjects: Eprint Open STM Press > Mathematical Science
Depositing User: Unnamed user with email admin@eprint.openstmpress.com
Date Deposited: 13 Apr 2024 07:59
Last Modified: 13 Apr 2024 07:59
URI: http://library.go4manusub.com/id/eprint/2126

Actions (login required)

View Item
View Item