Entropy Measures vs. Kolmogorov Complexity
Entropy Measures vs. Kolmogorov Complexity
dc.contributor.author | André Souto | en |
dc.contributor.author | Luís Filipe Antunes | en |
dc.contributor.author | Andreia Teixeira | en |
dc.contributor.author | Armando Matos | en |
dc.date.accessioned | 2017-11-16T14:21:34Z | |
dc.date.available | 2017-11-16T14:21:34Z | |
dc.date.issued | 2011 | en |
dc.description.abstract | Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies. | en |
dc.identifier.uri | http://repositorio.inesctec.pt/handle/123456789/2961 | |
dc.identifier.uri | http://dx.doi.org/10.3390/e13030595 | en |
dc.language | eng | en |
dc.relation | 5649 | en |
dc.rights | info:eu-repo/semantics/openAccess | en |
dc.title | Entropy Measures vs. Kolmogorov Complexity | en |
dc.type | article | en |
dc.type | Publication | en |