Abstract (EN):
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Renyi and Tsallis entropies of order alpha, showing that it only holds for alpha = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution m(t)(x), Tsallis and Renyi entropies converge if and only if alpha is greater than 1. We also establish the uniform continuity of these entropies.
Language:
English
Type (Professor's evaluation):
Scientific
Contact:
andreiasofia@ncc.up.pt; acm@dcc.fc.up.pt; andresouto@dcc.fc.up.pt; lfa@dcc.fc.up.pt
No. of pages:
17