You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a discussion post. Please feel free to comment and contribute to the discussion even if you are not directly involved in the development of inform or its wrapper libraries.
Premise
Claude Shannon introduced his measure of entropy in his 1948 paper A Mathematical Theory of Communication. Since then there have been several new measures of entropy have been developed, (see Renyi, 1961 and Tsallis, 1988 for notable examples). Each of these measures are actually families of entropy measures parameterized by at least one continuous parameter, and tend toward Shannon's measure in some limit of those parameters. They also admit divergences which tend toward Kullback–Leibler divergence in the same limit.
Question
These alternative entropy measures do have a place in our toolbox. The question is, is it worthwhile to put some effort into implementing these measures. To my knowledge Renyi and Tsallis entropies are not implemented in any of the standard information toolkits. This could be because they are not useful, in which case implementing them would be a waste of a lot of thought, energy and drive us a little closer to carpel-tunnel. Or it could be that no one has considered using them and treasures are waiting to be uncovered.
Let us know what you think!
The text was updated successfully, but these errors were encountered:
Thanks @dglmoore for starting this conversation. At the moment I'm inclined to request Tsallis entropies, because of their apparent use in non-equilibrium (non-ergodic) stat mech. However, at the moment I don't have any specific measurements or systems I'd like to test with these entropies yet.
I would love it if we could come up with an API that would let the user decide which class of measure they would like to use. The C API would probably be hideous, maybe something like
This is a discussion post. Please feel free to comment and contribute to the discussion even if you are not directly involved in the development of inform or its wrapper libraries.
Premise
Claude Shannon introduced his measure of entropy in his 1948 paper A Mathematical Theory of Communication. Since then there have been several new measures of entropy have been developed, (see Renyi, 1961 and Tsallis, 1988 for notable examples). Each of these measures are actually families of entropy measures parameterized by at least one continuous parameter, and tend toward Shannon's measure in some limit of those parameters. They also admit divergences which tend toward Kullback–Leibler divergence in the same limit.
Question
These alternative entropy measures do have a place in our toolbox. The question is, is it worthwhile to put some effort into implementing these measures. To my knowledge Renyi and Tsallis entropies are not implemented in any of the standard information toolkits. This could be because they are not useful, in which case implementing them would be a waste of a lot of thought, energy and drive us a little closer to carpel-tunnel. Or it could be that no one has considered using them and treasures are waiting to be uncovered.
Let us know what you think!
The text was updated successfully, but these errors were encountered: