Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alternative Entropies #22

Open
dglmoore opened this issue Jul 29, 2016 · 2 comments
Open

Alternative Entropies #22

dglmoore opened this issue Jul 29, 2016 · 2 comments

Comments

@dglmoore
Copy link
Contributor

This is a discussion post. Please feel free to comment and contribute to the discussion even if you are not directly involved in the development of inform or its wrapper libraries.

Premise

Claude Shannon introduced his measure of entropy in his 1948 paper A Mathematical Theory of Communication. Since then there have been several new measures of entropy have been developed, (see Renyi, 1961 and Tsallis, 1988 for notable examples). Each of these measures are actually families of entropy measures parameterized by at least one continuous parameter, and tend toward Shannon's measure in some limit of those parameters. They also admit divergences which tend toward Kullback–Leibler divergence in the same limit.

Question

These alternative entropy measures do have a place in our toolbox. The question is, is it worthwhile to put some effort into implementing these measures. To my knowledge Renyi and Tsallis entropies are not implemented in any of the standard information toolkits. This could be because they are not useful, in which case implementing them would be a waste of a lot of thought, energy and drive us a little closer to carpel-tunnel. Or it could be that no one has considered using them and treasures are waiting to be uncovered.

Let us know what you think!

@colemathis
Copy link

Thanks @dglmoore for starting this conversation. At the moment I'm inclined to request Tsallis entropies, because of their apparent use in non-equilibrium (non-ergodic) stat mech. However, at the moment I don't have any specific measurements or systems I'd like to test with these entropies yet.

@dglmoore
Copy link
Contributor Author

dglmoore commented Aug 5, 2016

I would love it if we could come up with an API that would let the user decide which class of measure they would like to use. The C API would probably be hideous, maybe something like

typedef enum inform_entropy_class
{
    INFORM_SHANNON = 0,
    INFORM_RENYI   = 1,
    INFORM_TSALLIS = 2,
} inform_entropy_class;

double inform_active_info(int const *series, size_t n, size_t m, int b, size_t k, inform_entropy_class h, inform_error *err);

Which might look something like

class EntropyClass(Enum):
    shannon = 0
    renyi   = 1
    tsallis = 2

def active_info(series, k, b=2, local=True, eclass=EntropyClass.shannon):

There are a couple of other ways we could go about this, but I think they may be less efficient.

Thoughts?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants