Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Negative entropy #55

Open
andresilvapimentel opened this issue Jun 17, 2021 · 3 comments
Open

Negative entropy #55

andresilvapimentel opened this issue Jun 17, 2021 · 3 comments
Labels
question Further information is requested

Comments

@andresilvapimentel
Copy link

I performed the calculation of the entropy of a protein region (five aminoacids). It turns out negative number in the entropy array. What does it mean?

@jokr91
Copy link
Collaborator

jokr91 commented Jun 18, 2021

Hi Andre,

this is not really a big issue, the values for the dihedral entropy range from negative infinity (only one state) to R * ln(2 pi) (all states are equally likely).
It is more interesting to compare the values between different parts in the molecule or between different molecules.

@andresilvapimentel
Copy link
Author

Thanks. But I did not quite understand your explanation.

I calculated the dihedral entropy of the region of a nornal protein (5 aminoacids) by using the code:
S = dih_ent.entropy
The result was: array([ 1.02632743, 0.27675774, 5.29546661, -2.06808451, 3.92356854,
-1.25778001, -0.2144733 , 9.55031112, 1.43636179, 6.05720575])

Then, I calculated the dihedral entropy of the region of a mutated protein (5 aminoacids) by using the same code:
The result was: array([-0.85348536, -1.25256182, -2.80944595, -0.0278988 , -0.68257193,
-1.39297199, 0.484963 , -1.88314858])

How do I compare these two entropies? I emphasize that I still think the negative entopies are too weird and make no sense physically!!!

@Clownshift Clownshift added the question Further information is requested label Oct 17, 2022
@Clownshift
Copy link
Collaborator

In case other users stumble upon the question:
"Can entropy be negative?"

We do not want to address this question here in detail, as it is (at least in our eyes) not actually related to our implementation. However, we want to point out that from the mathematical definition of entropy (according to, e.g., Shannon), it may very well be negative. For example for very narrow probability distributions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants