You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I hope that this is not trivial. I was confused by it for a while so I thought that I should bring it up.
Normally, in engineering, when I see Log, without any base, it is assumed to be logarithm to the base 10. In the following from chapter 3, Loss first_summation = torch.log(positive_pred).sum()
Printing this first summation I noticed the value = tensor(-0.1054)
I was going though the math and realized that this is not equal to log 10 of .9, which is -.045.
Going to the pytorch documentation I saw that "log Returns a new tensor with the natural logarithm of the elements of input."
Of course, in the "From Logits to Probablilties" there is shown the relationship which "kind of" hints towards natural logarithms or log to the base e, but the whole confusion can be avoided by using the symbol "ln" as opposed to "log".
Do you agree?
Thank You
Tom
The text was updated successfully, but these errors were encountered:
I see you're moving quite fast :-)
Thanks for the feedback - you do have a point - I will put this in my to-do list for the final revision.
I guess each field has its own defaults for log... in ML, I've never seen log base 10 or log base 2, it is always natural log.
Since I come from a CS background (and it is always base 2 in information theory), it bugs me a bit to see 0.69 for log(2) instead of 1 bit :-)
Thanks for the feedback - you do have a point - I will put this in my
to-do list for the final revision.
I guess each field has its own defaults for log... in ML, I've never seen
log base 10 or log base 2, it is always natural log.
Since I come from a CS background (and it is always base 2 in information
theory), it bugs me a bit to see 0.69 for log(2) instead of 1 bit :-)
Best,
Daniel
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#12 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADHGGHC6FUHV3A35M4XEEKLSTE227ANCNFSM4UMJ5Z3Q>
.
I hope that this is not trivial. I was confused by it for a while so I thought that I should bring it up.
Normally, in engineering, when I see Log, without any base, it is assumed to be logarithm to the base 10. In the following from chapter 3, Loss
first_summation = torch.log(positive_pred).sum()
Printing this first summation I noticed the value = tensor(-0.1054)
I was going though the math and realized that this is not equal to log 10 of .9, which is -.045.
Going to the pytorch documentation I saw that "log Returns a new tensor with the natural logarithm of the elements of input."
Of course, in the "From Logits to Probablilties" there is shown the relationship which "kind of" hints towards natural logarithms or log to the base e, but the whole confusion can be avoided by using the symbol "ln" as opposed to "log".
Do you agree?
Thank You
Tom
The text was updated successfully, but these errors were encountered: