optimize_weight_bits inflatting weight values? #305
Michaeljurado42
started this conversation in
General
Replies: 1 comment 2 replies
-
Hi @Michaeljurado42 I think this is a bug. From a cursory look, it should have been |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I initially reported this as a bug but am thinking that there is actually a reason behind this behavior.
Basically I have been observing some of the synapse weight values of an exported netx model. Surprisingly, I am seeing weight values scaled much higher than i was expecting. I traced the issue back to this optimize_weight_bits which produces results like this:
Original Weights: [-1 1]
Optimized Weights: [-128 128]
Number of Weight Bits: 2
Weight Exponent: -7
Sign Mode: 1
It seems like a better way to encode this would be leave the optimized weight bits alone and set weight_exponent = 0. So my question is, what is the reason behind this behavior?
I'll update the discussion if I figure out what the answer is.
Beta Was this translation helpful? Give feedback.
All reactions