You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am training models with custom entities - and I was wondering if I can get the "confidence" level when it did not correctly predict an expected entity. When it does predict the entity, it seems to come up with a confidence level (e.g. .993424 etc.) so I was hoping I can tweak or set a confidence level to say .85 and above so it predicts the entity instead of being unrecognized
The text was updated successfully, but these errors were encountered:
Hi @pganesh
I don't think that is easy to do.
The way the viterbi algorithm works, you get only the most likely sequence out. If you want to score an entity-candidate that is not part of the most likely sequence, you'd have to do some different calculations. You can do this, by calling tagger.predict(..., force_token_predictions=True) and then use the scores to calculate everything on your own, but I am not sure what the exact formula for this would be.
On Fri, Dec 6, 2024 at 7:54 AM Benedikt Fuchs ***@***.***> wrote:
Hi @pganesh <https://github.com/pganesh>
I don't think that is easy to do.
The way the viterbi algorithm works, you get only the most likely sequence
out. If you want to score an entity-candidate that is not part of the most
likely sequence, you'd have to do some different calculations. You can do
this, by calling tagger.predict(..., force_token_predictions=True) and
then use the scores to calculate everything on your own, but I am not sure
what the exact formula for this would be.
—
Reply to this email directly, view it on GitHub
<#3575 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABOOQI2NR2HOOCIFON6YYLL2EHCC7AVCNFSM6AAAAABS4V4T72VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKMRTGU2TKNZUHA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
Question
Hi, I am training models with custom entities - and I was wondering if I can get the "confidence" level when it did not correctly predict an expected entity. When it does predict the entity, it seems to come up with a confidence level (e.g. .993424 etc.) so I was hoping I can tweak or set a confidence level to say .85 and above so it predicts the entity instead of being unrecognized
The text was updated successfully, but these errors were encountered: