-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: cutoff/threshold value for classification #1267
Comments
indeed a good option for optimizing CA: calibrated classifier widget, |
Improved calibration plot (#3881) now allows for exploring different thresholds and their effect on CA, F1, sens, spec, ppv ... It can also output a model with the user-defined threshold -- if the data doesn't come from cross-validation but from a single training/testing (otherwise there is no single model to output). I initially planned to add a widget that would take a learning algorithm as input and output a learning algorithm that calls the wrapped algorithm to compute a model and then imposes a threshold that optimizes CA, F1 or MCC (I wouldn't include anything exotic). The widget wouldn't be complicated: a single input and output, and a single combo box for choosing the score to optimize. But I am no longer convinced that such a widget would actually make sense.
So, a threshold-optimization widget wouldn't be very useful and it would be rather incompatible with other widgets. We can discuss it some more (I would appreciate @BlazZupan's opinion), but I lean towards considering this issue closed when #3881 is merged. |
I discussed this with @BlazZupan. We'll add a "calibrator" that will take a learning algorithm and optimize the threshold as well as calibrate probabilities; especially the latter. Of course on training data (I don't know what I was thinking yesterday :)). The widget can be connected to Test and Score, which is useful enough. |
When using classification usually a p-value is calculated to determine
the class of the target, by default orange uses 0.5 as cutoff or
threshold value if u like. Depending the on this cutoff value the model performs differently.
** For example in the learner
** Or in test and score
** or ..
The text was updated successfully, but these errors were encountered: