-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Emmanuel confusion matrix #833
base: develop
Are you sure you want to change the base?
Conversation
Added the feature in the library and in the webapp so that when we test from the webapp, it return the predicted element, and if it was correct or no
This commit add - the fact that the validator return you the correct prediction - the confusion matrix (for the moment, with the labels as numbers, and whithout light mode) in the test page statistics - the confusion matrix is extendable (it works not only for boolean, but also for multi classification)
- Move the confusion matrix into her own space - Add dark mode support for the confusion matrix - Add the labels as the title of the confusion matrix, instead of "1" or "2" (the labels have to be predicted at least once)
Hi @tomasoignons! Thanks for the PR! I'm super busy these days so I can maybe review it next week. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for the nicer test visualisation, love a table filling up of data!
I've got a few comments on how it can be tidier but the overall feature is great!
I also got a bunch of errors toaster at the start of the test:TypeError: matrix[output.predicted] is undefined
Co-authored-by: Valérian Rousset <[email protected]>
…to emmanuel-confusion-matrix
Co-authored-by: Valérian Rousset <[email protected]>
…he task.information.LABEL_LIST
…the matrix is handled
I have no clue why I don't pass the timeout here, maybe my code is suboptimal, or the postprocess takes longer and I am just above, if @JulienVig or @tharvik could check this point (not review the whole merge request, just check why the test fails, because I try random things, but the test seems to work fine until the last bit) if would be perfect. |
ho, yeah, some integration tests are a bit flaky, fell free to retry the CI/CD workflow if it doesn't look like it's coming from your code. |
…d predict randomly on simple_face'
This pull request add a confusion matrix to the testing page
The confusion matrix has been made so that it works with multiple class classfication, and not only binary classification.
This pull request modify the Validator class in the discoJS library, in order to retrieve