You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using my custom dataset with 2 classes which has 10% samples with label 1 and 90% samples with label 0 in both source and target datasets. The accuracy is quite good, and is around 98+%!. But, calling accuracy 'precision' is incorrect as precision means (no. of samples in target classified as 1)/(no. of samples in target with ground truth 1).
But, how to find precision in this case? Which part of the code must be modified to find the following:
sensitivity (precision) = (no. of samples in target classified as 1)/(no. of samples in target with ground truth 1).
specificity = (no. of samples in target classified as 0)/(no. of samples in target with ground truth 0).
We promise to give you credit in our publication. Thank you
The text was updated successfully, but these errors were encountered:
I am using my custom dataset with 2 classes which has 10% samples with label 1 and 90% samples with label 0 in both source and target datasets. The accuracy is quite good, and is around 98+%!. But, calling accuracy 'precision' is incorrect as precision means (no. of samples in target classified as 1)/(no. of samples in target with ground truth 1).
But, how to find precision in this case? Which part of the code must be modified to find the following:
We promise to give you credit in our publication. Thank you
The text was updated successfully, but these errors were encountered: