Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Importance of model loss #366

Open
ShadHOH opened this issue Jun 13, 2024 · 0 comments
Open

Importance of model loss #366

ShadHOH opened this issue Jun 13, 2024 · 0 comments

Comments

@ShadHOH
Copy link

ShadHOH commented Jun 13, 2024

Hi, thanks for a great tool

I'm curious as to what the significance of the model's loss. I've noticed that as you increase the low_count_threshold parameter, both test and training loss shoots up quite a bit. Currently I am working with low_count_threshold = 25 and experience losses in the realm of ~5000, whereas with default (and the examples I can find in the documentation and in the repo's issues) show a loss of around ~1800. Increasing the parameter further naturally increases the loss further

Based on previous issues submitted, loss does not appear to be a major concern, and in my case I only find that the results get better when using the parameter and ignoring the change in loss - better separation between real and empty cells, better learning curves, etc. Still, I am curious as to what the loss reflects, or more importantly, how it may influences the results if applicable

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant