-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
questions and issues regarding training with nnUNetPlannerResEncL on BraTS2023 #2653
Comments
Hi @neuronflow, First off, it is not the intended behavior that json files need to be copied around. Could you please give me information which json files you had to copy manually and also into which places? Secondly, the residual encoder nnU-Net can overfit onto the data much easier than the standard model. To get a better understanding what might be going on, please could you do the following:
Best regards, Carsten |
Dear @sten2lu, thanks for looking into this. I am testing on the non-public BraTS validation set. Similar distribution to the training set. Due to the heavy over-segmentation, the mean dice scores are very low, between 5-10%, depending on the class. I copied the |
Hi @neuronflow, Sorry for the long response time and we are actively looking into it. At the moment, I try to reproduce this Issue on the BraTS21 dataset and the BraTS23 dataset. Best regards, |
Hi @neuronflow, Could you please send me the dataset.json alongside how you preprocessed the BraTS23 dataset? |
I am trying to train nnUNet on BraTS 2023 data (1251 exams).
The idea is to train on all cases and then evaluate on a separate test set.
With the standard
3d_fullres
config, everything works without trouble.However, nnUNet advertises I should use the residual encoders, so I try:
Notably, the training reaches pseudo dice 1 for some channels and .999 for others (can this flavor of nnUNet overfit the BraTS training set really so well?)
After training, I tried to run inference on our test set. Therefore, I used:
For this to work, I had to manually copy some .json files. Am I using the wrong command here?
The resulting segmentations are terrible:
The text was updated successfully, but these errors were encountered: