-
Notifications
You must be signed in to change notification settings - Fork 316
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
why code for dataloader is this complicated??? #102
Comments
Just personal opinion. Some of the prepossessing code of RCAN is inherit from EDSR . And MDSR(a model proposed with EDSR in the same paper) support multi-scale training.
So when MDSR is training, the data_generator function should randomly select a scale to train the model. This feature need a custom dataloader function(See dataloader.py line 37-45). By the way, Thanks for the code of RCAN. :) |
but why we cannot prepare the DIV2K in scale 2,3,4 then randomly crop a patch and feed into loader? now the dataloader is so long and hard to read/understand what is going on. |
Sorry, I'm just a fresh man in Super-resolution. And I haven't read the source code of the DRRN. I have also face the same problem in dataloader. And it's incompatible with Torch1.6. But if you have some patience and read the Besides, the RCAN doesn't use multi-scale training, so you could just replace MSDataLoader with torch.utils.data.dataloader. Also you need to change some Args passed to this Function.(I'm testing this method and it seems that the data could be read properly) |
first thanks for your code.
but TBH, why the code for dataloader is so complex and so hard to read???
The text was updated successfully, but these errors were encountered: