Unable to use Pytorch Dataloader in Skorch NeuralNetClassifier #993
Answered
by
BenjaminBossan
animeshkumarpaul
asked this question in
Q&A
-
I am trying to pass the PyTorch data loader, but it gives the error-
Sample Code:
How can I pass this Pytorch data loader to here? Please help |
Beta Was this translation helpful? Give feedback.
Answered by
BenjaminBossan
Jul 9, 2023
Replies: 1 comment 3 replies
-
I converted the issue into a discussion, I hope you don't mind. Thanks for providing the code. Some points regarding your example:
Overall, your code could be changed like this: sampler_train = WeightedRandomSampler(weights=samples_weights_train, num_samples=len(y_train)*2, replacement=True)
X_train = X_train.astype(np.float32)
y_train = y_train.astype(np.int64)
sampler_valid = WeightedRandomSampler(weights=samples_weights_valid, num_samples=len(y_valid)*2, replacement=True)
X_valid = X_valid.astype(np.float32)
y_valid = y_valid.astype(np.int64)
from skorch import NeuralNetClassifier
net = NeuralNetClassifier(
ClassifierModule,
module__layer_sizes=[16384, 2],
module__input_size=X_train.shape[1],
max_epochs=500,
batch_size=500,
criterion=nn.CrossEntropyLoss,
criterion__weight=torch.FloatTensor([w1,w2]),
criterion__reduction='mean',
optimizer=torch.optim.Adam,
lr=0.01,
optimizer__weight_decay=1e-4,
iterator_train__shuffle=True,
iterator_train__workers=1,
iterator_train__sampler=sampler_train,
iterator_valid__workers=1,
iterator_valid__sampler=sampler_valid,
device='cuda:0',
callbacks=[early_stopping,lr_scheduler,initializer_cb, checkpoint_cb,EpochScoring_cb_train,EpochScoring_cb_valid],
train_split=None,
)
net.fit(X_train, y_train) Maybe you'll need to make a few adjustments, but this is the idea. |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
BenjaminBossan
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I converted the issue into a discussion, I hope you don't mind.
Thanks for providing the code. Some points regarding your example:
DataLoader
from PyTorch. In that case, you don't have to pass it at all, it is used by default.ClassifierModule(layer_sizes=...)
, just passClassifierModule
andmodule__layer_sizes=...
.