Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom dataset error #68

Open
jmandivarapu1 opened this issue Aug 19, 2020 · 4 comments
Open

Custom dataset error #68

jmandivarapu1 opened this issue Aug 19, 2020 · 4 comments

Comments

@jmandivarapu1
Copy link

Hi ,

I am using custom dataset and running reg_lstm model, I am getting the below error. But for the other models it's running fine. Can anybody help me out?

RuntimeError: Length of all samples has to be greater than 0, but found an element in 'lengths' that is <= 0
@lintool
Copy link
Member

lintool commented Aug 19, 2020

The error message seems pretty informative - have you checked the length of your input samples?

@jmandivarapu1
Copy link
Author

I did checked it, its of size 32,32

length tensor([[  225,  1168,   377,     4,   274,  3939,     8,    54,   105,   268,
            33,  5755, 83638],
        [    8,    54,   105,    19,    44,    44, 56528,   486,   767,    50,
          5304,     1,     1],
        [19369, 15135, 10210,  8251, 43371,  2545,  7481, 11348, 90504,   621,
         56398,     1,     1],
        [72795,  4372,   100, 70551,  1544,  1034,  3004, 13160,     6,  1370,
             1,     1,     1],
        [19919, 53218,    80,  1002,  1115,     8,    54,   105,  3386, 19796,
             1,     1,     1],
        [    2, 69127,  1703,     3,  1077, 59162,    16,  1174, 50016, 88044,
             1,     1,     1],
        [    2,  6726, 67871,  4653, 11485,    83,  6732,    83,   633, 38001,
             1,     1,     1],
        [87454, 20624,  4879,  4357,   430,    80,     6,  6026,  2196,  1489,
             1,     1,     1],
        [36913,   277,   286, 13804,    31, 14327,     6, 36907,  4260,     1,
             1,     1,     1],
        [    9, 15352, 95585,  2684,    72,  5433,   486,    24, 47224,     1,
             1,     1,     1],
        [ 5564,   299, 10120,    58,  5164,  7374,    84,  4070, 70076,     1,
             1,     1,     1],
        [   14,     2,  3648,     3,   519,  3637, 36810,     1,     1,     1,
             1,     1,     1],
        [  519,  6904, 10666, 32768, 74735, 48988, 16957,     1,     1,     1,
             1,     1,     1],
        [83525, 38609,   441, 63150,   557, 50244,   856,     1,     1,     1,
             1,     1,     1],
        [    8,    54,   105,  6929,    74, 10401, 36925,     1,     1,     1,
             1,     1,     1],
        [36812,    21,  1872,   261,   121, 76565,    27,     1,     1,     1,
             1,     1,     1],
        [   66,  1099,   971,     6,    11,  4987,     1,     1,     1,     1,
             1,     1,     1],
        [73323,  1517,  2818, 30047,    19, 80137,     1,     1,     1,     1,
             1,     1,     1],
        [  111,  1034, 69019, 36800,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [94708,    20, 19813,  1470,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [46071, 58952,    76,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [37106, 75713,   232,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [   65, 83533, 36175,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [36828,   428,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [38424, 58372,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [37562,     1,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [94372,     1,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [36801,     1,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [36958,     1,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [    1,     1,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [    1,     1,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1],
        [    1,     1,     1,     1,     1,     1,     1,     1,     1,     1,
             1,     1,     1]], device='cuda:0') tensor([13, 11, 11, 10, 10, 10, 10, 10,  9,  9,  9,  7,  7,  7,  7,  7,  6,  6,
         4,  4,  3,  3,  3,  2,  2,  1,  1,  1,  1,  0,  0,  0],
       device='cuda:0')
Traceback (most recent call last):
  File "/home/ubuntu/anaconda3/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/ubuntu/anaconda3/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/home/ubuntu/new_mount_device/hedwig/models/reg_lstm/__main__.py", line 149, in <module>
    trainer.train(args.epochs)
  File "/home/ubuntu/new_mount_device/hedwig/common/trainers/classification_trainer.py", line 93, in train
    dev_acc, dev_precision, dev_recall, dev_f1, dev_loss = self.dev_evaluator.get_scores()[0]
  File "/home/ubuntu/new_mount_device/hedwig/common/evaluators/classification_evaluator.py", line 40, in get_scores
    scores = self.model(batch.text[0], lengths=batch.text[1])
  File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 550, in __call__
    result = self.forward(*input, **kwargs)
  File "/home/ubuntu/new_mount_device/hedwig/models/reg_lstm/model.py", line 74, in forward
    x = torch.nn.utils.rnn.pack_padded_sequence(x, lengths, batch_first=True)
  File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/torch/nn/utils/rnn.py", line 244, in pack_padded_sequence
    _VF._pack_padded_sequence(input, lengths, batch_first)
RuntimeError: Length of all samples has to be greater than 0, but found an element in 'lengths' that is <= 0

@achyudh
Copy link
Member

achyudh commented Aug 19, 2020

I see that the last three elements have values 0, 0, 0. Even though your input is of non-zero length, the length vector might not have been set properly. Could that be the issue?

@jmandivarapu1
Copy link
Author

I see that the last three elements have values 0, 0, 0. Even though your input is of non-zero length, the length vector might not have been set properly. Could that be the issue?

Yeah I felt the same. But
when I run below works fine

python -m models.bert --dataset Reuters --model bert-base-uncased --max-seq-length 256 --batch-size 16 --lr 2e-5 --epochs 30

but when I run

python -m models.reg_lstm --dataset Reuters --mode static --batch-size 32 --lr 0.01 --epochs 30 --bidirectional --num-layers 1 --hidden-dim 512 --wdrop 0.1 --embed-droprate 0.2 --dropout 0.5 --beta-ema 0.99 --seed 3435

So I am thinking where the error might be, because of LSTM or so?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants