Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

global variables are being used instead of instance variables in DecoderWithAttention in "Show, Attend and Tell.ipynb" #1

Open
naveen-marthala opened this issue Jul 14, 2022 · 0 comments

Comments

@naveen-marthala
Copy link

I could identify some places, where the global variables are being used inside DecoderWithAttention, like this line in forward method:

        # flatten image
        encoder_out = encoder_out.view(batch_size, -1, encoder_dim)

I presume encoder_dim should have been self.encoder_dim!
and here

        # create tensors to hold word prediction scores and alphas
        predictions = torch.zeros(batch_size, max(decode_lens), vocab_size).to(device)

vocab_size should instead be self.vocab_size.

Please correct me If I got this wrong.

@naveen-marthala naveen-marthala changed the title global variables are being used instead of instance variables in DecoderWithAttention global variables are being used instead of instance variables in DecoderWithAttention in "Show, Attend and Tell.ipynb" Jul 14, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant