-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Layers that are never used in ByteLatentTransformer class #28
Comments
The same question about: Line 831 in 6ffeb66
and: Line 854 in 6ffeb66
because we already have tok_embeds and unembeds in encoder and decoder: blt/bytelatent/model/local_models.py Line 81 in 6ffeb66
blt/bytelatent/model/local_models.py Line 330 in 6ffeb66
|
Also, we initialise tok_embeds twice because of super() call in the decoder: blt/bytelatent/model/local_models.py Line 308 in 6ffeb66
|
There is a lot of orphaned code in this project. For example: blt/bytelatent/model/local_models.py Line 83 in a809259
Here, positional embeddings are created when |
Thanks for reports on this, we'll go through and do some cleanup. Feel free to comment on any other spots. |
Weight-tying will not work here, because Line 72 in 7622d28
|
Here at line 824, n-gram embeddings are initialized, only to be invalidated just after, at line 838: Line 824 in 7044771
Honestly, lines 838-849 can probably be removed, since it seems like the init_embeddings at line 824 already implements that logic gracefully.
At line 959, h_cross = None
(h_encoder, h_cross), cache_encoder = self.local_encoder(
tokens=local_encoder_tokens,
embeds=local_encoder_embeds,
patch_embeds=h_cross if self.cross_attn_encoder else None,
cross_mask=cross_attn_mask_enc,
num_patches=patch_lengths.shape[1],
patch_ids=patch_ids,
) |
Another one here: blt/bytelatent/model/local_models.py Line 92 in 7044771
self.token_embedding_projection can be created here, but is never used anywhere.
|
Hello! Please check this line :
blt/bytelatent/model/blt.py
Line 834 in 6ffeb66
It seems that these layers are here by mistake. All transformerblocks are defined inside the local and global parts. There is no self.layers in the forward at all also
The text was updated successfully, but these errors were encountered: