Skip to content

Commit

Permalink
Merge pull request #3364 from flairNLP/fix_embedding_size_for_xlm_rob…
Browse files Browse the repository at this point in the history
…erta

fix embedding size for xlm roberta models
  • Loading branch information
alanakbik authored Oct 28, 2023
2 parents c375fba + 032260c commit d7a69d3
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions flair/embeddings/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -1139,8 +1139,8 @@ def is_supported_t5_model(config: PretrainedConfig) -> bool:
# If we use a context separator, add a new special token
self.use_context_separator = use_context_separator
if use_context_separator:
self.tokenizer.add_special_tokens({"additional_special_tokens": [SENTENCE_BOUNDARY_TAG]})
transformer_model.resize_token_embeddings(len(self.tokenizer))
added = self.tokenizer.add_special_tokens({"additional_special_tokens": [SENTENCE_BOUNDARY_TAG]})
transformer_model.resize_token_embeddings(transformer_model.config.vocab_size + added)

super().__init__(**self.to_args())

Expand Down

0 comments on commit d7a69d3

Please sign in to comment.