You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 29, 2022. It is now read-only.
I have a naive question, if I have a pre-trained transformer model (pre-train task is a regression probelem), can I using the embedding of this model to training a decode model to decode embeddings to sequences, as like a translation task? I wonder if seq2seq can help me on this task. Thanks, anyway.
The text was updated successfully, but these errors were encountered:
I have a naive question, if I have a pre-trained transformer model (pre-train task is a regression probelem), can I using the embedding of this model to training a decode model to decode embeddings to sequences, as like a translation task? I wonder if seq2seq can help me on this task. Thanks, anyway.
The text was updated successfully, but these errors were encountered: