You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great work and we are wondering if you have any ideas that we can use GPT-2 for keywords-based sentence generation? If that is possible, it can be helpful for us to include your work as the main benchmark in our ongoing paper. Thank you very much.
The text was updated successfully, but these errors were encountered:
yuchenlin
changed the title
any idea on using GPT-2 as the language model for sampling?
Any idea on using GPT-2 as the language model for key-gen?
Oct 12, 2019
We actually trained a backward GPT-2 model. It is not difficult since the code for training is released.
I also recommend using BERT or just a forward GPT-2 as the proposal.
Thanks for the prompt reply. I saw that you mentioned that you have already the code for using CGMH over GPT-2 in another issue. Would you please share the code with us as well? That would be a huge help for our CGMH-based experiments. Thank you very much in advance!
Hi Ning,
Thanks for your great work and we are wondering if you have any ideas that we can use GPT-2 for keywords-based sentence generation? If that is possible, it can be helpful for us to include your work as the main benchmark in our ongoing paper. Thank you very much.
The text was updated successfully, but these errors were encountered: