Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any idea on using GPT-2 as the language model for key-gen? #9

Open
yuchenlin opened this issue Oct 12, 2019 · 2 comments
Open

Any idea on using GPT-2 as the language model for key-gen? #9

yuchenlin opened this issue Oct 12, 2019 · 2 comments

Comments

@yuchenlin
Copy link

Hi Ning,

Thanks for your great work and we are wondering if you have any ideas that we can use GPT-2 for keywords-based sentence generation? If that is possible, it can be helpful for us to include your work as the main benchmark in our ongoing paper. Thank you very much.

@yuchenlin yuchenlin changed the title any idea on using GPT-2 as the language model for sampling? Any idea on using GPT-2 as the language model for key-gen? Oct 12, 2019
@NingMiao
Copy link
Owner

We actually trained a backward GPT-2 model. It is not difficult since the code for training is released.
I also recommend using BERT or just a forward GPT-2 as the proposal.

@yuchenlin
Copy link
Author

Thanks for the prompt reply. I saw that you mentioned that you have already the code for using CGMH over GPT-2 in another issue. Would you please share the code with us as well? That would be a huge help for our CGMH-based experiments. Thank you very much in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants