Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is this pruning methods common for multiHeadAttention #37

Open
shoveller86 opened this issue Jun 17, 2022 · 0 comments
Open

Is this pruning methods common for multiHeadAttention #37

shoveller86 opened this issue Jun 17, 2022 · 0 comments

Comments

@shoveller86
Copy link

shoveller86 commented Jun 17, 2022

As the document tested the BERT models and got good result, one question is this nn_pruning methods can be applied to other Transformer models, like Google ViT, Swin Transformer and so on.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant