Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement saliency methods (from: The elephant in the interpretability room: Why use attention as explanation when we have saliency methods?) #8

Open
2 tasks
oserikov opened this issue Jan 28, 2022 · 0 comments

Comments

@oserikov
Copy link
Collaborator

oserikov commented Jan 28, 2022

it seems like there is no implementation available so the actual todo is:

  • implement the paper
  • provide an api like import saliency_interpret; saliency_interpret(huggingface_model_name, path_to_data)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant