Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider whether we need to change the @lru_cache default memory limit #18

Open
gaurav opened this issue Nov 1, 2021 · 0 comments
Open

Comments

@gaurav
Copy link
Collaborator

gaurav commented Nov 1, 2021

We currently use an @lru_cache with a default maximum size of 128 to cache some values that we look up through the terminology service. If this limit proves to be too low, we can increase this value or replace it with None, which will replicate the functionality of the previously used @cache annotation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant