Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache File and Dataset IDs for DataStubs #412

Open
2 tasks
lawrence-mbf opened this issue Feb 18, 2022 · 3 comments
Open
2 tasks

Cache File and Dataset IDs for DataStubs #412

lawrence-mbf opened this issue Feb 18, 2022 · 3 comments
Assignees
Labels
status: todo something needs to be done

Comments

@lawrence-mbf
Copy link
Collaborator

  • cache file and dataset ids on load and destroy them if possible.
  • use a different solution for bound pipes which may cause file corruption if the dataset is not properly closed.
@lawrence-mbf lawrence-mbf self-assigned this Feb 18, 2022
@lawrence-mbf lawrence-mbf mentioned this issue Jul 25, 2022
3 tasks
@lawrence-mbf
Copy link
Collaborator Author

lawrence-mbf commented Jul 25, 2023

Currently blocked by #448 (comment)

@ehennestad ehennestad added the status: need more info unclear what the issue is or what needs to be done label Oct 31, 2024
@ehennestad
Copy link
Collaborator

@lawrence-mbf

Is the idea here to keep h5 IDs in memory to avoid opening and closing files/datasets more times than needed?

@lawrence-mbf
Copy link
Collaborator Author

@ehennestad I believe this was because network drives were incredibly slow to process when opened/closed. Data processing is much faster if the file ids were kept open, though you would have to be diligent to maintain the file ids so you don't get leaks or use after fcloses.

@ehennestad ehennestad added status: todo something needs to be done and removed status: need more info unclear what the issue is or what needs to be done labels Nov 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: todo something needs to be done
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants