You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running an InSAR time-series script for about 50 S1 scenes and have hit memory issues once or twice. After aligning the images, there is a line that reads sbas.backup(FILEDIR+'backup') and saves everything I have done up to that point. However, I can't find a way to load that data back into sbas to continue from there and not have to rerun the whole script.
I have tried using the AI assistant to no avail as they suggest using a load_data function which doesn't appear to exist. Any advice would save me a lot of time and memory, thank you!
The text was updated successfully, but these errors were encountered:
The backup refers to a compact, stitched, and cropped Sentinel-1 GeoTiFF dataset that can be used instead of the potentially much larger original scenes. You may restart the processing by targeting the data directory to the backup directory. If you need to save your current state to continue processing later, you can use the Stack.dump() and Stack.restore() functions.
I am running an InSAR time-series script for about 50 S1 scenes and have hit memory issues once or twice. After aligning the images, there is a line that reads
sbas.backup(FILEDIR+'backup')
and saves everything I have done up to that point. However, I can't find a way to load that data back into sbas to continue from there and not have to rerun the whole script.I have tried using the AI assistant to no avail as they suggest using a load_data function which doesn't appear to exist. Any advice would save me a lot of time and memory, thank you!
The text was updated successfully, but these errors were encountered: