You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to use the burt2020 function to create permuted nulls for volumetric MNI152 data (1mm, shape: 121, 145, 121) and have encountered multiple issues:
Disk space: At first I tried to run 1000 permutations for a 1mm voxel resolution map, using n_proc=15. After running for 5 days I was unable to connect to the server. This was due to the tmp directory filling up with >100GB of temporary files (the distance matrices I suppose!?). There was no disk space left so the script was unresponsive.
Permuted nulls contain nan values: Next I tried a lower resolution (3mm) and fewer parallel processes (n_proc=2) and fewer permutations (n_perm=2) to avoid disk fill up and to identify the problem. This time the code ran (still time intensive with 17 minutes but ok) but did not seem to generate null distributions. The returned null object was the correct shape but only contained nan values. Also despite the code running and returning a nulls object there was an error message (see below).
tmp files not found: I encountered an error that the tmp-files are not found (error message below at the end of this post). I tried to specify the directory where tmp files should be stored using the 'tempdir' argument but this caused an "unexpected keyword argument" error (despite having installed the latest version of neuromaps). I also tried to set the tmp directory for the environment manually by using "os.environ['TMPDIR'] = '/tmp/'" but the tmp files were still not found.
I tried this with different resolutions now and with different brain maps (my own t-maps, as well as maps provided within neuromaps). Also I tried different n_proc specifications (1 / 2 / 15). I also tried the burt2018 and moran functions instead. However, these did not run through within 2 hours despite using a low voxel resolution of 3mm and only n_perm=2. I checked if the nifti used as input for the function is correctly formatted and contains data. I made sure that the connectome workbench is properly installed as well.
Specs: I have ~200GB RAM and ~1.5TB free disk storage and 48 processors available.
I understand that the surface-based spin null functions are easier to handle. However, my brain map includes brain-wide t-values for volumetric differences and the effects I am interested in are widespread across cortical AND subcortical regions. Thus, using surface data seems insufficient. It is ok for me if the creation of the nulls takes some time and with the computational resources at hand it should be feasible within a reasonable amount of time.
I think your toolbox is an amazing approach to enrich brain-related findings. If you could provide help with this issue that would be invaluable!
Error message regarding missing tmp files:
Exception ignored in: <function _TemporaryFileCloser.del at 0x7f0e217ae040>
Traceback (most recent call last):
File "/home/jgoltermann/.conda/envs/spm/lib/python3.9/tempfile.py", line 445, in del
self.close()
File "/home/jgoltermann/.conda/envs/spm/lib/python3.9/tempfile.py", line 441, in close
unlink(self.name)
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpq1obodxl.mmap'
Exception ignored in: <function _TemporaryFileCloser.del at 0x7f0e217ae040>
Traceback (most recent call last):
File "/home/jgoltermann/.conda/envs/spm/lib/python3.9/tempfile.py", line 445, in del
self.close()
File "/home/jgoltermann/.conda/envs/spm/lib/python3.9/tempfile.py", line 441, in close
unlink(self.name)
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpq2nw3sxc.mmap'
Description of issue
I am trying to use the burt2020 function to create permuted nulls for volumetric MNI152 data (1mm, shape: 121, 145, 121) and have encountered multiple issues:
Disk space: At first I tried to run 1000 permutations for a 1mm voxel resolution map, using n_proc=15. After running for 5 days I was unable to connect to the server. This was due to the tmp directory filling up with >100GB of temporary files (the distance matrices I suppose!?). There was no disk space left so the script was unresponsive.
Permuted nulls contain nan values: Next I tried a lower resolution (3mm) and fewer parallel processes (n_proc=2) and fewer permutations (n_perm=2) to avoid disk fill up and to identify the problem. This time the code ran (still time intensive with 17 minutes but ok) but did not seem to generate null distributions. The returned null object was the correct shape but only contained nan values. Also despite the code running and returning a nulls object there was an error message (see below).
tmp files not found: I encountered an error that the tmp-files are not found (error message below at the end of this post). I tried to specify the directory where tmp files should be stored using the 'tempdir' argument but this caused an "unexpected keyword argument" error (despite having installed the latest version of neuromaps). I also tried to set the tmp directory for the environment manually by using "os.environ['TMPDIR'] = '/tmp/'" but the tmp files were still not found.
I tried this with different resolutions now and with different brain maps (my own t-maps, as well as maps provided within neuromaps). Also I tried different n_proc specifications (1 / 2 / 15). I also tried the burt2018 and moran functions instead. However, these did not run through within 2 hours despite using a low voxel resolution of 3mm and only n_perm=2. I checked if the nifti used as input for the function is correctly formatted and contains data. I made sure that the connectome workbench is properly installed as well.
Specs: I have ~200GB RAM and ~1.5TB free disk storage and 48 processors available.
I understand that the surface-based spin null functions are easier to handle. However, my brain map includes brain-wide t-values for volumetric differences and the effects I am interested in are widespread across cortical AND subcortical regions. Thus, using surface data seems insufficient. It is ok for me if the creation of the nulls takes some time and with the computational resources at hand it should be feasible within a reasonable amount of time.
I think your toolbox is an amazing approach to enrich brain-related findings. If you could provide help with this issue that would be invaluable!
Error message regarding missing tmp files:
Exception ignored in: <function _TemporaryFileCloser.del at 0x7f0e217ae040>
Traceback (most recent call last):
File "/home/jgoltermann/.conda/envs/spm/lib/python3.9/tempfile.py", line 445, in del
self.close()
File "/home/jgoltermann/.conda/envs/spm/lib/python3.9/tempfile.py", line 441, in close
unlink(self.name)
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpq1obodxl.mmap'
Exception ignored in: <function _TemporaryFileCloser.del at 0x7f0e217ae040>
Traceback (most recent call last):
File "/home/jgoltermann/.conda/envs/spm/lib/python3.9/tempfile.py", line 445, in del
self.close()
File "/home/jgoltermann/.conda/envs/spm/lib/python3.9/tempfile.py", line 441, in close
unlink(self.name)
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpq2nw3sxc.mmap'
My code:
nulls_mni3mm_dat_k2_burt2020 = nulls.burt2020(pet_dat_mni_3mm,
atlas='MNI152',
density='3mm',
n_perm=2,
n_proc=1,
seed=1234#,
#tempdir=tmp_dir
)
Code of Conduct
neuromaps
Code of ConductThe text was updated successfully, but these errors were encountered: