You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
On the full benchmark set, while generating plots with match_minima.py using data read in from pickle file, the memory grows exceedingly high (observed > 60 Gb) and is eventually killed.
I am working on identifying high memory use areas using the memory_profiler package with PYTHONPATH=../ python -m memory_profiler ../match_minima.py -i match.in --cutoff 1.0 --plot --readpickle
The text was updated successfully, but these errors were encountered:
Issue was traced back to molecules with extremely high disparate energies. For example, in this plot (disregarding the RMSD axis) some GAFF energies are exceedingly high -- 3.5e7 kcal/mol.
This particular case is due to GAFF missing a specific vdW parameter for polar hydrogen atoms leading to overlapping atoms. Additional molecules with this issue are here:
The solution for this might be to check if any of the FF values compared to the reference method is greater than some cutoff, then skip generating plots for this mol. Cutoff would be arbitrarily defined though, say 1000 kcal/mol?
vtlim
changed the title
memory management in match_minima.py
handling outlier molecules in match_minima.pyFeb 10, 2020
On the full benchmark set, while generating plots with
match_minima.py
using data read in from pickle file, the memory grows exceedingly high (observed > 60 Gb) and is eventually killed.I am working on identifying high memory use areas using the
memory_profiler
package withPYTHONPATH=../ python -m memory_profiler ../match_minima.py -i match.in --cutoff 1.0 --plot --readpickle
The text was updated successfully, but these errors were encountered: