You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After some time away from Sherlock, I wanted to run a simple pygeoprocessing function call on a relatively small raster but found myself exhausting memory. Of course, I had forgotten that I was running this within a SLURM job, which may have some extra identifying features available, such as environment variables.
It would be really great to be able to, at import time, log a warning or something that we are running in a slurm job and that gdal's cache max is unset, defaulting to a very large memory limit. Even better would be if we could detect the slurm job memory limit and compare it to the GDAL cache limit, but even a simple warning would be great.
The text was updated successfully, but these errors were encountered:
After some time away from Sherlock, I wanted to run a simple pygeoprocessing function call on a relatively small raster but found myself exhausting memory. Of course, I had forgotten that I was running this within a SLURM job, which may have some extra identifying features available, such as environment variables.
It would be really great to be able to, at import time, log a warning or something that we are running in a slurm job and that gdal's cache max is unset, defaulting to a very large memory limit. Even better would be if we could detect the slurm job memory limit and compare it to the GDAL cache limit, but even a simple warning would be great.
The text was updated successfully, but these errors were encountered: