FORUMMemory Allocation in R using Compute Canada Cluster
anandam asked 1 month ago



I am trying to run a Species Distribution Model Analysis using R in the Beluga and Cedar. One of the R objects I need to save is a large raster file. Everytime I save my R environment and try to load this raster file afterwards I get an error. It seems that important information is kept in a temporary directory and lost after the analysis is finished and I cannot load the raster object with all its metadata. One of the solutions for this problem seems to use the functions raster::readAll(rasterobject) or raster::getValues to avoid that important information is saved just temporarily. However, using these I get the error Error: cannot allocate vector of size 12.5 Gb. I am trying to increase the memory limits for R within the cluster using ulimit::memory_limit(20000), but the memory allocated does not change. My question is: HOW DO I INCREASE THE AMOUNT OF MEMORY FOR R USING R IN THE CLSUTER?

Thank you

1 Answers
zhibin Staff answered 1 month ago



How much memory did you request when you submitted your job? You can request memory for your job with “–mem”. Please refer to
 
https://docs.computecanada.ca/wiki/Running_jobs