Forcing garbage collection to run in R with the gc() command

"Probably." I do it too, and often even in a loop as in

cleanMem <- function(n=10) { for (i in 1:n) gc() }

Yet that does not, in my experience, restore memory to a pristine state.

So what I usually do is to keep the tasks at hand in script files and execute those using the 'r' frontend (on Unix, and from the 'littler' package). Rscript is an alternative on that other OS.

That workflow happens to agree with

  • workflow-for-statistical-analysis-and-report-writing
  • tricks-to-manage-the-available-memory-in-an-r-session

which we covered here before.


From the help page on gc:

A call of 'gc' causes a garbage collection to take place. This will also take place automatically without user intervention, and the primary purpose of calling 'gc' is for the report on memory usage.

However, it can be useful to call 'gc' after a large object has been removed, as this may prompt R to return memory to the operating system.

So it can be useful to do, but mostly you shouldn't have to. My personal opinion is that it is code of last resort - you shouldn't be littering your code with gc() statements as a matter of course, but if your machine keeps falling over, and you've tried everything else, then it might be helpful.

By everything else, I mean things like

  1. Writing functions rather than raw scripts, so variables go out of scope.

  2. Emptying your workspace if you go from one problem to another unrelated one.

  3. Discarding data/variables that you aren't interested in. (I frequently receive spreadsheets with dozens of uninteresting columns.)


Supposedly R uses only RAM. That's just not true on a Mac (and I suspect it's not true on Windows either.) If it runs out of RAM, it will start using virtual memory. Sometimes, but not always, processes will 'recognize' that they need to run gc() and free up memory. When they do not do so, you can see this by using the ActivityMonitor.app and seeing that all the RAM is occupied and disk access has jumped up. I find that when I am doing large Cox regression runs that I can avoid spilling over into virtual memory (with slow disk access) by preceding calls with gc(); cph(...)