Increasing (or decreasing) the memory available to R processes
I would like to increase (or decrease) the amount of memory available to R. What are the methods for achieving this?
From:
http://gking.harvard.edu/zelig/docs/How_do_I2.html (mirror)
Windows users may get the error that R has run out of memory.
If you have R already installed and subsequently install more RAM, you may have to reinstall R in order to take advantage of the additional capacity.
You may also set the amount of available memory manually. Close R, then right-click on your R program icon (the icon on your desktop or in your programs directory). Select ``Properties'', and then select the ``Shortcut'' tab. Look for the ``Target'' field and after the closing quotes around the location of the R executible, add
--max-mem-size=500M
as shown in the figure below. You may increase this value up to 2GB or the maximum amount of physical RAM you have installed.
If you get the error that R cannot allocate a vector of length x, close out of R and add the following line to the ``Target'' field:
--max-vsize=500M
or as appropriate. You can always check to see how much memory R has available by typing at the R prompt
memory.limit()
which gives you the amount of available memory in MB. In previous versions of R you needed to use: round(memory.limit()/2^20, 2)
.
Use memory.limit()
. You can increase the default using this command, memory.limit(size=2500)
, where the size is in MB. You need to be using 64-bit in order to take real advantage of this.
One other suggestion is to use memory efficient objects wherever possible: for instance, use a matrix instead of a data.frame.
For linux/unix, I can suggest unix package.
To increase the memory limit in linux:
install.packages("unix")
library(unix)
rlimit_as(1e12) #increases to ~12GB
You can also check the memory with this:
rlimit_all()
for detailed information: https://rdrr.io/cran/unix/man/rlimit.html
also you can find further info here: limiting memory usage in R under linux