'System.OutOfMemoryException' was thrown when there is still plenty of memory free
This is my code:
int size = 100000000;
double sizeInMegabytes = (size * 8.0) / 1024.0 / 1024.0; //762 mb
double[] randomNumbers = new double[size];
Exception: Exception of type 'System.OutOfMemoryException' was thrown.
I have 4GB memory on this machine 2.5GB is free when I start this running, there is clearly enough space on the PC to handle the 762mb of 100000000 random numbers. I need to store as many random numbers as possible given available memory. When I go to production there will be 12GB on the box and I want to make use of it.
Does the CLR constrain me to a default max memory to start with? and how do I request more?
Update
I thought breaking this into smaller chunks and incrementally adding to my memory requirements would help if the issue is due to memory fragmentation, but it doesn't I can't get past a total ArrayList size of 256mb regardless of what I do tweaking blockSize.
private static IRandomGenerator rnd = new MersenneTwister();
private static IDistribution dist = new DiscreteNormalDistribution(1048576);
private static List<double> ndRandomNumbers = new List<double>();
private static void AddNDRandomNumbers(int numberOfRandomNumbers) {
for (int i = 0; i < numberOfRandomNumbers; i++) {
ndRandomNumbers.Add(dist.ICDF(rnd.nextUniform()));
}
}
From my main method:
int blockSize = 1000000;
while (true) {
try
{
AddNDRandomNumbers(blockSize);
}
catch (System.OutOfMemoryException ex)
{
break;
}
}
double arrayTotalSizeInMegabytes = (ndRandomNumbers.Count * 8.0) / 1024.0 / 1024.0;
You may want to read this: "“Out Of Memory” Does Not Refer to Physical Memory" by Eric Lippert.
In short, and very simplified, "Out of memory" does not really mean that the amount of available memory is too small. The most common reason is that within the current address space, there is no contiguous portion of memory that is large enough to serve the wanted allocation. If you have 100 blocks, each 4 MB large, that is not going to help you when you need one 5 MB block.
Key Points:
- the data storage that we call “process memory” is in my opinion best visualized as a massive file on disk.
- RAM can be seen as merely a performance optimization
- Total amount of virtual memory your program consumes is really not hugely relevant to its performance
- "running out of RAM" seldom results in an “out of memory” error. Instead of an error, it results in bad performance because the full cost of the fact that storage is actually on disk suddenly becomes relevant.
Check that you are building a 64-bit process, and not a 32-bit one, which is the default compilation mode of Visual Studio. To do this, right click on your project, Properties -> Build -> platform target : x64. As any 32-bit process, Visual Studio applications compiled in 32-bit have a virtual memory limit of 2GB.
64-bit processes do not have this limitation, as they use 64-bit pointers, so their theoretical maximum address space (the size of their virtual memory) is 16 exabytes (2^64). In reality, Windows x64 limits the virtual memory of processes to 8TB. The solution to the memory limit problem is then to compile in 64-bit.
However, object’s size in .NET is still limited to 2GB, by default. You will be able to create several arrays whose combined size will be greater than 2GB, but you cannot by default create arrays bigger than 2GB. Hopefully, if you still want to create arrays bigger than 2GB, you can do it by adding the following code to you app.config file:
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>
You don't have a continuous block of memory in order to allocate 762MB, your memory is fragmented and the allocator cannot find a big enough hole to allocate the needed memory.
- You can try to work with /3GB (as others had suggested)
- Or switch to 64 bit OS.
- Or modify the algorithm so it will not need a big chunk of memory. maybe allocate a few smaller (relatively) chunks of memory.
As you probably figured out, the issue is that you are trying to allocate one large contiguous block of memory, which does not work due to memory fragmentation. If I needed to do what you are doing I would do the following:
int sizeA = 10000,
sizeB = 10000;
double sizeInMegabytes = (sizeA * sizeB * 8.0) / 1024.0 / 1024.0; //762 mb
double[][] randomNumbers = new double[sizeA][];
for (int i = 0; i < randomNumbers.Length; i++)
{
randomNumbers[i] = new double[sizeB];
}
Then, to get a particular index you would use randomNumbers[i / sizeB][i % sizeB]
.
Another option if you always access the values in order might be to use the overloaded constructor to specify the seed. This way you would get a semi random number (like the DateTime.Now.Ticks
) store it in a variable, then when ever you start going through the list you would create a new Random instance using the original seed:
private static int randSeed = (int)DateTime.Now.Ticks; //Must stay the same unless you want to get different random numbers.
private static Random GetNewRandomIterator()
{
return new Random(randSeed);
}
It is important to note that while the blog linked in Fredrik Mörk's answer indicates that the issue is usually due to a lack of address space it does not list a number of other issues, like the 2GB CLR object size limitation (mentioned in a comment from ShuggyCoUk on the same blog), glosses over memory fragmentation, and fails to mention the impact of page file size (and how it can be addressed with the use of the CreateFileMapping
function).
The 2GB limitation means that randomNumbers
must be less than 2GB. Since arrays are classes and have some overhead them selves this means an array of double
will need to be smaller then 2^31. I am not sure how much smaller then 2^31 the Length would have to be, but Overhead of a .NET array? indicates 12 - 16 bytes.
Memory fragmentation is very similar to HDD fragmentation. You might have 2GB of address space, but as you create and destroy objects there will be gaps between the values. If these gaps are too small for your large object, and additional space can not be requested, then you will get the System.OutOfMemoryException
. For example, if you create 2 million, 1024 byte objects, then you are using 1.9GB. If you delete every object where the address is not a multiple of 3 then you will be using .6GB of memory, but it will be spread out across the address space with 2024 byte open blocks in between. If you need to create an object which was .2GB you would not be able to do it because there is not a block large enough to fit it in and additional space cannot be obtained (assuming a 32 bit environment). Possible solutions to this issue are things like using smaller objects, reducing the amount of data you store in memory, or using a memory management algorithm to limit/prevent memory fragmentation. It should be noted that unless you are developing a large program which uses a large amount of memory this will not be an issue. Also, this issue can arise on 64 bit systems as windows is limited mostly by the page file size and the amount of RAM on the system.
Since most programs request working memory from the OS and do not request a file mapping, they will be limited by the system's RAM and page file size. As noted in the comment by Néstor Sánchez (Néstor Sánchez) on the blog, with managed code like C# you are stuck to the RAM/page file limitation and the address space of the operating system.
That was way longer then expected. Hopefully it helps someone. I posted it because I ran into the System.OutOfMemoryException
running a x64 program on a system with 24GB of RAM even though my array was only holding 2GB of stuff.
I'd advise against the /3GB windows boot option. Apart from everything else (it's overkill to do this for one badly behaved application, and it probably won't solve your problem anyway), it can cause a lot of instability.
Many Windows drivers are not tested with this option, so quite a few of them assume that user-mode pointers always point to the lower 2GB of the address space. Which means they may break horribly with /3GB.
However, Windows does normally limit a 32-bit process to a 2GB address space. But that doesn't mean you should expect to be able to allocate 2GB!
The address space is already littered with all sorts of allocated data. There's the stack, and all the assemblies that are loaded, static variables and so on. There's no guarantee that there will be 800MB of contiguous unallocated memory anywhere.
Allocating 2 400MB chunks would probably fare better. Or 4 200MB chunks. Smaller allocations are much easier to find room for in a fragmented memory space.
Anyway, if you're going to deploy this to a 12GB machine anyway, you'll want to run this as a 64-bit application, which should solve all the problems.