Segmentation fault on large array sizes
The following code gives me a segmentation fault when run on a 2Gb machine, but works on a 4GB machine.
int main()
{
int c[1000000];
cout << "done\n";
return 0;
}
The size of the array is just 4Mb. Is there a limit on the size of an array that can be used in c++?
Solution 1:
You're probably just getting a stack overflow here. The array is too big to fit in your program's stack region; the stack growth limit is usually 8 MiB or 1 MiB for user-space code on most mainstream desktop / server OSes. (Normal C++ implementations use the asm stack for automatic storage, i.e. non-static
local variables arrays. This makes deallocating them happen for free when functions return or an exception propagates through them.)
If you dynamically allocate the array you should be fine, assuming your machine has enough memory.
int* array = new int[1000000]; // may throw std::bad_alloc
But remember that this will require you to delete[]
the array manually to avoid memory leaks, even if your function exits via an exception. Manual new/delete is strongly discouraged in modern C++, prefer RAII.
A better solution would be to use std::vector<int> array
(cppreference). You can reserve space for 1000000 elements, if you know how large it will grow. Or even resize
it to default-construct them (i.e. zero-initialize the memory, unlike when you declare a plain C-style array with no initializer), like std::vector<int> array(1000000)
When the std::vector
object goes out of scope, its destructor will deallocate the storage for you, even if that happens via an exception in a child function that's caught by a parent function.
Solution 2:
In C or C++ local objects are usually allocated on the stack. You are allocating a large array on the stack, more than the stack can handle, so you are getting a stackoverflow.
Don't allocate it local on stack, use some other place instead. This can be achieved by either making the object global or allocating it on the global heap. Global variables are fine, if you don't use the from any other compilation unit. To make sure this doesn't happen by accident, add a static storage specifier, otherwise just use the heap.
This will allocate in the BSS segment, which is a part of the heap. Since it's in static storage, it's zero initialized if you don't specify otherwise, unlike local variables (automatic storage) including arrays.
static int c[1000000];
int main()
{
cout << "done\n";
return 0;
}
A non-zero initializer will make a compiler allocate in the DATA segment, which is a part of the heap too. (And all the data for the array initializer will take space in the executable, including all the implicit trailing zeros, instead of just a size to zero-init in the BSS)
int c[1000000] = {1, 2, 3};
int main()
{
cout << "done\n";
return 0;
}
This will allocate at some unspecified location in the heap:
int main()
{
int* c = new int[1000000]; // size can be a variable, unlike with static storage
cout << "done\n";
delete[] c; // dynamic storage needs manual freeing
return 0;
}
Solution 3:
Also, if you are running in most UNIX & Linux systems you can temporarily increase the stack size by the following command:
ulimit -s unlimited
But be careful, memory is a limited resource and with great power come great responsibilities :)
Solution 4:
You array is being allocated on the stack in this case attempt to allocate an array of the same size using alloc.