C programming, why does this large array declaration produce a segmentation fault?

Well, for one thing, that's two billion integers. If you have a 32-bit address space and int has a size of four bytes on your platform (typical for a 32-bit platform), you can't store that many integers, period.

Even still, you only have so much space available to you on the stack, which is where automatic variables are located.

If you need a really large array, you should dyncamically allocate it using malloc() (and if you do so, be sure to free it using free() when you are done with it!).


int  nums_size = 2000000000;

int nums[nums_size];

Does not mean 2000000000 bytes of ints, it means 2000000000 elements of type int, which on a 32-bit platform means that you are consuming almost 8GB of memory - this is impossible.


You are allocating a giant array on the stack. Virtually no C/C++ compiler will handle that correctly.

You might be able to get away with moving it into the globals (which will allocate the space statically by mapping memory in the executable at compile time), or by switching to a malloc'd array.

Of course, that's still a LOT of memory to ask for at one go, but at least the methods I'm mentioning will avoid an immediate segfault.


Local variables are allocated on the stack. There is a fixed amount stack space (typically 1MB–8MB, varies with OS) provided to the application. The general rule is to use malloc() to allocate large amounts of data.


The answer to your question is simple: stackoverflow. No, no, not the site, but the actual process of "overflowing the stack". You don't have enough stack to store that array. As simple as that. Doing this on memory-constrained systems is pure madness. Also see this question.