How many chars can be in a char array?

See this response by Jack Klein (see original post):

The original C standard (ANSI 1989/ISO 1990) required that a compiler successfully translate at least one program containing at least one example of a set of environmental limits. One of those limits was being able to create an object of at least 32,767 bytes.

This minimum limit was raised in the 1999 update to the C standard to be at least 65,535 bytes.

No C implementation is required to provide for objects greater than that size, which means that they don't need to allow for an array of ints greater than (int)(65535 / sizeof(int)).

In very practical terms, on modern computers, it is not possible to say in advance how large an array can be created. It can depend on things like the amount of physical memory installed in the computer, the amount of virtual memory provided by the OS, the number of other tasks, drivers, and programs already running and how much memory that are using. So your program may be able to use more or less memory running today than it could use yesterday or it will be able to use tomorrow.

Many platforms place their strictest limits on automatic objects, that is those defined inside of a function without the use of the 'static' keyword. On some platforms you can create larger arrays if they are static or by dynamic allocation.

Now, to provide a slightly more tailored answer, DO NOT DECLARE HUGE ARRAYS TO AVOID BUFFER OVERFLOWS. That's close to the worst practice one can think of in C. Rather, spend some time writing good code, and carefully make sure that no buffer overflow will occur. Also, if you do not know the size of your array in advance, look at malloc, it might come in handy :P


It depends on where char string[HUGE_NUMBER]; is placed.

  • Is it inside a function? Then the array will be on the stack, and if and how fast your OS can grow stacks depends on the OS. So here is the general rule: dont place huge arrays on the stack.

  • Is it ouside a function then it is global (process-memory), if the OS cannot allocate that much memory when it tries to load your program, your program will crash and your program will have no chance to notice that (so the following is better:)

  • Large arrays should be malloc'ed. With malloc, the OS will return a null-pointer if the malloc failed, depending on the OS and its paging-scheme and memory-mapping-scheme this will either fail when 1) there is no continuous region of free memory large enough for the array or 2) the OS cannot map enough regions of free physical memory to memory that appears to your process as continous memory.

So, with large arrays do this:

char* largeArray = malloc(HUGE_NUMBER);
if(!largeArray) { do error recovery and display msg to user }

Declaring arbitrarily huge arrays to avoid buffer overflows is bad practice. If you really don't know in advance how large a buffer needs to be, use malloc or realloc to dynamically allocate and extend the buffer as necessary, possibly using a smaller, fixed-sized buffer as an intermediary.

Example:

#define PAGE_SIZE 1024  // 1K buffer; you can make this larger or smaller

/**
 * Read up to the next newline character from the specified stream.
 * Dynamically allocate and extend a buffer as necessary to hold
 * the line contents.
 *
 * The final size of the generated buffer is written to bufferSize.
 * 
 * Returns NULL if the buffer cannot be allocated or if extending it
 * fails.
 */
 char *getNextLine(FILE *stream, size_t *bufferSize)
 {
   char input[PAGE_SIZE];  // allocate 
   int done = 0;
   char *targetBuffer = NULL;
   *bufferSize = 0;

   while (!done)
   {
     if(fgets(input, sizeof input, stream) != NULL)
     {
       char *tmp;
       char *newline = strchr(input, '\n');
       if (newline != NULL)
       {
         done = 1;
         *newline = 0;
       }
       tmp = realloc(targetBuffer, sizeof *tmp * (*bufferSize + strlen(input)));
       if (tmp)
       {
         targetBuffer = tmp;
         *bufferSize += strlen(input);
         strcat(targetBuffer, input);
       }
       else
       {
         free(targetBuffer);
         targetBuffer = NULL;
         *bufferSize = 0;
         fprintf(stderr, "Unable to allocate or extend input buffer\n");

       }
     }
   }

If the array is going to be allocated on the stack, then you are limited by the stack size (typically 1MB on Windows, some of it will be used so you have even less). Otherwise I imagine the limit would be quite large.

However, making the array really big is not a solution to buffer overflow issues. Don't do it. Use functions that have a mechanism for limiting the amount of buffer they use to make sure you don't overstep your buffer, and make the size something more reasonable (1K for example).


You can use malloc() to get larger portions of memory than normally an array could handle.