What is the maximum size of an array in C?

There is no fixed limit to the size of an array in C.

The size of any single object, including of any array object, is limited by SIZE_MAX, the maximum value of type size_t, which is the result of the sizeof operator. (It's not entirely clear whether the C standard permits objects larger than SIZE_MAX bytes, but in practice such objects are not supported; see footnote.) Since SIZE_MAX is determined by the implementation, and cannot be modified by any program, that imposes an upper bound of SIZE_MAX bytes for any single object. (That's an upper bound, not a least upper bound; implementations may, and typically do, impose smaller limits.)

The width of the type void*, a generic pointer type, imposes an upper bound on the total size of all objects in an executing program (which may be larger than the maximum size of a single object).

The C standard imposes lower bounds, but not upper bounds, on these fixed sizes. No conforming C implementation can support infinite-sized objects, but it can in principle support objects of any finite size. Upper bounds are imposed by individual C implementations, by the environments in which they operate, and by physics, not by the language.

For example, a conforming implementation could have SIZE_MAX equal to 21024-1, which means it could in principle have objects up to 179769313486231590772930519078902473361797697894230657273430081157732675805500963132708477322407536021120113879871393357658789768814416622492847430639474124377767893424865485276302219601246094119453082952085005768838150682342462881473913110540827237163350510684586298239947245938479716304835356329624224137215 bytes.

Good luck finding hardware that actually supports such objects.

Footnote: There is no explicit rule that no object can be bigger than SIZE_MAX bytes. You couldn't usefully apply the sizeof operator to such an object, but like any other operator, sizeof can overflow; that doesn't mean you couldn't perform operations on such an object. But in practice, any sane implementation will make size_t big enough to represent the size of any object it supports.


Without regard for memory, the maximum size of an array is limited by the type of integer used to index the array.


A 64-bit machine could theoretically address a maximum of 2^64 bytes of memory.


C99 5.2.4.1 "Translation limits" minimal size

The implementation shall be able to translate and execute at least one program that contains at least one instance of every one of the following limits: 13)

  • 65535 bytes in an object (in a hosted environment only)
  1. Implementations should avoid imposing fixed translation limits whenever possible.

This suggests that a conforming implementation could refuse to compile an object (which includes arrays) with more than short bytes.

PTRDIFF_MAX also imposes some limits on array says

The C99 standard 6.5.6 Additive operators says:

9 When two pointers are subtracted, both shall point to elements of the same array object, or one past the last element of the array object; the result is the difference of the subscripts of the two array elements. The size of the result is implementation-defined, and its type (a signed integer type) is ptrdiff_t defined in the <stddef.h> header. If the result is not representable in an object of that type, the behavior is undefined.

Which implies to me that arrays larger than ptrdiff_t are allowed in theory, but then you cannot take the difference of their addresses portabibly.

So perhaps for this reason, GCC just seems to limit you to ptrdiff_t. This is also mentioned at: Why is the maximum size of an array "too large"?

Experiments

Maybe what ultimately matters is whatever your compiler will accept, so here we go:

main.c

#include <stdint.h>

TYPE a[(NELEMS)];

int main(void) {
    return 0;
}

sizes.c

#include <stdint.h>
#include <stdio.h>

int main(void) {
    printf("PTRDIFF_MAX 0x%jx\n", (uintmax_t)PTRDIFF_MAX);
    printf("SIZE_MAX    0x%jx\n", (uintmax_t)SIZE_MAX);
    return 0;
}

And then we try to compile with:

gcc -ggdb3 -O0 -std=c99 -Wall -Wextra -pedantic -o sizes.out sizes.c
./sizes.out
gcc -ggdb3 -O0 -std=c99 -Wall -Wextra -pedantic -o main.out \
  -DNELEMS='((2lu << 62) - 1)' -DTYPE=uint8_t main.c 

Results:

  • PTRDIFF_MAX: 0x7fffffffffffffff = 2^63 - 1

  • SIZE_MAX: 0xffffffffffffffff = 2^64 - 1

  • -DNELEMS='((2lu << 62) - 1)' -DTYPE=uint8_t: compiles (== 2^63 - 1). Running it segfaults immediately on my mere 32 GB RAM system :-)

  • -DNELEMS='(2lu << 62)' -DTYPE=uint8_t: compilation fails with:

    error: size of array ‘a’ is too large
    
  • -DNELEMS='(2lu << 62 - 1)' -DTYPE=uint16_t: compilation fails with:

    error: size ‘18446744073709551614’ of array ‘a’ exceeds maximum object size ‘9223372036854775807’
    

    where 9223372036854775807 == 0x7fffffffffffffff

So from this we understand that GCC imposes two limitations with different error messages:

  • number of elements cannot exceed 2^63 (happens to == PTRDIFF_MAX)
  • array size cannot exceed 2^63 (also happens to == PTRDIFF_MAX)

Tested on Ubuntu 20.04 amd64, GCC 9.3.0.

See also

  • Are there any size limitations for C structures?
  • What is the correct type for array indexes in C?

Tags:

C

Arrays