Working with arrays in V8 (performance issue)

Seemingly Unlimited Arrays [2020]

In modern V8, arrays can have any size now. You can use [] or new Array(len) in any way you like, even with random access.

In current Chrome (and I guess any V8 environment), Arrays can have a length of up to 2^32-1.

enter image description here

enter image description here

However, there are a few caveats:

Dictionary-mode Still Applies

As jmrk mentioned in the comments, arrays are not magical beings. Instead, smaller arrays (up to some threshold, apparently up to a few million elements now) are not sparse at all, and only appear to be sparse. They thus will use up the actual memory for all its elements. Once the threshold has been reached, arrays fall back into dictionary-mode.

They are easier to use now, but they internally still work the same as before.

You Need to Initialize an Empty Array

On the one hand, for loops work as intended, however, Array's builtin higher order functions (such as map, filter, find, some etc.) ignore unassigned elements. They require fill (or some other method of population) first:

const a = new Array(10);
const b = new Array(10).fill(0);

a.forEach(x => console.log(x)); // does nothing
b.forEach(x => console.log(x)); // works as intended

Judging from the Array constructor code, the SetLengthWouldNormalize function and the kMaxFastArrayLength constant, it can now support an almost arbitrarily large amount (currently capped at 32 million) of elements before resorting to dictionary mode.

Note, however that there are many more considerations at play now, as V8 optimization has become ever more complicated. This official blog post from 2017 explains that arrays can distinguish between 21 different kinds of arrays (or rather, array element kinds), and that - to quote:

"each of which comes with its own set of possible optimizations"

If "sparse arrays work, let's leave it at that!" is not good enough for you, I would recommend the following:

  • Start with that blog post.
  • Learn how to use the built-in node profiler tools.

Original Post

If you pre-allocate an array with > 100000 elements in Chrome or Node (or more generally, in V8), they fall back to dictionary mode, making things uber-slow.

Thanks to some of the comments in this thread, I was able to track things down to object.h's kInitialMaxFastElementArray.

I then used that information to file an issue in the v8 repository which is now starting to gain some traction, but it will still take a while. And I quote:

I hope we'll be able to do this work eventually. But it's still probably a ways away.


If you preallocate, do not use .push because you will create a sparse array backed by a hashtable. You can preallocate sparse arrays up to 99999 elements which will be backed by a C array, after that it's a hashtable.

With the second array you are adding elements in a contiguous way starting from 0, so it will be backed by a real C array, not a hash table.

So roughly:

If your array indices go nicely from 0 to Length-1, with no holes, then it can be represented by a fast C array. If you have holes in your array, then it will be represented by a much slower hash table. The exception is that if you preallocate an array of size < 100000, then you can have holes in the array and still get backed by a C array:

var a = new Array(N); 

//If N < 100000, this will not make the array a hashtable:
a[50000] = "sparse";

var b = [] //Or new Array(N), with N >= 100000
//B will be backed by hash table
b[50000] = "Sparse";
//b.push("Sparse"), roughly same as above if you used new Array with N > 0