How to fix "Requested array size exceeds VM limit" error in Java?

You will get this exception because you are trying to create an Array that is larger than the maximum contiguous block of memory in your Java VMs heap.

https://plumbr.eu/outofmemoryerror/requested-array-size-exceeds-vm-limit

What is the solution?

The java.lang.OutOfMemoryError: Requested array size exceeds VM limit can appear as a result of either of the following situations:

Your arrays grow too big and end up having a size between the platform limit and the Integer.MAX_INT

You deliberately try to allocate arrays larger than 2^31-1 elements to experiment with the limits.

In the first case, check your code base to see whether you really need arrays that large. Maybe you could reduce the size of the arrays and be done with it. Or divide the array into smaller bulks and load the data you need to work with in batches fitting into your platform limit.

In the second case – remember that Java arrays are indexed by int. So you cannot go beyond 2^31-1 elements in your arrays when using the standard data structures within the platform. In fact, in this case you are already blocked by the compiler announcing “error: integer number too large” during compilation. But if you really work with truly large data sets, you need to rethink your options. You can load the data you need to work with in smaller batches and still use standard Java tools, or you might go beyond the standard utilities. One way to achieve this is to look into the sun.misc.Unsafe class. This allows you to allocate memory directly like you would in C.


If you want to find out what causes OutOfMemory, you can add

-XX:+HeapDumpOnOutOfMemoryError 

to your java opts.

The next time you get out of memory, you will get a heap dump file that can be analyzed with "jhat" that is located inside jdk/lib. Jhat will show you what objects exist in your heap and how much memory they consume.


I suspect you might be using sorts on a large index. That's one thing I definitely know can require a large array size with Lucene. Either way, you might want to try using a 64-bit JVM with these options:

-Xmx6G -XX:MaxPermSize=128M -XX:+UseCompressedOops

The last option will reduce 64-bit memory pointers to 32-bit (as long the heap is under 32GB). This typically reduces the memory overhead by about 40%, so it can help stretch your memory significantly.

Update: Most likely you don't need such a large permanent generation size, certainly not 1G. You're probably fine with 128M, and you'll get a specific error if you go over with Java 6. Since you're limited to 8G in your server you might be able to get away with 7G for the heap with a smaller perm gen. Be careful about not going into swap, that can seriously slow things down for Java.

I noticed you didn't mention -XX:+UseCompressedOops in your update. That can make a huge difference if you haven't tried it yet. You might be able to squeeze a little more space out by reducing the size of eden to give the tenured generation more room. Beyond that I think you'll simply need more memory or fewer sort fields.