Memory-mapping huge files in Java

Take a look at Using a memory mapped file for a huge matrix code which shows how to create a list of MappedByteBuffer, each smaller then 2 GB, to map the entire file:

private static final int MAPPING_SIZE = 1 << 30;
...
long size = 8L * width * height;
for (long offset = 0; offset < size; offset += MAPPING_SIZE) {
    long size2 = Math.min(size - offset, MAPPING_SIZE);
    mappings.add(raf.getChannel().map(FileChannel.MapMode.READ_WRITE, offset, size2));
}

As per JDK-6347833 (fs) Enhance MappedByteBuffer to support sizes >2GB on 64 bit platforms the reason for the 2 GB limit is:

A MappedByteBuffer is a ByteBuffer with additional operations to support memory-mapped file regions. To support mapping a region larger than Integer.MAX_VALUE would require a parallel hierarchy of classes. For now the only solution is create multiple MappedByteBuffers where each corresponds to a region that is no larger than 2GB.


As mentioned, MappedByteBuffer has the 2GB limitation due to usage of integer index/position pointers.

To get around that you could use an alternative implementation like larray