How to handle huge data/images in RAM in Java?

You can read the specific portion of an image, then scale it with reduced resolution for display purpose.

So in your case you can read the image in chunk (read image portions just like we read the data from db row by row)

For example:

// Define the portion / row size 50px or 100px
int rowHeight = 50;
int rowsToScan = imageHeight / rowHeight;
if(imageHeight % rowHeight > 0) rowsToScan++;

int x = 0;
int y = 0;
int w = imageWidth;
int h = rowHeight;

ArrayList<BufferedImage> scaledImagePortions = new ArrayList<>();

for(int i = 1; i <= rowsToScan; i++) {
    // Read the portion of an image scale it
    // and push the scaled version in lets say array
    BufferedImage scalledPortionOfImage = this.getScaledPortionOfImage(img, x, y, w, h);
    scaledImagePortions.add(scalledPortionOfImage);

    y = (rowHeight * i);
}

// Create single image out of scaled images portions

Thread which can help you to get portion of an image Read region from very large image file in Java

Thread which can help you to scale the image (my quick search result :) ) how to resize Image in java?

Thread which can help you in merging the buffered images: Merging two images

You can always tweak the snippets :)


  1. OutOfMemoryError that is self explainatory - you are out of memory. That is beeing said not physical RAM you have on your machine, but rather JVM hits upper memory allocation limit set by -xmx setting
  2. Your xmx setting testing makes little sense as you try to put 3,8GB size of an image into 512MB memory block. It cannot work - you cannot put 10 liters of water in 5 liters bottle. For memory usage you need at least the size of image x3 as you are storing every pixel separately and that contains of 3 bytes (RGB). And that is just for pure image data. What is left is whole app and data object structure overhead + additional space required for computation and probably plenty more that I didn't mention and I am not even aware of.
  3. You don't want to "dynamicly set" -xmx. Set it to maximum possible value in your system (trial and error). JVM will not take that much of memory unless it will need it. By additional -X settings you can tell JVM to free up unused memory so you don't have to worry about unused memory beeing "freezed" by JVM.
  4. I never worked on image processing applications. Is Photoshop or Gimp is capable of opening and doing something usefull with such big images? Maybe you should looks for clues about processing that much of data there (if it is working)
  5. If point above is just a naive as you need this for scientific purposes (and that is not what Photoshop or Gimp are made for unless you are flatearther :) ), you will need scientific grade hardware.
  6. One thing that comes into my mind, is not to read image into memory at all but process it on the fly. This could reduce memory consumption to order of megabytes.

Take a closer look into ImageReader API as it suggest (readTile method) it might be possible to read only area of image (eg for zooming in)