How does Google's Page Speed lossless image compression work?

If you're really interested in the technical details, check out the source code:

  • png_optimizer.cc
  • jpeg_optimizer.cc
  • webp_optimizer.cc

For PNG files, they use OptiPNG with some trial-and-error approach

// we use these four combinations because different images seem to benefit from
// different parameters and this combination of 4 seems to work best for a large
// set of PNGs from the web.
const PngCompressParams kPngCompressionParams[] = {
  PngCompressParams(PNG_ALL_FILTERS, Z_DEFAULT_STRATEGY),
  PngCompressParams(PNG_ALL_FILTERS, Z_FILTERED),
  PngCompressParams(PNG_FILTER_NONE, Z_DEFAULT_STRATEGY),
  PngCompressParams(PNG_FILTER_NONE, Z_FILTERED)
};

When all four combinations are applied, the smallest result is kept. Simple as that.

(N.B.: The optipng command line tool does that too if you provide -o 2 through -o 7)


For JPEG files, they use jpeglib with the following options:

 JpegCompressionOptions()
     : progressive(false), retain_color_profile(false),
       retain_exif_data(false), lossy(false) {}

Similarly, WEBP is compressed using libwebp with these options:

  WebpConfiguration()
      : lossless(true), quality(100), method(3), target_size(0),
        alpha_compression(0), alpha_filtering(1), alpha_quality(100) {}

There is also image_converter.cc which is used to losslessly convert to the smallest format.


I use jpegoptim to optimize JPG files and optipng to optimize PNG files.

If you're on bash, the command to losslessly optimize all JPGs in a directory (recursively) is:

find /path/to/jpgs/ -type f -name "*.jpg" -exec jpegoptim --strip-all {} \;

You can add -m[%] to jpegoptim to lossy compress JPG images, for example:

 find /path/to/jpgs/ -type f -name "*.jpg" -exec jpegoptim -m70 --strip-all {} \;

To optimize all PNGs in a directory:

find /path/to/pngs/ -type f -name "*.png" -exec optipng -o2 {} \;

-o2 is the default optimization level, you can change this from o2 to o7. Notice that higher optimization level means longer processing time.


Take a look at http://code.google.com/speed/page-speed/docs/payload.html#CompressImages which describes some of the techniques/tools.