Deciding what interpolation method to use for resampling raster data?

A clarification to the question indicates that methods of resampling a raster are sought. Many are in use in imaging and photographic communities. For GIS work, though, several straightforward methods are in common use:

  • Nearest-neighbor resampling. Each cell in the new raster is assigned the value of the nearest cell (center to center) in the original raster. Use this for categorical data like land use and other classifications.

  • Bilinear interpolation. Each cell in the new raster is assigned an average based on the four nearest original cells. The averaging is linear in the horizontal and vertical directions. (The resulting formula, though, is not linear; it's actually quadratic.) This is good for general-purpose smoothing but the averaging that goes on typically clips local peaks and valleys a bit.

  • Cubic convolution. This is similar in spirit to bilinear interpolation but can slightly extrapolate values from nearby cells. It does so in a way intended to reproduce local averages and variability in the new grid; in particular, the clipping of local extrema should not be as severe. (One untoward consequence, evident as a bug in ESRI's ArcGIS, is that the values in the new grid may extend beyond the range of the old one, causing some of the new extremes not to be rendered correctly. But this is a matter of data display only.) The tradeoff is that cubic convolution takes a little more time to compute than bilinear interpolation.

I discuss the latter two methods in some detail at http://www.quantdec.com/SYSEN597/GTKAV/section9/map_algebra.htm

For quick one-off calculations I am usually content to perform bilinear interpolation (for continuous data) or nearest-neighbor interpolation (for categorical data). For all others, especially when preparing master datasets or when anticipating extensive manipulations, I recommend using cubic convolution (as well as giving some thought to ordering the operations to minimize propagation of floating point error).


According to ESRI the available interpolation methods (Available as tools in Spatial Analyst and other extensions) are compared as follows: (Quoting)

IDW (Inverse Distance Weighted) tool uses a method of interpolation that estimates cell values by averaging the values of sample data points in the neighborhood of each processing cell. The closer a point is to the center of the cell being estimated, the more influence, or weight, it has in the averaging process.

Kriging is an advanced geostatistical procedure that generates an estimated surface from a scattered set of points with z-values. More so than other interpolation methods supported by ArcGIS Spatial Analyst, a thorough investigation of the spatial behavior of the phenomenon represented by the z-values should be done before you select the best estimation method for generating the output surface.

Natural Neighbor interpolation finds the closest subset of input samples to a query point and applies weights to them based on proportionate areas to interpolate a value (Sibson, 1981). It is also known as Sibson or "area-stealing" interpolation.

The Spline tool uses an interpolation method that estimates values using a mathematical function that minimizes overall surface curvature, resulting in a smooth surface that passes exactly through the input points.

Spline with Barriers The Spline with Barriers tool uses a method similar to the technique used in the Spline tool, with the major difference being that this tool honors discontinuities encoded in both the input barriers and the input point data.

The Topo to Raster and Topo to Raster by File tools use an interpolation technique specifically designed to create a surface that more closely represents a natural drainage surface and better preserves both ridgelines and stream networks from input contour data.

The algorithm used is based on that of ANUDEM, developed by Hutchinson et al at the Australian National University.

Trend is a global polynomial interpolation that fits a smooth surface defined by a mathematical function (a polynomial) to the input sample points. The trend surface changes gradually and captures coarse-scale patterns in the data.

You could also take a look at this article: http://proceedings.esri.com/library/userconf/proc95/to100/p089.html


I agree there are no hard and fast rules, but there are some guidelines for various interpolation methods. For example, IDW is best when you have fairly dense points to begin with. Kriging is processor intensive, usually used in soil/geology modelling. Spline is usually used when a smooth surface is desired, e. g. temperature data Some methods keep the resulting raster passing through the original points while others do not.

Although it is ArcGIS centric, a good overview of the different methods can be found in the 4 page paper

Interpolating Surfaces in ArcGIS Spatial Analyst