Why aren't features like gzip enabled by default on many webservers?

"resources" - it takes ram & CPU to compress your content on the fly... a very little bit but resources nonetheless, on a site or two it's trivial, but on 1000 or more... things can get funky. Also - Falcon's comment is valid - vendors do want the failsafe 'bare bones' configuration as the default.

-sean

UPDATE: I forgot as well: the extra execution time on each request to do the actual compression also - there is the extra memory used by loading mod_deflate [deflate_module] in the first place, so generally every process gets a little bit fatter and a little bit slower. ~ like I say, generally trivial, but if you are strapped for resources to begin with....


One reason could be IE6 compatibility. While IE6 supports gzip, it only does it up to 65535 bytes compressed size.

If your page is larger than that the rest will be cut off without explanation. But here's the funny part: It only happens if the page was loaded from file cache, not if it's received over the network, making the whole mess hard to debug.

It can be worked around by processing the headers of the request, but sometimes there will be a transparent proxy using IE6, eg. virus software. The proxy does not alter any User-Agent header so you're pretty much out of luck. Good thing that IE6 is almost eradicated.

Not sure if all versions were affected or only some, but the above was the reason the website I work for had gzip off for a long time.

Tags:

Iis

Apache 2.2