Why would a website serve different versions of a file over HTTP and HTTPS?

The simple answer is: because it wants to! The web server can serve whatever it likes, either by configuration or coincidence.

Right now, I get the same 75916c7b file over both HTTP and HTTPS and cannot confirm your theory that the web server is serving different content for HTTPS versus HTTP. However, if you managed to access the site near the time the file was updated, it could very well be that different servers were serving the different protocols and the file update had not yet propagated to the server that served you the old file.

Remember that one URL can be served by any number of servers - the fact that you get one file from a URL does not exclude there being 20 copies of this file on 20 servers/caches, some of which may be out of date.

This a certainty in this case, as the website appears to be using Cloudfront which is a Content Delivery Network - a piece of infrastructure explicitly designed for caching files on many distributed servers for delivery at global scale.


Author of above referenced files here.

The checksum of said files at the current state of things is normal to change when the build scripts are modified and a new build automatically triggered and uploaded. In case of curl 7.46.0, the build scripts (and consequently the downloadable packages) changed three times as of this writing:

  • HTTP/2 support was enabled, by building and static linking the nghttp2 dependency: commit, log
  • nghttp2 dependency was updated from version 1.5.0 to 1.6.0: commit, log
  • a libcurl.dll build bug was identified and fixed, that was caused by a missing nghttp2 build option: commit, log
  • underlying MinGW C compiler was updated to 5.3.0, plus multiple internal fixes and updates: commit, log
  • VirusTotal false positive was fixed by dropping an optional .vbs file that was previously copied from the original curl source package: commit, log

Any commits here (made to the master branch), will trigger a new build:

  • except when explicitly excluded using commit text: [ci skip]
  • the build process is designed to be reproducible/deterministic, so the checksum will only change if the underlying source code, or the compiler options are altered. (A repeated build with the same options won't change it.)

As for downloading the binaries via different protocols, HTTP vs. HTTPS should result in the exact same binary content (if the downloads are made at the same point in time) — though HTTPS is highly recommended.

As for potential malware, see my thoughts on GitHub: https://github.com/bagder/curl/issues/583#issuecomment-167520488

In short: They are almost certainly false positives, unlikely to be influenced by the transfer protocol, more likely influenced by scanner engine problems. The complete build process is public and auditable as well as all the source code that gets built into it.)

UPDATE [2015-12-28]: As a general answer to the HTTP vs. HTTPS issue: The latter, secure protocol makes certain guarantees that the content comes from the right party and that it's not tampered on its way to the other end of the wire. For these reasons, it can be trusted more and hence my recommending it in the original answer. An even better assurance is to verify SHA256 hashes of the downloaded content against the ones generated on the build server itself (visible at the end of the build logs linked above). An even stronger assurance is to have the content digitally signed (f.e. using PGP or minisign), and that signature verified by the receiver. (My curl downloads don't feature a digital signature at this time.)

UPDATE [2016-01-06]: Updated the list of package updates. Fixing the VirusTotal false positive was one of them.


Look at the timestamps on the scans, the "Clean" version was weeks ago. When you click the link for a newer scan date you see both http and https are infected.

Basically there is no difference in the files, just the scans were run at two different times (pre-infection and post-infection)