Is there parallel wget? Something like fping but only for downloading?

Using GNU Parallel,

$ parallel -j ${jobs} wget < urls.txt

or xargs from GNU Findutils,

$ xargs -n 1 -P ${jobs} wget < urls.txt

where ${jobs} is the maximum number of wget you want to allow to run concurrently (setting -n to 1 to get one wget invocation per line in urls.txt). Without -j/-P, parallel will run as many jobs at a time as CPU cores (which doesn't necessarily make sense for wget bound by network IO), and xargs will run one at a time.

One nice feature that parallel has over xargs is keeping the output of the concurrently-running jobs separated, but if you don't care about that, xargs is more likely to be pre-installed.


aria2 does this.

http://sourceforge.net/apps/trac/aria2/wiki/UsageExample#Downloadfileslistedinafileconcurrently

Example: aria2c http://example.org/mylinux.iso


You can implement that using Python and the pycurl library. The pycurl library has the "multi" interface that implements its own even loop that enables multiple simultaneous connections.

However the interface is rather C-like and therefore a bit cumbersome as compared to other, more "Pythonic", code.

I wrote a wrapper for it that builds a more complete browser-like client on top of it. You can use that as an example. See the pycopia.WWW.client module. The HTTPConnectionManager wraps the multi interface.