How do I use cURL to perform multiple simultaneous requests?

Curl may not do it itself but bash can.

curl -o 1.txt -X GET https://foo & curl -o 2.txt -X GET https://foo

While curl is a very useful and flexible tool, isn't intended for this type of use. There are other tools available which will let you make multiple concurrent requests to the same URL.

ab is a very simple yet effective tool of this type, which works for any web server (despite the introduction focusing on Apache server).

Grinder is a more sophisticated tool, which can let you specify many different URLs to use in a load test. This lets you mix requests for cheap and expensive pages, which may more closely resemble standard load for your website.


Starting from 7.68.0 curl can fetch several urls in parallel. This example will fetch urls from urls.txt file with 3 parallel connections:

curl --parallel --parallel-immediate --parallel-max 3 --config urls.txt

urls.txt:

url = "example1.com"
url = "example2.com"
url = "example3.com"
url = "example4.com"
url = "example5.com"

Another method is to use GNU Parallel with Curl.

Here is a simplified example that does 100 curl requests total, 10 at a time (concurrently):

seq 100 | parallel --max-args 0 --jobs 10 "curl https://www.example.com"

seq 100 generates a list of numbers that we pipe into parallel:

1
2
3
4
5
6
7
8
9
10
... and so on

Then we use the --max-args 0 option which means that it will execute 1 job per argument. Don't change this number. Identically alias option is -n.

Docs say:

-n 0 means read one argument, but insert 0 arguments on the command line.

Then we use the --jobs 10 option which will run up to 10 jobs in parallel/concurrently. Identically alias options are -j, --procs/-P.

Docs say:

Number of jobslots on each machine. Run up to N jobs in parallel. 0 means as many as possible. Default is 100% which will run one job per CPU core on each machine.

Below is a more functional example which prints the return code, hides the output and depending on if the command succeeded (&&) or not (||) print a SUCCESS or FAIL message along with it, which I find useful for my debugging purposes:

seq 100 | parallel --max-args 0 --jobs 10 "curl -w '%{http_code}\n' https://www.example.com --output /dev/null --location --silent && printf SUCCESS\n\n || printf FAIL\n\n"

example output of using GNU Parallel to Curl

Tags:

Curl

Request