Ubuntu: Using curl to download an image

For those who don't have nor want to install wget, curl -O (capital "o", not a zero) will do the same thing as wget. E.g. my old netbook doesn't have wget, and is a 2.68 MB install that I don't need.

curl -O https://www.python.org/static/apple-touch-icon-144x144-precomposed.png

If you want to keep the original name — use uppercase -O

curl -O https://www.python.org/static/apple-touch-icon-144x144-precomposed.png

If you want to save remote file with a different name — use lowercase -o

curl -o myPic.png https://www.python.org/static/apple-touch-icon-144x144-precomposed.png


curl without any options will perform a GET request. It will simply return the data from the URI specified. Not retrieve the file itself to your local machine.

When you do,

$ curl https://www.python.org/static/apple-touch-icon-144x144-precomposed.png

You will receive binary data:

                   |�>�$! <R�HP@T*�Pm�Z��jU֖��ZP+UAUQ@�
��{X\� K���>0c�yF[i�}4�!�V̧�H_�)nO#�;I��vg^_ ��-Hm$$N0.
���%Y[�L�U3�_^9��P�T�0'u8�l�4 ...

In order to save this, you can use:

$ curl https://www.python.org/static/apple-touch-icon-144x144-precomposed.png > image.png

to store that raw image data inside of a file.

An easier way though, is just to use wget.

$ wget https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
$ ls
.
..
apple-touch-icon-144x144-precomposed.png

Create a new file called files.txt and paste the URLs one per line. Then run the following command.

xargs -n 1 curl -O < files.txt

source: https://www.abeautifulsite.net/downloading-a-list-of-urls-automatically