How do I use wget to download all links from my site and save to a text file?

wget does not offer such an option. Please read its man page.

You could use lynx for this:

lynx -dump -listonly http://aligajani.com | grep -v facebook.com > file.txt

From its man page:

   -listonly
          for -dump, show only the list of links.

Use the following in terminal:

      wget -r -p -k http://website

or

      wget -r -p -k --wait=#SECONDS http://website

Note: The second one is for websites that may flag you if downloading too quickly; may also cause a loss of service, so use second one for most circumstances to be courteous. Everything will be placed in a folder named the same as website in your root folder directory or whatever directory you have terminal in at time of executing command.

Tags:

Wget