SSH to multiple hosts in file and run command fails - only goes to the first host

ssh pre-reads stdin on the assumption that the remote job will want some data, and it makes for a faster start-up. So the first job eats the rest of your hosts list.

You can prevent this happening either using the ssh -n option, or by redirecting like ssh < /dev/null. Being a belt-and-braces man, I do both: ssh -n < /dev/null <other ssh options>.

If you really want the remote job to read something from the local system, you need to pipe it, or here-doc it, for each iteration. Even with for <host>, ssh will read an indeterminate amount of data from stdin for each host start-up.

I would always ping a remote host to get status before ssh to it. IIRC, ping -W has a hard timeout option, while ssh can hang in DNS or while connecting for 90+ seconds, for an unknown or dead host.


You could also use pdsh for this, it allows running a command in parallel on multiple hosts.

pdsh -w ^hostlist.txt -R ssh "cat /etc/redhat-release"

^ defines a file with the list of hostnames, alternatively a comma separated list could be used:

pdsh -w host1,host2,host3,... -R ssh "cat /etc/redhat-release"

A result would look like:

pdsh, -w raspi,raspi.local,192.168.0.3,localhost,127.0.0.1 -R ssh "cat /etc/issue"

raspi: Raspbian GNU/Linux 10 \n \l
raspi: 
raspi.local: Raspbian GNU/Linux 10 \n \l
raspi.local: 
192.168.0.3: Raspbian GNU/Linux 10 \n \l
192.168.0.3: 
localhost: Debian GNU/Linux 10 \n \l
localhost: 
127.0.0.1: Debian GNU/Linux 10 \n \l
127.0.0.1: 

Tags:

Ssh