Not able to SSH EKS Worker Nodes

I think you are missing SSH rule for instance's security group or you are using the wrong SSH key to connect to the worker nodes.

Please check from the console your security group id, and add SSH rule from inbound rule like in the screenshot if you don't have it. SSH rule for security group of worker nodes

Or you can add same rule via aws cli like:

aws ec2 authorize-security-group-ingress --group-id <security-group-id>  --protocol tcp --port 22 --cidr 0.0.0.0/0

Then, by specifying a valid SSH key, you can run the below command to connect to your worker node.

ssh -i "ssh-key.pem" ec2-user@<node-external-ip or node-dns-name>

If you lost/miss your key, you need to create new stack in cloudformation with new SSH key-pair as described in the following tutorials.

Creating a Key Pair Using Amazon EC2 and Launch and Configure Amazon EKS Worker Nodes

I hope it will help you.


I found a workaround. I created an EC2 instance with same VPC which is used by worker node, also used the same security group and Key Pair for newly created EC2 instance. I tried to login to newly created EC2 instance which works like charm ( don't know Why it won't work for worker nodes). Once I logged into the instance tried SSH to worker nodes from there with Private IP which is working as expected.

Again this a workaround. Not sure why I wasn't able to login to worker node.


When we create Compute, we need to make sure that the Allow remote access to nodes is enabled. If this is not chosen, then we cant SSH into the worker nodes.

As well as, EKS itself creates a Security Group which is attached to the worker EC2 machines. I believe you might have missed out enabling Allow remote access to nodes in the Step 3.

The workaround is effective because both bastion host and server is in same VPC. That's the rule added by default in the security group.