spark worker not connecting to master

Please check configuration file "spark-env.sh" on your master node. Have you set the SPARK_MASTER_HOST variable to the IP address of the master node? If not try to set it and restart the master and slaves. For example, if your master node's IP is 192.168.0.1, you should have SPARK_MASTER_HOST=192.168.0.1 in there. Note that you don't need to set this variable on your slaves.


1) Make sure you set a password less SSH between nodes

Please refer the below link to setup a password less ssh between nodes

http://www.tecmint.com/ssh-passwordless-login-using-ssh-keygen-in-5-easy-steps/

2) Specify the slaves IP Address in slaves file present in $SPARK_HOME/conf directory

[This is the spark folder containing conf directory] on Master node

3) Once you specify the IP Address in slaves file start the spark cluster

[Execute the start-all.sh script present in $SPARK_HOME/sbin directory] on Master Node

Hope this Helps


If you are able to ping the master node from Worker means it has the network connectivity .The new worker node needs to be added in Spark master you need to update few things spark-env.sh Please check the official document Spark CLuster launch and update the reuired fileds .

Here is another blog which can help you Spark Cluster modeBlog

Tags:

Apache Spark