ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed

There are a few different solutions

  1. Get your hostname

    $ hostname
    

    then try to assign your host name

    $ sudo hostname -s 127.0.0.1
    

    Start spark-shell.

  2. Add your hostname to your /etc/hosts file (if not present)

    127.0.0.1      your_hostname
    
  3. Add env variable

    export SPARK_LOCAL_IP="127.0.0.1" 
    
    load-spark-env.sh 
    
  4. Above steps solved my problem but you can also try to add

    export SPARK_LOCAL_IP=127.0.0.1 
    

    under the comment for local IP on template file spark-env.sh.template (/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)

    and then

    cp spark-env.sh.template spark-env.sh
    spark-shell
    
  5. If none of the above fixes, check your firewall and enable it, if not already enabled


Add SPARK_LOCAL_IP in load-spark-env.sh as

export SPARK_LOCAL_IP="127.0.0.1"

The load-spark-env.sh file is located in spark/bin directory

Or you can add your hostname in /etc/hosts file as

127.0.0.1   hostname 

You can get your hostname by typing hostname in terminal

Hope this solves the issue!


  • Had similar issue in my IntelliJ

    Reason : I was on cisco anyconnect VPN

    Fix : disconnected from the VPN, this issue did not appear