The SPARK_HOME env variable is set but Jupyter Notebook doesn't see it. (Windows)

I had the same problem and wasted a lot of time. I found two solutions:

There are two solutions

  1. copy downloaded spark folder in somewhere in C directory and give the link as below

    import findspark
    findspark.init('C:/spark')
    
  2. use the function of findspark to find automatically the spark folder

    import findspark
    findspark.find()
    

The environmental variables get updated only after system reboot. It works after restarting your system.