Spark-submit ClassNotFound exception

I also had same issue. I think --jars is not shipping the jars to executors. After I added this into SparkConf, it works fine.

 val conf = new SparkConf().setMaster("...").setJars(Seq("/a/b/x.jar", "/c/d/y.jar"))

This web page for trouble shooting is useful too.


You should set the SPARK_CLASS_PATH in spark-env.sh file like this:

SPARK_LOCAL_IP=your local ip 
SPARK_CLASSPATH=your external jars

and you should submit with spark shell like this:spark-submit --class your.runclass --master spark://yourSparkMasterHostname:7077 /your.jar

and your java code like this:

SparkConf sparkconf = new SparkConf().setAppName("sparkOnHbase");  JavaSparkContext sc = new JavaSparkContext(sparkconf);

then it will work.


If you are using Maven and Maven Assembly plugin to build your jar file with mvn package, ensure that the assembly plugin is configured correctly to point to your Spark app's main class.

Something like this should be added to your pom.xml to avoid any java.lang.ClassNotFoundException's:

           <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-assembly-plugin</artifactId>
            <version>2.4.1</version>
            <configuration>
                <archive>
                    <manifest>
                        <mainClass>com.my.package.SparkDriverApp</mainClass>
                    </manifest>
                </archive>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
                <skipAssembly>false</skipAssembly>
            </configuration>
            <executions>
                <execution>
                    <id>package</id>
                    <phase>package</phase>
                    <goals>
                        <goal>single</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>

I had this same issue. If master is local then program runs fine for most people. If they set it to (also happened to me) "spark://myurl:7077" it doesn't work. Most people get error because an anonymous class was not found during execution. It is resolved by using SparkContext.addJars ("Path to jar").

Make sure you are doing the following things: -

  • SparkContext.addJars("Path to jar created from maven [hint: mvn package]").
  • I have used SparkConf.setMaster("spark://myurl:7077") in code and have supplied same as argument while submitting job to spark via command line.
  • When you specify class in command line, make sure your are writing it's complete name with URL. eg: "packageName.ClassName"
  • Final command should look like this bin/spark-submit --class "packageName.ClassName" --master spark://myurl:7077 pathToYourJar/target/yourJarFromMaven.jar

Note: this jar pathToYourJar/target/yourJarFromMaven.jar in last point is also set in code as in first point of this answer.