Unresolved dependency issue when compiling spark project with sbt

Your problem is that Spark is written with Scala 2.10. So you should use version 2.10 of Scala instead of 2.11.

Example:

scalaVersion := "2.10.5"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.0-cdh5.3.2" % "provided"


resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/"

I had a similar dependency issue and solved it by reloading plugins and updating dependencies. I think your dependency issue is due to Ivy cache. Normally, if no dependency management configuration has changed since the last successful resolution and the retrieved files are still present, sbt does not ask Ivy to perform resolution.

Try running:

sbt reload plugins
sbt update
sbt reload

If that doesn't work, follow instructions on http://www.scala-sbt.org/0.13/docs/Dependency-Management-Flow.html