java.lang.NoSuchMethodError: scala.Predef$.refArrayOps

I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.


I had SDK in global libraries with a different version of Scala(IntelliJ IDEA).
File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild. It fixed the Exception for me.


scalatest_2.11 is the version of ScalaTest compatible only with Scala 2.11.x. Write libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test" (note %%) instead to pick the correct version automatically and switch to Scala 2.11.8 until scalatest_2.12 is released (it should be very soon). See http://www.scala-sbt.org/0.13/docs/Cross-Build.html for more.


This error occurs when you use a Scala JAR file that was compiled with Scala 2.11 for a Scala 2.12 project.

Scala libraries are generally cross compiled with different versions of Scala, so different JAR files are published to Maven for different project versions. For example, Scalatest version 3.2.3 publishes separate JAR files to Maven to Scala 2.10, 2.11, 2.12, and 2.13, as you can see here.

Lots of Spark programmers will run into this error when they attach a JAR file that was compiled with Scala 2.11 to a cluster that's running Scala 2.12. See here for a detailed guide on how to migrate Spark projects from Scala 2.11 to Scala 2.12.

As the accepted answer mentioned, the SBT %% operator should be used when specifying Scala dependencies so you can automatically grab library dependencies that correspond with your project's Scala version (as mentioned in the accepted answer). The %% operator won't help you if the library dependency doesn't have a JAR file for the Scala version you're looking for. Look at the Spark releases for example:

spark releases

This build.sbt file will work because there is a Scala 2.12 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

This code will not work because there isn't a Scala 2.11 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

You can cross compile your project and build JAR files for different Scala versions if your library dependencies are also cross compiled. Spark 2.4.7 is cross compiled with Scala 2.11 and Scala 2.12, so you can cross compile your project with this code:

scalaVersion := "2.11.12"
crossScalaVersions := Seq("2.11.12", "2.12.10")
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.7"

The sbt +assembly code will build two JAR files for your project, one that's compiled with Scala 2.11 and another that's compiled with Scala 2.12. Libraries that release multiple JAR files follow a similar process cross compilation workflow.

Tags:

Scala