How to run external jar functions in spark-shell

you can try by providing jars with argument as below

./spark-shell --jars pathOfjarsWithCommaSeprated

Or you can add following configuration in you spark-defaults.conf but remember to remove template from end of spark-defaults

spark.driver.extraClassPath  pathOfJarsWithCommaSeprated

If you want to add a .jar to the classpath after you've entered spark-shell, use :require. Like:

scala> :require /path/to/file.jar
Added '/path/to/file.jar' to classpath.

I tried two options and both worked for me.

  1. spark-shell --jars <path of jar>
  2. open spark-shell -Type :help,you will get all the available help. use below to add

    :require /full_path_of_jar

enter image description here