Increase Spark memory when using local[*]

in spark 2.x ,you can use SparkSession,which looks like :

        val spark= new SparkSession()
        .config("spark.executor.memory", "1g")
        .config("spark.driver.memory", "4g")
        .setMaster("local[*]")
        .setAppName("MyApp")

I was able to solve this by running SBT with:

sbt -mem 4096

However the MemoryStore is half the size. Still looking into where this fraction is.


Tried --driver-memory 4g, --executor-memory 4g, neither worked to increase working memory. However, I noticed that bin/spark-submit was picking up _JAVA_OPTIONS, setting that to -Xmx4g resolved it. I use jdk7


Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.

You can either launch your spark-shell using:

./bin/spark-shell --driver-memory 4g

or you can set it in spark-defaults.conf:

spark.driver.memory 4g

If you are launching an application using spark-submit, you must specify the driver memory as an argument:

./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar