AWS Glue executor memory limit

  1. Open Glue> Jobs > Edit your Job> Script libraries and job parameters (optional) > Job parameters near the bottom
  2. Set the following: key: --conf value: spark.yarn.executor.memoryOverhead=1024 spark.driver.memory=10g

The official glue documentation suggests that glue doesn't support custom spark config.

There are also several argument names used by AWS Glue internally that you should never set:

--conf — Internal to AWS Glue. Do not set!

--debug — Internal to AWS Glue. Do not set!

--mode — Internal to AWS Glue. Do not set!

--JOB_NAME — Internal to AWS Glue. Do not set!

Any better suggestion on solving this problem?


You can override the parameters by editing the job and adding job parameters. The key and value I used are here:

Key: --conf

Value: spark.yarn.executor.memoryOverhead=7g

This seemed counterintuitive since the setting key is actually in the value, but it was recognized. So if you're attempting to set spark.yarn.executor.memory the following parameter would be appropriate:

Key: --conf

Value: spark.yarn.executor.memory=7g


despite aws documentation stating that the --conf parameter should not be passed, our AWS support team told us to pass --conf spark.driver.memory=10g which corrected the issue we were having