Spark Kill Running Application

  • copy past the application Id from the spark scheduler, for instance application_1428487296152_25597
  • connect to the server that have launch the job
  • yarn application -kill application_1428487296152_25597

It may be time consuming to get all the application Ids from YARN and kill them one by one. You can use a Bash for loop to accomplish this repetitive task quickly and more efficiently as shown below:

Kill all applications on YARN which are in ACCEPTED state:

for x in $(yarn application -list -appStates ACCEPTED | awk 'NR > 2 { print $1 }'); do yarn application -kill $x; done

Kill all applications on YARN which are in RUNNING state:

for x in $(yarn application -list -appStates RUNNING | awk 'NR > 2 { print $1 }'); do yarn application -kill $x; done


https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Application_State_API

PUT http://{rm http address:port}/ws/v1/cluster/apps/{appid}/state

{
  "state":"KILLED"
}