Spark job execution time

Every SparkContext launches its own instance of Web UI which is available at

http://[master]:4040
by default (the port can be changed using spark.ui.port ).

It offers pages (tabs) with the following information:

Jobs, Stages, Storage (with RDD size and memory use) Environment, Executors, SQL

This information is available only until the application is running by default.

Tip : You can use the web UI after the application is finished by enabling spark.eventLog.enabled.

Sample web ui where you can see the time as 3.2hours: enter image description here