How to check that the SparkContext has been stopped?

This applies to the Scala/Java API for that writing time

Before Spark has released version 1.6, you wouldn't be able to check it, but only to trigger it:

sc.stop()

From version 1.6 and above, you have a boolean function that returns true if context is stopped or in the midst of stopping:

sc.isStopped

This applies to PySpark API

Thanks for @zero323 comment:

sc._jsc.sc().isStopped()

Which gives you the Java SparkContext.


If you use spark 1.5, that could be done via reflection API:

boolean isStopped(SparkContext sc) throws NoSuchFieldException, IllegalAccessException {
    Field f = sc.getClass().getDeclaredField("stopped"); 
    f.setAccessible(true);
    AtomicBoolean stopped = (AtomicBoolean) f.get(sc);
    return stopped.get();
}