Encountering " WARN ProcfsMetricsGetter: Exception when trying to compute pagesize" error when running Spark

I received this same message, running Spark 3.0.1 on Windows 10, using Scala 2.12.10. It's not actually an Error, in the sense that it ends your program execution. Its a warning related to /proc file systems on Linux machines.

If you are also on a Windows machine, the answer maybe, to quote Wing Yew Poon @ Apache: "The warning happened because the command "getconf PAGESIZE" was run and it is not a valid command on Windows so an exception was caught." (From the Spark jira issue here).

If your program failed right after throwing this Exception message, it is for some other reason. In my case, Spark was crashing with this message right after this warning:

20/11/13 12:41:51 ERROR MicroBatchExecution: Query [id = 32320bc7-d7ba-49b4-8a56-1166a4f2d6db, runId = d7cc93c2-41ef-4765-aecd-9cd453c25905] terminated with error
org.apache.spark.SparkException: Job 1 cancelled because SparkContext was shut down

This warning can be hidden by setting spark.executor.processTreeMetrics.enabled to false. To quote Mr. Poon again, "it is a minor bug that you see this warning. But it can be safely ignored."


The same problem occured with me because python path was not added to system environment. I added this in environment and now it works perfectly.

Adding PYTHONPATH environment variable with value as:

%SPARK_HOME%\python;%SPARK_HOME%\python\lib\py4j-<version>-src.zip;%PYTHONPATH%

helped resolve this issue. Just check what py4j version you have in your spark/python/lib folder.