spark dataframe trim column and convert

You can create a simple function to do it. First a couple of imports:

import org.apache.spark.sql.functions.{trim, length, when}
import org.apache.spark.sql.Column

and the definition:

def emptyToNull(c: Column) = when(length(trim(c)) > 0, c)

Finally a quick test:

val df = Seq(" ", "foo", "", "bar").toDF
df.withColumn("value", emptyToNull($"value"))

which should yield following result:

+-----+
|value|
+-----+
| null|
|  foo|
| null|
|  bar|
+-----+

If you want to replace empty string with string "NULL you can add otherwise clause:

def emptyToNullString(c: Column) = when(length(trim(c)) > 0, c).otherwise("NULL")