How to add days (as values of a column) to date?

You can use expr:

import org.apache.spark.sql.functions.expr

data.withColumn("future", expr("date_add(date, id)")).show
// +---+----------+----------+
// | id|      date|    future|
// +---+----------+----------+
// |  0| 2016-01-1|2016-01-01|
// |  1| 2016-02-2|2016-02-03|
// |  2|2016-03-22|2016-03-24|
// |  3|2016-04-25|2016-04-28|
// |  4|2016-05-21|2016-05-25|
// |  5| 2016-06-1|2016-06-06|
// |  6|2016-03-21|2016-03-27|
// +---+----------+----------+

selectExpr could be use in a similar way:

data.selectExpr("*", "date_add(date, id) as future").show

The other answers work but aren't a drop in replacement for the existing date_add function. I had a case where expr wouldn't work for me, so here is a drop in replacement:

def date_add(date: Column, days: Column) = {
    new Column(DateAdd(date.expr, days.expr))
}

Basically, all the machinery is there in Spark to do this already, the function signature for date_add just forces it to be a literal.