Scala add new column to dataframe by expression

In Spark 2.x, you can create a new column C5 with expression "C2/C3+C4" using withColumn() and org.apache.spark.sql.functions._,

 val currentDf = Seq(
              ("steak", 1, 1, 150),
              ("steak", 2, 2, 180),
              ("fish", 3, 3, 100)
            ).toDF("C1", "C2", "C3", "C4")

 val requiredDf = currentDf
                   .withColumn("C5", (col("C2")/col("C3")+col("C4")))

Also, you can do the same using org.apache.spark.sql.Column as well. (But the space complexity is bit higher in this approach than using org.apache.spark.sql.functions._ due to the Column object creation)

 val requiredDf = currentDf
                   .withColumn("C5", (new Column("C2")/new Column("C3")+new Column("C4")))

This worked perfectly for me. I am using Spark 2.0.2.


Two approaches:

    import spark.implicits._ //so that you could use .toDF
    val df = Seq(
      ("steak", 1, 1, 150),
      ("steak", 2, 2, 180),
      ("fish", 3, 3, 100)
    ).toDF("C1", "C2", "C3", "C4")

    import org.apache.spark.sql.functions._

    // 1st approach using expr
    df.withColumn("C5", expr("C2/(C3 + C4)")).show()

    // 2nd approach using selectExpr
    df.selectExpr("*", "(C2/(C3 + C4)) as C5").show()

+-----+---+---+---+--------------------+
|   C1| C2| C3| C4|                  C5|
+-----+---+---+---+--------------------+
|steak|  1|  1|150|0.006622516556291391|
|steak|  2|  2|180| 0.01098901098901099|
| fish|  3|  3|100| 0.02912621359223301|
+-----+---+---+---+--------------------+

This can be done using expr to create a Column from an expression:

val df = Seq((1,2)).toDF("x","y")

val myExpression = "x+y"

import org.apache.spark.sql.functions.expr

df.withColumn("z",expr(myExpression)).show()

+---+---+---+
|  x|  y|  z|
+---+---+---+
|  1|  2|  3|
+---+---+---+