How to explode an array into multiple columns in Spark

You can use foldLeft to add each columnn fron DataArray

make a list of column names that you want to add

val columns = List("col1", "col2", "col3")

columns.zipWithIndex.foldLeft(df) {
  (memodDF, column) => {
    memodDF.withColumn(column._1, col("dataArray")(column._2))
  }
}
  .drop("DataArray")

Hope this helps!


Use apply:

import org.apache.spark.sql.functions.col

df.select(
  col("id") +: (0 until 3).map(i => col("DataArray")(i).alias(s"col$i")): _*
)