Mapping Spark DataSet row values into new hash column

One way is to use the withColumn function:

import org.apache.spark.sql.functions.{col, hash}
dataset.withColumn("hash", hash(dataset.columns.map(col):_*))

Turns out that Spark already has this implemented as the hash function inside package org.apache.spark.sql.functions

/**
 * Calculates the hash code of given columns, and returns the result as an int column.
 *
 * @group misc_funcs
 * @since 2.0
 */
@scala.annotation.varargs
def hash(cols: Column*): Column = withExpr {
  new Murmur3Hash(cols.map(_.expr))
}

And in my case, applied as:

import org.apache.spark.sql.functions.{col, hash}

val newDs = typedRows.withColumn("hash", hash(typedRows.columns.map(col): _*))

I truly have a lot to learn about Spark sql :(.

Leaving this here in case someone else needs it. Thanks!