Solution 1:

Actually you don't need here a window function since you can just sort the data in desc order and return the first record using limit(1).

But for the sake of practice, you can use window function like this:

import org.apache.spark.sql.functions.{col, row_number}
import org.apache.spark.sql.expressions.Window


val windowSpec  = Window.orderBy(col("insert_dttm").desc)
val lastSessionId = df.withColumn("row_number", row_number.over(windowSpec)).filter("row_number=1").first.getString(0)