Hi,
I have the following scenario:

scala> val df = spark.sql("select * from danieltest3")
df: org.apache.spark.sql.DataFrame = [iid: string, activity: string ... 34
more fields]

Now I'm trying to map through the rows I'm getting:
scala> df.map(r=>r.toSeq)
<console>:32: error: Unable to find encoder for type stored in a Dataset.
Primitive types (Int, String, etc) and Product types (case classes) are
supported by importing spark.implicits._  Support for serializing other
types will be added in future releases.
       df.map(r=>r.toSeq)


What am I missing here ?

Thank you,
Daniel

Reply via email to