You need to supply a rowencoder.

Regards,
Ramandeep Singh

On Thu, May 9, 2019, 11:33 SNEHASISH DUTTA <info.snehas...@gmail.com> wrote:

> Hi ,
>
> I am trying to write a generic method which will return custom type
> datasets as well as spark.sql.Row
>
> def read[T](params: Map[String, Any])(implicit encoder: Encoder[T]): 
> Dataset[T]
>
> is my method signature, which is working fine for custom types but when I
> am trying to obtain a Dataset[Row] it errors out with the following message
>
> " Unable to find encoder for type org.apache.spark.sql.Row. An implicit
> Encoder[org.apache.spark.sql.Row] is needed to store
> org.apache.spark.sql.Row instances in a Dataset. Primitive types (Int,
> String, etc) and Product types (case classes) are supported by importing
> spark.implicits._ "
>
> Is it possible to make some changes so that it can process both custom
> types and Row type.
>
> Regards,
> Snehasish
>

Reply via email to