Re: Spark SQL - Encoders - case class

2016-06-06 Thread Dave Maughan
Hi,

Thanks for the quick replies. I've tried those suggestions but Eclipse is
showing:

*Unable** to find encoder for type stored in a Dataset.  Primitive
types (Int, String, etc) and Product types (case classes) are supported by
importing sqlContext.implicits._  Support for serializing other types will
be added in future.*


Thanks

- Dave


Re: Spark SQL - Encoders - case class

2016-06-06 Thread Han JU
Hi,

I think encoders for case classes are already provided in spark. You'll
just need to import them.

val sql = new SQLContext(sc)
import sql.implicits._

And then do the cast to Dataset.

2016-06-06 14:13 GMT+02:00 Dave Maughan :

> Hi,
>
> I've figured out how to select data from a remote Hive instance and encode
> the DataFrame -> Dataset using a Java POJO class:
>
> TestHive.sql("select foo_bar as `fooBar` from table1"
> ).as(Encoders.bean(classOf[Table1])).show()
>
> However, I'm struggling to find out to do the equivalent in Scala if
> Table1 is a case class. Could someone please point me in the right
> direction?
>
> Thanks
> - Dave
>



-- 
*JU Han*

Software Engineer @ Teads.tv

+33 061960


Spark SQL - Encoders - case class

2016-06-06 Thread Dave Maughan
Hi,

I've figured out how to select data from a remote Hive instance and encode
the DataFrame -> Dataset using a Java POJO class:

TestHive.sql("select foo_bar as `fooBar` from table1"
).as(Encoders.bean(classOf[Table1])).show()

However, I'm struggling to find out to do the equivalent in Scala if Table1
is a case class. Could someone please point me in the right direction?

Thanks
- Dave