Hi Ted,

Sure! It works with map, but not with select. Wonder if it's by design
or...will soon be fixed? Thanks again for your help.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Mar 31, 2016 at 10:57 AM, Ted Yu <yuzhih...@gmail.com> wrote:
> I tried this:
>
> scala> final case class Text(id: Int, text: String)
> warning: there was one unchecked warning; re-run with -unchecked for details
> defined class Text
>
> scala> val ds = Seq(Text(0, "hello"), Text(1, "world")).toDF.as[Text]
> ds: org.apache.spark.sql.Dataset[Text] = [id: int, text: string]
>
> scala> ds.map(t => t.id).show
> +-----+
> |value|
> +-----+
> |    0|
> |    1|
> +-----+
>
> On Thu, Mar 31, 2016 at 5:02 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>>
>> Hi,
>>
>> I can't seem to use Dataset using case classes (or tuples) to select per
>> field:
>>
>> scala> final case class Text(id: Int, text: String)
>> warning: there was one unchecked warning; re-run with -unchecked for
>> details
>> defined class Text
>>
>> scala> val ds = Seq(Text(0, "hello"), Text(1, "world")).toDF.as[Text]
>> ds: org.apache.spark.sql.Dataset[Text] = [id: int, text: string]
>>
>> // query per field as symbol works fine
>> scala> ds.select('id).show
>> +---+
>> | id|
>> +---+
>> |  0|
>> |  1|
>> +---+
>>
>> // but not per field as Scala attribute
>> scala> ds.select(_.id).show
>> <console>:40: error: missing parameter type for expanded function
>> ((x$1) => x$1.id)
>>        ds.select(_.id).show
>>                  ^
>>
>> Is this supposed to work in Spark 2.0 (today's build)?
>>
>> BTW, Why is Seq(Text(0, "hello"), Text(1, "world")).as[Text] not possible?
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to