I'll note that the DSL is pretty experimental. That said you should be
able to do something like "user.id".attr
On Mon, Sep 29, 2014 at 3:39 PM, Benyi Wang wrote:
> scala> user
> res19: org.apache.spark.sql.SchemaRDD =
> SchemaRDD[0] at RDD at SchemaRDD.scala:98
> == Query Plan ==
> ParquetTabl
scala> user
res19: org.apache.spark.sql.SchemaRDD =
SchemaRDD[0] at RDD at SchemaRDD.scala:98
== Query Plan ==
ParquetTableScan [id#0,name#1], (ParquetRelation
/user/hive/warehouse/user), None
scala> order
res20: org.apache.spark.sql.SchemaRDD =
SchemaRDD[72] at RDD at SchemaRDD.scala:98
== Query