Hi Will, We're also very interested in windowing support in SparkSQL. Let's us know once this is available for testing. Thanks.
Sincerely, DB Tsai ------------------------------------------------------- My Blog: https://www.dbtsai.com LinkedIn: https://www.linkedin.com/in/dbtsai On Tue, Sep 23, 2014 at 8:39 AM, Will Benton <wi...@redhat.com> wrote: > Hi Yi, > > I've had some interest in implementing windowing and rollup in particular for > some of my applications but haven't had them on the front of my plate yet. > If you need them as well, I'm happy to start taking a look this week. > > > best, > wb > > > ----- Original Message ----- >> From: "Yi Tian" <tianyi.asiai...@gmail.com> >> To: dev@spark.apache.org >> Sent: Tuesday, September 23, 2014 2:47:17 AM >> Subject: Question about SparkSQL and Hive-on-Spark >> >> Hi all, >> >> I have some questions about the SparkSQL and Hive-on-Spark >> >> Will SparkSQL support all the hive feature in the future? or just making hive >> as a datasource of Spark? >> >> From Spark 1.1.0 , we have thrift-server support running hql on spark. Will >> this feature be replaced by Hive on Spark? >> >> The reason for asking these questions is that we found some hive functions >> are not running well on SparkSQL ( like window function, cube and rollup >> function) >> >> Is it worth for making effort on implement these functions with SparkSQL? >> Could you guys give some advices ? >> >> thank you. >> >> >> Best Regards, >> >> Yi Tian >> tianyi.asiai...@gmail.com >> >> >> >> >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org >> For additional commands, e-mail: dev-h...@spark.apache.org >> >> > > --------------------------------------------------------------------- > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org > For additional commands, e-mail: dev-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org