My approach may be partly influenced by my limited experience with SQL and
Hive, but I just converted all my dates to seconds-since-epoch and then
selected samples from specific time ranges using integer comparisons.


On Thu, Sep 4, 2014 at 6:38 PM, Cheng, Hao <hao.ch...@intel.com> wrote:

>  There are 2 SQL dialects, one is a very basic SQL support and another is
> Hive QL. In most of cases I think people prefer using the HQL, which also
> means you have to use HiveContext instead of the SQLContext.
>
>
>
> In this particular query you showed, seems datatime is the type Date,
> unfortunately, neither of those SQL dialect supports Date, but Timestamp.
>
>
>
> Cheng Hao
>
>
>
> *From:* Benjamin Zaitlen [mailto:quasi...@gmail.com]
> *Sent:* Friday, September 05, 2014 5:37 AM
> *To:* user@spark.apache.org
> *Subject:* TimeStamp selection with SparkSQL
>
>
>
> I may have missed this but is it possible to select on datetime in a
> SparkSQL query
>
>
>
> jan1 = sqlContext.sql("SELECT * FROM Stocks WHERE datetime = '2014-01-01'")
>
>
>
> Additionally, is there a guide as to what SQL is valid? The guide says,
> "Note that Spark SQL currently uses a very basic SQL parser"  It would be
> great to post what is currently supported.
>
>
>
> --Ben
>
>
>
>
>

Reply via email to