For PostgreSQL:


postgres=# SELECT date_part('year',TIMESTAMP '2017-01-01');

date_part

-----------

      2017

(1 row)



postgres=# SELECT date_part('year',TIMESTAMP '2017');

ERROR:  invalid input syntax for type timestamp: "2017"

LINE 1: SELECT date_part('year',TIMESTAMP '2017');

                                          ^

postgres=# SELECT date_part('month',TIMESTAMP '2017-01-01');

date_part

-----------

         1

(1 row)



postgres=# SELECT date_part('year',TIMESTAMP '2017-1-1');

date_part

-----------

      2017

(1 row)





We'd better follow the Hive semantics. And removing support for yyyy and 
yyyy-d[d] will simplify the routine.



I'll create a Pull Request later.




---- On Sat, 16 Feb 2019 00:51:43 +0800 Xiao Li <gatorsm...@gmail.com> wrote 
----




We normally do not follow MySQL. Check the commercial database [like Oracle]? 
or the open source PostgreSQL?



Sean Owen <mailto:sro...@gmail.com> 于2019年2月15日周五 上午5:34写道:





year("1912") == 1912 makes sense; month("1912") == 1 is odd but not

 wrong. On the one hand, some answer might be better than none. But

 then, we are trying to match Hive semantics where the SQL standard is

 silent. Is this actually defined behavior in a SQL standard, or, what

 does MySQL do?

 

 On Fri, Feb 15, 2019 at 2:07 AM Darcy Shen <mailto:sad...@zoho.com.invalid> 
wrote:

 >

 > See https://issues.apache.org/jira/browse/SPARK-26885 and 
 > https://github.com/apache/spark/blob/71170e74df5c7ec657f61154212d1dc2ba7d0613/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala

 >

 >

 >

 >

 > stringToTimestamp, stringToDate support yyyy, as a result:

 >

 > select year("1912") => 1912

 >

 > select month("1912") => 1

 >

 > select hour("1912") => 0

 >

 >

 >

 > In Presto or Hive,

 >

 > select year("1912") => null

 >

 > select month("1912") => null

 >

 > select hour("1912") => null

 >

 >

 >

 > It is not a good idea to support yyyy for a Date/DateTime. As well as 
 > yyyy-[d]d.

 >

 >

 > What's your opinion?

 >

 >

 

 ---------------------------------------------------------------------

 To unsubscribe e-mail: mailto:dev-unsubscr...@spark.apache.org

Reply via email to