[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] Add `date_part` function for timestamps/dates

2019-08-15 Thread GitBox
dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] 
Add `date_part` function for timestamps/dates
URL: https://github.com/apache/spark/pull/25410#discussion_r314600209
 
 

 ##
 File path: sql/core/src/test/resources/sql-tests/inputs/pgSQL/timestamp.sql
 ##
 @@ -187,22 +187,21 @@ SELECT '' AS date_trunc_week, date_trunc( 'week', 
timestamp '2004-02-29 15:44:17
 --   WHERE d1 BETWEEN timestamp '1902-01-01'
 --AND timestamp '2038-01-01';
 
--- [SPARK-28420] Date/Time Functions: date_part
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'year', d1) AS year, date_part( 'month', d1) AS month,
---date_part( 'day', d1) AS day, date_part( 'hour', d1) AS hour,
---date_part( 'minute', d1) AS minute, date_part( 'second', d1) AS second
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
-
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'quarter', d1) AS quarter, date_part( 'msec', d1) AS msec,
---date_part( 'usec', d1) AS usec
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
-
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'isoyear', d1) AS isoyear, date_part( 'week', d1) AS week,
---date_part( 'dow', d1) AS dow
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
+SELECT '' AS `54`, d1 as `timestamp`,
 
 Review comment:
   ? @MaxGekk . The master branch seems to be same. Could you push your change?
   ```
   spark-sql> select 1 as year;
   1
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] Add `date_part` function for timestamps/dates

2019-08-15 Thread GitBox
dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] 
Add `date_part` function for timestamps/dates
URL: https://github.com/apache/spark/pull/25410#discussion_r314599685
 
 

 ##
 File path: sql/core/src/test/resources/sql-tests/inputs/pgSQL/timestamp.sql
 ##
 @@ -187,22 +187,21 @@ SELECT '' AS date_trunc_week, date_trunc( 'week', 
timestamp '2004-02-29 15:44:17
 --   WHERE d1 BETWEEN timestamp '1902-01-01'
 --AND timestamp '2038-01-01';
 
--- [SPARK-28420] Date/Time Functions: date_part
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'year', d1) AS year, date_part( 'month', d1) AS month,
---date_part( 'day', d1) AS day, date_part( 'hour', d1) AS hour,
---date_part( 'minute', d1) AS minute, date_part( 'second', d1) AS second
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
-
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'quarter', d1) AS quarter, date_part( 'msec', d1) AS msec,
---date_part( 'usec', d1) AS usec
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
-
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'isoyear', d1) AS isoyear, date_part( 'week', d1) AS week,
---date_part( 'dow', d1) AS dow
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
+SELECT '' AS `54`, d1 as `timestamp`,
 
 Review comment:
   Spark 2.4.3 shows the following and PostgreSQL also allows it. Let me check 
the logs.
   ```
   spark-sql> select 1 as year;
   1
   ```
   Also, cc @maropu .


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] Add `date_part` function for timestamps/dates

2019-08-15 Thread GitBox
dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] 
Add `date_part` function for timestamps/dates
URL: https://github.com/apache/spark/pull/25410#discussion_r314599253
 
 

 ##
 File path: sql/core/src/test/resources/sql-tests/inputs/pgSQL/timestamp.sql
 ##
 @@ -187,22 +187,21 @@ SELECT '' AS date_trunc_week, date_trunc( 'week', 
timestamp '2004-02-29 15:44:17
 --   WHERE d1 BETWEEN timestamp '1902-01-01'
 --AND timestamp '2038-01-01';
 
--- [SPARK-28420] Date/Time Functions: date_part
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'year', d1) AS year, date_part( 'month', d1) AS month,
---date_part( 'day', d1) AS day, date_part( 'hour', d1) AS hour,
---date_part( 'minute', d1) AS minute, date_part( 'second', d1) AS second
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
-
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'quarter', d1) AS quarter, date_part( 'msec', d1) AS msec,
---date_part( 'usec', d1) AS usec
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
-
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'isoyear', d1) AS isoyear, date_part( 'week', d1) AS week,
---date_part( 'dow', d1) AS dow
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
+SELECT '' AS `54`, d1 as `timestamp`,
 
 Review comment:
   Oh. That sounds like a regression which we made recently.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] Add `date_part` function for timestamps/dates

2019-08-15 Thread GitBox
dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] 
Add `date_part` function for timestamps/dates
URL: https://github.com/apache/spark/pull/25410#discussion_r314593244
 
 

 ##
 File path: sql/core/src/test/resources/sql-tests/inputs/pgSQL/timestamp.sql
 ##
 @@ -187,22 +187,21 @@ SELECT '' AS date_trunc_week, date_trunc( 'week', 
timestamp '2004-02-29 15:44:17
 --   WHERE d1 BETWEEN timestamp '1902-01-01'
 --AND timestamp '2038-01-01';
 
--- [SPARK-28420] Date/Time Functions: date_part
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'year', d1) AS year, date_part( 'month', d1) AS month,
---date_part( 'day', d1) AS day, date_part( 'hour', d1) AS hour,
---date_part( 'minute', d1) AS minute, date_part( 'second', d1) AS second
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
-
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'quarter', d1) AS quarter, date_part( 'msec', d1) AS msec,
---date_part( 'usec', d1) AS usec
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
-
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'isoyear', d1) AS isoyear, date_part( 'week', d1) AS week,
---date_part( 'dow', d1) AS dow
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
+SELECT '' AS `54`, d1 as `timestamp`,
 
 Review comment:
   Oops. What I meant was backquoting for the special value like `54`.
   For the other values like `year` and `month`, we don't need back quoting.
   
   So, 
   1. For the places where PostgreSQL uses `AS ".."`, we can use backquoting.
   2. For the places where PostgreSQL uses `AS xxx`, we don't need backquoting, 
too.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] Add `date_part` function for timestamps/dates

2019-08-14 Thread GitBox
dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] 
Add `date_part` function for timestamps/dates
URL: https://github.com/apache/spark/pull/25410#discussion_r314085860
 
 

 ##
 File path: sql/core/src/test/resources/sql-tests/inputs/pgSQL/timestamp.sql
 ##
 @@ -187,12 +187,11 @@ SELECT '' AS date_trunc_week, date_trunc( 'week', 
timestamp '2004-02-29 15:44:17
 --   WHERE d1 BETWEEN timestamp '1902-01-01'
 --AND timestamp '2038-01-01';
 
--- [SPARK-28420] Date/Time Functions: date_part
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'year', d1) AS year, date_part( 'month', d1) AS month,
---date_part( 'day', d1) AS day, date_part( 'hour', d1) AS hour,
---date_part( 'minute', d1) AS minute, date_part( 'second', d1) AS second
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
+SELECT '' AS `54`, d1 as `timestamp`,
+date_part( 'year', d1) AS `year`, date_part( 'month', d1) AS `month`,
+date_part( 'day', d1) AS `day`, date_part( 'hour', d1) AS `hour`,
+date_part( 'minute', d1) AS `minute`, date_part( 'second', d1) AS `second`
+FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
 
 -- SELECT '' AS "54", d1 as "timestamp",
 --date_part( 'quarter', d1) AS quarter, date_part( 'msec', d1) AS msec,
 
 Review comment:
   For these lines, you need to update SQL a little by using backquotes instead 
of `"` for `AS`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] Add `date_part` function for timestamps/dates

2019-08-14 Thread GitBox
dongjoon-hyun commented on a change in pull request #25410: [SPARK-28690][SQL] 
Add `date_part` function for timestamps/dates
URL: https://github.com/apache/spark/pull/25410#discussion_r314085423
 
 

 ##
 File path: sql/core/src/test/resources/sql-tests/inputs/pgSQL/timestamp.sql
 ##
 @@ -187,12 +187,11 @@ SELECT '' AS date_trunc_week, date_trunc( 'week', 
timestamp '2004-02-29 15:44:17
 --   WHERE d1 BETWEEN timestamp '1902-01-01'
 --AND timestamp '2038-01-01';
 
--- [SPARK-28420] Date/Time Functions: date_part
--- SELECT '' AS "54", d1 as "timestamp",
---date_part( 'year', d1) AS year, date_part( 'month', d1) AS month,
---date_part( 'day', d1) AS day, date_part( 'hour', d1) AS hour,
---date_part( 'minute', d1) AS minute, date_part( 'second', d1) AS second
---FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
+SELECT '' AS `54`, d1 as `timestamp`,
+date_part( 'year', d1) AS `year`, date_part( 'month', d1) AS `month`,
+date_part( 'day', d1) AS `day`, date_part( 'hour', d1) AS `hour`,
+date_part( 'minute', d1) AS `minute`, date_part( 'second', d1) AS `second`
+FROM TIMESTAMP_TBL WHERE d1 BETWEEN '1902-01-01' AND '2038-01-01';
 
 -- SELECT '' AS "54", d1 as "timestamp",
 --date_part( 'quarter', d1) AS quarter, date_part( 'msec', d1) AS msec,
 
 Review comment:
   We need to enable line 196 ~ 204 in this PR, @MaxGekk .


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org