[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-18 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r175339309
  
--- Diff: sql/core/src/test/resources/sql-tests/results/interval.sql.out ---
@@ -0,0 +1,375 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 31
+
+
+-- !query 0
+select
+  '1' second,
+  2  seconds,
+  '1' minute,
+  2  minutes,
+  '1' hour,
+  2  hours,
+  '1' day,
+  2  days,
+  '1' month,
+  2  months,
+  '1' year,
+  2  years
+-- !query 0 schema
+struct
+-- !query 0 output
+interval 1 seconds interval 2 seconds  interval 1 minutes  
interval 2 minutes  interval 1 hoursinterval 2 hours
interval 1 days interval 2 days interval 1 months   interval 2 months   
interval 1 yearsinterval 2 years
+
+
+-- !query 1
+select
+  interval '10-11' year to month,
+  interval '10' year,
+  interval '11' month
+-- !query 1 schema
+struct
+-- !query 1 output
+interval 10 years 11 monthsinterval 10 years   interval 11 
months
+
+
+-- !query 2
+select
+  '10-11' year to month,
+  '10' year,
+  '11' month
+-- !query 2 schema
+struct
+-- !query 2 output
+interval 10 years 11 monthsinterval 10 years   interval 11 
months
+
+
+-- !query 3
+select
+  interval '10 9:8:7.987654321' day to second,
+  interval '10' day,
+  interval '11' hour,
+  interval '12' minute,
+  interval '13' second,
+  interval '13.123456789' second
+-- !query 3 schema
+struct
+-- !query 3 output
+interval 1 weeks 3 days 9 hours 8 minutes 7 seconds 987 milliseconds 654 
microseconds  interval 1 weeks 3 days interval 11 hours   interval 12 
minutes interval 13 seconds interval 13 seconds 123 milliseconds 456 
microseconds
+
+
+-- !query 4
+select
+  '10 9:8:7.987654321' day to second,
+  '10' day,
+  '11' hour,
+  '12' minute,
+  '13' second,
+  '13.123456789' second
+-- !query 4 schema
+struct
+-- !query 4 output
+interval 1 weeks 3 days 9 hours 8 minutes 7 seconds 987 milliseconds 654 
microseconds  interval 1 weeks 3 days interval 11 hours   interval 12 
minutes interval 13 seconds interval 13 seconds 123 milliseconds 456 
microseconds
+
+
+-- !query 5
+create temporary view interval_arithmetic as
+  select CAST(dateval AS date), CAST(tsval AS timestamp) from values
+('2012-01-01', '2012-01-01')
+as interval_arithmetic(dateval, tsval)
+-- !query 5 schema
+struct<>
+-- !query 5 output
+
+
+
+-- !query 6
+select
+  dateval,
+  dateval - interval '2-2' year to month,
+  dateval - interval '-2-2' year to month,
+  dateval + interval '2-2' year to month,
+  dateval + interval '-2-2' year to month,
+  - interval '2-2' year to month + dateval,
+  interval '2-2' year to month + dateval
+from interval_arithmetic
+-- !query 6 schema
+struct
+-- !query 6 output
+2012-01-01 2009-11-01  2014-03-01  2014-03-01  2009-11-01  
2009-11-01  2014-03-01
+
+
+-- !query 7
+select
+  dateval,
+  dateval - '2-2' year to month,
+  dateval - '-2-2' year to month,
+  dateval + '2-2' year to month,
+  dateval + '-2-2' year to month,
+  - '2-2' year to month + dateval,
+  '2-2' year to month + dateval
+from interval_arithmetic
+-- !query 7 schema
+struct
+-- !query 7 output
+2012-01-01 2009-11-01  2014-03-01  2014-03-01  2009-11-01  
2009-11-01  2014-03-01
+
+
+-- !query 8
+select
+  tsval,
+  tsval - interval '2-2' year to month,
+  tsval - interval '-2-2' year to month,
+  tsval + interval '2-2' year to month,
+  tsval + interval '-2-2' year to month,
+  - interval '2-2' year to month + tsval,
+  interval '2-2' year to month + tsval
+from interval_arithmetic
+-- !query 8 schema
+struct
+-- !query 8 output
+2012-01-01 00:00:002009-11-01 00:00:00 2014-03-01 00:00:00 
2014-03-01 00:00:00 2009-11-01 00:00:00 2009-11-01 00:00:00 
2014-03-01 00:00:00
+
+
+-- !query 9
+select
+  tsval,
+  tsval - '2-2' year to month,
+  tsval - '-2-2' year to month,
+  tsval + '2-2' year to month,
+  tsval + '-2-2' year to month,
+  - '2-2' year to month + tsval,
+  '2-2' year to month + tsval
+from interval_arithmetic
+-- !query 9 schema
+struct
+-- !query 9 output
+2012-01-01 00:00:002009-11-01 00:00:00 2014-03-01 00:00:00 
2014-03-01 00:00:00 2009-11-01 00:00:00 2009-11-01 00:00:00 
2014-03-01 00:00:00
+
+
+-- !query 10
+select
+  interva

[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-18 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r175336924
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala ---
@@ -155,6 +155,7 @@ class QueryExecution(val sparkSession: SparkSession, 
val logical: LogicalPlan) {
   case (null, _) => "null"
   case (s: String, StringType) => "\"" + s + "\""
   case (decimal, DecimalType()) => decimal.toString
+  case (interval, CalendarIntervalType) => interval.toString
--- End diff --

ok, I'll try to add tests for that.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-18 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r175336329
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala ---
@@ -155,6 +155,7 @@ class QueryExecution(val sparkSession: SparkSession, 
val logical: LogicalPlan) {
   case (null, _) => "null"
   case (s: String, StringType) => "\"" + s + "\""
   case (decimal, DecimalType()) => decimal.toString
+  case (interval, CalendarIntervalType) => interval.toString
--- End diff --

Do we have a test case to capture this change?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-12 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r174016181
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/SQLQueryTestSuite.scala ---
@@ -83,6 +83,15 @@ class SQLQueryTestSuite extends QueryTest with 
SharedSQLContext {
 
   private val regenerateGoldenFiles: Boolean = 
System.getenv("SPARK_GENERATE_GOLDEN_FILES") == "1"
 
+  private val testFilter: Option[String] = {
+val testFilter = System.getenv("SPARK_SQL_QUERY_TEST_FILTER")
+if (testFilter != null && !testFilter.isEmpty) {
+  Some(testFilter.toLowerCase(Locale.ROOT))
+} else {
+  None
+}
+  }
+
--- End diff --

ok, I'll do later.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-12 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r174016110
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/SQLQueryTestSuite.scala ---
@@ -83,6 +83,15 @@ class SQLQueryTestSuite extends QueryTest with 
SharedSQLContext {
 
   private val regenerateGoldenFiles: Boolean = 
System.getenv("SPARK_GENERATE_GOLDEN_FILES") == "1"
 
+  private val testFilter: Option[String] = {
+val testFilter = System.getenv("SPARK_SQL_QUERY_TEST_FILTER")
+if (testFilter != null && !testFilter.isEmpty) {
+  Some(testFilter.toLowerCase(Locale.ROOT))
+} else {
+  None
+}
+  }
+
--- End diff --

Let us create a separate PR?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-12 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r174011447
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/SQLQueryTestSuite.scala ---
@@ -83,6 +83,15 @@ class SQLQueryTestSuite extends QueryTest with 
SharedSQLContext {
 
   private val regenerateGoldenFiles: Boolean = 
System.getenv("SPARK_GENERATE_GOLDEN_FILES") == "1"
 
+  private val testFilter: Option[String] = {
+val testFilter = System.getenv("SPARK_SQL_QUERY_TEST_FILTER")
+if (testFilter != null && !testFilter.isEmpty) {
+  Some(testFilter.toLowerCase(Locale.ROOT))
+} else {
+  None
+}
+  }
+
--- End diff --

This is not related to this pr though, I think it is some useful to run 
tests selectively in `SQLQueryTestSuite` (cuz the number of tests there grows 
recently...). If possibly, could we add this feature in a separate pr? 
Otherwise, I'll drop this.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-05 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r172427740
  
--- Diff: 
sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 ---
@@ -790,6 +796,16 @@ ASC: 'ASC';
 DESC: 'DESC';
 FOR: 'FOR';
 INTERVAL: 'INTERVAL';
+YEAR: 'YEAR' | 'YEARS';
--- End diff --

Also update `TableIdentifierParserSuite`


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-05 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r172427617
  
--- Diff: 
sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 ---
@@ -790,6 +796,16 @@ ASC: 'ASC';
 DESC: 'DESC';
 FOR: 'FOR';
 INTERVAL: 'INTERVAL';
+YEAR: 'YEAR' | 'YEARS';
+MONTH: 'MONTH' | 'MONTHS';
+WEEK: 'WEEK' | 'WEEKS';
+DAY: 'DAY' | 'DAYS';
+HOUR: 'HOUR' | 'HOURS';
+MINUTE: 'MINUTE' | 'MINUTES';
+SECOND: 'SECOND' | 'SECONDS';
+MILLISECOND: 'MILLISECOND' | 'MILLISECONDS';
+MICROSECOND: 'MICROSECOND' | 'MICROSECONDS';
+NANOSECOND: 'NANOSECOND' | 'NANOSECONDS';
--- End diff --

We do not support `nanosecond`. 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-05 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r172427354
  
--- Diff: 
sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 ---
@@ -790,6 +796,16 @@ ASC: 'ASC';
 DESC: 'DESC';
 FOR: 'FOR';
 INTERVAL: 'INTERVAL';
+YEAR: 'YEAR' | 'YEARS';
+MONTH: 'MONTH' | 'MONTHS';
+WEEK: 'WEEK' | 'WEEKS';
+DAY: 'DAY' | 'DAYS';
+HOUR: 'HOUR' | 'HOURS';
+MINUTE: 'MINUTE' | 'MINUTES';
+SECOND: 'SECOND' | 'SECONDS';
+MILLISECOND: 'MILLISECOND' | 'MILLISECONDS';
--- End diff --

yea.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-05 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r172426790
  
--- Diff: 
sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 ---
@@ -790,6 +796,16 @@ ASC: 'ASC';
 DESC: 'DESC';
 FOR: 'FOR';
 INTERVAL: 'INTERVAL';
+YEAR: 'YEAR' | 'YEARS';
+MONTH: 'MONTH' | 'MONTHS';
+WEEK: 'WEEK' | 'WEEKS';
+DAY: 'DAY' | 'DAYS';
+HOUR: 'HOUR' | 'HOURS';
+MINUTE: 'MINUTE' | 'MINUTES';
+SECOND: 'SECOND' | 'SECONDS';
+MILLISECOND: 'MILLISECOND' | 'MILLISECONDS';
--- End diff --

nvm, it sounds like we already support them. 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-03-05 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r172426643
  
--- Diff: 
sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 ---
@@ -790,6 +796,16 @@ ASC: 'ASC';
 DESC: 'DESC';
 FOR: 'FOR';
 INTERVAL: 'INTERVAL';
+YEAR: 'YEAR' | 'YEARS';
+MONTH: 'MONTH' | 'MONTHS';
+WEEK: 'WEEK' | 'WEEKS';
+DAY: 'DAY' | 'DAYS';
+HOUR: 'HOUR' | 'HOURS';
+MINUTE: 'MINUTE' | 'MINUTES';
+SECOND: 'SECOND' | 'SECONDS';
+MILLISECOND: 'MILLISECOND' | 'MILLISECONDS';
--- End diff --

I am wondering which systems support `MILLISECOND `, `MICROSECOND ` and 
`NANOSECOND `?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-01-31 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r165259575
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ExpressionParserSuite.scala
 ---
@@ -561,8 +561,11 @@ class ExpressionParserSuite extends PlanTest {
   Literal(CalendarInterval.fromSingleUnitString(u, s))
 }
 
-// Empty interval statement
-intercept("interval", "at least one time unit should be given for 
interval literal")
--- End diff --

yea, antlr just throws an exception when hitting this case;
```
scala> sql("select cast('2018-01-12' as DATE) + 1 days").show

+---+
|CAST(CAST(CAST(2018-01-12 AS DATE) AS TIMESTAMP) + interval 1 days AS 
DATE)|

+---+
| 
2018-01-13|

+---+

scala> sql("select cast('2018-01-12' as DATE) + interval").show
org.apache.spark.sql.AnalysisException: cannot resolve '`interval`' given 
input columns: []; line 1 pos 36;
'Project [unresolvedalias((cast(2018-01-12 as date) + 'interval), None)]
+- OneRowRelation
```


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-01-31 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/20433#discussion_r165233944
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ExpressionParserSuite.scala
 ---
@@ -561,8 +561,11 @@ class ExpressionParserSuite extends PlanTest {
   Literal(CalendarInterval.fromSingleUnitString(u, s))
 }
 
-// Empty interval statement
-intercept("interval", "at least one time unit should be given for 
interval literal")
--- End diff --

we shall still check the empty interval statement, now it shall produce a 
different error message?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20433: [SPARK-23264][SQL] Support interval values withou...

2018-01-29 Thread maropu
GitHub user maropu opened a pull request:

https://github.com/apache/spark/pull/20433

[SPARK-23264][SQL] Support interval values without INTERVAL clauses

## What changes were proposed in this pull request?
This pr updated parsing rules in `SqlBase.g4` to support a SQL query below;
```
SELECT CAST('2017-08-04' AS DATE) + 1 days;
```
The current master cannot parse it though, other dbms-like systems support 
the syntax (e.g., hive and mysql). Also, the syntax is frequently used in the 
official TPC-DS queries.

## How was this patch tested?
Added tests in `SQLQuerySuite`.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maropu/spark SPARK-23264

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/20433.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #20433


commit 830cf8d014ae17ade5fd771ca98c8c846c93
Author: Takeshi Yamamuro 
Date:   2018-01-30T06:15:35Z

Fix




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org