[GitHub] [spark] maropu commented on a change in pull request #28807: [SPARK-26905][SQL] Follow the SQL:2016 reserved keywords

2020-06-14 Thread GitBox


maropu commented on a change in pull request #28807:
URL: https://github.com/apache/spark/pull/28807#discussion_r439927771



##
File path: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/TableIdentifierParserSuite.scala
##
@@ -388,12 +396,24 @@ class TableIdentifierParserSuite extends SparkFunSuite 
with SQLHelper {
   val reservedKeywordsInAnsiMode = allCandidateKeywords -- 
nonReservedKeywordsInAnsiMode
 
   test("check # of reserved keywords") {
-val numReservedKeywords = 78
+val numReservedKeywords = 74

Review comment:
   Note: `ANTI`, `SEMI`, `MINUS`, and `!` are removed.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] maropu commented on a change in pull request #28807: [SPARK-26905][SQL] Follow the SQL:2016 reserved keywords

2020-06-14 Thread GitBox


maropu commented on a change in pull request #28807:
URL: https://github.com/apache/spark/pull/28807#discussion_r439905202



##
File path: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/TableIdentifierParserSuite.scala
##
@@ -388,12 +391,24 @@ class TableIdentifierParserSuite extends SparkFunSuite 
with SQLHelper {
   val reservedKeywordsInAnsiMode = allCandidateKeywords -- 
nonReservedKeywordsInAnsiMode
 
   test("check # of reserved keywords") {
-val numReservedKeywords = 78
+val numReservedKeywords = 75
 assert(reservedKeywordsInAnsiMode.size == numReservedKeywords,
   s"The expected number of reserved keywords is $numReservedKeywords, but 
" +
 s"${reservedKeywordsInAnsiMode.size} found.")
   }
 
+  test("should follow reserved keywords in SQL:2016") {
+withTempDir { dir =>
+  val tmpFile = new File(dir, "tmp")
+  val is = Thread.currentThread().getContextClassLoader
+.getResourceAsStream("ansi-sql-2016-reserved-keywords.txt")
+  Files.copy(is, tmpFile.toPath)
+  val reservedKeywordsInSql2016 = Files.readAllLines(tmpFile.toPath)
+.asScala.filterNot(_.startsWith("--")).map(_.trim).toSet
+  assert(((reservedKeywordsInAnsiMode -- Set("!")) -- 
reservedKeywordsInSql2016).isEmpty)

Review comment:
   Yea, will do.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] maropu commented on a change in pull request #28807: [SPARK-26905][SQL] Follow the SQL:2016 reserved keywords

2020-06-14 Thread GitBox


maropu commented on a change in pull request #28807:
URL: https://github.com/apache/spark/pull/28807#discussion_r439905098



##
File path: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/TableIdentifierParserSuite.scala
##
@@ -388,12 +391,24 @@ class TableIdentifierParserSuite extends SparkFunSuite 
with SQLHelper {
   val reservedKeywordsInAnsiMode = allCandidateKeywords -- 
nonReservedKeywordsInAnsiMode
 
   test("check # of reserved keywords") {
-val numReservedKeywords = 78
+val numReservedKeywords = 75
 assert(reservedKeywordsInAnsiMode.size == numReservedKeywords,
   s"The expected number of reserved keywords is $numReservedKeywords, but 
" +
 s"${reservedKeywordsInAnsiMode.size} found.")
   }
 
+  test("should follow reserved keywords in SQL:2016") {

Review comment:
   Looks clearer, okay, I'll update. Thanks!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org