peter-toth commented on a change in pull request #23531: [SPARK-24497][SQL] 
Support recursive SQL query
URL: https://github.com/apache/spark/pull/23531#discussion_r307475054
 
 

 ##########
 File path: sql/core/src/test/resources/sql-tests/results/cte.sql.out
 ##########
 @@ -328,16 +328,891 @@ struct<scalarsubquery():int>
 
 
 -- !query 25
-DROP VIEW IF EXISTS t
+WITH r(level) AS (
+  VALUES (0)
+  UNION ALL
+  SELECT level + 1 FROM r WHERE level < 10
+)
+SELECT * FROM r
 -- !query 25 schema
 struct<>
 -- !query 25 output
-
+org.apache.spark.sql.AnalysisException
+Table or view not found: r; line 4 pos 24
 
 
 -- !query 26
-DROP VIEW IF EXISTS t2
+WITH RECURSIVE r(level) AS (
+  VALUES (0)
+  UNION ALL
+  SELECT level + 1 FROM r WHERE level < 10
+)
+SELECT * FROM r
 -- !query 26 schema
-struct<>
+struct<level:int>
 -- !query 26 output
+0
+1
+10
+2
+3
+4
+5
+6
+7
+8
+9
+
+
+-- !query 27
+WITH RECURSIVE r(level) AS (
+  VALUES (0)
+  UNION ALL
+  SELECT level + 1 FROM r
+)
+SELECT * FROM r
+-- !query 27 schema
+struct<>
+-- !query 27 output
+org.apache.spark.SparkException
+Recursion level limit 100 reached but query has not exhausted, try increasing 
spark.sql.cte.recursion.level.limit
+
+
+-- !query 28
+WITH RECURSIVE r(level) AS (
+  VALUES (0)
+  UNION ALL
+  SELECT level + 1 FROM r
+)
+SELECT * FROM r LIMIT 10
+-- !query 28 schema
+struct<level:int>
+-- !query 28 output
+0
+1
+2
+3
+4
+5
+6
+7
+8
+9
+
+
+-- !query 29
+WITH RECURSIVE r(level) AS (
+  VALUES (0)
+  UNION ALL
+  SELECT level + 1 FROM r
+)
+SELECT level, level FROM r LIMIT 10
+-- !query 29 schema
+struct<level:int,level:int>
+-- !query 29 output
+0      0
+1      1
+2      2
+3      3
+4      4
+5      5
+6      6
+7      7
+8      8
+9      9
+
+
+-- !query 30
+WITH RECURSIVE r(level) AS (
+  VALUES (0)
+  UNION ALL
+  SELECT level + 1 FROM r
+)
+SELECT level, level FROM r ORDER BY level LIMIT 10
+-- !query 30 schema
+struct<>
+-- !query 30 output
+org.apache.spark.SparkException
+Recursion level limit 100 reached but query has not exhausted, try increasing 
spark.sql.cte.recursion.level.limit
+
+
+-- !query 31
+WITH RECURSIVE r(c) AS (
+  SELECT 'a'
+  UNION ALL
+  SELECT c || ' b' FROM r WHERE LENGTH(c) < 10
+)
+SELECT * FROM r
+-- !query 31 schema
+struct<c:string>
+-- !query 31 output
+a
+a b
+a b b
+a b b b
+a b b b b
+a b b b b b
+
+
+-- !query 32
+WITH RECURSIVE r(level) AS (
+  SELECT level + 1 FROM r WHERE level < 10
+  UNION ALL
+  VALUES (0)
+)
+SELECT * FROM r
+-- !query 32 schema
+struct<level:int>
+-- !query 32 output
 
 Review comment:
   This is is the same issue that we have in the ported `with.sql` here: 
https://github.com/apache/spark/pull/23531/files#diff-ed1773ed16f538a75da7b4a75fe38ebdR613-R616
   PostgreSQL requires the first term to be the anchor (no recursive reference 
in it) and the second one (the other side of `UNION` or `UNION ALL`) to be the 
recursive one.
   But even the original comment says that the order of terms should not matter 
at all so my implementation allows reversed order of terms too. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to