[jira] [Updated] (SPARK-37696) Optimizer exceeds max iterations

2022-04-25 Thread Nicholas Chammas (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nicholas Chammas updated SPARK-37696:
-
Affects Version/s: 3.2.1

> Optimizer exceeds max iterations
> 
>
> Key: SPARK-37696
> URL: https://issues.apache.org/jira/browse/SPARK-37696
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.2.0, 3.2.1
>Reporter: Denis Tarima
>Priority: Minor
>
> A specific scenario causing Spark's failure in tests and a warning in 
> production:
> 21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) 
> reached for batch Operator Optimization before Inferring Filters, please set 
> 'spark.sql.optimizer.maxIterations' to a larger value.
> 21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) 
> reached for batch Operator Optimization after Inferring Filters, please set 
> 'spark.sql.optimizer.maxIterations' to a larger value.
>  
> To reproduce run the following commands in `spark-shell`:
> {{// define case class for a struct type in an array}}
> {{case class S(v: Int, v2: Int)}}
>  
> {{// prepare a table with an array of structs}}
> {{Seq((10, Seq(S(1, 2.toDF("i", "data").write.saveAsTable("tbl")}}
>  
> {{// select using SQL and join with a dataset using "left_anti"}}
> {{spark.sql("select i, data[size(data) - 1].v from 
> tbl").join(Seq(10).toDF("i"), Seq("i"), "left_anti").show()}}
>  
> The following conditions are required:
>  # Having additional `v2` field in `S`
>  # Using `{{{}data[size(data) - 1]{}}}` instead of `{{{}element_at(data, 
> -1){}}}`
>  # Using `{{{}left_anti{}}}` in join operation
>  
> The same behavior was observed in `master` branch and `3.1.1`.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-37696) Optimizer exceeds max iterations

2021-12-20 Thread Denis Tarima (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-37696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Denis Tarima updated SPARK-37696:
-
Description: 
A specific scenario causing Spark's failure in tests and a warning in 
production:

21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) 
reached for batch Operator Optimization before Inferring Filters, please set 
'spark.sql.optimizer.maxIterations' to a larger value.
21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) 
reached for batch Operator Optimization after Inferring Filters, please set 
'spark.sql.optimizer.maxIterations' to a larger value.

 

To reproduce run the following commands in `spark-shell`:

{{// define case class for a struct type in an array}}

{{case class S(v: Int, v2: Int)}}

 

{{// prepare a table with an array of structs}}

{{Seq((10, Seq(S(1, 2.toDF("i", "data").write.saveAsTable("tbl")}}

 

{{// select using SQL and join with a dataset using "left_anti"}}

{{spark.sql("select i, data[size(data) - 1].v from 
tbl").join(Seq(10).toDF("i"), Seq("i"), "left_anti").show()}}

 

The following conditions are required:
 # Having additional `v2` field in `S`
 # Using `{{{}data[size(data) - 1]{}}}` instead of `{{{}element_at(data, 
-1){}}}`
 # Using `{{{}left_anti{}}}` in join operation

 

The same behavior was observed in `master` branch and `3.1.1`.

  was:
A specific scenario causing Spark's failure in tests and a warning in 
production:

21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) 
reached for batch Operator Optimization before Inferring Filters, please set 
'spark.sql.optimizer.maxIterations' to a larger value.
21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) 
reached for batch Operator Optimization after Inferring Filters, please set 
'spark.sql.optimizer.maxIterations' to a larger value.

 

To reproduce run the following commands in `spark-shell`:

{{// define case class for a struct type in an array}}

{{case class S(v: Int, v2: Int)}}

{{// prepare a table with an array of structs}}
{{Seq((10, Seq(S(1, 2.toDF("i", "data").write.saveAsTable("tbl")}}

{{{}// select using SQL and join with a dataset using "left_anti"{}}}{{{}{}}}

{{{}{}}}{{{}spark.sql("select i, data[size(data) - 1].v from 
tbl").join(Seq(10).toDF("i"), Seq("i"), "left_anti").show(){}}}

 

The following conditions are required:
 # Having additional `v2` field in `S`
 # Using `{{{}data[size(data) - 1]{}}}` instead of `{{{}element_at(data, 
-1){}}}`
 # Using `{{{}left_anti{}}}` in join operation

 

The same behavior was observed in `master` branch and `3.1.1`.


> Optimizer exceeds max iterations
> 
>
> Key: SPARK-37696
> URL: https://issues.apache.org/jira/browse/SPARK-37696
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.2.0
>Reporter: Denis Tarima
>Priority: Minor
>
> A specific scenario causing Spark's failure in tests and a warning in 
> production:
> 21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) 
> reached for batch Operator Optimization before Inferring Filters, please set 
> 'spark.sql.optimizer.maxIterations' to a larger value.
> 21/12/20 06:45:24 WARN BaseSessionStateBuilder$$anon$2: Max iterations (100) 
> reached for batch Operator Optimization after Inferring Filters, please set 
> 'spark.sql.optimizer.maxIterations' to a larger value.
>  
> To reproduce run the following commands in `spark-shell`:
> {{// define case class for a struct type in an array}}
> {{case class S(v: Int, v2: Int)}}
>  
> {{// prepare a table with an array of structs}}
> {{Seq((10, Seq(S(1, 2.toDF("i", "data").write.saveAsTable("tbl")}}
>  
> {{// select using SQL and join with a dataset using "left_anti"}}
> {{spark.sql("select i, data[size(data) - 1].v from 
> tbl").join(Seq(10).toDF("i"), Seq("i"), "left_anti").show()}}
>  
> The following conditions are required:
>  # Having additional `v2` field in `S`
>  # Using `{{{}data[size(data) - 1]{}}}` instead of `{{{}element_at(data, 
> -1){}}}`
>  # Using `{{{}left_anti{}}}` in join operation
>  
> The same behavior was observed in `master` branch and `3.1.1`.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org