[ 
https://issues.apache.org/jira/browse/SPARK-35365?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17342978#comment-17342978
 ] 

yao edited comment on SPARK-35365 at 5/12/21, 3:04 AM:
-------------------------------------------------------

[~yumwang] I have uploaded two files,  I removed some columns and adjust the 
sql, now spark 3.1 use 20-30 minues around. thanks ,please help me check, is 
there something I can do to improve the performance in spark3.1.1, thanks.


was (Author: xiaohua):
[~yumwang] I have upload two files,  I removed some columns and adjust the sql, 
now spark 3.1 use 20-30 minues around. thanks ,please help me check, is there 
something I can do to improve the performance in spark3.1.1, thanks.

> spark3.1.1 use too long time to analyze table fields
> ----------------------------------------------------
>
>                 Key: SPARK-35365
>                 URL: https://issues.apache.org/jira/browse/SPARK-35365
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.1
>            Reporter: yao
>            Priority: Major
>         Attachments: spark2.4report, spark3.11report
>
>
> I have a big sql with a few width tables join and complex logic, when I run 
> that in spark 2.4 , it will take 20 minues in analyze phase, when I use spark 
> 3.1.1, it will use about 40 minutes,
> I need set spark.sql.analyzer.maxIterations=1000 in spark3.1.1.
> or spark.sql.optimizer.maxIterations=1000 in spark2.4.
> no other special setting for this .
> I check on the spark ui , I find that there is no job generated, all executor 
> have no active tasks, and when I set log level to debug, I find that the job 
> is in analyze phase, analyze the fields reference.
> this phase use too long time.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to