[ 
https://issues.apache.org/jira/browse/SPARK-29329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17291477#comment-17291477
 ] 

Yikun Jiang edited comment on SPARK-29329 at 2/26/21, 2:21 PM:
---------------------------------------------------------------

Thanks for [~hyukjin.kwon] bring me in here from [1]. I wrote a short 
introduction about Zinc/scala-maven-plugin history [2]. Hope this helps.

Yes, right, the Zinc server mode (0.3.X) is on longer supported, and 
scala-maven-plugin 4.X is using embeded Zinc v1.X (it's a powerful version than 
0.3.X [3]). But I don't think it (the Zinc upgrade work) is the root reason of 
this problem.

The Zinc server upgrade maybe have some influence on the people, who are using 
zinc 0.3.X in the separated powerful dedicated compiler server, but there is no 
influence on the case of zinc server and mvn build are launched in local 
machine (I mean our spark dev env).

I think this problem perhaps caused by [4], and it's already fixed by in 
v4.3.0, and fortunately, it's already upgraded in SPARK-29528 (merged in Spark 
3.0.0).

I also do a simple test on master branch, use `build/mvn package -DskipTests 
-pl core`, the frist time takes 3mins and 2nd time take 1 min. Would you mind 
to confirm the problem still exsits or not? [~tgraves]

[1] [https://github.com/apache/spark/pull/31647]

[2][https://docs.google.com/document/d/1u4kCHDx7KjVlHGerfmbcKSB0cZo6AD4cBdHSse-SBsM]

[3][https://www.scala-lang.org/blog/2017/11/03/zinc-blog-1.0.html]

[4][https://github.com/davidB/scala-maven-plugin/issues/368]

 


was (Author: yikunkero):
Thanks for [~hyukjin.kwon] bring me in here from [1]. I wrote a short 
introduction about Zinc/scala-maven-plugin history [2]. Hope this helps.

Yes, right, the Zinc server mode (0.3.X) is on longer supported, and 
scala-maven-plugin 4.X is using embeded Zinc v1.X (it's a powerful version than 
0.3.X [3]). But I don't think it (the Zinc upgrade work) is the root reason of 
this problem.

The Zinc server upgrade maybe have some influence on the people, who are using 
zinc 0.3.X in the separated powerful dedicated compiler server, there is no 
influence on the case of zinc server and mvn build are launched in same machine 
(I mean our spark dev env).

I think this problem perhaps caused by [4], and it's already fixed by in 
v4.3.0, and fortunately, it's already upgraded in SPARK-29528 (merged in Spark 
3.0.0).

I also do a simple test on master branch, use `build/mvn package -DskipTests 
-pl core`, the frist time takes 3mins and 2nd time take 1 min. Would you mind 
to confirm the problem still exsits or not? [~tgraves]

[1] [https://github.com/apache/spark/pull/31647]

[2][https://docs.google.com/document/d/1u4kCHDx7KjVlHGerfmbcKSB0cZo6AD4cBdHSse-SBsM]

[3][https://www.scala-lang.org/blog/2017/11/03/zinc-blog-1.0.html]

[4][https://github.com/davidB/scala-maven-plugin/issues/368]

 

> maven incremental builds not working
> ------------------------------------
>
>                 Key: SPARK-29329
>                 URL: https://issues.apache.org/jira/browse/SPARK-29329
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 3.0.0
>            Reporter: Thomas Graves
>            Priority: Major
>
> It looks like since we Upgraded scala-maven-plugin to 4.2.0 
> https://issues.apache.org/jira/browse/SPARK-28759 spark incremental builds 
> stop working.  Everytime you build its building all files, which takes 
> forever.
> It would be nice to fix this.
>  
> To reproduce, just build spark once ( I happened to be using the command 
> below):
> build/mvn -Phadoop-3.2 -Phive-thriftserver -Phive -Pyarn -Pkinesis-asl 
> -Pkubernetes -Pmesos -Phadoop-cloud -Pspark-ganglia-lgpl package -DskipTests
> Then build it again and you will see that it compiles all the files and takes 
> 15-30 minutes. With incremental it skips all unnecessary files and takes 
> closer to 5 minutes.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to