[ 
https://issues.apache.org/jira/browse/SPARK-7726?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14550170#comment-14550170
 ] 

Iulian Dragos edited comment on SPARK-7726 at 5/19/15 10:11 AM:
----------------------------------------------------------------

The problem is different visibility rules in Scala and Java w.r.t to statics 
(and the fact that scaladoc is invoked on Java sources).

In this particular case, `Type` is a static enumeration inherited from 
`BlockTransferMessage`. In Scala statics are part of the companion object, and 
not visible without an import. Since scaladoc needs to resolve all types it 
documents, this comes up as an error.

I fixed this by adding a static import in all those files

{code}
import static 
org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Type;
{code}

{code}
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Networking ........................... SUCCESS [ 11.395 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  5.460 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17.582 s
[INFO] Finished at: 2015-05-19T12:07:01+02:00
[INFO] Final Memory: 42M/456M
[INFO] ------------------------------------------------------------------------
{code}

Should I open a pull request reverting the reverting commit and add this fix? 
Or you'd like to suppress scaladoc running on Java files (assuming that's not 
intentional...)


was (Author: dragos):
The problem is different visibility rules in Scala and Java w.r.t to statics 
(and the fact that scaladoc is run on Java sources).

In this particular case, `Type` is a static enumeration inherited from 
`BlockTransferMessage`. In Scala statics are part of the companion object, and 
not visible without an import. Since scaladoc needs to resolve all types it 
documents, this comes up as an error.

I fixed this by adding a static import in all those files

{code}
import static 
org.apache.spark.network.shuffle.protocol.BlockTransferMessage.Type;
{code}

{code}
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Networking ........................... SUCCESS [ 11.395 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  5.460 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17.582 s
[INFO] Finished at: 2015-05-19T12:07:01+02:00
[INFO] Final Memory: 42M/456M
[INFO] ------------------------------------------------------------------------
{code}

Should I open a pull request reverting the reverting commit and add this fix? 
Or you'd like to suppress scaladoc running on Java files (assuming that's not 
intentional...)

> Maven Install Breaks When Upgrading Scala 2.11.2-->[2.11.3 or higher]
> ---------------------------------------------------------------------
>
>                 Key: SPARK-7726
>                 URL: https://issues.apache.org/jira/browse/SPARK-7726
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>            Reporter: Patrick Wendell
>            Priority: Blocker
>
> This one took a long time to track down. The Maven install phase is part of 
> our release process. It runs the "scala:doc" target to generate doc jars. 
> Between Scala 2.11.2 and Scala 2.11.3, the behavior of this plugin changed in 
> a way that breaks our build. In both cases, it returned an error (there has 
> been a long running error here that we've always ignored), however in 2.11.3 
> that error became fatal and failed the entire build process. The upgrade 
> occurred in SPARK-7092. Here is a simple reproduction:
> {code}
> ./dev/change-version-to-2.11.sh
> mvn clean install -pl network/common -pl network/shuffle -DskipTests 
> -Dscala-2.11
> {code} 
> This command exits success when Spark is at Scala 2.11.2 and fails with 
> 2.11.3 or higher. In either case an error is printed:
> {code}
> [INFO] 
> [INFO] --- scala-maven-plugin:3.2.0:doc-jar (attach-scaladocs) @ 
> spark-network-shuffle_2.11 ---
> /Users/pwendell/Documents/spark/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/UploadBlock.java:56:
>  error: not found: type Type
>   protected Type type() { return Type.UPLOAD_BLOCK; }
>             ^
> /Users/pwendell/Documents/spark/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/StreamHandle.java:37:
>  error: not found: type Type
>   protected Type type() { return Type.STREAM_HANDLE; }
>             ^
> /Users/pwendell/Documents/spark/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/RegisterExecutor.java:44:
>  error: not found: type Type
>   protected Type type() { return Type.REGISTER_EXECUTOR; }
>             ^
> /Users/pwendell/Documents/spark/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/OpenBlocks.java:40:
>  error: not found: type Type
>   protected Type type() { return Type.OPEN_BLOCKS; }
>             ^
> model contains 22 documentable templates
> four errors found
> {code}
> Ideally we'd just dig in and fix this error. Unfortunately it's a very 
> confusing error and I have no idea why it is appearing. I'd propose reverting 
> SPARK-7092 in the mean time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to