[ 
https://issues.apache.org/jira/browse/SPARK-4326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14208998#comment-14208998
 ] 

Marcelo Vanzin commented on SPARK-4326:
---------------------------------------

So, this is really weird. Unidoc is run by the sbt build, where none of the 
shading shenanigans from the maven build should apply. The root pom.xml adds 
guava as a dependency for everybody with compile scope when the sbt profile is 
enabled.

That being said, if you look at the output of {{show allDependencies}} from 
within an sbt shell, it will show some components with a "guava 11.0.2 
provided" dependency. So the profile isn't taking?

Another fun fact is that the dependencies for the core project, where the 
errors above come from, are correct in the output of {{show allDependencies}}; 
it shows "guava 14.0.1 compile" as it should.

I was able to workaround this by adding guava explicitly in SparkBuild.scala, 
in the {{sharedSettings}} variable:

{code}
    libraryDependencies += "com.google.guava" % "guava" % "14.0.1"
{code}

That got rid of the above errors, but it didn't fix the overall build. Anyone 
more familiar with sbt/unidoc knows what's going on here?

Here are the errors with that hack applied:

{noformat}
[error] 
/work/apache/spark/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/UploadBlock.java:55:
 not found: type Type
[error]   protected Type type() { return Type.UPLOAD_BLOCK; }
[error]             ^
[error] 
/work/apache/spark/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/RegisterExecutor.java:44:
 not found: type Type
[error]   protected Type type() { return Type.REGISTER_EXECUTOR; }
[error]             ^
[error] 
/work/apache/spark/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/OpenBlocks.java:40:
 not found: type Type
[error]   protected Type type() { return Type.OPEN_BLOCKS; }
[error]             ^
[error] 
/work/apache/spark/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/StreamHandle.java:39:
 not found: type Type
[error]   protected Type type() { return Type.STREAM_HANDLE; }
[error]             ^
{noformat}


> unidoc is broken on master
> --------------------------
>
>                 Key: SPARK-4326
>                 URL: https://issues.apache.org/jira/browse/SPARK-4326
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, Documentation
>    Affects Versions: 1.3.0
>            Reporter: Xiangrui Meng
>
> On master, `jekyll build` throws the following error:
> {code}
> [error] 
> /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/AppendOnlyMap.scala:205:
>  value hashInt is not a member of com.google.common.hash.HashFunction
> [error]   private def rehash(h: Int): Int = 
> Hashing.murmur3_32().hashInt(h).asInt()
> [error]                                                          ^
> [error] 
> /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/ExternalAppendOnlyMap.scala:426:
>  value limit is not a member of object com.google.common.io.ByteStreams
> [error]         val bufferedStream = new 
> BufferedInputStream(ByteStreams.limit(fileStream, end - start))
> [error]                                                                  ^
> [error] 
> /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/ExternalSorter.scala:558:
>  value limit is not a member of object com.google.common.io.ByteStreams
> [error]         val bufferedStream = new 
> BufferedInputStream(ByteStreams.limit(fileStream, end - start))
> [error]                                                                  ^
> [error] 
> /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/OpenHashSet.scala:261:
>  value hashInt is not a member of com.google.common.hash.HashFunction
> [error]   private def hashcode(h: Int): Int = 
> Hashing.murmur3_32().hashInt(h).asInt()
> [error]                                                            ^
> [error] 
> /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/Utils.scala:37:
>  type mismatch;
> [error]  found   : java.util.Iterator[T]
> [error]  required: Iterable[?]
> [error]     collectionAsScalaIterable(ordering.leastOf(asJavaIterator(input), 
> num)).iterator
> [error]                                                              ^
> [error] 
> /Users/meng/src/spark/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableOperations.scala:421:
>  value putAll is not a member of 
> com.google.common.cache.Cache[org.apache.hadoop.fs.FileStatus,parquet.hadoop.Footer]
> [error]           footerCache.putAll(newFooters)
> [error]                       ^
> [warn] 
> /Users/meng/src/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/parquet/FakeParquetSerDe.scala:34:
>  @deprecated now takes two arguments; see the scaladoc.
> [warn] @deprecated("No code should depend on FakeParquetHiveSerDe as it is 
> only intended as a " +
> [warn]  ^
> [info] No documentation generated with unsucessful compiler run
> [warn] two warnings found
> [error] 6 errors found
> [error] (spark/scalaunidoc:doc) Scaladoc generation failed
> [error] Total time: 48 s, completed Nov 10, 2014 1:31:01 PM
> {code}
> It doesn't happen on branch-1.2.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to