[
https://issues.apache.org/jira/browse/SPARK-10390?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14907986#comment-14907986
]
Sean Owen commented on SPARK-10390:
-----------------------------------
My guess is it ends up building in a different Guava dependency when built via
SBT? I'm still not entirely sure. I do know the dependency resolution rules are
different and that's why only the Maven build 'counts'. I'd try Maven, anyway,
just to see if it works. If not then we know this guess isn't correct.
> Py4JJavaError java.lang.NoSuchMethodError:
> com.google.common.base.Stopwatch.elapsedMillis()J
> --------------------------------------------------------------------------------------------
>
> Key: SPARK-10390
> URL: https://issues.apache.org/jira/browse/SPARK-10390
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Reporter: Zoltán Zvara
>
> While running PySpark through iPython.
> {code}
> Py4JJavaError: An error occurred while calling
> z:org.apache.spark.api.python.PythonRDD.collectAndServe.
> : java.lang.NoSuchMethodError:
> com.google.common.base.Stopwatch.elapsedMillis()J
> at
> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:245)
> at
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
> at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:207)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
> at
> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
> at
> org.apache.spark.api.python.PythonRDD.getPartitions(PythonRDD.scala:58)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:1910)
> at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:905)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
> at org.apache.spark.rdd.RDD.collect(RDD.scala:904)
> at
> org.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:373)
> at
> org.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
> at py4j.Gateway.invoke(Gateway.java:259)
> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
> at py4j.commands.CallCommand.execute(CallCommand.java:79)
> at py4j.GatewayConnection.run(GatewayConnection.java:207)
> at java.lang.Thread.run(Thread.java:745)
> {code}
> {{spark-env.sh}}
> {code}
> export IPYTHON=1
> export PYSPARK_PYTHON=/usr/bin/python3
> export PYSPARK_DRIVER_PYTHON=ipython3
> export PYSPARK_DRIVER_PYTHON_OPTS="notebook"
> {code}
> Spark built with:
> {{build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 assembly --error}}
> Not a problem, when built against {{Hadoop 2.4}}!
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]