[
https://issues.apache.org/jira/browse/SPARK-30364?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17084033#comment-17084033
]
Nick Hryhoriev commented on SPARK-30364:
----------------------------------------
I have the same issue with the spark app on Mac OS with 2.4.3 spark.
The issue appeared only when I extend Spark Metrics With custom Source.
Any known way to avoid it?
> The spark-streaming-kafka-0-10_2.11 test cases are failing on ppc64le
> ---------------------------------------------------------------------
>
> Key: SPARK-30364
> URL: https://issues.apache.org/jira/browse/SPARK-30364
> Project: Spark
> Issue Type: Test
> Components: Build, DStreams
> Affects Versions: 2.4.0
> Environment: os: rhel 7.6
> arch: ppc64le
> Reporter: AK97
> Priority: Major
>
> I have been trying to build the Apache Spark on rhel_7.6/ppc64le; however,
> the spark-streaming-kafka-0-10_2.11 test cases are failing with following
> error :
> {code}
> [ERROR]
> /opt/spark/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumerSuite.scala:85:
> Symbol 'term org.eclipse' is missing from the classpath.
> This symbol is required by 'method
> org.apache.spark.metrics.MetricsSystem.getServletHandlers'.
> Make sure that term eclipse is in your classpath and check for conflicting
> dependencies with `-Ylog-classpath`.
> A full rebuild may help if 'MetricsSystem.class' was compiled against an
> incompatible version of org.
> [ERROR] testUtils.sendMessages(topic, data.toArray)
> ^
> {code}
> Would like some help on understanding the cause for the same . I am running
> it on a High end VM with good connectivity.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]