Thanks, will try this out and get back...
On Tue, Jun 23, 2015 at 2:30 AM, Tathagata Das t...@databricks.com wrote:
Try adding the provided scopes
dependency !-- Spark dependency --
groupIdorg.apache.spark/groupId
artifactIdspark-core_2.10/artifactId
Hi,
I have the following piece of code, where I am trying to transform a spark
stream and add min and max to it of eachRDD. However, I get an error saying
max call does not exist, at run-time (compiles properly). I am using
spark-1.4
I have added the question to stackoverflow as well:
Hi Tathagata,
When you say please mark spark-core and spark-streaming as dependencies how
do you mean?
I have installed the pre-build spark-1.4 for Hadoop 2.6 from spark
downloads. In my maven pom.xml, I am using version 1.4 as described.
Please let me know how I can fix that?
Thanks
Nipun
On
I think you may be including a different version of Spark Streaming in your
assembly. Please mark spark-core nd spark-streaming as provided
dependencies. Any installation of Spark will automatically provide Spark in
the classpath so you do not have to bundle it.
On Thu, Jun 18, 2015 at 8:44 AM,