Re: Streaming Job gives error after changing to version 1.5.2

2015-11-19 Thread swetha kasireddy
That was actually an issue with our Mesos.

On Wed, Nov 18, 2015 at 5:29 PM, Tathagata Das  wrote:

> If possible, could you give us the root cause and solution for future
> readers of this thread.
>
> On Wed, Nov 18, 2015 at 6:37 AM, swetha kasireddy <
> swethakasire...@gmail.com> wrote:
>
>> It works fine after some changes.
>>
>> -Thanks,
>> Swetha
>>
>> On Tue, Nov 17, 2015 at 10:22 PM, Tathagata Das 
>> wrote:
>>
>>> Can you verify that the cluster is running the correct version of Spark.
>>> 1.5.2.
>>>
>>> On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy <
>>> swethakasire...@gmail.com> wrote:
>>>
>>>> Sorry compile makes it work locally. But, the cluster
>>>> still seems to have issues with provided. Basically it
>>>> does not seem to process any records, no data is shown in any of the tabs
>>>> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
>>>> etc show empty RDDs.
>>>>
>>>> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
>>>> swethakasire...@gmail.com> wrote:
>>>>
>>>>> Hi TD,
>>>>>
>>>>> Basically, I see two issues. With provided the job
>>>>> does not start localy. It does start in Cluster but seems  no data is
>>>>> getting processed.
>>>>>
>>>>> Thanks,
>>>>> Swetha
>>>>>
>>>>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram >>>> > wrote:
>>>>>
>>>>>> If you are running a local context, could it be that you should use:
>>>>>>
>>>>>>
>>>>>>
>>>>>> provided
>>>>>>
>>>>>>
>>>>>>
>>>>>> ?
>>>>>>
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Tim
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>>>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>>>>> *To:* Tathagata Das
>>>>>> *Cc:* user
>>>>>> *Subject:* Re: Streaming Job gives error after changing to version
>>>>>> 1.5.2
>>>>>>
>>>>>>
>>>>>>
>>>>>> This error I see locally.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das 
>>>>>> wrote:
>>>>>>
>>>>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha 
>>>>>> wrote:
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>>>>> version to 1.5.2. Any idea as to why this is happening? Following are
>>>>>> my
>>>>>> dependencies and the error that I get.
>>>>>>
>>>>>>   
>>>>>> org.apache.spark
>>>>>> spark-core_2.10
>>>>>> ${sparkVersion}
>>>>>> provided
>>>>>> 
>>>>>>
>>>>>>
>>>>>> 
>>>>>> org.apache.spark
>>>>>> spark-streaming_2.10
>>>>>> ${sparkVersion}
>>>>>> provided
>>>>>> 
>>>>>>
>>>>>>
>>>>>> 
>>>>>> org.apache.spark
>>>>>> spark-sql_2.10
>>>>>> ${sparkVersion}
>>>>>> provided
>>>>>> 
>>>>>>
>>>>>>
>>>>>> 
>>>>>> org.apache.spark
>>>>>> spark-hive_2.10
>>>>>> ${sparkVersion}
>>>>>> provided
>>>>>> 
>>&

Re: Streaming Job gives error after changing to version 1.5.2

2015-11-18 Thread Tathagata Das
If possible, could you give us the root cause and solution for future
readers of this thread.

On Wed, Nov 18, 2015 at 6:37 AM, swetha kasireddy  wrote:

> It works fine after some changes.
>
> -Thanks,
> Swetha
>
> On Tue, Nov 17, 2015 at 10:22 PM, Tathagata Das 
> wrote:
>
>> Can you verify that the cluster is running the correct version of Spark.
>> 1.5.2.
>>
>> On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy <
>> swethakasire...@gmail.com> wrote:
>>
>>> Sorry compile makes it work locally. But, the cluster
>>> still seems to have issues with provided. Basically it
>>> does not seem to process any records, no data is shown in any of the tabs
>>> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
>>> etc show empty RDDs.
>>>
>>> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
>>> swethakasire...@gmail.com> wrote:
>>>
>>>> Hi TD,
>>>>
>>>> Basically, I see two issues. With provided the job does
>>>> not start localy. It does start in Cluster but seems  no data is
>>>> getting processed.
>>>>
>>>> Thanks,
>>>> Swetha
>>>>
>>>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram 
>>>> wrote:
>>>>
>>>>> If you are running a local context, could it be that you should use:
>>>>>
>>>>>
>>>>>
>>>>> provided
>>>>>
>>>>>
>>>>>
>>>>> ?
>>>>>
>>>>>
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Tim
>>>>>
>>>>>
>>>>>
>>>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>>>> *To:* Tathagata Das
>>>>> *Cc:* user
>>>>> *Subject:* Re: Streaming Job gives error after changing to version
>>>>> 1.5.2
>>>>>
>>>>>
>>>>>
>>>>> This error I see locally.
>>>>>
>>>>>
>>>>>
>>>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das 
>>>>> wrote:
>>>>>
>>>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>>>
>>>>>
>>>>>
>>>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha 
>>>>> wrote:
>>>>>
>>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>>>> version to 1.5.2. Any idea as to why this is happening? Following are
>>>>> my
>>>>> dependencies and the error that I get.
>>>>>
>>>>>   
>>>>> org.apache.spark
>>>>> spark-core_2.10
>>>>> ${sparkVersion}
>>>>> provided
>>>>> 
>>>>>
>>>>>
>>>>> 
>>>>> org.apache.spark
>>>>> spark-streaming_2.10
>>>>> ${sparkVersion}
>>>>> provided
>>>>> 
>>>>>
>>>>>
>>>>> 
>>>>> org.apache.spark
>>>>> spark-sql_2.10
>>>>> ${sparkVersion}
>>>>> provided
>>>>> 
>>>>>
>>>>>
>>>>> 
>>>>> org.apache.spark
>>>>> spark-hive_2.10
>>>>> ${sparkVersion}
>>>>> provided
>>>>> 
>>>>>
>>>>>
>>>>>
>>>>> 
>>>>> org.apache.spark
>>>>> spark-streaming-kafka_2.10
>>>>> ${sparkVersion}
>>>>> 
>>>>>
>>>>>
>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>> org/apache/spark/streaming/StreamingContext
>>>>> at java.lang.Class.getDeclaredMethods0(Native Method)
>>>>> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>>>>> at java.lang.C

Re: Streaming Job gives error after changing to version 1.5.2

2015-11-18 Thread swetha kasireddy
It works fine after some changes.

-Thanks,
Swetha

On Tue, Nov 17, 2015 at 10:22 PM, Tathagata Das  wrote:

> Can you verify that the cluster is running the correct version of Spark.
> 1.5.2.
>
> On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy <
> swethakasire...@gmail.com> wrote:
>
>> Sorry compile makes it work locally. But, the cluster
>> still seems to have issues with provided. Basically it
>> does not seem to process any records, no data is shown in any of the tabs
>> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
>> etc show empty RDDs.
>>
>> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
>> swethakasire...@gmail.com> wrote:
>>
>>> Hi TD,
>>>
>>> Basically, I see two issues. With provided the job does
>>> not start localy. It does start in Cluster but seems  no data is
>>> getting processed.
>>>
>>> Thanks,
>>> Swetha
>>>
>>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram 
>>> wrote:
>>>
>>>> If you are running a local context, could it be that you should use:
>>>>
>>>>
>>>>
>>>> provided
>>>>
>>>>
>>>>
>>>> ?
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Tim
>>>>
>>>>
>>>>
>>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>>> *To:* Tathagata Das
>>>> *Cc:* user
>>>> *Subject:* Re: Streaming Job gives error after changing to version
>>>> 1.5.2
>>>>
>>>>
>>>>
>>>> This error I see locally.
>>>>
>>>>
>>>>
>>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das 
>>>> wrote:
>>>>
>>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>>
>>>>
>>>>
>>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha 
>>>> wrote:
>>>>
>>>>
>>>>
>>>> Hi,
>>>>
>>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>>> version to 1.5.2. Any idea as to why this is happening? Following are my
>>>> dependencies and the error that I get.
>>>>
>>>>   
>>>> org.apache.spark
>>>> spark-core_2.10
>>>> ${sparkVersion}
>>>> provided
>>>> 
>>>>
>>>>
>>>> 
>>>> org.apache.spark
>>>> spark-streaming_2.10
>>>> ${sparkVersion}
>>>> provided
>>>> 
>>>>
>>>>
>>>> 
>>>> org.apache.spark
>>>> spark-sql_2.10
>>>> ${sparkVersion}
>>>> provided
>>>> 
>>>>
>>>>
>>>> 
>>>> org.apache.spark
>>>> spark-hive_2.10
>>>> ${sparkVersion}
>>>> provided
>>>> 
>>>>
>>>>
>>>>
>>>> 
>>>> org.apache.spark
>>>> spark-streaming-kafka_2.10
>>>> ${sparkVersion}
>>>> 
>>>>
>>>>
>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>> org/apache/spark/streaming/StreamingContext
>>>> at java.lang.Class.getDeclaredMethods0(Native Method)
>>>> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>>>> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
>>>> at java.lang.Class.getMethod0(Class.java:3010)
>>>> at java.lang.Class.getMethod(Class.java:1776)
>>>> at
>>>> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> org.apache.spark.streaming.StreamingContext
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>

Re: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread Tathagata Das
Can you verify that the cluster is running the correct version of Spark.
1.5.2.

On Tue, Nov 17, 2015 at 7:23 PM, swetha kasireddy  wrote:

> Sorry compile makes it work locally. But, the cluster
> still seems to have issues with provided. Basically it
> does not seem to process any records, no data is shown in any of the tabs
> of the Streaming UI except the Streaming tab. Executors, Storage, Stages
> etc show empty RDDs.
>
> On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy <
> swethakasire...@gmail.com> wrote:
>
>> Hi TD,
>>
>> Basically, I see two issues. With provided the job does
>> not start localy. It does start in Cluster but seems  no data is getting
>> processed.
>>
>> Thanks,
>> Swetha
>>
>> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram 
>> wrote:
>>
>>> If you are running a local context, could it be that you should use:
>>>
>>>
>>>
>>> provided
>>>
>>>
>>>
>>> ?
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Tim
>>>
>>>
>>>
>>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>>> *To:* Tathagata Das
>>> *Cc:* user
>>> *Subject:* Re: Streaming Job gives error after changing to version 1.5.2
>>>
>>>
>>>
>>> This error I see locally.
>>>
>>>
>>>
>>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das 
>>> wrote:
>>>
>>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>>
>>>
>>>
>>> On Tue, Nov 17, 2015 at 5:34 PM, swetha 
>>> wrote:
>>>
>>>
>>>
>>> Hi,
>>>
>>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>>> version to 1.5.2. Any idea as to why this is happening? Following are my
>>> dependencies and the error that I get.
>>>
>>>   
>>> org.apache.spark
>>> spark-core_2.10
>>> ${sparkVersion}
>>> provided
>>> 
>>>
>>>
>>> 
>>> org.apache.spark
>>> spark-streaming_2.10
>>> ${sparkVersion}
>>> provided
>>> 
>>>
>>>
>>> 
>>> org.apache.spark
>>> spark-sql_2.10
>>> ${sparkVersion}
>>> provided
>>> 
>>>
>>>
>>> 
>>> org.apache.spark
>>> spark-hive_2.10
>>> ${sparkVersion}
>>> provided
>>> 
>>>
>>>
>>>
>>> 
>>> org.apache.spark
>>> spark-streaming-kafka_2.10
>>> ${sparkVersion}
>>> 
>>>
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> org/apache/spark/streaming/StreamingContext
>>> at java.lang.Class.getDeclaredMethods0(Native Method)
>>> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>>> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
>>> at java.lang.Class.getMethod0(Class.java:3010)
>>> at java.lang.Class.getMethod(Class.java:1776)
>>> at
>>> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.spark.streaming.StreamingContext
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>>
>>>
>>>
>>> _
>>>
>>> The information transmitted in this message and its attachments (if any)
>>> is intended
>>> only for the person or entity to which it is addressed.
>>> The message may contain confidential and/or privileged material. Any
>>> review,
>>> retransmission, dissemination or other use of, or taking of any action
>>> in reliance
>>> upon this information, by persons or entities other than the intended
>>> recipient is
>>> prohibited.
>>>
>>> If you have received this in error, please contact the sender and delete
>>> this e-mail
>>> and associated material from any computer.
>>>
>>> The intended recipient of this e-mail may only use, reproduce, disclose
>>> or distribute
>>> the information contained in this e-mail and any attached files, with
>>> the permission
>>> of the sender.
>>>
>>> This message has been scanned for viruses.
>>> _
>>>
>>
>>
>


Re: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread swetha kasireddy
Sorry compile makes it work locally. But, the cluster still
seems to have issues with provided. Basically it does not
seem to process any records, no data is shown in any of the tabs of the
Streaming UI except the Streaming tab. Executors, Storage, Stages etc show
empty RDDs.

On Tue, Nov 17, 2015 at 7:19 PM, swetha kasireddy  wrote:

> Hi TD,
>
> Basically, I see two issues. With provided the job does
> not start localy. It does start in Cluster but seems  no data is getting
> processed.
>
> Thanks,
> Swetha
>
> On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram 
> wrote:
>
>> If you are running a local context, could it be that you should use:
>>
>>
>>
>> provided
>>
>>
>>
>> ?
>>
>>
>>
>> Thanks,
>>
>> Tim
>>
>>
>>
>> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
>> *Sent:* Wednesday, 18 November 2015 2:01 PM
>> *To:* Tathagata Das
>> *Cc:* user
>> *Subject:* Re: Streaming Job gives error after changing to version 1.5.2
>>
>>
>>
>> This error I see locally.
>>
>>
>>
>> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das 
>> wrote:
>>
>> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>>
>>
>>
>> On Tue, Nov 17, 2015 at 5:34 PM, swetha 
>> wrote:
>>
>>
>>
>> Hi,
>>
>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>> version to 1.5.2. Any idea as to why this is happening? Following are my
>> dependencies and the error that I get.
>>
>>   
>> org.apache.spark
>> spark-core_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-streaming_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-sql_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-hive_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>>
>> 
>> org.apache.spark
>> spark-streaming-kafka_2.10
>> ${sparkVersion}
>> 
>>
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/spark/streaming/StreamingContext
>> at java.lang.Class.getDeclaredMethods0(Native Method)
>> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
>> at java.lang.Class.getMethod0(Class.java:3010)
>> at java.lang.Class.getMethod(Class.java:1776)
>> at
>> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.spark.streaming.StreamingContext
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>>
>>
>>
>> _
>>
>> The information transmitted in this message and its attachments (if any)
>> is intended
>> only for the person or entity to which it is addressed.
>> The message may contain confidential and/or privileged material. Any
>> review,
>> retransmission, dissemination or other use of, or taking of any action in
>> reliance
>> upon this information, by persons or entities other than the intended
>> recipient is
>> prohibited.
>>
>> If you have received this in error, please contact the sender and delete
>> this e-mail
>> and associated material from any computer.
>>
>> The intended recipient of this e-mail may only use, reproduce, disclose
>> or distribute
>> the information contained in this e-mail and any attached files, with the
>> permission
>> of the sender.
>>
>> This message has been scanned for viruses.
>> _
>>
>
>


Re: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread swetha kasireddy
Hi TD,

Basically, I see two issues. With provided the job does not
start localy. It does start in Cluster but seems  no data is getting
processed.

Thanks,
Swetha

On Tue, Nov 17, 2015 at 7:04 PM, Tim Barthram 
wrote:

> If you are running a local context, could it be that you should use:
>
>
>
> provided
>
>
>
> ?
>
>
>
> Thanks,
>
> Tim
>
>
>
> *From:* swetha kasireddy [mailto:swethakasire...@gmail.com]
> *Sent:* Wednesday, 18 November 2015 2:01 PM
> *To:* Tathagata Das
> *Cc:* user
> *Subject:* Re: Streaming Job gives error after changing to version 1.5.2
>
>
>
> This error I see locally.
>
>
>
> On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das 
> wrote:
>
> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>
>
>
> On Tue, Nov 17, 2015 at 5:34 PM, swetha  wrote:
>
>
>
> Hi,
>
> I see  java.lang.NoClassDefFoundError after changing the Streaming job
> version to 1.5.2. Any idea as to why this is happening? Following are my
> dependencies and the error that I get.
>
>   
> org.apache.spark
> spark-core_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-streaming_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-sql_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-hive_2.10
> ${sparkVersion}
> provided
> 
>
>
>
> 
> org.apache.spark
> spark-streaming-kafka_2.10
> ${sparkVersion}
> 
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/streaming/StreamingContext
> at java.lang.Class.getDeclaredMethods0(Native Method)
> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
> at java.lang.Class.getMethod0(Class.java:3010)
> at java.lang.Class.getMethod(Class.java:1776)
> at
> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.streaming.StreamingContext
> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>
>
> _
>
> The information transmitted in this message and its attachments (if any)
> is intended
> only for the person or entity to which it is addressed.
> The message may contain confidential and/or privileged material. Any
> review,
> retransmission, dissemination or other use of, or taking of any action in
> reliance
> upon this information, by persons or entities other than the intended
> recipient is
> prohibited.
>
> If you have received this in error, please contact the sender and delete
> this e-mail
> and associated material from any computer.
>
> The intended recipient of this e-mail may only use, reproduce, disclose or
> distribute
> the information contained in this e-mail and any attached files, with the
> permission
> of the sender.
>
> This message has been scanned for viruses.
> _
>


RE: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread Tim Barthram
If you are running a local context, could it be that you should use:

provided

?

Thanks,
Tim

From: swetha kasireddy [mailto:swethakasire...@gmail.com]
Sent: Wednesday, 18 November 2015 2:01 PM
To: Tathagata Das
Cc: user
Subject: Re: Streaming Job gives error after changing to version 1.5.2

This error I see locally.

On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das 
mailto:t...@databricks.com>> wrote:
Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?

On Tue, Nov 17, 2015 at 5:34 PM, swetha 
mailto:swethakasire...@gmail.com>> wrote:


Hi,

I see  java.lang.NoClassDefFoundError after changing the Streaming job
version to 1.5.2. Any idea as to why this is happening? Following are my
dependencies and the error that I get.

  
org.apache.spark
spark-core_2.10
${sparkVersion}
provided




org.apache.spark
spark-streaming_2.10
${sparkVersion}
provided




org.apache.spark
spark-sql_2.10
${sparkVersion}
provided




org.apache.spark
spark-hive_2.10
${sparkVersion}
provided





org.apache.spark
spark-streaming-kafka_2.10
${sparkVersion}



Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/streaming/StreamingContext
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
at java.lang.Class.getMethod0(Class.java:3010)
at java.lang.Class.getMethod(Class.java:1776)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.streaming.StreamingContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



_

The information transmitted in this message and its attachments (if any) is 
intended 
only for the person or entity to which it is addressed.
The message may contain confidential and/or privileged material. Any review, 
retransmission, dissemination or other use of, or taking of any action in 
reliance 
upon this information, by persons or entities other than the intended recipient 
is 
prohibited.

If you have received this in error, please contact the sender and delete this 
e-mail 
and associated material from any computer.

The intended recipient of this e-mail may only use, reproduce, disclose or 
distribute 
the information contained in this e-mail and any attached files, with the 
permission 
of the sender.

This message has been scanned for viruses.
_


Re: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread swetha kasireddy
This error I see locally.

On Tue, Nov 17, 2015 at 5:44 PM, Tathagata Das  wrote:

> Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?
>
> On Tue, Nov 17, 2015 at 5:34 PM, swetha  wrote:
>
>>
>>
>> Hi,
>>
>> I see  java.lang.NoClassDefFoundError after changing the Streaming job
>> version to 1.5.2. Any idea as to why this is happening? Following are my
>> dependencies and the error that I get.
>>
>>   
>> org.apache.spark
>> spark-core_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-streaming_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-sql_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>> 
>> org.apache.spark
>> spark-hive_2.10
>> ${sparkVersion}
>> provided
>> 
>>
>>
>>
>> 
>> org.apache.spark
>> spark-streaming-kafka_2.10
>> ${sparkVersion}
>> 
>>
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/spark/streaming/StreamingContext
>> at java.lang.Class.getDeclaredMethods0(Native Method)
>> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
>> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
>> at java.lang.Class.getMethod0(Class.java:3010)
>> at java.lang.Class.getMethod(Class.java:1776)
>> at
>> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.spark.streaming.StreamingContext
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread Tathagata Das
Are you running 1.5.2-compiled jar on a Spark 1.5.2 cluster?

On Tue, Nov 17, 2015 at 5:34 PM, swetha  wrote:

>
>
> Hi,
>
> I see  java.lang.NoClassDefFoundError after changing the Streaming job
> version to 1.5.2. Any idea as to why this is happening? Following are my
> dependencies and the error that I get.
>
>   
> org.apache.spark
> spark-core_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-streaming_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-sql_2.10
> ${sparkVersion}
> provided
> 
>
>
> 
> org.apache.spark
> spark-hive_2.10
> ${sparkVersion}
> provided
> 
>
>
>
> 
> org.apache.spark
> spark-streaming-kafka_2.10
> ${sparkVersion}
> 
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/streaming/StreamingContext
> at java.lang.Class.getDeclaredMethods0(Native Method)
> at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
> at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
> at java.lang.Class.getMethod0(Class.java:3010)
> at java.lang.Class.getMethod(Class.java:1776)
> at
> com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.streaming.StreamingContext
> at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Streaming Job gives error after changing to version 1.5.2

2015-11-17 Thread swetha


Hi,

I see  java.lang.NoClassDefFoundError after changing the Streaming job
version to 1.5.2. Any idea as to why this is happening? Following are my
dependencies and the error that I get.

  
org.apache.spark
spark-core_2.10
${sparkVersion}
provided




org.apache.spark
spark-streaming_2.10
${sparkVersion}
provided




org.apache.spark
spark-sql_2.10
${sparkVersion}
provided




org.apache.spark
spark-hive_2.10
${sparkVersion}
provided





org.apache.spark
spark-streaming-kafka_2.10
${sparkVersion}



Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/streaming/StreamingContext
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
at java.lang.Class.getMethod0(Class.java:3010)
at java.lang.Class.getMethod(Class.java:1776)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:125)
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.streaming.StreamingContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Job-gives-error-after-changing-to-version-1-5-2-tp25406.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org