RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-12-09 Thread Judy Nash
To report back how I ultimately solved this issue and someone else can do:

1) Check each jar class path and make sure the jars are listed in the order of 
Guava class version (i.e. spark-assembly needs to list before Hadoop 2.4 
because spark-assembly has guava 14 and Hadoop 2.4 has guava 11). May require 
update compute-classpath.sh to get the ordering right. 

2) If the other jars uses a higher version, bump spark guava library to higher 
version. Guava supposedly to be very backward compatible.  

Hope this helps. 

-Original Message-
From: Marcelo Vanzin [mailto:van...@cloudera.com] 
Sent: Tuesday, December 2, 2014 11:35 AM
To: Judy Nash
Cc: Patrick Wendell; Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

On Tue, Dec 2, 2014 at 11:22 AM, Judy Nash  
wrote:
> Any suggestion on how can user with custom Hadoop jar solve this issue?

You'll need to include all the dependencies for that custom Hadoop jar to the 
classpath. Those will include Guava (which is not included in its original form 
as part of the Spark dependencies).



> -Original Message-
> From: Patrick Wendell [mailto:pwend...@gmail.com]
> Sent: Sunday, November 30, 2014 11:06 PM
> To: Judy Nash
> Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with 
> NoClassDefFoundError on Guava
>
> Thanks Judy. While this is not directly caused by a Spark issue, it is likely 
> other users will run into this. This is an unfortunate consequence of the way 
> that we've shaded Guava in this release, we rely on byte code shading of 
> Hadoop itself as well. And if the user has their own Hadoop classes present 
> it can cause issues.
>
> On Sun, Nov 30, 2014 at 10:53 PM, Judy Nash  
> wrote:
>> Thanks Patrick and Cheng for the suggestions.
>>
>> The issue was Hadoop common jar was added to a classpath. After I removed 
>> Hadoop common jar from both master and slave, I was able to bypass the error.
>> This was caused by a local change, so no impact on the 1.2 release.
>> -Original Message-
>> From: Patrick Wendell [mailto:pwend...@gmail.com]
>> Sent: Wednesday, November 26, 2014 8:17 AM
>> To: Judy Nash
>> Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>> Just to double check - I looked at our own assembly jar and I confirmed that 
>> our Hadoop configuration class does use the correctly shaded version of 
>> Guava. My best guess here is that somehow a separate Hadoop library is 
>> ending up on the classpath, possible because Spark put it there somehow.
>>
>>> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
>>> cd org/apache/hadoop/
>>> javap -v Configuration | grep Precond
>>
>> Warning: Binary file Configuration contains 
>> org.apache.hadoop.conf.Configuration
>>
>>#497 = Utf8   
>> org/spark-project/guava/common/base/Preconditions
>>
>>#498 = Class  #497 //
>> "org/spark-project/guava/common/base/Preconditions"
>>
>>#502 = Methodref  #498.#501//
>> "org/spark-project/guava/common/base/Preconditions".checkArgument:(ZL
>> j
>> ava/lang/Object;)V
>>
>> 12: invokestatic  #502// Method
>> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLj
>> a
>> va/lang/Object;)V
>>
>> 50: invokestatic  #502// Method
>> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLj
>> a
>> va/lang/Object;)V
>>
>> On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell  wrote:
>>> Hi Judy,
>>>
>>> Are you somehow modifying Spark's classpath to include jars from 
>>> Hadoop and Hive that you have running on the machine? The issue 
>>> seems to be that you are somehow including a version of Hadoop that 
>>> references the original guava package. The Hadoop that is bundled in 
>>> the Spark jars should not do this.
>>>
>>> - Patrick
>>>
>>> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash 
>>>  wrote:
>>>> Looks like a config issue. I ran spark-pi job and still failing 
>>>> with the same guava error
>>>>
>>>> Command ran:
>>>>
>>>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
>>>> org.apache.spark.examples.SparkPi --master 
>>>> spark://headnodehost:7077 --ex

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-12-02 Thread Marcelo Vanzin
On Tue, Dec 2, 2014 at 11:22 AM, Judy Nash
 wrote:
> Any suggestion on how can user with custom Hadoop jar solve this issue?

You'll need to include all the dependencies for that custom Hadoop jar
to the classpath. Those will include Guava (which is not included in
its original form as part of the Spark dependencies).



> -Original Message-
> From: Patrick Wendell [mailto:pwend...@gmail.com]
> Sent: Sunday, November 30, 2014 11:06 PM
> To: Judy Nash
> Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
> Guava
>
> Thanks Judy. While this is not directly caused by a Spark issue, it is likely 
> other users will run into this. This is an unfortunate consequence of the way 
> that we've shaded Guava in this release, we rely on byte code shading of 
> Hadoop itself as well. And if the user has their own Hadoop classes present 
> it can cause issues.
>
> On Sun, Nov 30, 2014 at 10:53 PM, Judy Nash  
> wrote:
>> Thanks Patrick and Cheng for the suggestions.
>>
>> The issue was Hadoop common jar was added to a classpath. After I removed 
>> Hadoop common jar from both master and slave, I was able to bypass the error.
>> This was caused by a local change, so no impact on the 1.2 release.
>> -Original Message-
>> From: Patrick Wendell [mailto:pwend...@gmail.com]
>> Sent: Wednesday, November 26, 2014 8:17 AM
>> To: Judy Nash
>> Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with
>> NoClassDefFoundError on Guava
>>
>> Just to double check - I looked at our own assembly jar and I confirmed that 
>> our Hadoop configuration class does use the correctly shaded version of 
>> Guava. My best guess here is that somehow a separate Hadoop library is 
>> ending up on the classpath, possible because Spark put it there somehow.
>>
>>> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
>>> cd org/apache/hadoop/
>>> javap -v Configuration | grep Precond
>>
>> Warning: Binary file Configuration contains
>> org.apache.hadoop.conf.Configuration
>>
>>#497 = Utf8   
>> org/spark-project/guava/common/base/Preconditions
>>
>>#498 = Class  #497 //
>> "org/spark-project/guava/common/base/Preconditions"
>>
>>#502 = Methodref  #498.#501//
>> "org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLj
>> ava/lang/Object;)V
>>
>> 12: invokestatic  #502// Method
>> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLja
>> va/lang/Object;)V
>>
>> 50: invokestatic  #502// Method
>> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLja
>> va/lang/Object;)V
>>
>> On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell  wrote:
>>> Hi Judy,
>>>
>>> Are you somehow modifying Spark's classpath to include jars from
>>> Hadoop and Hive that you have running on the machine? The issue seems
>>> to be that you are somehow including a version of Hadoop that
>>> references the original guava package. The Hadoop that is bundled in
>>> the Spark jars should not do this.
>>>
>>> - Patrick
>>>
>>> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash
>>>  wrote:
>>>> Looks like a config issue. I ran spark-pi job and still failing with
>>>> the same guava error
>>>>
>>>> Command ran:
>>>>
>>>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077
>>>> --executor-memory 1G --num-executors 1
>>>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>>>
>>>>
>>>>
>>>> Had used the same build steps on spark 1.1 and had no issue.
>>>>
>>>>
>>>>
>>>> From: Denny Lee [mailto:denny.g@gmail.com]
>>>> Sent: Tuesday, November 25, 2014 5:47 PM
>>>> To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org
>>>>
>>>>
>>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> To determine if this is a Windows vs. other configuration, can you
>>>> just try to call the Spark-class.cmd SparkSubmit without actually
>>>

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-12-02 Thread Judy Nash
Any suggestion on how can user with custom Hadoop jar solve this issue? 

-Original Message-
From: Patrick Wendell [mailto:pwend...@gmail.com] 
Sent: Sunday, November 30, 2014 11:06 PM
To: Judy Nash
Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Thanks Judy. While this is not directly caused by a Spark issue, it is likely 
other users will run into this. This is an unfortunate consequence of the way 
that we've shaded Guava in this release, we rely on byte code shading of Hadoop 
itself as well. And if the user has their own Hadoop classes present it can 
cause issues.

On Sun, Nov 30, 2014 at 10:53 PM, Judy Nash  
wrote:
> Thanks Patrick and Cheng for the suggestions.
>
> The issue was Hadoop common jar was added to a classpath. After I removed 
> Hadoop common jar from both master and slave, I was able to bypass the error.
> This was caused by a local change, so no impact on the 1.2 release.
> -Original Message-
> From: Patrick Wendell [mailto:pwend...@gmail.com]
> Sent: Wednesday, November 26, 2014 8:17 AM
> To: Judy Nash
> Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with 
> NoClassDefFoundError on Guava
>
> Just to double check - I looked at our own assembly jar and I confirmed that 
> our Hadoop configuration class does use the correctly shaded version of 
> Guava. My best guess here is that somehow a separate Hadoop library is ending 
> up on the classpath, possible because Spark put it there somehow.
>
>> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
>> cd org/apache/hadoop/
>> javap -v Configuration | grep Precond
>
> Warning: Binary file Configuration contains 
> org.apache.hadoop.conf.Configuration
>
>#497 = Utf8   org/spark-project/guava/common/base/Preconditions
>
>#498 = Class  #497 //
> "org/spark-project/guava/common/base/Preconditions"
>
>#502 = Methodref  #498.#501//
> "org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLj
> ava/lang/Object;)V
>
> 12: invokestatic  #502// Method
> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLja
> va/lang/Object;)V
>
> 50: invokestatic  #502// Method
> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLja
> va/lang/Object;)V
>
> On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell  wrote:
>> Hi Judy,
>>
>> Are you somehow modifying Spark's classpath to include jars from 
>> Hadoop and Hive that you have running on the machine? The issue seems 
>> to be that you are somehow including a version of Hadoop that 
>> references the original guava package. The Hadoop that is bundled in 
>> the Spark jars should not do this.
>>
>> - Patrick
>>
>> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash 
>>  wrote:
>>> Looks like a config issue. I ran spark-pi job and still failing with 
>>> the same guava error
>>>
>>> Command ran:
>>>
>>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
>>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077 
>>> --executor-memory 1G --num-executors 1 
>>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>>
>>>
>>>
>>> Had used the same build steps on spark 1.1 and had no issue.
>>>
>>>
>>>
>>> From: Denny Lee [mailto:denny.g@gmail.com]
>>> Sent: Tuesday, November 25, 2014 5:47 PM
>>> To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org
>>>
>>>
>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> To determine if this is a Windows vs. other configuration, can you 
>>> just try to call the Spark-class.cmd SparkSubmit without actually 
>>> referencing the Hadoop or Thrift server classes?
>>>
>>>
>>>
>>>
>>>
>>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
>>> 
>>> wrote:
>>>
>>> I traced the code and used the following to call:
>>>
>>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>>> spark-internal --hiveconf hive.server2.thrift.port=1
>>>
>>>
>>>
>>> The issue ended up to be much more fundamental however. Spark 
>>> doesn't work at all i

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-30 Thread Patrick Wendell
Thanks Judy. While this is not directly caused by a Spark issue, it is
likely other users will run into this. This is an unfortunate
consequence of the way that we've shaded Guava in this release, we
rely on byte code shading of Hadoop itself as well. And if the user
has their own Hadoop classes present it can cause issues.

On Sun, Nov 30, 2014 at 10:53 PM, Judy Nash
 wrote:
> Thanks Patrick and Cheng for the suggestions.
>
> The issue was Hadoop common jar was added to a classpath. After I removed 
> Hadoop common jar from both master and slave, I was able to bypass the error.
> This was caused by a local change, so no impact on the 1.2 release.
> -Original Message-
> From: Patrick Wendell [mailto:pwend...@gmail.com]
> Sent: Wednesday, November 26, 2014 8:17 AM
> To: Judy Nash
> Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
> Guava
>
> Just to double check - I looked at our own assembly jar and I confirmed that 
> our Hadoop configuration class does use the correctly shaded version of 
> Guava. My best guess here is that somehow a separate Hadoop library is ending 
> up on the classpath, possible because Spark put it there somehow.
>
>> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
>> cd org/apache/hadoop/
>> javap -v Configuration | grep Precond
>
> Warning: Binary file Configuration contains 
> org.apache.hadoop.conf.Configuration
>
>#497 = Utf8   org/spark-project/guava/common/base/Preconditions
>
>#498 = Class  #497 //
> "org/spark-project/guava/common/base/Preconditions"
>
>#502 = Methodref  #498.#501//
> "org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLjava/lang/Object;)V
>
> 12: invokestatic  #502// Method
> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V
>
> 50: invokestatic  #502// Method
> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V
>
> On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell  wrote:
>> Hi Judy,
>>
>> Are you somehow modifying Spark's classpath to include jars from
>> Hadoop and Hive that you have running on the machine? The issue seems
>> to be that you are somehow including a version of Hadoop that
>> references the original guava package. The Hadoop that is bundled in
>> the Spark jars should not do this.
>>
>> - Patrick
>>
>> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash
>>  wrote:
>>> Looks like a config issue. I ran spark-pi job and still failing with
>>> the same guava error
>>>
>>> Command ran:
>>>
>>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077
>>> --executor-memory 1G --num-executors 1
>>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>>
>>>
>>>
>>> Had used the same build steps on spark 1.1 and had no issue.
>>>
>>>
>>>
>>> From: Denny Lee [mailto:denny.g@gmail.com]
>>> Sent: Tuesday, November 25, 2014 5:47 PM
>>> To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org
>>>
>>>
>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> To determine if this is a Windows vs. other configuration, can you
>>> just try to call the Spark-class.cmd SparkSubmit without actually
>>> referencing the Hadoop or Thrift server classes?
>>>
>>>
>>>
>>>
>>>
>>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash
>>> 
>>> wrote:
>>>
>>> I traced the code and used the following to call:
>>>
>>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>>> spark-internal --hiveconf hive.server2.thrift.port=1
>>>
>>>
>>>
>>> The issue ended up to be much more fundamental however. Spark doesn't
>>> work at all in configuration below. When open spark-shell, it fails
>>> with the same ClassNotFound error.
>>>
>>> Now I wonder if this is a windows-only issue or the hive/Hadoop
>>> configuration that is having this problem.
>>>
>>>
>>>
>>> From: Cheng Lian [mailto:lian.cs@gmail.com]
>>> Sent: Tuesday, November 25,

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-30 Thread Judy Nash
Thanks Patrick and Cheng for the suggestions.

The issue was Hadoop common jar was added to a classpath. After I removed 
Hadoop common jar from both master and slave, I was able to bypass the error. 
This was caused by a local change, so no impact on the 1.2 release. 
-Original Message-
From: Patrick Wendell [mailto:pwend...@gmail.com] 
Sent: Wednesday, November 26, 2014 8:17 AM
To: Judy Nash
Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Just to double check - I looked at our own assembly jar and I confirmed that 
our Hadoop configuration class does use the correctly shaded version of Guava. 
My best guess here is that somehow a separate Hadoop library is ending up on 
the classpath, possible because Spark put it there somehow.

> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
> cd org/apache/hadoop/
> javap -v Configuration | grep Precond

Warning: Binary file Configuration contains org.apache.hadoop.conf.Configuration

   #497 = Utf8   org/spark-project/guava/common/base/Preconditions

   #498 = Class  #497 //
"org/spark-project/guava/common/base/Preconditions"

   #502 = Methodref  #498.#501//
"org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLjava/lang/Object;)V

12: invokestatic  #502// Method
"org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V

50: invokestatic  #502// Method
"org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V

On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell  wrote:
> Hi Judy,
>
> Are you somehow modifying Spark's classpath to include jars from 
> Hadoop and Hive that you have running on the machine? The issue seems 
> to be that you are somehow including a version of Hadoop that 
> references the original guava package. The Hadoop that is bundled in 
> the Spark jars should not do this.
>
> - Patrick
>
> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash 
>  wrote:
>> Looks like a config issue. I ran spark-pi job and still failing with 
>> the same guava error
>>
>> Command ran:
>>
>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077 
>> --executor-memory 1G --num-executors 1 
>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>
>>
>>
>> Had used the same build steps on spark 1.1 and had no issue.
>>
>>
>>
>> From: Denny Lee [mailto:denny.g@gmail.com]
>> Sent: Tuesday, November 25, 2014 5:47 PM
>> To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org
>>
>>
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> To determine if this is a Windows vs. other configuration, can you 
>> just try to call the Spark-class.cmd SparkSubmit without actually 
>> referencing the Hadoop or Thrift server classes?
>>
>>
>>
>>
>>
>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
>> 
>> wrote:
>>
>> I traced the code and used the following to call:
>>
>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 
>> spark-internal --hiveconf hive.server2.thrift.port=1
>>
>>
>>
>> The issue ended up to be much more fundamental however. Spark doesn't 
>> work at all in configuration below. When open spark-shell, it fails 
>> with the same ClassNotFound error.
>>
>> Now I wonder if this is a windows-only issue or the hive/Hadoop 
>> configuration that is having this problem.
>>
>>
>>
>> From: Cheng Lian [mailto:lian.cs@gmail.com]
>> Sent: Tuesday, November 25, 2014 1:50 AM
>>
>>
>> To: Judy Nash; u...@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> Oh so you're using Windows. What command are you using to start the 
>> Thrift server then?
>>
>> On 11/25/14 4:25 PM, Judy Nash wrote:
>>
>> Made progress but still blocked.
>>
>> After recompiling the code on cmd instead of PowerShell, now I can 
>> see all 5 classes as you mentioned.
>>
>> However I am still seeing the same error as before. Anything else I 
>> can check for?
>>
>>
>>
>> From: Judy Nash [mailto:judyn...@exchange.microsoft.com]
>> Sent: Monday, November 24, 2014 11:50 PM
>

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-26 Thread Patrick Wendell
Just to double check - I looked at our own assembly jar and I
confirmed that our Hadoop configuration class does use the correctly
shaded version of Guava. My best guess here is that somehow a separate
Hadoop library is ending up on the classpath, possible because Spark
put it there somehow.

> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
> cd org/apache/hadoop/
> javap -v Configuration | grep Precond

Warning: Binary file Configuration contains org.apache.hadoop.conf.Configuration

   #497 = Utf8   org/spark-project/guava/common/base/Preconditions

   #498 = Class  #497 //
"org/spark-project/guava/common/base/Preconditions"

   #502 = Methodref  #498.#501//
"org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLjava/lang/Object;)V

12: invokestatic  #502// Method
"org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V

50: invokestatic  #502// Method
"org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V

On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell  wrote:
> Hi Judy,
>
> Are you somehow modifying Spark's classpath to include jars from
> Hadoop and Hive that you have running on the machine? The issue seems
> to be that you are somehow including a version of Hadoop that
> references the original guava package. The Hadoop that is bundled in
> the Spark jars should not do this.
>
> - Patrick
>
> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash
>  wrote:
>> Looks like a config issue. I ran spark-pi job and still failing with the
>> same guava error
>>
>> Command ran:
>>
>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077
>> --executor-memory 1G --num-executors 1
>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>
>>
>>
>> Had used the same build steps on spark 1.1 and had no issue.
>>
>>
>>
>> From: Denny Lee [mailto:denny.g@gmail.com]
>> Sent: Tuesday, November 25, 2014 5:47 PM
>> To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org
>>
>>
>> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError
>> on Guava
>>
>>
>>
>> To determine if this is a Windows vs. other configuration, can you just try
>> to call the Spark-class.cmd SparkSubmit without actually referencing the
>> Hadoop or Thrift server classes?
>>
>>
>>
>>
>>
>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
>> wrote:
>>
>> I traced the code and used the following to call:
>>
>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal
>> --hiveconf hive.server2.thrift.port=1
>>
>>
>>
>> The issue ended up to be much more fundamental however. Spark doesn't work
>> at all in configuration below. When open spark-shell, it fails with the same
>> ClassNotFound error.
>>
>> Now I wonder if this is a windows-only issue or the hive/Hadoop
>> configuration that is having this problem.
>>
>>
>>
>> From: Cheng Lian [mailto:lian.cs@gmail.com]
>> Sent: Tuesday, November 25, 2014 1:50 AM
>>
>>
>> To: Judy Nash; u...@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError
>> on Guava
>>
>>
>>
>> Oh so you're using Windows. What command are you using to start the Thrift
>> server then?
>>
>> On 11/25/14 4:25 PM, Judy Nash wrote:
>>
>> Made progress but still blocked.
>>
>> After recompiling the code on cmd instead of PowerShell, now I can see all 5
>> classes as you mentioned.
>>
>> However I am still seeing the same error as before. Anything else I can
>> check for?
>>
>>
>>
>> From: Judy Nash [mailto:judyn...@exchange.microsoft.com]
>> Sent: Monday, November 24, 2014 11:50 PM
>> To: Cheng Lian; u...@spark.incubator.apache.org
>> Subject: RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError
>> on Guava
>>
>>
>>
>> This is what I got from jar tf:
>>
>> org/spark-project/guava/common/base/Preconditions.class
>>
>> org/spark-project/guava/common/math/MathPreconditions.class
>>
>> com/clearspring/analytics/util/Preconditions.class
>>
>> parquet/Preconditions.class
>>
>>
>>
>> I seem to have the line that reported missing, but I am mi

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-26 Thread Patrick Wendell
Hi Judy,

Are you somehow modifying Spark's classpath to include jars from
Hadoop and Hive that you have running on the machine? The issue seems
to be that you are somehow including a version of Hadoop that
references the original guava package. The Hadoop that is bundled in
the Spark jars should not do this.

- Patrick

On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash
 wrote:
> Looks like a config issue. I ran spark-pi job and still failing with the
> same guava error
>
> Command ran:
>
> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077
> --executor-memory 1G --num-executors 1
> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>
>
>
> Had used the same build steps on spark 1.1 and had no issue.
>
>
>
> From: Denny Lee [mailto:denny.g@gmail.com]
> Sent: Tuesday, November 25, 2014 5:47 PM
> To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org
>
>
> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError
> on Guava
>
>
>
> To determine if this is a Windows vs. other configuration, can you just try
> to call the Spark-class.cmd SparkSubmit without actually referencing the
> Hadoop or Thrift server classes?
>
>
>
>
>
> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
> wrote:
>
> I traced the code and used the following to call:
>
> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal
> --hiveconf hive.server2.thrift.port=1
>
>
>
> The issue ended up to be much more fundamental however. Spark doesn't work
> at all in configuration below. When open spark-shell, it fails with the same
> ClassNotFound error.
>
> Now I wonder if this is a windows-only issue or the hive/Hadoop
> configuration that is having this problem.
>
>
>
> From: Cheng Lian [mailto:lian.cs@gmail.com]
> Sent: Tuesday, November 25, 2014 1:50 AM
>
>
> To: Judy Nash; u...@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError
> on Guava
>
>
>
> Oh so you're using Windows. What command are you using to start the Thrift
> server then?
>
> On 11/25/14 4:25 PM, Judy Nash wrote:
>
> Made progress but still blocked.
>
> After recompiling the code on cmd instead of PowerShell, now I can see all 5
> classes as you mentioned.
>
> However I am still seeing the same error as before. Anything else I can
> check for?
>
>
>
> From: Judy Nash [mailto:judyn...@exchange.microsoft.com]
> Sent: Monday, November 24, 2014 11:50 PM
> To: Cheng Lian; u...@spark.incubator.apache.org
> Subject: RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError
> on Guava
>
>
>
> This is what I got from jar tf:
>
> org/spark-project/guava/common/base/Preconditions.class
>
> org/spark-project/guava/common/math/MathPreconditions.class
>
> com/clearspring/analytics/util/Preconditions.class
>
> parquet/Preconditions.class
>
>
>
> I seem to have the line that reported missing, but I am missing this file:
>
> com/google/inject/internal/util/$Preconditions.class
>
>
>
> Any suggestion on how to fix this?
>
> Very much appreciate the help as I am very new to Spark and open source
> technologies.
>
>
>
> From: Cheng Lian [mailto:lian.cs@gmail.com]
> Sent: Monday, November 24, 2014 8:24 PM
> To: Judy Nash; u...@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError
> on Guava
>
>
>
> Hm, I tried exactly the same commit and the build command locally, but
> couldn't reproduce this.
>
> Usually this kind of errors are caused by classpath misconfiguration. Could
> you please try this to ensure corresponding Guava classes are included in
> the assembly jar you built?
>
> jar tf
> assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.jar |
> grep Preconditions
>
> On my machine I got these lines (the first line is the one reported as
> missing in your case):
>
> org/spark-project/guava/common/base/Preconditions.class
>
> org/spark-project/guava/common/math/MathPreconditions.class
>
> com/clearspring/analytics/util/Preconditions.class
>
> parquet/Preconditions.class
>
> com/google/inject/internal/util/$Preconditions.class
>
> On 11/25/14 6:25 AM, Judy Nash wrote:
>
> Thank you Cheng for responding.
>
>
> Here is the commit SHA1 on the 1.2 branch I saw this failure in:
>
> commit 6f70e0295572e3037660004797040e026e440dbd
>
> Author: zsxwing 
>
> Date:   Fri Nov 21 00:42:43 2014 -0800
>
>
>

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Judy Nash
Looks like a config issue. I ran spark-pi job and still failing with the same 
guava error
Command ran:
.\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.examples.SparkPi --master spark://headnodehost:7077 
--executor-memory 1G --num-executors 1 
.\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100

Had used the same build steps on spark 1.1 and had no issue.

From: Denny Lee [mailto:denny.g@gmail.com]
Sent: Tuesday, November 25, 2014 5:47 PM
To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

To determine if this is a Windows vs. other configuration, can you just try to 
call the Spark-class.cmd SparkSubmit without actually referencing the Hadoop or 
Thrift server classes?


On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
mailto:judyn...@exchange.microsoft.com>> wrote:
I traced the code and used the following to call:
Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal 
--hiveconf hive.server2.thrift.port=1

The issue ended up to be much more fundamental however. Spark doesn’t work at 
all in configuration below. When open spark-shell, it fails with the same 
ClassNotFound error.
Now I wonder if this is a windows-only issue or the hive/Hadoop configuration 
that is having this problem.

From: Cheng Lian [mailto:lian.cs@gmail.com<mailto:lian.cs@gmail.com>]
Sent: Tuesday, November 25, 2014 1:50 AM

To: Judy Nash; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Oh so you're using Windows. What command are you using to start the Thrift 
server then?
On 11/25/14 4:25 PM, Judy Nash wrote:
Made progress but still blocked.
After recompiling the code on cmd instead of PowerShell, now I can see all 5 
classes as you mentioned.

However I am still seeing the same error as before. Anything else I can check 
for?

From: Judy Nash [mailto:judyn...@exchange.microsoft.com]
Sent: Monday, November 24, 2014 11:50 PM
To: Cheng Lian; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

This is what I got from jar tf:
org/spark-project/guava/common/base/Preconditions.class
org/spark-project/guava/common/math/MathPreconditions.class
com/clearspring/analytics/util/Preconditions.class
parquet/Preconditions.class

I seem to have the line that reported missing, but I am missing this file:

com/google/inject/internal/util/$Preconditions.class

Any suggestion on how to fix this?
Very much appreciate the help as I am very new to Spark and open source 
technologies.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Monday, November 24, 2014 8:24 PM
To: Judy Nash; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava


Hm, I tried exactly the same commit and the build command locally, but couldn’t 
reproduce this.

Usually this kind of errors are caused by classpath misconfiguration. Could you 
please try this to ensure corresponding Guava classes are included in the 
assembly jar you built?

jar tf assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.jar 
| grep Preconditions

On my machine I got these lines (the first line is the one reported as missing 
in your case):

org/spark-project/guava/common/base/Preconditions.class

org/spark-project/guava/common/math/MathPreconditions.class

com/clearspring/analytics/util/Preconditions.class

parquet/Preconditions.class

com/google/inject/internal/util/$Preconditions.class

On 11/25/14 6:25 AM, Judy Nash wrote:
Thank you Cheng for responding.

Here is the commit SHA1 on the 1.2 branch I saw this failure in:
commit 6f70e0295572e3037660004797040e026e440dbd
Author: zsxwing <mailto:zsxw...@gmail.com>
Date:   Fri Nov 21 00:42:43 2014 -0800

[SPARK-4472][Shell] Print "Spark context available as sc." only when 
SparkContext is created...

... successfully

It's weird that printing "Spark context available as sc" when creating 
SparkContext unsuccessfully.

Let me know if you need anything else.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Friday, November 21, 2014 8:02 PM
To: Judy Nash; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Hi Judy, could you please provide the commit SHA1 of the version you're using? 
Thanks!
On 11/22/14 11:05 AM, Judy Nash wrote:
Hi,

Thrift server is failing to start for me on latest spark 1.2 branch.

I got the error below when I start thrift server.
Exception in thread "main

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Denny Lee
To determine if this is a Windows vs. other configuration, can you just try
to call the Spark-class.cmd SparkSubmit without actually referencing the
Hadoop or Thrift server classes?


On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
wrote:

>  I traced the code and used the following to call:
>
> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal
> --hiveconf hive.server2.thrift.port=1
>
>
>
> The issue ended up to be much more fundamental however. Spark doesn’t work
> at all in configuration below. When open spark-shell, it fails with the
> same ClassNotFound error.
>
> Now I wonder if this is a windows-only issue or the hive/Hadoop
> configuration that is having this problem.
>
>
>
> *From:* Cheng Lian [mailto:lian.cs@gmail.com]
> *Sent:* Tuesday, November 25, 2014 1:50 AM
>
>
> *To:* Judy Nash; u...@spark.incubator.apache.org
> *Subject:* Re: latest Spark 1.2 thrift server fail with
> NoClassDefFoundError on Guava
>
>
>
> Oh so you're using Windows. What command are you using to start the Thrift
> server then?
>
> On 11/25/14 4:25 PM, Judy Nash wrote:
>
> Made progress but still blocked.
>
> After recompiling the code on cmd instead of PowerShell, now I can see all
> 5 classes as you mentioned.
>
>
>  However I am still seeing the same error as before. Anything else I can
> check for?
>
>
>
> *From:* Judy Nash [mailto:judyn...@exchange.microsoft.com
> ]
> *Sent:* Monday, November 24, 2014 11:50 PM
> *To:* Cheng Lian; u...@spark.incubator.apache.org
> *Subject:* RE: latest Spark 1.2 thrift server fail with
> NoClassDefFoundError on Guava
>
>
>
> This is what I got from jar tf:
>
> org/spark-project/guava/common/base/Preconditions.class
>
> org/spark-project/guava/common/math/MathPreconditions.class
>
> com/clearspring/analytics/util/Preconditions.class
>
> parquet/Preconditions.class
>
>
>
> I seem to have the line that reported missing, but I am missing this file:
>
> com/google/inject/internal/util/$Preconditions.class
>
>
>
> Any suggestion on how to fix this?
>
> Very much appreciate the help as I am very new to Spark and open source
> technologies.
>
>
>
> *From:* Cheng Lian [mailto:lian.cs@gmail.com ]
> *Sent:* Monday, November 24, 2014 8:24 PM
> *To:* Judy Nash; u...@spark.incubator.apache.org
> *Subject:* Re: latest Spark 1.2 thrift server fail with
> NoClassDefFoundError on Guava
>
>
>
> Hm, I tried exactly the same commit and the build command locally, but
> couldn’t reproduce this.
>
> Usually this kind of errors are caused by classpath misconfiguration.
> Could you please try this to ensure corresponding Guava classes are
> included in the assembly jar you built?
>
> jar tf 
> assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.jar | 
> grep Preconditions
>
> On my machine I got these lines (the first line is the one reported as
> missing in your case):
>
> org/spark-project/guava/common/base/Preconditions.class
>
> org/spark-project/guava/common/math/MathPreconditions.class
>
> com/clearspring/analytics/util/Preconditions.class
>
> parquet/Preconditions.class
>
> com/google/inject/internal/util/$Preconditions.class
>
> On 11/25/14 6:25 AM, Judy Nash wrote:
>
> Thank you Cheng for responding.
>
>
> Here is the commit SHA1 on the 1.2 branch I saw this failure in:
>
> commit 6f70e0295572e3037660004797040e026e440dbd
>
> Author: zsxwing  
>
> Date:   Fri Nov 21 00:42:43 2014 -0800
>
>
>
> [SPARK-4472][Shell] Print "Spark context available as sc." only when
> SparkContext is created...
>
>
>
> ... successfully
>
>
>
> It's weird that printing "Spark context available as sc" when creating
> SparkContext unsuccessfully.
>
>
>
> Let me know if you need anything else.
>
>
>
> *From:* Cheng Lian [mailto:lian.cs@gmail.com ]
> *Sent:* Friday, November 21, 2014 8:02 PM
> *To:* Judy Nash; u...@spark.incubator.apache.org
> *Subject:* Re: latest Spark 1.2 thrift server fail with
> NoClassDefFoundError on Guava
>
>
>
> Hi Judy, could you please provide the commit SHA1 of the version you're
> using? Thanks!
>
> On 11/22/14 11:05 AM, Judy Nash wrote:
>
> Hi,
>
>
>
> Thrift server is failing to start for me on latest spark 1.2 branch.
>
>
>
> I got the error below when I start thrift server.
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/bas
>
> e/Preconditions
>
> at
> org.apache.hadoop.conf.C

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Judy Nash
I traced the code and used the following to call:
Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal 
--hiveconf hive.server2.thrift.port=1

The issue ended up to be much more fundamental however. Spark doesn’t work at 
all in configuration below. When open spark-shell, it fails with the same 
ClassNotFound error.
Now I wonder if this is a windows-only issue or the hive/Hadoop configuration 
that is having this problem.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Tuesday, November 25, 2014 1:50 AM
To: Judy Nash; u...@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Oh so you're using Windows. What command are you using to start the Thrift 
server then?
On 11/25/14 4:25 PM, Judy Nash wrote:
Made progress but still blocked.
After recompiling the code on cmd instead of PowerShell, now I can see all 5 
classes as you mentioned.


However I am still seeing the same error as before. Anything else I can check 
for?

From: Judy Nash [mailto:judyn...@exchange.microsoft.com]
Sent: Monday, November 24, 2014 11:50 PM
To: Cheng Lian; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

This is what I got from jar tf:
org/spark-project/guava/common/base/Preconditions.class
org/spark-project/guava/common/math/MathPreconditions.class
com/clearspring/analytics/util/Preconditions.class
parquet/Preconditions.class

I seem to have the line that reported missing, but I am missing this file:

com/google/inject/internal/util/$Preconditions.class

Any suggestion on how to fix this?
Very much appreciate the help as I am very new to Spark and open source 
technologies.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Monday, November 24, 2014 8:24 PM
To: Judy Nash; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava


Hm, I tried exactly the same commit and the build command locally, but couldn’t 
reproduce this.

Usually this kind of errors are caused by classpath misconfiguration. Could you 
please try this to ensure corresponding Guava classes are included in the 
assembly jar you built?

jar tf assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.jar 
| grep Preconditions

On my machine I got these lines (the first line is the one reported as missing 
in your case):

org/spark-project/guava/common/base/Preconditions.class

org/spark-project/guava/common/math/MathPreconditions.class

com/clearspring/analytics/util/Preconditions.class

parquet/Preconditions.class

com/google/inject/internal/util/$Preconditions.class

On 11/25/14 6:25 AM, Judy Nash wrote:
Thank you Cheng for responding.

Here is the commit SHA1 on the 1.2 branch I saw this failure in:
commit 6f70e0295572e3037660004797040e026e440dbd
Author: zsxwing <mailto:zsxw...@gmail.com>
Date:   Fri Nov 21 00:42:43 2014 -0800

[SPARK-4472][Shell] Print "Spark context available as sc." only when 
SparkContext is created...

... successfully

It's weird that printing "Spark context available as sc" when creating 
SparkContext unsuccessfully.

Let me know if you need anything else.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Friday, November 21, 2014 8:02 PM
To: Judy Nash; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Hi Judy, could you please provide the commit SHA1 of the version you're using? 
Thanks!
On 11/22/14 11:05 AM, Judy Nash wrote:
Hi,

Thrift server is failing to start for me on latest spark 1.2 branch.

I got the error below when I start thrift server.
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/bas
e/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:314)….

Here is my setup:

1)  Latest spark 1.2 branch build

2)  Used build command:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver 
-DskipTests clean package

3)  Added hive-site.xml to \conf

4)  Version on the box: Hive 0.13, Hadoop 2.4

Is this a real bug or am I doing something wrong?

---
Full Stacktrace:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/bas
e/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:314)
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:327)
at org.apache.hadoop.conf.Configuration.(Configuration.java:409)

at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU
til.scala:82)
at

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Cheng Lian
Oh so you're using Windows. What command are you using to start the 
Thrift server then?


On 11/25/14 4:25 PM, Judy Nash wrote:


Made progress but still blocked.

After recompiling the code on cmd instead of PowerShell, now I can see 
all 5 classes as you mentioned.


However I am still seeing the same error as before. Anything else I 
can check for?


*From:*Judy Nash [mailto:judyn...@exchange.microsoft.com]
*Sent:* Monday, November 24, 2014 11:50 PM
*To:* Cheng Lian; u...@spark.incubator.apache.org
*Subject:* RE: latest Spark 1.2 thrift server fail with 
NoClassDefFoundError on Guava


This is what I got from jar tf:

org/spark-project/guava/common/base/Preconditions.class

org/spark-project/guava/common/math/MathPreconditions.class

com/clearspring/analytics/util/Preconditions.class

parquet/Preconditions.class

I seem to have the line that reported missing, but I am missing this 
file:


com/google/inject/internal/util/$Preconditions.class

Any suggestion on how to fix this?

Very much appreciate the help as I am very new to Spark and open 
source technologies.


*From:*Cheng Lian [mailto:lian.cs@gmail.com]
*Sent:* Monday, November 24, 2014 8:24 PM
*To:* Judy Nash; u...@spark.incubator.apache.org 
<mailto:u...@spark.incubator.apache.org>
*Subject:* Re: latest Spark 1.2 thrift server fail with 
NoClassDefFoundError on Guava


Hm, I tried exactly the same commit and the build command locally, but 
couldn’t reproduce this.


Usually this kind of errors are caused by classpath misconfiguration. 
Could you please try this to ensure corresponding Guava classes are 
included in the assembly jar you built?


|jar tf 
assembly/target/scala-|2.10|/spark-assembly-|1.2|.|1|-SNAPSHOT-hadoop2.|4.0|.jar
 | grep Preconditions|

On my machine I got these lines (the first line is the one reported as 
missing in your case):


|org/spark-project/guava/common/base/Preconditions.class|
|org/spark-project/guava/common/math/MathPreconditions.class|
|com/clearspring/analytics/util/Preconditions.class|
|parquet/Preconditions.class|
|com/google/inject/internal/util/$Preconditions.class|||

On 11/25/14 6:25 AM, Judy Nash wrote:

Thank you Cheng for responding.


Here is the commit SHA1 on the 1.2 branch I saw this failure in:

commit 6f70e0295572e3037660004797040e026e440dbd

Author: zsxwing  <mailto:zsxw...@gmail.com>

Date: Fri Nov 21 00:42:43 2014 -0800

[SPARK-4472][Shell] Print "Spark context available as sc." only
when SparkContext is created...

... successfully

It's weird that printing "Spark context available as sc" when
creating SparkContext unsuccessfully.

Let me know if you need anything else.

*From:*Cheng Lian [mailto:lian.cs@gmail.com]
*Sent:* Friday, November 21, 2014 8:02 PM
*To:* Judy Nash; u...@spark.incubator.apache.org
<mailto:u...@spark.incubator.apache.org>
    *Subject:* Re: latest Spark 1.2 thrift server fail with
NoClassDefFoundError on Guava

Hi Judy, could you please provide the commit SHA1 of the version
you're using? Thanks!

On 11/22/14 11:05 AM, Judy Nash wrote:

Hi,

Thrift server is failing to start for me on latest spark 1.2
branch.

I got the error below when I start thrift server.

Exception in thread "main" java.lang.NoClassDefFoundError:
com/google/common/bas

e/Preconditions

at
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur

ation.java:314)….

Here is my setup:

1)Latest spark 1.2 branch build

2)Used build command:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive
-Phive-thriftserver -DskipTests clean package

3)Added hive-site.xml to \conf

4)Version on the box: Hive 0.13, Hadoop 2.4

Is this a real bug or am I doing something wrong?

---

Full Stacktrace:

Exception in thread "main" java.lang.NoClassDefFoundError:
com/google/common/bas

e/Preconditions

at
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur

ation.java:314)

at
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur

ation.java:327)

at
org.apache.hadoop.conf.Configuration.(Configuration.java:409)

at
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU

til.scala:82)

at
org.apache.spark.deploy.SparkHadoopUtil.(SparkHadoopUtil.scala:

42)

at
org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.scala

:202)

at
org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.sca

la)

at
org.apache.spa

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Judy Nash
Made progress but still blocked.
After recompiling the code on cmd instead of PowerShell, now I can see all 5 
classes as you mentioned.

However I am still seeing the same error as before. Anything else I can check 
for?

From: Judy Nash [mailto:judyn...@exchange.microsoft.com]
Sent: Monday, November 24, 2014 11:50 PM
To: Cheng Lian; u...@spark.incubator.apache.org
Subject: RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

This is what I got from jar tf:
org/spark-project/guava/common/base/Preconditions.class
org/spark-project/guava/common/math/MathPreconditions.class
com/clearspring/analytics/util/Preconditions.class
parquet/Preconditions.class

I seem to have the line that reported missing, but I am missing this file:

com/google/inject/internal/util/$Preconditions.class

Any suggestion on how to fix this?
Very much appreciate the help as I am very new to Spark and open source 
technologies.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Monday, November 24, 2014 8:24 PM
To: Judy Nash; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava


Hm, I tried exactly the same commit and the build command locally, but couldn’t 
reproduce this.

Usually this kind of errors are caused by classpath misconfiguration. Could you 
please try this to ensure corresponding Guava classes are included in the 
assembly jar you built?

jar tf assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.jar 
| grep Preconditions

On my machine I got these lines (the first line is the one reported as missing 
in your case):

org/spark-project/guava/common/base/Preconditions.class

org/spark-project/guava/common/math/MathPreconditions.class

com/clearspring/analytics/util/Preconditions.class

parquet/Preconditions.class

com/google/inject/internal/util/$Preconditions.class

On 11/25/14 6:25 AM, Judy Nash wrote:
Thank you Cheng for responding.

Here is the commit SHA1 on the 1.2 branch I saw this failure in:
commit 6f70e0295572e3037660004797040e026e440dbd
Author: zsxwing <mailto:zsxw...@gmail.com>
Date:   Fri Nov 21 00:42:43 2014 -0800

[SPARK-4472][Shell] Print "Spark context available as sc." only when 
SparkContext is created...

... successfully

It's weird that printing "Spark context available as sc" when creating 
SparkContext unsuccessfully.

Let me know if you need anything else.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Friday, November 21, 2014 8:02 PM
To: Judy Nash; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Hi Judy, could you please provide the commit SHA1 of the version you're using? 
Thanks!
On 11/22/14 11:05 AM, Judy Nash wrote:
Hi,

Thrift server is failing to start for me on latest spark 1.2 branch.

I got the error below when I start thrift server.
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/bas
e/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:314)….

Here is my setup:

1)  Latest spark 1.2 branch build

2)  Used build command:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver 
-DskipTests clean package

3)  Added hive-site.xml to \conf

4)  Version on the box: Hive 0.13, Hadoop 2.4

Is this a real bug or am I doing something wrong?

---
Full Stacktrace:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/bas
e/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:314)
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:327)
at org.apache.hadoop.conf.Configuration.(Configuration.java:409)

at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU
til.scala:82)
at org.apache.spark.deploy.SparkHadoopUtil.(SparkHadoopUtil.scala:
42)
at org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.scala
:202)
at org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.sca
la)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
at org.apache.spark.storage.BlockManager.(BlockManager.scala:105)
at org.apache.spark.storage.BlockManager.(BlockManager.scala:180)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
at org.apache.spark.SparkContext.(SparkContext.scala:230)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
scala:38)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh
riftServer2.scala:56)
at org.apache.spark.sql.hive.thri

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-24 Thread Judy Nash
This is what I got from jar tf:
org/spark-project/guava/common/base/Preconditions.class
org/spark-project/guava/common/math/MathPreconditions.class
com/clearspring/analytics/util/Preconditions.class
parquet/Preconditions.class

I seem to have the line that reported missing, but I am missing this file:

com/google/inject/internal/util/$Preconditions.class

Any suggestion on how to fix this?
Very much appreciate the help as I am very new to Spark and open source 
technologies.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Monday, November 24, 2014 8:24 PM
To: Judy Nash; u...@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava


Hm, I tried exactly the same commit and the build command locally, but couldn’t 
reproduce this.

Usually this kind of errors are caused by classpath misconfiguration. Could you 
please try this to ensure corresponding Guava classes are included in the 
assembly jar you built?

jar tf assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.jar 
| grep Preconditions

On my machine I got these lines (the first line is the one reported as missing 
in your case):

org/spark-project/guava/common/base/Preconditions.class

org/spark-project/guava/common/math/MathPreconditions.class

com/clearspring/analytics/util/Preconditions.class

parquet/Preconditions.class

com/google/inject/internal/util/$Preconditions.class

On 11/25/14 6:25 AM, Judy Nash wrote:
Thank you Cheng for responding.

Here is the commit SHA1 on the 1.2 branch I saw this failure in:
commit 6f70e0295572e3037660004797040e026e440dbd
Author: zsxwing <mailto:zsxw...@gmail.com>
Date:   Fri Nov 21 00:42:43 2014 -0800

[SPARK-4472][Shell] Print "Spark context available as sc." only when 
SparkContext is created...

... successfully

It's weird that printing "Spark context available as sc" when creating 
SparkContext unsuccessfully.

Let me know if you need anything else.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Friday, November 21, 2014 8:02 PM
To: Judy Nash; 
u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Hi Judy, could you please provide the commit SHA1 of the version you're using? 
Thanks!
On 11/22/14 11:05 AM, Judy Nash wrote:
Hi,

Thrift server is failing to start for me on latest spark 1.2 branch.

I got the error below when I start thrift server.
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/bas
e/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:314)….

Here is my setup:

1)  Latest spark 1.2 branch build

2)  Used build command:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver 
-DskipTests clean package

3)  Added hive-site.xml to \conf

4)  Version on the box: Hive 0.13, Hadoop 2.4

Is this a real bug or am I doing something wrong?

---
Full Stacktrace:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/bas
e/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:314)
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:327)
at org.apache.hadoop.conf.Configuration.(Configuration.java:409)

at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU
til.scala:82)
at org.apache.spark.deploy.SparkHadoopUtil.(SparkHadoopUtil.scala:
42)
at org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.scala
:202)
at org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.sca
la)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
at org.apache.spark.storage.BlockManager.(BlockManager.scala:105)
at org.apache.spark.storage.BlockManager.(BlockManager.scala:180)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
at org.apache.spark.SparkContext.(SparkContext.scala:230)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
scala:38)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh
riftServer2.scala:56)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr
iftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-24 Thread Cheng Lian
Hm, I tried exactly the same commit and the build command locally, but 
couldn’t reproduce this.


Usually this kind of errors are caused by classpath misconfiguration. 
Could you please try this to ensure corresponding Guava classes are 
included in the assembly jar you built?


|jar tf 
assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.jar | grep 
Preconditions
|

On my machine I got these lines (the first line is the one reported as 
missing in your case):


|org/spark-project/guava/common/base/Preconditions.class
org/spark-project/guava/common/math/MathPreconditions.class
com/clearspring/analytics/util/Preconditions.class
parquet/Preconditions.class
com/google/inject/internal/util/$Preconditions.class
|

On 11/25/14 6:25 AM, Judy Nash wrote:


Thank you Cheng for responding.


Here is the commit SHA1 on the 1.2 branch I saw this failure in:

commit 6f70e0295572e3037660004797040e026e440dbd

Author: zsxwing 

Date:   Fri Nov 21 00:42:43 2014 -0800

[SPARK-4472][Shell] Print "Spark context available as sc." only when 
SparkContext is created...


... successfully

It's weird that printing "Spark context available as sc" when 
creating SparkContext unsuccessfully.


Let me know if you need anything else.

*From:*Cheng Lian [mailto:lian.cs@gmail.com]
*Sent:* Friday, November 21, 2014 8:02 PM
*To:* Judy Nash; u...@spark.incubator.apache.org
*Subject:* Re: latest Spark 1.2 thrift server fail with 
NoClassDefFoundError on Guava


Hi Judy, could you please provide the commit SHA1 of the version 
you're using? Thanks!


On 11/22/14 11:05 AM, Judy Nash wrote:

Hi,

Thrift server is failing to start for me on latest spark 1.2 branch.

I got the error below when I start thrift server.

Exception in thread "main" java.lang.NoClassDefFoundError:
com/google/common/bas

e/Preconditions

at
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur

ation.java:314)….

Here is my setup:

1)Latest spark 1.2 branch build

2)Used build command:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive
-Phive-thriftserver -DskipTests clean package

3)Added hive-site.xml to \conf

4)Version on the box: Hive 0.13, Hadoop 2.4

Is this a real bug or am I doing something wrong?

---

Full Stacktrace:

Exception in thread "main" java.lang.NoClassDefFoundError:
com/google/common/bas

e/Preconditions

at
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur

ation.java:314)

at
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur

ation.java:327)

at
org.apache.hadoop.conf.Configuration.(Configuration.java:409)

at
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU

til.scala:82)

at
org.apache.spark.deploy.SparkHadoopUtil.(SparkHadoopUtil.scala:

42)

at
org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.scala

:202)

at
org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.sca

la)

at
org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)

at
org.apache.spark.storage.BlockManager.(BlockManager.scala:105)

at
org.apache.spark.storage.BlockManager.(BlockManager.scala:180)

at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)

at
org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)

at
org.apache.spark.SparkContext.(SparkContext.scala:230)

at
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.

scala:38)

at
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh

riftServer2.scala:56)

at
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr

iftServer2.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

java:57)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

sorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)

at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.ClassNotFoundException:
com.google.common.base.Precondition

s

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-24 Thread Judy Nash
Thank you Cheng for responding.

Here is the commit SHA1 on the 1.2 branch I saw this failure in:
commit 6f70e0295572e3037660004797040e026e440dbd
Author: zsxwing 
Date:   Fri Nov 21 00:42:43 2014 -0800

[SPARK-4472][Shell] Print "Spark context available as sc." only when 
SparkContext is created...

... successfully

It's weird that printing "Spark context available as sc" when creating 
SparkContext unsuccessfully.

Let me know if you need anything else.

From: Cheng Lian [mailto:lian.cs@gmail.com]
Sent: Friday, November 21, 2014 8:02 PM
To: Judy Nash; u...@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on 
Guava

Hi Judy, could you please provide the commit SHA1 of the version you're using? 
Thanks!
On 11/22/14 11:05 AM, Judy Nash wrote:
Hi,

Thrift server is failing to start for me on latest spark 1.2 branch.

I got the error below when I start thrift server.
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/bas
e/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:314)

Here is my setup:

1)  Latest spark 1.2 branch build

2)  Used build command:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver 
-DskipTests clean package

3)  Added hive-site.xml to \conf

4)  Version on the box: Hive 0.13, Hadoop 2.4

Is this a real bug or am I doing something wrong?

---
Full Stacktrace:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/bas
e/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:314)
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur
ation.java:327)
at org.apache.hadoop.conf.Configuration.(Configuration.java:409)

at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU
til.scala:82)
at org.apache.spark.deploy.SparkHadoopUtil.(SparkHadoopUtil.scala:
42)
at org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.scala
:202)
at org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.sca
la)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
at org.apache.spark.storage.BlockManager.(BlockManager.scala:105)
at org.apache.spark.storage.BlockManager.(BlockManager.scala:180)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
at org.apache.spark.SparkContext.(SparkContext.scala:230)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
scala:38)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh
riftServer2.scala:56)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr
iftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.google.common.base.Precondition
s
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)



Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-21 Thread Cheng Lian
Hi Judy, could you please provide the commit SHA1 of the version you're 
using? Thanks!


On 11/22/14 11:05 AM, Judy Nash wrote:


Hi,

Thrift server is failing to start for me on latest spark 1.2 branch.

I got the error below when I start thrift server.

Exception in thread "main" java.lang.NoClassDefFoundError: 
com/google/common/bas


e/Preconditions

at 
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur


ation.java:314)….

Here is my setup:

1)Latest spark 1.2 branch build

2)Used build command:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive 
-Phive-thriftserver -DskipTests clean package


3)Added hive-site.xml to \conf

4)Version on the box: Hive 0.13, Hadoop 2.4

Is this a real bug or am I doing something wrong?

---

Full Stacktrace:

Exception in thread "main" java.lang.NoClassDefFoundError: 
com/google/common/bas


e/Preconditions

at 
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur


ation.java:314)

at 
org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configur


ation.java:327)

at 
org.apache.hadoop.conf.Configuration.(Configuration.java:409)


at 
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU


til.scala:82)

at 
org.apache.spark.deploy.SparkHadoopUtil.(SparkHadoopUtil.scala:


42)

at 
org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.scala


:202)

at 
org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.sca


la)

at 
org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)


at 
org.apache.spark.storage.BlockManager.(BlockManager.scala:105)


at 
org.apache.spark.storage.BlockManager.(BlockManager.scala:180)


at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)

at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)

at org.apache.spark.SparkContext.(SparkContext.scala:230)

at 
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.


scala:38)

at 
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh


riftServer2.scala:56)

at 
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr


iftServer2.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.


java:57)

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces


sorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at 
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)


at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.ClassNotFoundException: 
com.google.common.base.Precondition


s

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

at java.lang.ClassLoader.loadClass(ClassLoader.java:358)