Re: NoClassDefFoundError: scala/Product$class

2020-06-07 Thread charles_cai
The org.bdgenomics.adam is one of the Components of the GATK, and I just
download the release version from its github website . However, when I build
a new  docker image with spark2.4.5 and scala 2.12.4,It works well and that
makes me confused.


root@master2:~# pyspark 
Python 2.7.17 (default, Apr 15 2020, 17:20:14) 
[GCC 7.5.0] on linux2
Type "help", "copyright", "credits" or "license" for more information.
20/06/08 01:44:16 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
Welcome to
    __
 / __/__  ___ _/ /__
_\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 2.4.5
  /_/

Using Python version 2.7.17 (default, Apr 15 2020 17:20:14)
SparkSession available as 'spark'.


root@master2:~# scala -version
Scala code runner version 2.12.4 -- Copyright 2002-2017, LAMP/EPFL and
Lightbend, Inc.




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: NoClassDefFoundError: scala/Product$class

2020-06-06 Thread James Moore
How are you depending on that org.bdgenomics.adam library?  Maybe you're
pulling the 2.11 version of that.


Re: NoClassDefFoundError: scala/Product$class

2020-06-06 Thread Sean Owen
Spark 3 supports only Scala 2.12. This actually sounds like third party
library is compiled for 2.11 or something.

On Fri, Jun 5, 2020 at 11:11 PM charles_cai <1620075...@qq.com> wrote:

> Hi Pol,
>
> thanks for your suggestion, I am going to use Spark-3.0.0 for GPU
> acceleration,so I update the scala to the *version 2.12.11* and the latest
> *2.13* ,but the error is still there, and by the way , the Spark version is
> *spark-3.0.0-preview2-bin-without-hadoop*
>
> Caused by: java.lang.ClassNotFoundException: scala.Product$class
> at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>
> Charles cai
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: NoClassDefFoundError: scala/Product$class

2020-06-05 Thread charles_cai
Hi Pol, 

thanks for your suggestion, I am going to use Spark-3.0.0 for GPU
acceleration,so I update the scala to the *version 2.12.11* and the latest
*2.13* ,but the error is still there, and by the way , the Spark version is
*spark-3.0.0-preview2-bin-without-hadoop*

Caused by: java.lang.ClassNotFoundException: scala.Product$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)

Charles cai



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: NoClassDefFoundError: scala/Product$class

2020-06-03 Thread Pol Santamaria
Hi Charles,

I believe Spark 3.0 removed the support for Scala 2.11, and that error is a
version compatibility issue. You should try Spark 2.4.5 with your current
setup (works with Scala 2.11 by default).

Pol Santamaria

On Wed, Jun 3, 2020 at 7:44 AM charles_cai <1620075...@qq.com> wrote:

> Hi,
>
> I run the GATK MarkDuplicates in Spark mode and it throws an
> *NoClassDefFoundError: scala/Product$class*. The GATK version is 4.1.7 and
> 4.0.0,the environment is: spark-3.0.0, scala-2.11.12
>
> *GATK commands:*
>
> gatk MarkDuplicatesSpark \
> -I hdfs://master2:9000/Drosophila/output/Drosophila.sorted.bam \
> -O hdfs://master2:9000/Drosophila/output/Drosophila.sorted.markdup.bam \
> -M
> hdfs://master2:9000/Drosophila/output/Drosophila.sorted.markdup_metrics.txt
> \
> -- \
> --spark-runner SPARK --spark-master spark://master2:7077
>
> *error logs:*
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> scala/Product$class
>at
> org.bdgenomics.adam.serialization.InputStreamWithDecoder.(ADAMKryoRegistrator.scala:35)
>
>at
> org.bdgenomics.adam.serialization.AvroSerializer.(ADAMKryoRegistrator.scala:45)
>
>at
> org.bdgenomics.adam.models.VariantContextSerializer.(VariantContext.scala:94)
>
>at
> org.bdgenomics.adam.serialization.ADAMKryoRegistrator.registerClasses(ADAMKryoRegistrator.scala:179)
>
>at
> org.broadinstitute.hellbender.engine.spark.GATKRegistrator.registerClasses(GATKRegistrator.java:78)
>
>at
> org.apache.spark.serializer.KryoSerializer.$anonfun$newKryo$8(KryoSerializer.scala:170)
>
>at
> org.apache.spark.serializer.KryoSerializer.$anonfun$newKryo$8$adapted(KryoSerializer.scala:170)
>
>at scala.Option.foreach(Option.scala:407)
>at
> org.apache.spark.serializer.KryoSerializer.$anonfun$newKryo$5(KryoSerializer.scala:170)
>
>at
> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
>at
> org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:221)
>at
> org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:161)
>
>at
> org.apache.spark.serializer.KryoSerializer$$anon$1.create(KryoSerializer.scala:102)
>
>at
> com.esotericsoftware.kryo.pool.KryoPoolQueueImpl.borrow(KryoPoolQueueImpl.java:48)
>
>at
> org.apache.spark.serializer.KryoSerializer$PoolWrapper.borrow(KryoSerializer.scala:109)
>
>at
> org.apache.spark.serializer.KryoSerializerInstance.borrowKryo(KryoSerializer.scala:336)
>
>at
> org.apache.spark.serializer.KryoSerializationStream.(KryoSerializer.scala:256)
>
>at
> org.apache.spark.serializer.KryoSerializerInstance.serializeStream(KryoSerializer.scala:422)
>
>at
> org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:309)
>
>at
> org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:137)
>
>at
> org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:91)
>
>at
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:35)
>
>at
> org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:77)
>
>at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1494)
>at org.apache.spark.rdd.NewHadoopRDD.(NewHadoopRDD.scala:80)
>at
> org.apache.spark.SparkContext.$anonfun$newAPIHadoopFile$2(SparkContext.scala:1235)
>
>at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>
>at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>
>at org.apache.spark.SparkContext.withScope(SparkContext.scala:771)
>at
> org.apache.spark.SparkContext.newAPIHadoopFile(SparkContext.scala:1221)
>at
> org.apache.spark.api.java.JavaSparkContext.newAPIHadoopFile(JavaSparkContext.scala:484)
>
>at
>
> org.broadinstitute.hellbender.engine.spark.datasources.ReadsSparkSource.getParallelReads(ReadsSparkSource
> .java:112)
>at
> org.broadinstitute.hellbender.engine.spark.GATKSparkTool.getUnfilteredReads(GATKSparkTool.java:254)
>
>at
> org.broadinstitute.hellbender.engine.spark.GATKSparkTool.getReads(GATKSparkTool.java:220)
>
>at
>
> org.broadinstitute.hellbender.tools.spark.transforms.markduplicates.MarkDuplicatesSpark.runTool(MarkDupli
> catesSpark.java:72)
>at
> org.broadinstitute.hellbender.engine.spark.GATKSparkTool.runPipeline(GATKSparkTool.java:387)
>
>at
>
> org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:30
> )
>at
> org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:136)
>
>at
>
> org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.jav
> a:179)
>at
> org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLinePro

Re: NoClassDefFoundError

2016-12-21 Thread Vadim Semenov
You better ask folks in the spark-jobserver gitter channel:
https://github.com/spark-jobserver/spark-jobserver

On Wed, Dec 21, 2016 at 8:02 AM, Reza zade  wrote:

> Hello
>
> I've extended the JavaSparkJob (job-server-0.6.2) and created an object
> of SQLContext class. my maven project doesn't have any problem during
> compile and packaging phase. but when I send .jar of project to sjs and run
> it "NoClassDefFoundError" will be issued. the trace of exception is :
>
>
> job-server[ERROR] Exception in thread "pool-20-thread-1"
> java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext
> job-server[ERROR]  at sparkdesk.SparkSQLJob2.runJob(SparkSQLJob2.java:61)
> job-server[ERROR]  at sparkdesk.SparkSQLJob2.runJob(SparkSQLJob2.java:45)
> job-server[ERROR]  at spark.jobserver.JavaSparkJob.r
> unJob(JavaSparkJob.scala:17)
> job-server[ERROR]  at spark.jobserver.JavaSparkJob.r
> unJob(JavaSparkJob.scala:14)
> job-server[ERROR]  at spark.jobserver.JobManagerActo
> r$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.
> apply(JobManagerActor.scala:301)
> job-server[ERROR]  at scala.concurrent.impl.Future$P
> romiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> job-server[ERROR]  at scala.concurrent.impl.Future$P
> romiseCompletingRunnable.run(Future.scala:24)
> job-server[ERROR]  at java.util.concurrent.ThreadPoo
> lExecutor.runWorker(ThreadPoolExecutor.java:1145)
> job-server[ERROR]  at java.util.concurrent.ThreadPoo
> lExecutor$Worker.run(ThreadPoolExecutor.java:615)
> job-server[ERROR]  at java.lang.Thread.run(Thread.java:745)
> job-server[ERROR] Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.sql.SQLContext
> job-server[ERROR]  at java.net.URLClassLoader$1.run(
> URLClassLoader.java:366)
> job-server[ERROR]  at java.net.URLClassLoader$1.run(
> URLClassLoader.java:355)
> job-server[ERROR]  at java.security.AccessController.doPrivileged(Native
> Method)
> job-server[ERROR]  at java.net.URLClassLoader.findCl
> ass(URLClassLoader.java:354)
> job-server[ERROR]  at java.lang.ClassLoader.loadClas
> s(ClassLoader.java:425)
> job-server[ERROR]  at java.lang.ClassLoader.loadClas
> s(ClassLoader.java:358)
> job-server[ERROR]  ... 10 more
>
>
> what is the problem?
> do you have any solution about this?
>


Re: NoClassDefFoundError: org/apache/spark/Logging in SparkSession.getOrCreate

2016-10-17 Thread Saisai Shao
Not sure why your code will search Logging class under org/apache/spark,
this should be “org/apache/spark/internal/Logging”, and it changed long
time ago.


On Sun, Oct 16, 2016 at 3:25 AM, Brad Cox  wrote:

> I'm experimenting with Spark 2.0.1 for the first time and hitting a
> problem right out of the gate.
>
> My main routine starts with this which I think is the standard idiom.
>
> SparkSession sparkSession = SparkSession
> .builder()
> .master("local")
> .appName("DecisionTreeExample")
> .getOrCreate();
>
> Running this in the eclipse debugger, execution fails in getOrCreate()
> with this exception
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/Logging
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> at java.security.SecureClassLoader.defineClass(
> SecureClassLoader.java:142)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at org.apache.spark.sql.SparkSession.(
> SparkSession.scala:122)
> at org.apache.spark.sql.SparkSession.(SparkSession.scala:77)
> at org.apache.spark.sql.SparkSession$Builder.
> getOrCreate(SparkSession.scala:840)
> at titanic.DecisionTreeExample.main(DecisionTreeExample.java:54)
>
> java.lang.NoClassDefFoundError means a class is not found at run time that
> was present at
> compile time. I've googled everything I can think of and found no
> solutions. Can someone
> help? Thanks!
>
> These are my spark-relevant dependencies:
>
> 
> org.apache.spark
> spark-core_2.11
> 2.0.1
> 
> 
> org.apache.spark
> spark-mllib_2.11
> 2.0.1
> 
> 
> org.apache.spark
> spark-sql_2.11
> 2.0.1
> 
>
>
>
> Dr. Brad J. CoxCell: 703-594-1883 Skype: dr.brad.cox
>
>
>
>
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: NoClassDefFoundError with ZonedDateTime

2016-07-24 Thread Timur Shenkao
Which version of Java 8 do you use? AFAIK, it's recommended to exploit Java
1.8_0.66 +

On Fri, Jul 22, 2016 at 8:49 PM, Jacek Laskowski  wrote:

> On Fri, Jul 22, 2016 at 6:43 AM, Ted Yu  wrote:
> > You can use this command (assuming log aggregation is turned on):
> >
> > yarn logs --applicationId XX
>
> I don't think it's gonna work for already-running application (and I
> wish I were mistaken since I needed it just yesterday) and you have to
> revert to stderr of ApplicationMaster in container 1.
>
> Jacek
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: NoClassDefFoundError with ZonedDateTime

2016-07-22 Thread Jacek Laskowski
On Fri, Jul 22, 2016 at 6:43 AM, Ted Yu  wrote:
> You can use this command (assuming log aggregation is turned on):
>
> yarn logs --applicationId XX

I don't think it's gonna work for already-running application (and I
wish I were mistaken since I needed it just yesterday) and you have to
revert to stderr of ApplicationMaster in container 1.

Jacek

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: NoClassDefFoundError with ZonedDateTime

2016-07-21 Thread Ted Yu
You can use this command (assuming log aggregation is turned on):

yarn logs --applicationId XX

In the log, you should see snippet such as the following:

java.class.path=...

FYI

On Thu, Jul 21, 2016 at 9:38 PM, Ilya Ganelin  wrote:

> what's the easiest way to get the Classpath for the spark application
> itself?
>
> On Thu, Jul 21, 2016 at 9:37 PM Ted Yu  wrote:
>
>> Might be classpath issue.
>>
>> Mind pastebin'ning the effective class path ?
>>
>> Stack trace of NoClassDefFoundError may also help provide some clue.
>>
>> On Thu, Jul 21, 2016 at 8:26 PM, Ilya Ganelin  wrote:
>>
>>> Hello - I'm trying to deploy the Spark TimeSeries library in a new
>>> environment. I'm running Spark 1.6.1 submitted through YARN in a cluster
>>> with Java 8 installed on all nodes but I'm getting the NoClassDef at
>>> runtime when trying to create a new TimeSeriesRDD. Since ZonedDateTime is
>>> part of Java 8 I feel like I shouldn't need to do anything else. The weird
>>> thing is I get it on the data nodes, not the driver. Any thoughts on what's
>>> causing this or how to track it down? Would appreciate the help.
>>>
>>> Thanks!
>>>
>>
>>


Re: NoClassDefFoundError with ZonedDateTime

2016-07-21 Thread Ilya Ganelin
what's the easiest way to get the Classpath for the spark application
itself?
On Thu, Jul 21, 2016 at 9:37 PM Ted Yu  wrote:

> Might be classpath issue.
>
> Mind pastebin'ning the effective class path ?
>
> Stack trace of NoClassDefFoundError may also help provide some clue.
>
> On Thu, Jul 21, 2016 at 8:26 PM, Ilya Ganelin  wrote:
>
>> Hello - I'm trying to deploy the Spark TimeSeries library in a new
>> environment. I'm running Spark 1.6.1 submitted through YARN in a cluster
>> with Java 8 installed on all nodes but I'm getting the NoClassDef at
>> runtime when trying to create a new TimeSeriesRDD. Since ZonedDateTime is
>> part of Java 8 I feel like I shouldn't need to do anything else. The weird
>> thing is I get it on the data nodes, not the driver. Any thoughts on what's
>> causing this or how to track it down? Would appreciate the help.
>>
>> Thanks!
>>
>
>


Re: NoClassDefFoundError with ZonedDateTime

2016-07-21 Thread Ted Yu
Might be classpath issue.

Mind pastebin'ning the effective class path ?

Stack trace of NoClassDefFoundError may also help provide some clue.

On Thu, Jul 21, 2016 at 8:26 PM, Ilya Ganelin  wrote:

> Hello - I'm trying to deploy the Spark TimeSeries library in a new
> environment. I'm running Spark 1.6.1 submitted through YARN in a cluster
> with Java 8 installed on all nodes but I'm getting the NoClassDef at
> runtime when trying to create a new TimeSeriesRDD. Since ZonedDateTime is
> part of Java 8 I feel like I shouldn't need to do anything else. The weird
> thing is I get it on the data nodes, not the driver. Any thoughts on what's
> causing this or how to track it down? Would appreciate the help.
>
> Thanks!
>


RE: NoClassDefFoundError: scala/collection/GenTraversableOnce$class

2015-07-29 Thread Benjamin Ross
Hey Ted,
Thanks for the quick response.  Sadly, all of those are 2.10.x:
─$ mvn dependency:tree | grep -A 2 -B 2 org.scala-lang  
 130 ↵
[INFO] |  |  \- org.tukaani:xz:jar:1.0:compile
[INFO] |  \- org.slf4j:slf4j-api:jar:1.6.4:compile
[INFO] +- org.scala-lang:scala-library:jar:2.10.4:compile
[INFO] +- org.apache.spark:spark-core_2.10:jar:1.4.1:compile
[INFO] |  +- com.twitter:chill_2.10:jar:0.5.0:compile
--
[INFO] |  |  \- org.json4s:json4s-core_2.10:jar:3.2.10:compile
[INFO] |  | +- org.json4s:json4s-ast_2.10:jar:3.2.10:compile
[INFO] |  | \- org.scala-lang:scalap:jar:2.10.0:compile
[INFO] |  +- com.sun.jersey:jersey-server:jar:1.9:compile
[INFO] |  |  \- asm:asm:jar:3.1:compile
--
[INFO] +- org.apache.spark:spark-sql_2.10:jar:1.4.1:compile
[INFO] |  +- org.apache.spark:spark-catalyst_2.10:jar:1.4.1:compile
[INFO] |  |  +- org.scala-lang:scala-compiler:jar:2.10.4:compile
[INFO] |  |  \- org.scalamacros:quasiquotes_2.10:jar:2.0.1:compile
[INFO] |  \- org.jodd:jodd-core:jar:3.6.3:compile
--
[INFO] |  +- org.joda:joda-convert:jar:1.2:compile
[INFO] |  +- com.twitter:jsr166e:jar:1.1.0:compile
[INFO] |  \- org.scala-lang:scala-reflect:jar:2.10.5:compile
[INFO] +- 
com.datastax.spark:spark-cassandra-connector-java_2.10:jar:1.2.4:compile
[INFO] +- commons-codec:commons-codec:jar:1.4:compile

Ben


From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Wednesday, July 29, 2015 8:30 PM
To: Benjamin Ross
Cc: user@spark.apache.org
Subject: Re: NoClassDefFoundError: scala/collection/GenTraversableOnce$class

You can generate dependency tree using:

mvn dependency:tree

and grep for 'org.scala-lang' in the output to see if there is any clue.

Cheers

On Wed, Jul 29, 2015 at 5:14 PM, Benjamin Ross 
mailto:br...@lattice-engines.com>> wrote:
Hello all,
I’m new to both spark and scala, and am running into an annoying error 
attempting to prototype some spark functionality.  From forums I’ve read 
online, this error should only present itself if there’s a version mismatch 
between the version of scala used to compile spark and the scala version that 
I’m using.  However, that’s not the case for me.  I’m using scala 2.10.4, and 
spark was compiled against scala 2.10.x.  Perhaps I’m missing something here.

Also, the NoClassDefFoundError presents itself when debugging in eclipse, but 
running directly via the jar, the following error appears:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Seq
at com.latticeengines.test.CassandraTest.main(CassandraTest.scala)
Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 1 more

I am getting the following warning when trying to invoke maven, but it doesn’t 
seem to be related to the underlying issue:
[INFO] Checking for multiple versions of scala
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.mycompany:test:2.0.5-SNAPSHOT requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  org.spark-project.akka:akka-remote_2.10:2.3.4-spark requires scala 
version: 2.10.4
[WARNING]  org.spark-project.akka:akka-actor_2.10:2.3.4-spark requires scala 
version: 2.10.4
[WARNING]  org.spark-project.akka:akka-slf4j_2.10:2.3.4-spark requires scala 
version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.4.1 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] includes = [**/*.scala,**/*.java,]

Here’s the code I’m trying to run:

object CassandraTest {
  def main(args: Array[String]) {
println("Hello, scala!")

val conf = new SparkConf(true).set("spark.cassandra.connection.host", 
"127.0.0.1").set(
"spark.driver.extraClassPath",

"/home/bross/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.10/1.2.4/spark-cassandra-connector_2.10-1.2.4.jar;/home/bross/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.10/1.2.4/spark-cassandra-connector_2.10-1.2.4.jar;/home/bross/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar");

val sc = new SparkContext("local", "test", conf)
val sqlContext = new SQLContext(sc)
val df = sqlContext
  .read
  .format("org.apache.spark.sql.cassandra")
  .options(Map( &

Re: NoClassDefFoundError: scala/collection/GenTraversableOnce$class

2015-07-29 Thread Ted Yu
You can generate dependency tree using:

mvn dependency:tree

and grep for 'org.scala-lang' in the output to see if there is any clue.

Cheers

On Wed, Jul 29, 2015 at 5:14 PM, Benjamin Ross 
wrote:

>  Hello all,
>
> I’m new to both spark and scala, and am running into an annoying error
> attempting to prototype some spark functionality.  From forums I’ve read
> online, this error should only present itself if there’s a version mismatch
> between the version of scala used to compile spark and the scala version
> that I’m using.  However, that’s not the case for me.  I’m using scala
> 2.10.4, and spark was compiled against scala 2.10.x.  Perhaps I’m missing
> something here.
>
>
>
> Also, the NoClassDefFoundError presents itself when debugging in eclipse,
> but running directly via the jar, the following error appears:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> scala/collection/Seq
>
> at com.latticeengines.test.CassandraTest.main(CassandraTest.scala)
>
> Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
> ... 1 more
>
>
>
> I am getting the following warning when trying to invoke maven, but it
> doesn’t seem to be related to the underlying issue:
>
> [INFO] Checking for multiple versions of scala
>
> [WARNING]  Expected all dependencies to require Scala version: 2.10.4
>
> [WARNING]  com.mycompany:test:2.0.5-SNAPSHOT requires scala version: 2.10.4
>
> [WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
>
> [WARNING]  org.spark-project.akka:akka-remote_2.10:2.3.4-spark requires
> scala version: 2.10.4
>
> [WARNING]  org.spark-project.akka:akka-actor_2.10:2.3.4-spark requires
> scala version: 2.10.4
>
> [WARNING]  org.spark-project.akka:akka-slf4j_2.10:2.3.4-spark requires
> scala version: 2.10.4
>
> [WARNING]  org.apache.spark:spark-core_2.10:1.4.1 requires scala version:
> 2.10.4
>
> [WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version:
> 2.10.0
>
> [WARNING] Multiple versions of scala libraries detected!
>
> [INFO] includes = [**/*.scala,**/*.java,]
>
>
>
> Here’s the code I’m trying to run:
>
>
>
> object CassandraTest {
>
>   def main(args: Array[String]) {
>
> println("Hello, scala!")
>
>
>
> val conf = new SparkConf(true).set("spark.cassandra.connection.host",
> "127.0.0.1").set(
>
> "spark.driver.extraClassPath",
>
>
> "/home/bross/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.10/1.2.4/spark-cassandra-connector_2.10-1.2.4.jar;/home/bross/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.10/1.2.4/spark-cassandra-connector_2.10-1.2.4.jar;/home/bross/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar");
>
>
>
> val sc = new SparkContext("local", "test", conf)
>
> val sqlContext = new SQLContext(sc)
>
> val df = sqlContext
>
>   .read
>
>   .format("org.apache.spark.sql.cassandra")
>
>   .options(Map( "table" -> "kv", "keyspace" -> "test"))
>
>   .load()
>
> val w = Window.orderBy("value").rowsBetween(-2, 0)
>
> df.select(mean("value").over(w))
>
>
>
>   }
>
> }
>
>
>
> Here’s my maven file:
>
> 
>
> http://maven.apache.org/POM/4.0.0"; xmlns:xsi="
> http://www.w3.org/2001/XMLSchema-instance";
>
> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
> http://maven.apache.org/maven-v4_0_0.xsd";>
>
>
>
> 4.0.0
>
> test
>
> jar
>
> ${component-name}
>
>
>
> 
>
> le-sparkdb
>
> 2.6.0.2.2.0.0-2041
>
> 2.10.4
>
> 1.4.1
>
> 1.7.7
>
> 1.4.3
>
> 2.0.5-SNAPSHOT
>
> 2.0.5-SNAPSHOT
>
> 2.0.5-SNAPSHOT
>
> 1.2.4
>
> 
>
> 
>
> com.mycompany
>
> le-parent
>
> 2.0.5-SNAPSHOT
>
> le-parent
>
> 
>
>
>
> 
>
> 
>
> 
>
> org.scala-tools
>
> maven-scala-plugin
>
> 
>
> 
>
> 
>
> compile
>
> testCompile
>
> 
>
> 
>
> 
>
> 
>
> 
>
> org.apache.maven.plugins
>
> maven-eclipse-plugin
>
> ${maven.eclipse.version}
>
> 
>
> true
>
> true
>
> 
>
>
> org.scala-ide.sdt.core.scalanature
>
>
> org.eclipse.jdt.core.javanature
>
>   

Re: NoClassDefFoundError for NativeS3FileSystem in pyspark (1.3.1)

2015-05-11 Thread steaz
Solved:

The HADOOP_CONF_DIR wasn't set in spark-env.sh properly





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/NoClassDefFoundError-for-NativeS3FileSystem-in-pyspark-1-3-1-tp22829p22849.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: NoClassDefFoundError when trying to run spark application

2015-01-02 Thread Pankaj Narang
do you assemble the uber jar ?

you can use sbt assembly to build the jar and then run. It should fix the
issue



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/NoClassDefFoundError-when-trying-to-run-spark-application-tp20707p20944.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: NoClassDefFoundError

2014-12-08 Thread Akhil Das
Hi Julius,

You can add those external jars to spark while creating the sparkContext
(sc.addJar("/path/to/the/jar")), if you are submitting the job using
spark-submit then you can use the --jars option and get those jars shipped.

Thanks
Best Regards

On Sun, Dec 7, 2014 at 11:05 PM, Julius K  wrote:

> Hi everyone,
> I am new to Spark and encountered a problem.
> I want to use an external library in a java project and compiling
> works fine with maven, but during runtime (locally) I get a
> NoClassDefFoundError.
> Do I have to put the jars somewhere, or tell spark where they are?
>
> I can send the pom.xml and my imports or source code, if this helps you.
>
> Best regards
> Julius Kolbe
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: NoClassDefFoundError

2014-12-07 Thread Ted Yu
See the following threads:

http://search-hadoop.com/m/JW1q5kjNlK
http://search-hadoop.com/m/JW1q5XqSDk

Cheers

On Sun, Dec 7, 2014 at 9:35 AM, Julius K  wrote:

> Hi everyone,
> I am new to Spark and encountered a problem.
> I want to use an external library in a java project and compiling
> works fine with maven, but during runtime (locally) I get a
> NoClassDefFoundError.
> Do I have to put the jars somewhere, or tell spark where they are?
>
> I can send the pom.xml and my imports or source code, if this helps you.
>
> Best regards
> Julius Kolbe
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile

2014-11-03 Thread Michael Armbrust
It is merged!

On Mon, Nov 3, 2014 at 12:06 PM, Terry Siu  wrote:

>  Thanks, Kousuke. I’ll wait till this pull request makes it into the
> master branch.
>
>  -Terry
>
>   From: Kousuke Saruta 
> Date: Monday, November 3, 2014 at 11:11 AM
> To: Terry Siu , "user@spark.apache.org" <
> user@spark.apache.org>
> Subject: Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build
> with hive-0.13.1 profile
>
>  Hi Terry
>
> I think the issue you mentioned will be resolved by following PR.
> https://github.com/apache/spark/pull/3072
>
> - Kousuke
>
> (2014/11/03 10:42), Terry Siu wrote:
>
> I just built the 1.2 snapshot current as of commit 76386e1a23c using:
>
>  $ ./make-distribution.sh —tgz —name my-spark —skip-java-test -DskipTests
> -Phadoop-2.4 -Phive -Phive-0.13.1 -Pyarn
>
>  I drop in my Hive configuration files into the conf directory, launch
> spark-shell, and then create my HiveContext, hc. I then issue a “use ”
> command:
>
>  scala> hc.hql(“use ”)
>
>  and receive the following class-not-found error:
>
>  java.lang.NoClassDefFoundError:
> com/esotericsoftware/shaded/org/objenesis/strategy/InstantiatorStrategy
>
> at
> org.apache.hadoop.hive.ql.exec.Utilities.(Utilities.java:925)
>
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1224)
>
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
>
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
>
> at
> org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:315)
>
> at
> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:286)
>
> at
> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
>
> at
> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
>
> at
> org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
>
> at
> org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
>
> at
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:424)
>
> at
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:424)
>
> at
> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>
> at org.apache.spark.sql.SchemaRDD.(SchemaRDD.scala:103)
>
> at
> org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:111)
>
> at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:115)
>
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:31)
>
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:36)
>
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:38)
>
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:40)
>
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:42)
>
> at $iwC$$iwC$$iwC$$iwC$$iwC.(:44)
>
> at $iwC$$iwC$$iwC$$iwC.(:46)
>
> at $iwC$$iwC$$iwC.(:48)
>
> at $iwC$$iwC.(:50)
>
> at $iwC.(:52)
>
> at (:54)
>
> at .(:58)
>
> at .()
>
> at .(:7)
>
> at .()
>
> at $print()
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIva:43)
>
> at java.lang.reflect.Method.invoke(Method.java:606)
>
> at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>
> at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125
>
> at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>
> at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>
> at
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:8
>
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>
> at
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>
> at
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>
> at org.apache.spark.repl.SparkILoop.loop(SparkILoop.

Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile

2014-11-03 Thread Terry Siu
Thanks, Kousuke. I’ll wait till this pull request makes it into the master 
branch.

-Terry

From: Kousuke Saruta 
mailto:saru...@oss.nttdata.co.jp>>
Date: Monday, November 3, 2014 at 11:11 AM
To: Terry Siu mailto:terry@smartfocus.com>>, 
"user@spark.apache.org<mailto:user@spark.apache.org>" 
mailto:user@spark.apache.org>>
Subject: Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with 
hive-0.13.1 profile

Hi Terry

I think the issue you mentioned will be resolved by following PR.
https://github.com/apache/spark/pull/3072

- Kousuke

(2014/11/03 10:42), Terry Siu wrote:
I just built the 1.2 snapshot current as of commit 76386e1a23c using:

$ ./make-distribution.sh —tgz —name my-spark —skip-java-test -DskipTests 
-Phadoop-2.4 -Phive -Phive-0.13.1 -Pyarn

I drop in my Hive configuration files into the conf directory, launch 
spark-shell, and then create my HiveContext, hc. I then issue a “use ” 
command:

scala> hc.hql(“use ”)

and receive the following class-not-found error:


java.lang.NoClassDefFoundError: 
com/esotericsoftware/shaded/org/objenesis/strategy/InstantiatorStrategy

at org.apache.hadoop.hive.ql.exec.Utilities.(Utilities.java:925)

at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1224)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)

at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:315)

at 
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:286)

at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)

at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)

at 
org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)

at 
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)

at 
org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:424)

at 
org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:424)

at 
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)

at org.apache.spark.sql.SchemaRDD.(SchemaRDD.scala:103)

at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:111)

at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:115)

at 
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:31)

at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:36)

at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:38)

at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:40)

at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:42)

at $iwC$$iwC$$iwC$$iwC$$iwC.(:44)

at $iwC$$iwC$$iwC$$iwC.(:46)

at $iwC$$iwC$$iwC.(:48)

at $iwC$$iwC.(:50)

at $iwC.(:52)

at (:54)

at .(:58)

at .()

at .(:7)

at .()

at $print()

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIva:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)

at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125

at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)

at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)

at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)

at 
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)

at 
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:8

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)

at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)

at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)

at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)

at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILola:968)

at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scal

at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scal

at 
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoadla:135)

at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)

at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)

at org.apache.spark.repl.Main$.main(Main.scala:31)

at org.apache.spark.repl.Main.main(Main.scala)

at sun.reflect.NativeMet

Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile

2014-11-03 Thread Kousuke Saruta

Hi Terry

I think the issue you mentioned will be resolved by following PR.
https://github.com/apache/spark/pull/3072

- Kousuke

(2014/11/03 10:42), Terry Siu wrote:

I just built the 1.2 snapshot current as of commit 76386e1a23c using:

$ ./make-distribution.sh —tgz —name my-spark —skip-java-test 
-DskipTests -Phadoop-2.4 -Phive -Phive-0.13.1 -Pyarn


I drop in my Hive configuration files into the conf directory, launch 
spark-shell, and then create my HiveContext, hc. I then issue a “use 
” command:


scala> hc.hql(“use ”)

and receive the following class-not-found error:

java.lang.NoClassDefFoundError: 
com/esotericsoftware/shaded/org/objenesis/strategy/InstantiatorStrategy


at 
org.apache.hadoop.hive.ql.exec.Utilities.(Utilities.java:925)


at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1224)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)

at 
org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:315)


at 
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:286)


at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)


at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)


at 
org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)


at 
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)


at 
org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:424)


at 
org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:424)


at 
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)


at org.apache.spark.sql.SchemaRDD.(SchemaRDD.scala:103)

at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:111)

at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:115)

at 
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:31)


at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:36)

at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:38)

at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:40)

at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:42)

at $iwC$$iwC$$iwC$$iwC$$iwC.(:44)

at $iwC$$iwC$$iwC$$iwC.(:46)

at $iwC$$iwC$$iwC.(:48)

at $iwC$$iwC.(:50)

at $iwC.(:52)

at (:54)

at .(:58)

at .()

at .(:7)

at .()

at $print()

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java


at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIva:43)


at java.lang.reflect.Method.invoke(Method.java:606)

at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)


at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125


at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)


at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)

at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)

at 
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)


at 
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:8


at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)

at 
org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)


at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)

at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)

at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILola:968)


at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scal


at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scal


at 
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoadla:135)


at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)

at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)

at org.apache.spark.repl.Main$.main(Main.scala:31)

at org.apache.spark.repl.Main.main(Main.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java


at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIva:43)


at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.ClassNotFoundException: 
com.esotericsoftware.shaded.org.objenesategy.InstantiatorStrategy


at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at

Re: NoClassDefFoundError on ThreadFactoryBuilder in Intellij

2014-10-28 Thread Stephen Boesch
I have checked out from master, cleaned/rebuilt on command line in maven,
then cleaned/rebuilt in intellij many times. This error persists through it
all. Anyone have a solution?








2014-10-23 1:43 GMT-07:00 Stephen Boesch :

> After having checked out from master/head the following error occurs when
> attempting to run any test in Intellij
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/util/concurrent/ThreadFactoryBuilder
> at org.apache.spark.util.Utils$.(Utils.scala:648)
>
>
> There appears to be a related issue/JIRA:
>
>
> https://issues.apache.org/jira/browse/SPARK-3217
>
>
> But the conditions described do not apply in my case:
>
>  Did you by any chance do one of the following:
>
>- forget to "clean" after pulling that change
>- mix sbt and mvn built artifacts in the same build
>- set SPARK_PREPEND_CLASSES
>
>
> For reference here is the full stacktrace:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/util/concurrent/ThreadFactoryBuilder
> at org.apache.spark.util.Utils$.(Utils.scala:648)
> at org.apache.spark.util.Utils$.(Utils.scala)
> at org.apache.spark.SparkContext.(SparkContext.scala:179)
> at org.apache.spark.SparkContext.(SparkContext.scala:119)
> at org.apache.spark.SparkContext.(SparkContext.scala:134)
> at
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:62)
> at
> org.apache.spark.sql.hbase.JavaHBaseSQLContext$.main(JavaHBaseSQLContext.scala:45)
> at
> org.apache.spark.sql.hbase.JavaHBaseSQLContext.main(JavaHBaseSQLContext.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
> Caused by: java.lang.ClassNotFoundException:
> com.google.common.util.concurrent.ThreadFactoryBuilder
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> ... 13 more
> Exception in thread "delete Spark temp dirs"
> java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.spark.util.Utils$
> at org.apache.spark.util.Utils$$anon$4.run(Utils.scala:173)
>
>
>


Re: NoClassDefFoundError on ThreadFactoryBuilder in Intellij

2014-10-28 Thread Stephen Boesch
I had an offline with Akhil, but this issue is still not resolved.

2014-10-24 0:18 GMT-07:00 Akhil Das :

> Make sure the guava jar
>  is
> present in the classpath.
>
> Thanks
> Best Regards
>
> On Thu, Oct 23, 2014 at 2:13 PM, Stephen Boesch  wrote:
>
>> After having checked out from master/head the following error occurs when
>> attempting to run any test in Intellij
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> com/google/common/util/concurrent/ThreadFactoryBuilder
>> at org.apache.spark.util.Utils$.(Utils.scala:648)
>>
>>
>> There appears to be a related issue/JIRA:
>>
>>
>> https://issues.apache.org/jira/browse/SPARK-3217
>>
>>
>> But the conditions described do not apply in my case:
>>
>>  Did you by any chance do one of the following:
>>
>>- forget to "clean" after pulling that change
>>- mix sbt and mvn built artifacts in the same build
>>- set SPARK_PREPEND_CLASSES
>>
>>
>> For reference here is the full stacktrace:
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> com/google/common/util/concurrent/ThreadFactoryBuilder
>> at org.apache.spark.util.Utils$.(Utils.scala:648)
>> at org.apache.spark.util.Utils$.(Utils.scala)
>> at org.apache.spark.SparkContext.(SparkContext.scala:179)
>> at org.apache.spark.SparkContext.(SparkContext.scala:119)
>> at org.apache.spark.SparkContext.(SparkContext.scala:134)
>> at
>> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:62)
>> at
>> org.apache.spark.sql.hbase.JavaHBaseSQLContext$.main(JavaHBaseSQLContext.scala:45)
>> at
>> org.apache.spark.sql.hbase.JavaHBaseSQLContext.main(JavaHBaseSQLContext.scala)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
>> Caused by: java.lang.ClassNotFoundException:
>> com.google.common.util.concurrent.ThreadFactoryBuilder
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> ... 13 more
>> Exception in thread "delete Spark temp dirs"
>> java.lang.NoClassDefFoundError: Could not initialize class
>> org.apache.spark.util.Utils$
>> at org.apache.spark.util.Utils$$anon$4.run(Utils.scala:173)
>>
>>
>>
>
>


Re: NoClassDefFoundError on ThreadFactoryBuilder in Intellij

2014-10-24 Thread Akhil Das
Make sure the guava jar
 is present
in the classpath.

Thanks
Best Regards

On Thu, Oct 23, 2014 at 2:13 PM, Stephen Boesch  wrote:

> After having checked out from master/head the following error occurs when
> attempting to run any test in Intellij
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/util/concurrent/ThreadFactoryBuilder
> at org.apache.spark.util.Utils$.(Utils.scala:648)
>
>
> There appears to be a related issue/JIRA:
>
>
> https://issues.apache.org/jira/browse/SPARK-3217
>
>
> But the conditions described do not apply in my case:
>
>  Did you by any chance do one of the following:
>
>- forget to "clean" after pulling that change
>- mix sbt and mvn built artifacts in the same build
>- set SPARK_PREPEND_CLASSES
>
>
> For reference here is the full stacktrace:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/util/concurrent/ThreadFactoryBuilder
> at org.apache.spark.util.Utils$.(Utils.scala:648)
> at org.apache.spark.util.Utils$.(Utils.scala)
> at org.apache.spark.SparkContext.(SparkContext.scala:179)
> at org.apache.spark.SparkContext.(SparkContext.scala:119)
> at org.apache.spark.SparkContext.(SparkContext.scala:134)
> at
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:62)
> at
> org.apache.spark.sql.hbase.JavaHBaseSQLContext$.main(JavaHBaseSQLContext.scala:45)
> at
> org.apache.spark.sql.hbase.JavaHBaseSQLContext.main(JavaHBaseSQLContext.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
> Caused by: java.lang.ClassNotFoundException:
> com.google.common.util.concurrent.ThreadFactoryBuilder
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> ... 13 more
> Exception in thread "delete Spark temp dirs"
> java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.spark.util.Utils$
> at org.apache.spark.util.Utils$$anon$4.run(Utils.scala:173)
>
>
>


Re: NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass with spark-submit

2014-08-08 Thread Nick Pentreath
By the way, for anyone using elasticsearch-hadoop, there is a fix for this
here: https://github.com/elasticsearch/elasticsearch-hadoop/issues/239

Ryan - using the nightly snapshot build of 2.1.0.BUILD-SNAPSHOT fixed this
for me.


On Thu, Aug 7, 2014 at 3:58 PM, Nick Pentreath 
wrote:

> I'm also getting this - Ryan we both seem to be running into this issue
> with elasticsearch-hadoop :)
>
> I tried spark.files.userClassPathFirst true on command line and that
> doesn;t work
>
> If I put it that line in spark/conf/spark-defaults it works but now I'm
> getting:
> java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/InputFormat
>
> think I may need to add hadoop-client to my assembly, but any other ideas
> welcome.
>
> Ryan, will let you know how I get on
>
>
> On Mon, Aug 4, 2014 at 10:28 AM, Sean Owen  wrote:
>
>> I'm guessing you have the Jackson classes in your assembly but so does
>> Spark. Its classloader wins, and does not contain the class present in
>> your app's version of Jackson. Try spark.files.userClassPathFirst ?
>>
>> On Mon, Aug 4, 2014 at 6:28 AM, Ryan Braley  wrote:
>> > Hi Folks,
>> >
>> >  I have an assembly jar that I am submitting using spark-submit script
>> on a
>> > cluster I created with the spark-ec2 script. I keep running into the
>> > java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
>> > error on my workers even though jar tf clearly shows that class being a
>> part
>> > of my assembly jar. I have the spark program working locally.
>> > Here is the error log:
>> > https://gist.github.com/rbraley/cf5cd3457a89b1c0ac88
>> >
>> > Anybody have any suggestions of things I can try? It seems
>> >
>> http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3c1403899110.65393.yahoomail...@web160503.mail.bf1.yahoo.com%3E
>> > that this is a similar error. I am open to recompiling spark to fix
>> this,
>> > but I would like to run my job on my cluster rather than just locally.
>> >
>> > Thanks,
>> > Ryan
>> >
>> >
>> > Ryan Braley  |  Founder
>> > http://traintracks.io/
>> >
>> > US: +1 (206) 866 5661
>> > CN: +86 156 1153 7598
>> > Coding the future. Decoding the game.
>> >
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass with spark-submit

2014-08-07 Thread Nick Pentreath
I'm also getting this - Ryan we both seem to be running into this issue
with elasticsearch-hadoop :)

I tried spark.files.userClassPathFirst true on command line and that
doesn;t work

If I put it that line in spark/conf/spark-defaults it works but now I'm
getting:
java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/InputFormat

think I may need to add hadoop-client to my assembly, but any other ideas
welcome.

Ryan, will let you know how I get on


On Mon, Aug 4, 2014 at 10:28 AM, Sean Owen  wrote:

> I'm guessing you have the Jackson classes in your assembly but so does
> Spark. Its classloader wins, and does not contain the class present in
> your app's version of Jackson. Try spark.files.userClassPathFirst ?
>
> On Mon, Aug 4, 2014 at 6:28 AM, Ryan Braley  wrote:
> > Hi Folks,
> >
> >  I have an assembly jar that I am submitting using spark-submit script
> on a
> > cluster I created with the spark-ec2 script. I keep running into the
> > java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
> > error on my workers even though jar tf clearly shows that class being a
> part
> > of my assembly jar. I have the spark program working locally.
> > Here is the error log:
> > https://gist.github.com/rbraley/cf5cd3457a89b1c0ac88
> >
> > Anybody have any suggestions of things I can try? It seems
> >
> http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3c1403899110.65393.yahoomail...@web160503.mail.bf1.yahoo.com%3E
> > that this is a similar error. I am open to recompiling spark to fix this,
> > but I would like to run my job on my cluster rather than just locally.
> >
> > Thanks,
> > Ryan
> >
> >
> > Ryan Braley  |  Founder
> > http://traintracks.io/
> >
> > US: +1 (206) 866 5661
> > CN: +86 156 1153 7598
> > Coding the future. Decoding the game.
> >
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass with spark-submit

2014-08-04 Thread Sean Owen
I'm guessing you have the Jackson classes in your assembly but so does
Spark. Its classloader wins, and does not contain the class present in
your app's version of Jackson. Try spark.files.userClassPathFirst ?

On Mon, Aug 4, 2014 at 6:28 AM, Ryan Braley  wrote:
> Hi Folks,
>
>  I have an assembly jar that I am submitting using spark-submit script on a
> cluster I created with the spark-ec2 script. I keep running into the
> java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
> error on my workers even though jar tf clearly shows that class being a part
> of my assembly jar. I have the spark program working locally.
> Here is the error log:
> https://gist.github.com/rbraley/cf5cd3457a89b1c0ac88
>
> Anybody have any suggestions of things I can try? It seems
> http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3c1403899110.65393.yahoomail...@web160503.mail.bf1.yahoo.com%3E
> that this is a similar error. I am open to recompiling spark to fix this,
> but I would like to run my job on my cluster rather than just locally.
>
> Thanks,
> Ryan
>
>
> Ryan Braley  |  Founder
> http://traintracks.io/
>
> US: +1 (206) 866 5661
> CN: +86 156 1153 7598
> Coding the future. Decoding the game.
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org