Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
@Sean Owen, Thanks for your advice.
 There are still some failing tests on my laptop. I will work on this 
issue(file move) as soon as I figure out other test related issues.


-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 2:41 PM, Sean Owen wrote:

> Good call -- indeed that same Files class has a move() method that
> will try to use renameTo() and then fall back to copy() and delete()
> if needed for this very reason.
> 
> 
> On Tue, Apr 15, 2014 at 6:34 AM, Ye Xianjin  (mailto:advance...@gmail.com)> wrote:
> > Hi, I think I have found the cause of the tests failing.
> > 
> > I have two disks on my laptop. The spark project dir is on an HDD disk 
> > while the tempdir created by google.io.Files.createTempDir is the 
> > /var/folders/5q/ ,which is on the system disk, an SSD.
> > The ExecutorLoaderSuite test uses 
> > org.apache.spark.TestUtils.createdCompiledClass methods.
> > The createCompiledClass method first generates the compiled class in the 
> > pwd(spark/repl), thens use renameTo to move
> > the file. The renameTo method fails because the dest file is in a different 
> > filesystem than the source file.
> > 
> > I modify the TestUtils.scala to first copy the file to dest then delete the 
> > original file. The tests go smoothly.
> > Should I issue an jira about this problem? Then I can send a pr on Github.
> > 
> 
> 
> 




Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Sean Owen
Good call -- indeed that same Files class has a move() method that
will try to use renameTo() and then fall back to copy() and delete()
if needed for this very reason.


On Tue, Apr 15, 2014 at 6:34 AM, Ye Xianjin  wrote:
> Hi, I think I have found the cause of the tests failing.
>
> I have two disks on my laptop. The spark project dir is on an HDD disk while 
> the tempdir created by google.io.Files.createTempDir is the 
> /var/folders/5q/ ,which is on the system disk, an SSD.
> The ExecutorLoaderSuite test uses 
> org.apache.spark.TestUtils.createdCompiledClass methods.
> The createCompiledClass method first generates the compiled class in the 
> pwd(spark/repl), thens use renameTo to move
> the file. The renameTo method fails because the dest file is in a different 
> filesystem than the source file.
>
> I modify the TestUtils.scala to first copy the file to dest then delete the 
> original file. The tests go smoothly.
> Should I issue an jira about this problem? Then I can send a pr on Github.


Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Aaron Davidson
By all means, it would be greatly appreciated!


On Mon, Apr 14, 2014 at 10:34 PM, Ye Xianjin  wrote:

> Hi, I think I have found the cause of the tests failing.
>
> I have two disks on my laptop. The spark project dir is on an HDD disk
> while the tempdir created by google.io.Files.createTempDir is the
> /var/folders/5q/ ,which is on the system disk, an SSD.
> The ExecutorLoaderSuite test uses
> org.apache.spark.TestUtils.createdCompiledClass methods.
> The createCompiledClass method first generates the compiled class in the
> pwd(spark/repl), thens use renameTo to move
> the file. The renameTo method fails because the dest file is in a
> different filesystem than the source file.
>
> I modify the TestUtils.scala to first copy the file to dest then delete
> the original file. The tests go smoothly.
> Should I issue an jira about this problem? Then I can send a pr on Github.
>
> --
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>
> On Tuesday, April 15, 2014 at 3:43 AM, Ye Xianjin wrote:
>
> > well. This is very strange.
> > I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and
> made small changes to ExecutorClassLoaderSuite.scala (mostly output some
> internal variables). After that, when running repl test, I noticed the
> ReplSuite
> > was tested first and the test result is ok. But the
> ExecutorClassLoaderSuite test was weird.
> > Here is the output:
> > [info] ExecutorClassLoaderSuite:
> > [error] Uncaught exception when running
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError:
> PermGen space
> > [error] Uncaught exception when running
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError:
> PermGen space
> > Internal error when running tests: java.lang.OutOfMemoryError: PermGen
> space
> > Exception in thread "Thread-3" java.io.EOFException
> > at
> java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
> > at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
> > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
> > at sbt.React.react(ForkTests.scala:116)
> > at
> sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
> > at java.lang.Thread.run(Thread.java:695)
> >
> >
> > I revert my changes. The test result is same.
> >
> >  I touched the ReplSuite.scala file (use touch command), the test order
> is reversed, same as the very beginning. And the output is also the
> same.(The result in my first post).
> >
> >
> > --
> > Ye Xianjin
> > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> >
> >
> > On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:
> >
> > > This may have something to do with running the tests on a Mac, as
> there is
> > > a lot of File/URI/URL stuff going on in that test which may just have
> > > happened to work if run on a Linux system (like Jenkins). Note that
> this
> > > suite was added relatively recently:
> > > https://github.com/apache/spark/pull/217
> > >
> > >
> > > On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin  advance...@gmail.com)> wrote:
> > >
> > > > Thank you for your reply.
> > > >
> > > > After building the assembly jar, the repl test still failed. The
> error
> > > > output is same as I post before.
> > > >
> > > > --
> > > > Ye Xianjin
> > > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > >
> > > >
> > > > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > > >
> > > > > I believe you may need an assembly jar to run the ReplSuite.
> "sbt/sbt
> > > > > assembly/assembly".
> > > > >
> > > > > Michael
> > > > >
> > > > >
> > > > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin 
> > > > >  advance...@gmail.com)(mailto:
> > > > advance...@gmail.com (mailto:advance...@gmail.com))> wrote:
> > > > >
> > > > > > Hi, everyone:
> > > > > > I am new to Spark development. I download spark's latest code
> from
> > > > > >
> > > > >
> > > > >
> > > >
> > > > github.
> > > > > > After running sbt/sbt assembly,
> > > > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > > >
> > > > >
> > > >
> > > > failed
> > > > > > running the repl module test.
> > > > > >
> > > > > > Here are some output details.
> > > > > >
> > > > > > command:
> > > > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > > > output:
> > > > > >
> > > > > > [info] Loading project definition from
> > > > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > > > [info] Loading project definition from
> > > > > > /Volumes/MacintoshHD/github/spark/project
> > > > > > [info] Set current project to root (in build
> > > > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for graphx/test:testOnly
> > > > > > [info] Passed: Tota

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
Hi, I think I have found the cause of the tests failing. 

I have two disks on my laptop. The spark project dir is on an HDD disk while 
the tempdir created by google.io.Files.createTempDir is the 
/var/folders/5q/ ,which is on the system disk, an SSD.
The ExecutorLoaderSuite test uses 
org.apache.spark.TestUtils.createdCompiledClass methods.
The createCompiledClass method first generates the compiled class in the 
pwd(spark/repl), thens use renameTo to move
the file. The renameTo method fails because the dest file is in a different 
filesystem than the source file.

I modify the TestUtils.scala to first copy the file to dest then delete the 
original file. The tests go smoothly.
Should I issue an jira about this problem? Then I can send a pr on Github.

-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 3:43 AM, Ye Xianjin wrote:

> well. This is very strange. 
> I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and made 
> small changes to ExecutorClassLoaderSuite.scala (mostly output some internal 
> variables). After that, when running repl test, I noticed the ReplSuite  
> was tested first and the test result is ok. But the ExecutorClassLoaderSuite 
> test was weird.
> Here is the output:
> [info] ExecutorClassLoaderSuite:
> [error] Uncaught exception when running 
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: 
> PermGen space
> [error] Uncaught exception when running 
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: 
> PermGen space
> Internal error when running tests: java.lang.OutOfMemoryError: PermGen space
> Exception in thread "Thread-3" java.io.EOFException
> at 
> java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
> at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
> at sbt.React.react(ForkTests.scala:116)
> at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
> at java.lang.Thread.run(Thread.java:695)
> 
> 
> I revert my changes. The test result is same.
> 
>  I touched the ReplSuite.scala file (use touch command), the test order is 
> reversed, same as the very beginning. And the output is also the same.(The 
> result in my first post).
> 
> 
> -- 
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> 
> 
> On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:
> 
> > This may have something to do with running the tests on a Mac, as there is
> > a lot of File/URI/URL stuff going on in that test which may just have
> > happened to work if run on a Linux system (like Jenkins). Note that this
> > suite was added relatively recently:
> > https://github.com/apache/spark/pull/217
> > 
> > 
> > On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin  > (mailto:advance...@gmail.com)> wrote:
> > 
> > > Thank you for your reply.
> > > 
> > > After building the assembly jar, the repl test still failed. The error
> > > output is same as I post before.
> > > 
> > > --
> > > Ye Xianjin
> > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > 
> > > 
> > > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > > 
> > > > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > > > assembly/assembly".
> > > > 
> > > > Michael
> > > > 
> > > > 
> > > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  > > > (mailto:advance...@gmail.com)(mailto:
> > > advance...@gmail.com (mailto:advance...@gmail.com))> wrote:
> > > > 
> > > > > Hi, everyone:
> > > > > I am new to Spark development. I download spark's latest code from
> > > > > 
> > > > 
> > > > 
> > > 
> > > github.
> > > > > After running sbt/sbt assembly,
> > > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > > 
> > > > 
> > > 
> > > failed
> > > > > running the repl module test.
> > > > > 
> > > > > Here are some output details.
> > > > > 
> > > > > command:
> > > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > > output:
> > > > > 
> > > > > [info] Loading project definition from
> > > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > > [info] Loading project definition from
> > > > > /Volumes/MacintoshHD/github/spark/project
> > > > > [info] Set current project to root (in build
> > > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for graphx/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for bagel/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for streaming/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 

Re: It seems that jenkins for PR is not working

2014-04-14 Thread Nan Zhu
+1….  

--  
Nan Zhu


On Friday, April 11, 2014 at 5:35 PM, DB Tsai wrote:

> I always got
> =
>  
> Could not find Apache license headers in the following files:
> !? /root/workspace/SparkPullRequestBuilder/python/metastore/db.lck
> !? 
> /root/workspace/SparkPullRequestBuilder/python/metastore/service.properties
>  
>  
> Sincerely,
>  
> DB Tsai
> ---
> My Blog: https://www.dbtsai.com
> LinkedIn: https://www.linkedin.com/in/dbtsai
>  
>  




Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
well. This is very strange. 
I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and made small 
changes to ExecutorClassLoaderSuite.scala (mostly output some internal 
variables). After that, when running repl test, I noticed the ReplSuite  
was tested first and the test result is ok. But the ExecutorClassLoaderSuite 
test was weird.
Here is the output:
[info] ExecutorClassLoaderSuite:
[error] Uncaught exception when running 
org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: 
PermGen space
[error] Uncaught exception when running 
org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: 
PermGen space
Internal error when running tests: java.lang.OutOfMemoryError: PermGen space
Exception in thread "Thread-3" java.io.EOFException
at 
java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
at sbt.React.react(ForkTests.scala:116)
at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
at java.lang.Thread.run(Thread.java:695)


I revert my changes. The test result is same.

 I touched the ReplSuite.scala file (use touch command), the test order is 
reversed, same as the very beginning. And the output is also the same.(The 
result in my first post).


-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:

> This may have something to do with running the tests on a Mac, as there is
> a lot of File/URI/URL stuff going on in that test which may just have
> happened to work if run on a Linux system (like Jenkins). Note that this
> suite was added relatively recently:
> https://github.com/apache/spark/pull/217
> 
> 
> On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin  (mailto:advance...@gmail.com)> wrote:
> 
> > Thank you for your reply.
> > 
> > After building the assembly jar, the repl test still failed. The error
> > output is same as I post before.
> > 
> > --
> > Ye Xianjin
> > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > 
> > 
> > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > 
> > > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > > assembly/assembly".
> > > 
> > > Michael
> > > 
> > > 
> > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  > > (mailto:advance...@gmail.com)(mailto:
> > advance...@gmail.com (mailto:advance...@gmail.com))> wrote:
> > > 
> > > > Hi, everyone:
> > > > I am new to Spark development. I download spark's latest code from
> > > > 
> > > 
> > > 
> > 
> > github.
> > > > After running sbt/sbt assembly,
> > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > 
> > > 
> > 
> > failed
> > > > running the repl module test.
> > > > 
> > > > Here are some output details.
> > > > 
> > > > command:
> > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > output:
> > > > 
> > > > [info] Loading project definition from
> > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > [info] Loading project definition from
> > > > /Volumes/MacintoshHD/github/spark/project
> > > > [info] Set current project to root (in build
> > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for graphx/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for bagel/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for streaming/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for mllib/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for catalyst/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for core/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for assembly/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for sql/test:testOnly
> > > > [info] ExecutorClassLoaderSuite:
> > > > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > > > SCDynamicStore
> > > > [info] - child first *** FAILED *** (440 milliseconds)
> > > > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > > > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > [info] at java.lang.ClassLoader.loadClass(Clas

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Aaron Davidson
This may have something to do with running the tests on a Mac, as there is
a lot of File/URI/URL stuff going on in that test which may just have
happened to work if run on a Linux system (like Jenkins). Note that this
suite was added relatively recently:
https://github.com/apache/spark/pull/217


On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin  wrote:

> Thank you for your reply.
>
> After building the assembly jar, the repl test still failed. The error
> output is same as I post before.
>
> --
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>
> On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
>
> > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > assembly/assembly".
> >
> > Michael
> >
> >
> > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  advance...@gmail.com)> wrote:
> >
> > > Hi, everyone:
> > > I am new to Spark development. I download spark's latest code from
> github.
> > > After running sbt/sbt assembly,
> > > I began running sbt/sbt test in the spark source code dir. But it
> failed
> > > running the repl module test.
> > >
> > > Here are some output details.
> > >
> > > command:
> > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > output:
> > >
> > > [info] Loading project definition from
> > > /Volumes/MacintoshHD/github/spark/project/project
> > > [info] Loading project definition from
> > > /Volumes/MacintoshHD/github/spark/project
> > > [info] Set current project to root (in build
> > > file:/Volumes/MacintoshHD/github/spark/)
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for graphx/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for bagel/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for streaming/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for mllib/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for catalyst/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for core/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for assembly/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for sql/test:testOnly
> > > [info] ExecutorClassLoaderSuite:
> > > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > > SCDynamicStore
> > > [info] - child first *** FAILED *** (440 milliseconds)
> > > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > > [info] at
> > >
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > [info] at
> > >
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > [info] at scala.Option.getOrElse(Option.scala:120)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > [info] at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > > org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > > [info] at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
Thank you for your reply. 

After building the assembly jar, the repl test still failed. The error output 
is same as I post before. 

-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:

> I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> assembly/assembly".
> 
> Michael
> 
> 
> On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  (mailto:advance...@gmail.com)> wrote:
> 
> > Hi, everyone:
> > I am new to Spark development. I download spark's latest code from github.
> > After running sbt/sbt assembly,
> > I began running sbt/sbt test in the spark source code dir. But it failed
> > running the repl module test.
> > 
> > Here are some output details.
> > 
> > command:
> > sbt/sbt "test-only org.apache.spark.repl.*"
> > output:
> > 
> > [info] Loading project definition from
> > /Volumes/MacintoshHD/github/spark/project/project
> > [info] Loading project definition from
> > /Volumes/MacintoshHD/github/spark/project
> > [info] Set current project to root (in build
> > file:/Volumes/MacintoshHD/github/spark/)
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for graphx/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for bagel/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for streaming/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for mllib/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for catalyst/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for core/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for assembly/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for sql/test:testOnly
> > [info] ExecutorClassLoaderSuite:
> > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > SCDynamicStore
> > [info] - child first *** FAILED *** (440 milliseconds)
> > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > [info] at
> > org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > [info] at
> > org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > [info] at scala.Option.getOrElse(Option.scala:120)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > [info] at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > [info] at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > [info] at
> > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> > [info] at
> > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> > [info] at scala.collection.immutable.List.foreach(List.scala:318)
> > [info] at org.scalatest.SuperEngine.org 
> > (http://org.scalatest.SuperEngine.org)
> > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> > [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Michael Armbrust
I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
assembly/assembly".

Michael


On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  wrote:

> Hi, everyone:
> I am new to Spark development. I download spark's latest code from github.
> After running sbt/sbt assembly,
> I began running  sbt/sbt test in the spark source code dir. But it failed
> running the repl module test.
>
> Here are some output details.
>
> command:
> sbt/sbt "test-only org.apache.spark.repl.*"
> output:
>
> [info] Loading project definition from
> /Volumes/MacintoshHD/github/spark/project/project
> [info] Loading project definition from
> /Volumes/MacintoshHD/github/spark/project
> [info] Set current project to root (in build
> file:/Volumes/MacintoshHD/github/spark/)
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for graphx/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for bagel/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for streaming/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for mllib/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for catalyst/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for core/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for assembly/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for sql/test:testOnly
> [info] ExecutorClassLoaderSuite:
> 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> SCDynamicStore
> [info] - child first *** FAILED *** (440 milliseconds)
> [info]   java.lang.ClassNotFoundException: ReplFakeClass2
> [info]   at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> [info]   at
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> [info]   at
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> [info]   at scala.Option.getOrElse(Option.scala:120)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> [info]   at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> [info]   at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> [info]   at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> [info]   at
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> [info]   at
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> [info]   at scala.collection.immutable.List.foreach(List.scala:318)
> [info]   at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> [info]   at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
> [info]   at org.scalatest.Suite$class.run(Suite.scala:2303)
> [info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.org
> $scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> [info]   at
> org.

Re: Akka problem when using scala command to launch Spark applications in the current 0.9.0-SNAPSHOT

2014-04-14 Thread Gary Malouf
Sorry to dig up an old issue.

We build an assembly against spark-0.9.0-RC3 to run on our Spark cluster on
top of Mesos.  When we upgraded to 0.9.0-RC3 from an earlier master cut
from November, we ran into Akka issues described above.

Is it supported to be able to deploy this jar using the Spark classpath
script and Java?  Putting Scala 2.10.3's libraries on the path seems to
break it at runtime.


On Tue, Dec 24, 2013 at 3:49 PM, Patrick Wendell wrote:

> Even,
>
> This problem also exists for people who write their own applications that
> depend on/include Spark. E.g. they bundle up their app and then launch the
> driver with "scala -cp my-budle.jar"... I've seen this cause an issue in
> that setting.
>
> - Patrick
>
>
> On Tue, Dec 24, 2013 at 10:50 AM, Evan Chan  wrote:
>
> > Hi Reynold,
> >
> > The default, documented methods of starting Spark all use the assembly
> > jar, and thus java, right?
> >
> > -Evan
> >
> >
> >
> > On Fri, Dec 20, 2013 at 11:36 PM, Reynold Xin 
> wrote:
> >
> >> It took me hours to debug a problem yesterday on the latest master
> branch
> >> (0.9.0-SNAPSHOT), and I would like to share with the dev list in case
> >> anybody runs into this Akka problem.
> >>
> >> A little background for those of you who haven't followed closely the
> >> development of Spark and YARN 2.2: YARN 2.2 uses protobuf 2.5, and Akka
> >> uses an older version of protobuf that is not binary compatible. In
> order
> >> to have a single build that is compatible for both YARN 2.2 and pre-2.2
> >> YARN/Hadoop, we published a special version of Akka that builds with
> >> protobuf shaded (i.e. using a different package name for the protobuf
> >> stuff).
> >>
> >> However, it turned out Scala 2.10 includes a version of Akka jar in its
> >> default classpath (look at the lib folder in Scala 2.10 binary
> >> distribution). If you use the scala command to launch any Spark
> >> application
> >> on the current master branch, there is a pretty high chance that you
> >> wouldn't be able to create the SparkContext (stack trace at the end of
> the
> >> email). The problem is that the Akka packaged with Scala 2.10 takes
> >> precedence in the classloader over the special Akka version Spark
> >> includes.
> >>
> >> Before we have a good solution for this, the workaround is to use java
> to
> >> launch the application instead of scala. All you need to do is to
> include
> >> the right Scala jars (scala-library and scala-compiler) in the
> classpath.
> >> Note that the scala command is really just a simple script that calls
> java
> >> with the right classpath.
> >>
> >>
> >> Stack trace:
> >>
> >> java.lang.NoSuchMethodException:
> >> akka.remote.RemoteActorRefProvider.(java.lang.String,
> >> akka.actor.ActorSystem$Settings, akka.event.EventStream,
> >> akka.actor.Scheduler, akka.actor.DynamicAccess)
> >> at java.lang.Class.getConstructor0(Class.java:2763)
> >> at java.lang.Class.getDeclaredConstructor(Class.java:2021)
> >> at
> >>
> >>
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:77)
> >> at scala.util.Try$.apply(Try.scala:161)
> >> at
> >>
> >>
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:74)
> >> at
> >>
> >>
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:85)
> >> at
> >>
> >>
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:85)
> >> at scala.util.Success.flatMap(Try.scala:200)
> >> at
> >>
> >>
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:85)
> >> at akka.actor.ActorSystemImpl.(ActorSystem.scala:546)
> >> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> >> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> >> at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:79)
> >> at
> >>
> org.apache.spark.SparkEnv$.createFromSystemProperties(SparkEnv.scala:120)
> >> at org.apache.spark.SparkContext.(SparkContext.scala:106)
> >>
> >
> >
> >
> > --
> > --
> > Evan Chan
> > Staff Engineer
> > e...@ooyala.com  |
> >
> >  <
> http://www.linkedin.com/company/ooyala>
> >
> >
>


Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
Hi, everyone: 
I am new to Spark development. I download spark's latest code from github. 
After running sbt/sbt assembly,
I began running  sbt/sbt test in the spark source code dir. But it failed 
running the repl module test.

Here are some output details.

command:
sbt/sbt "test-only org.apache.spark.repl.*"
output:

[info] Loading project definition from 
/Volumes/MacintoshHD/github/spark/project/project
[info] Loading project definition from /Volumes/MacintoshHD/github/spark/project
[info] Set current project to root (in build 
file:/Volumes/MacintoshHD/github/spark/)
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for graphx/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for bagel/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for streaming/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for mllib/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for catalyst/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for core/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for assembly/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for sql/test:testOnly
[info] ExecutorClassLoaderSuite:
2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from 
SCDynamicStore
[info] - child first *** FAILED *** (440 milliseconds)
[info]   java.lang.ClassNotFoundException: ReplFakeClass2
[info]   at java.lang.ClassLoader.findClass(ClassLoader.java:364)
[info]   at 
org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
[info]   at 
org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
[info]   at 
org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
[info]   at 
org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
[info]   at scala.Option.getOrElse(Option.scala:120)
[info]   at 
org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
[info]   at 
org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
[info]   at 
org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
[info]   at 
org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
[info]   at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
[info]   at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
[info]   at 
org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
[info]   at 
org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
[info]   at org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
[info]   at org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
[info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
[info]   at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
[info]   at 
org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
[info]   at 
org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
[info]   at 
org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
[info]   at 
org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
[info]   at 
org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
[info]   at scala.collection.immutable.List.foreach(List.scala:318)
[info]   at 
org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
[info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
[info]   at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
[info]   at 
org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
[info]   at org.scalatest.Suite$class.run(Suite.scala:2303)
[info]   at 
org.apache.spark.repl.ExecutorClassLoaderSuite.org$scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
[info]   at org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
[info]   at org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
[info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
[info]   at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
[info]   at 
org.apache.spark.repl.ExecutorClassLoaderSuite.org$scalatest$BeforeAndAfterAll$$super$run(ExecutorClassLoaderSuite.scal