Thanks DB,

Feel free to file sub-jira's under:
https://issues.apache.org/jira/browse/SPARK-2487

I've been importing the Maven build into Intellij, it might be worth
trying that as well to see if it works.

- Patrick

On Mon, Jul 14, 2014 at 4:53 PM, DB Tsai <dbt...@dbtsai.com> wrote:
> I've a clean clone of spark master repository, and I generated the
> intellij project file by sbt gen-idea as usual. There are two issues
> we have after merging SPARK-1776 (read dependencies from Maven).
>
> 1) After SPARK-1776, sbt gen-idea will download the dependencies from
> internet even those jars are in local cache. Before merging, the
> second time we run gen-idea will not download anything but use the
> jars in cache.
>
> 2) The tests with spark local context can not be run in the intellij.
> It will show the following exception.
>
> The current workaround we've are checking out any snapshot before
> merging to gen-idea, and then switch back to current master. But this
> will not work when the master deviate too much from the latest working
> snapshot.
>
> [ERROR] [07/14/2014 16:27:49.967] [ScalaTest-run] [Remoting] Remoting
> error: [Startup timed out] [
> akka.remote.RemoteTransportException: Startup timed out
> at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
> at akka.remote.Remoting.start(Remoting.scala:191)
> at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:132)
> at 
> org.apache.spark.mllib.util.LocalSparkContext$class.beforeAll(LocalSparkContext.scala:29)
> at 
> org.apache.spark.mllib.optimization.LBFGSSuite.beforeAll(LBFGSSuite.scala:27)
> at 
> org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
> at 
> org.apache.spark.mllib.optimization.LBFGSSuite.beforeAll(LBFGSSuite.scala:27)
> at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
> at org.apache.spark.mllib.optimization.LBFGSSuite.run(LBFGSSuite.scala:27)
> at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
> at 
> org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
> at 
> org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
> at 
> org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
> at 
> org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
> at 
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
> at 
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
> at org.scalatest.tools.Runner$.run(Runner.scala:883)
> at org.scalatest.tools.Runner.run(Runner.scala)
> at 
> org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:141)
> at 
> org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:32)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out
> after [10000 milliseconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107)
> at akka.remote.Remoting.start(Remoting.scala:173)
> ... 35 more
> ]
>
> An exception or error caused a run to abort: Futures timed out after
> [10000 milliseconds]
> java.util.concurrent.TimeoutException: Futures timed out after [10000
> milliseconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107)
> at akka.remote.Remoting.start(Remoting.scala:173)
> at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:132)
> at 
> org.apache.spark.mllib.util.LocalSparkContext$class.beforeAll(LocalSparkContext.scala:29)
> at 
> org.apache.spark.mllib.optimization.LBFGSSuite.beforeAll(LBFGSSuite.scala:27)
> at 
> org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
> at 
> org.apache.spark.mllib.optimization.LBFGSSuite.beforeAll(LBFGSSuite.scala:27)
> at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
> at org.apache.spark.mllib.optimization.LBFGSSuite.run(LBFGSSuite.scala:27)
> at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
> at 
> org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
> at 
> org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
> at 
> org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
> at 
> org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
> at 
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
> at 
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
> at org.scalatest.tools.Runner$.run(Runner.scala:883)
> at org.scalatest.tools.Runner.run(Runner.scala)
> at 
> org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:141)
> at 
> org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:32)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
>
>
> Sincerely,
>
> DB Tsai
> -------------------------------------------------------
> My Blog: https://www.dbtsai.com
> LinkedIn: https://www.linkedin.com/in/dbtsai

Reply via email to