thanks Manu..

For me, the sample app works only in 'local' mode.
If I tried to connect a spark cluster (even one running locally :
spark://localhost:7077)  I get the following error

spark.master=spark://localhost:7077
[error] o.a.s.s.c.SparkDeploySchedulerBackend - Application has been
killed. Reason: Master removed our application: FAILED
[error] application -

! @6j8im8dfj - Internal server error, for (GET) [/] ->

play.api.Application$$anon$1: Execution exception[[SparkException: Job
aborted due to stage failure: Master removed our application: FAILED]]



Sujee Maniyam (http://sujee.net | http://www.linkedin.com/in/sujeemaniyam )


On Sat, Aug 16, 2014 at 11:15 PM, Manu Suryavansh <
suryavanshi.m...@gmail.com> wrote:

> Hi,
>
> I tried the Spark(1.0.0)+Play(2.3.3) example from the Knoldus blog -
> http://blog.knoldus.com/2014/06/18/play-with-spark-building-apache-spark-with-play-framework/
>  and
> it worked for me. The project is here -
> https://github.com/knoldus/Play-Spark-Scala
>
> Regards,
> Manu
>
>
> On Sat, Aug 16, 2014 at 11:04 PM, Sujee Maniyam <su...@sujee.net> wrote:
>
>> Hi
>>
>> I am trying to connect to Spark from Play framework. Getting the
>> following Akka error...
>>
>>
>> [ERROR] [08/16/2014 17:12:05.249] [spark-akka.actor.default-dispatcher-3] 
>> [ActorSystem(spark)] Uncaught fatal error from thread 
>> [spark-akka.actor.default-dispatcher-3] shutting down ActorSystem [spark]
>>
>> java.lang.AbstractMethodError
>>   at 
>> akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
>>
>>   at 
>> akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
>>   at akka.actor.ActorCell.terminate(ActorCell.scala:369)
>>
>>   at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
>>   at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
>>
>>   at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
>>   at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>>
>>   at 
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>>   at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>>
>>   at 
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>   at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>>
>>   at 
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>
>>
>> full stack trace : https://gist.github.com/sujee/ff14fd602b76314e693d
>>
>> source code here : https://github.com/sujee/play-spark-test
>>
>> I have also found this thread mentioning Akka in-compatibility How to
>> run Play 2.2.x with Akka 2.3.x?
>> <http://stackoverflow.com/questions/22779882/how-to-run-play-2-2-x-with-akka-2-3-x>
>>
>> Stack overflow thread :
>> http://stackoverflow.com/questions/25346657/akka-error-play-framework-2-3-3-and-spark-1-0-2
>>
>> any suggestions?
>>
>> thanks!
>>
>> Sujee Maniyam (http://sujee.net | http://www.linkedin.com/in/sujeemaniyam
>> )
>>
>
>
>
> --
> Manu Suryavansh
>

Reply via email to