Re: Play Scala Spark Exmaple

2015-01-12 Thread Eduardo Cusa
The EC2 versiĆ³n is 1.1.0 and this is my build.sbt:


libraryDependencies ++= Seq(
  jdbc,
  anorm,
  cache,
  org.apache.spark  %% spark-core  % 1.1.0,
  com.typesafe.akka %% akka-actor  % 2.2.3,
  com.typesafe.akka %% akka-slf4j  % 2.2.3,
  org.apache.spark  %% spark-streaming-twitter % 1.1.0,
  org.apache.spark  %% spark-sql   % 1.1.0,
  org.apache.spark  %% spark-mllib % 1.1.0
  )



On Sun, Jan 11, 2015 at 3:01 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 What is your spark version that is running on the EC2 cluster? From the build
 file https://github.com/knoldus/Play-Spark-Scala/blob/master/build.sbt
 of your play application it seems that it uses Spark 1.0.1.

 Thanks
 Best Regards

 On Fri, Jan 9, 2015 at 7:17 PM, Eduardo Cusa 
 eduardo.c...@usmediaconsulting.com wrote:

 Hi guys, I running the following example :
 https://github.com/knoldus/Play-Spark-Scala in the same machine as the
 spark master, and the spark cluster was lauched with ec2 script.


 I'm stuck with this errors, any idea how to fix it?

 Regards
 Eduardo


 call the play app prints the following exception :


 [*error*] a.r.EndpointWriter - AssociationError 
 [akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481] - 
 [akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575]: Error 
 [Shut down address: 
 akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575] [
 akka.remote.ShutDownAssociation: Shut down address: 
 akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575
 Caused by: akka.remote.transport.Transport$InvalidAssociationException: The 
 remote system terminated the association because it is shutting down.




 The master recive the spark application and generate the following stderr
 log page:


 15/01/09 13:31:23 INFO Remoting: Remoting started; listening on addresses 
 :[akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:37856]
 15/01/09 13:31:23 INFO Remoting: Remoting now listens on addresses: 
 [akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:37856]
 15/01/09 13:31:23 INFO util.Utils: Successfully started service 
 'sparkExecutor' on port 37856.
 15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to MapOutputTracker: 
 akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/MapOutputTracker
 15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to BlockManagerMaster: 
 akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/BlockManagerMaster
 15/01/09 13:31:23 INFO storage.DiskBlockManager: Created local directory at 
 /mnt/spark/spark-local-20150109133123-3805
 15/01/09 13:31:23 INFO storage.DiskBlockManager: Created local directory at 
 /mnt2/spark/spark-local-20150109133123-b05e
 15/01/09 13:31:23 INFO util.Utils: Successfully started service 'Connection 
 manager for block manager' on port 36936.
 15/01/09 13:31:23 INFO network.ConnectionManager: Bound socket to port 36936 
 with id = ConnectionManagerId(ip-10-158-18-250.ec2.internal,36936)
 15/01/09 13:31:23 INFO storage.MemoryStore: MemoryStore started with 
 capacity 265.4 MB
 15/01/09 13:31:23 INFO storage.BlockManagerMaster: Trying to register 
 BlockManager
 15/01/09 13:31:23 INFO storage.BlockManagerMaster: Registered BlockManager
 15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to HeartbeatReceiver: 
 akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/HeartbeatReceiver
 15/01/09 13:31:54 ERROR executor.CoarseGrainedExecutorBackend: Driver 
 Disassociated [akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:57671] 
 - [akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481] 
 disassociated! Shutting down.





Re: Play Scala Spark Exmaple

2015-01-10 Thread Akhil Das
What is your spark version that is running on the EC2 cluster? From the build
file https://github.com/knoldus/Play-Spark-Scala/blob/master/build.sbt of
your play application it seems that it uses Spark 1.0.1.

Thanks
Best Regards

On Fri, Jan 9, 2015 at 7:17 PM, Eduardo Cusa 
eduardo.c...@usmediaconsulting.com wrote:

 Hi guys, I running the following example :
 https://github.com/knoldus/Play-Spark-Scala in the same machine as the
 spark master, and the spark cluster was lauched with ec2 script.


 I'm stuck with this errors, any idea how to fix it?

 Regards
 Eduardo


 call the play app prints the following exception :


 [*error*] a.r.EndpointWriter - AssociationError 
 [akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481] - 
 [akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575]: Error 
 [Shut down address: 
 akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575] [
 akka.remote.ShutDownAssociation: Shut down address: 
 akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575
 Caused by: akka.remote.transport.Transport$InvalidAssociationException: The 
 remote system terminated the association because it is shutting down.




 The master recive the spark application and generate the following stderr
 log page:


 15/01/09 13:31:23 INFO Remoting: Remoting started; listening on addresses 
 :[akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:37856]
 15/01/09 13:31:23 INFO Remoting: Remoting now listens on addresses: 
 [akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:37856]
 15/01/09 13:31:23 INFO util.Utils: Successfully started service 
 'sparkExecutor' on port 37856.
 15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to MapOutputTracker: 
 akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/MapOutputTracker
 15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to BlockManagerMaster: 
 akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/BlockManagerMaster
 15/01/09 13:31:23 INFO storage.DiskBlockManager: Created local directory at 
 /mnt/spark/spark-local-20150109133123-3805
 15/01/09 13:31:23 INFO storage.DiskBlockManager: Created local directory at 
 /mnt2/spark/spark-local-20150109133123-b05e
 15/01/09 13:31:23 INFO util.Utils: Successfully started service 'Connection 
 manager for block manager' on port 36936.
 15/01/09 13:31:23 INFO network.ConnectionManager: Bound socket to port 36936 
 with id = ConnectionManagerId(ip-10-158-18-250.ec2.internal,36936)
 15/01/09 13:31:23 INFO storage.MemoryStore: MemoryStore started with capacity 
 265.4 MB
 15/01/09 13:31:23 INFO storage.BlockManagerMaster: Trying to register 
 BlockManager
 15/01/09 13:31:23 INFO storage.BlockManagerMaster: Registered BlockManager
 15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to HeartbeatReceiver: 
 akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/HeartbeatReceiver
 15/01/09 13:31:54 ERROR executor.CoarseGrainedExecutorBackend: Driver 
 Disassociated [akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:57671] 
 - [akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481] 
 disassociated! Shutting down.




Play Scala Spark Exmaple

2015-01-09 Thread Eduardo Cusa
Hi guys, I running the following example :
https://github.com/knoldus/Play-Spark-Scala in the same machine as the
spark master, and the spark cluster was lauched with ec2 script.


I'm stuck with this errors, any idea how to fix it?

Regards
Eduardo


call the play app prints the following exception :


[*error*] a.r.EndpointWriter - AssociationError
[akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481] -
[akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575]:
Error [Shut down address:
akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575] [
akka.remote.ShutDownAssociation: Shut down address:
akka.tcp://driverPropsFetcher@ip-10-158-18-250.ec2.internal:52575
Caused by: akka.remote.transport.Transport$InvalidAssociationException:
The remote system terminated the association because it is shutting
down.




The master recive the spark application and generate the following stderr
log page:


15/01/09 13:31:23 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:37856]
15/01/09 13:31:23 INFO Remoting: Remoting now listens on addresses:
[akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:37856]
15/01/09 13:31:23 INFO util.Utils: Successfully started service
'sparkExecutor' on port 37856.
15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to MapOutputTracker:
akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/MapOutputTracker
15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to
BlockManagerMaster:
akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/BlockManagerMaster
15/01/09 13:31:23 INFO storage.DiskBlockManager: Created local
directory at /mnt/spark/spark-local-20150109133123-3805
15/01/09 13:31:23 INFO storage.DiskBlockManager: Created local
directory at /mnt2/spark/spark-local-20150109133123-b05e
15/01/09 13:31:23 INFO util.Utils: Successfully started service
'Connection manager for block manager' on port 36936.
15/01/09 13:31:23 INFO network.ConnectionManager: Bound socket to port
36936 with id =
ConnectionManagerId(ip-10-158-18-250.ec2.internal,36936)
15/01/09 13:31:23 INFO storage.MemoryStore: MemoryStore started with
capacity 265.4 MB
15/01/09 13:31:23 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/01/09 13:31:23 INFO storage.BlockManagerMaster: Registered BlockManager
15/01/09 13:31:23 INFO util.AkkaUtils: Connecting to
HeartbeatReceiver:
akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481/user/HeartbeatReceiver
15/01/09 13:31:54 ERROR executor.CoarseGrainedExecutorBackend: Driver
Disassociated [akka.tcp://sparkExecutor@ip-10-158-18-250.ec2.internal:57671]
- [akka.tcp://sparkDriver@ip-10-28-236-122.ec2.internal:47481]
disassociated! Shutting down.