akka.actor.ActorNotFound In Spark Streaming on Mesos (using ssc.actorStream)

2015-01-07 Thread Christophe Billiard
Hi all,

I am trying to run this example on mesos:
https://github.com/jaceklaskowski/spark-activator#master
https://github.com/jaceklaskowski/spark-activator#master  

I have mesos 0.21.0 (instead of 0.18.1, could that be a problem?)
I download spark pre built package spark-1.2.0-bin-hadoop2.4.tgz untar it
Create the conf/spark-env.sh file with the following lines:
export MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so
export
SPARK_EXECUTOR_URI=/home/christophe/Development/spark-1.1.1-bin-hadoop2.4.tgz

I create and fill the build.sbt (spark 1.2.0 / scala 2.11.4)
and I am using the src/main/scala/StreamingApp.scala (of the spark
activator) as my main class in Spark

When I submit with .setMaster(local[*])
The helloer actor is started at
path=akka://sparkDriver/user/Supervisor0/helloer
and it works fine.

But when I submit with .setMaster(mesos://127.0.1.1:5050)
The helloer actor is started at
path=akka://sparkExecutor/user/Supervisor0/helloer
and I get the following log:
Exception in thread main akka.actor.ActorNotFound: Actor not found for:
ActorSelection[Anchor(akka://sparkDriver/), Path(/user/Supervisor0/helloer)]

The problem is probably the new path of my actor
It can't be reached by the following url anymore (since its path is
akka://sparkExecutor/user/Supervisor0/helloer) :
val url =
sakka.tcp://sparkDriver@$driverHost:$driverPort/user/Supervisor0/$actorName

I have tried many systemActor@host:port but I didn't manage to communicate
with my actor

How can I reach my actor?
Can the mesos 0.21.0 be the source of my problem?
Have I misconfigured anything?
Any ideas?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/akka-actor-ActorNotFound-In-Spark-Streaming-on-Mesos-using-ssc-actorStream-tp21014.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: NoSuchMethodError: com.typesafe.config.Config.getDuration with akka-http/akka-stream

2015-01-06 Thread Christophe Billiard
Thanks Pankaj for the assembly plugin tip.

Yes there is a version mismatch of akka actor between Spark 1.1.1 and
akka-http/akka-stream (2.2.3 versus 2.3.x).

After some digging, I see 4 options for this problem (in case others
encounter it):
1) Upgrade to Spark 1.2.0, the same code will work (not possible for me
since I want to use Datastax connector)
2) Make a custom build of Spark 1.1.1
3) Use play instead with akka actor 2.2.3 (play 2.2.3 for instance)
4) Wait for Datastax connector 1.2.0 (released on 31th January 2015)

I currently trying option 3

Thank you all for your help



On Sat, Jan 3, 2015 at 4:11 AM, Pankaj Narang [via Apache Spark User List] 
ml-node+s1001560n20950...@n3.nabble.com wrote:

 Like before I get a java.lang.NoClassDefFoundError:
 akka/stream/FlowMaterializer$

 This can be solved using assembly plugin. you need to enable assembly
 plugin in global plugins

 C:\Users\infoshore\.sbt\0.13\plugins
  add a line in plugins.sbt  addSbtPlugin(com.eed3si9n % sbt-assembly %
 0.11.0)



  and then add the following lines in build.sbt

 import AssemblyKeys._ // put this at the top of the file

 seq(assemblySettings: _*)

 Also in the bottom dont forget to add

 assemblySettings

 mergeStrategy in assembly := {
   case m if m.toLowerCase.endsWith(manifest.mf)  =
 MergeStrategy.discard
   case m if m.toLowerCase.matches(meta-inf.*\\.sf$)  =
 MergeStrategy.discard
   case log4j.properties  =
 MergeStrategy.discard
   case m if m.toLowerCase.startsWith(meta-inf/services/) =
 MergeStrategy.filterDistinctLines
   case reference.conf=
 MergeStrategy.concat
   case _   =
 MergeStrategy.first
 }


 Now in your sbt run sbt assembly that will create the jar which can be run
 without --jars options
 as this will be a uber jar containing all jars



 Also nosuchmethod exception is thrown when there is difference in versions
 of complied and runtime versions.

 What is the version of spark you are using ? You need to use same version
 in build.sbt


 Here is your build.sbt


 libraryDependencies += org.apache.spark %% spark-core % 1.1.1
 //exclude(com.typesafe, config)

 libraryDependencies += org.apache.spark %% spark-sql % 1.1.1

 libraryDependencies += com.datastax.cassandra % cassandra-driver-core
 % 2.1.3

 libraryDependencies += com.datastax.spark %% spark-cassandra-connector
 % 1.1.0 withSources() withJavadoc()

 libraryDependencies += org.apache.cassandra % cassandra-thrift %
 2.0.5

 libraryDependencies += joda-time % joda-time % 2.6


 and your error is Exception in thread main java.lang.NoSuchMethodError:
 com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J

 at
 akka.stream.StreamSubscriptionTimeoutSettings$.apply(FlowMaterializer.scala:256)

  I think there is version mismatch on the jars you use at runtime


  If you need more help add me on skype pankaj.narang


 ---Pankaj



 --
  If you reply to this email, your message will be added to the discussion
 below:

 http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926p20950.html
  To unsubscribe from NoSuchMethodError:
 com.typesafe.config.Config.getDuration with akka-http/akka-stream, click
 here
 http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_codenode=20926code=Y2hyaXN0b3BoZS5iaWxsaWFyZEBnbWFpbC5jb218MjA5MjZ8LTE2ODA2NTAwMDk=
 .
 NAML
 http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewerid=instant_html%21nabble%3Aemail.namlbase=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespacebreadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926p20988.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: NoSuchMethodError: com.typesafe.config.Config.getDuration with akka-http/akka-stream

2015-01-02 Thread Christophe Billiard
 PM, Christophe Billiard 
 christophe.billi...@gmail.com wrote:

 Hi all,

 I am currently trying to combine datastax's spark-cassandra-connector
 and
 typesafe's akka-http-experimental
 on Spark 1.1.1 (spark-cassandra-connector for Spark 1.2.0 not out yet) and
 scala 2.10.4
 I am using the hadoop 2.4 pre built package. (build.sbt file at the end)

 To solve the java.lang.NoClassDefFoundError:
 com/datastax/spark/connector/mapper/ColumnMapper
 and other NoClassDefFoundErrors, I have to give some jars to Spark
 (build.sbt is not enough).
 The connectors works fine.

 My spark submit looks like:
 sbt clean package; bin/spark-submit   --class SimpleAppStreaming3
 --master local[*]  --jars

 spark-cassandra-connector_2.10-1.1.0.jar,cassandra-driver-core-2.1.3.jar,cassandra-thrift-2.0.5.jar,joda-time-2.6.jar
 target/scala-2.10/simple-project_2.10-1.0.jar

 Then I am trying to add some akka-http/akka-stream features.
 Like before I get a java.lang.NoClassDefFoundError:
 akka/stream/FlowMaterializer$
 Same solution, I begin to add jars.

 Now my spark submit looks like:
 sbt clean package; bin/spark-submit   --class impleAppStreaming3
  --master
 local[*]  --jars

 spark-cassandra-connector_2.10-1.1.0.jar,cassandra-driver-core-2.1.3.jar,cassandra-thrift-2.0.5.jar,joda-time-2.6.jar,akka-stream-experimental_2.10-1.0-M2.jar
 target/scala-2.10/simple-project_2.10-1.0.jar

 Then I have a new kind of error:
 Exception in thread main java.lang.NoSuchMethodError:

 com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J
 at

 akka.stream.StreamSubscriptionTimeoutSettings$.apply(FlowMaterializer.scala:256)
 at
 akka.stream.MaterializerSettings$.apply(FlowMaterializer.scala:185)
 at
 akka.stream.MaterializerSettings$.apply(FlowMaterializer.scala:172)
 at
 akka.stream.FlowMaterializer$$anonfun$1.apply(FlowMaterializer.scala:42)
 at
 akka.stream.FlowMaterializer$$anonfun$1.apply(FlowMaterializer.scala:42)
 at scala.Option.getOrElse(Option.scala:120)
 at akka.stream.FlowMaterializer$.apply(FlowMaterializer.scala:42)
 at SimpleAppStreaming3$.main(SimpleAppStreaming3.scala:240)
 at SimpleAppStreaming3.main(SimpleAppStreaming3.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at

 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at

 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:329)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 I can't get rid of this error.
 I tried:
 1) adding several jars (including config-1.2.1.jar)
 2) studying the dependency tree (with
 https://github.com/jrudolph/sbt-dependency-graph)
 3) excluding libraryDependencies (with dependencyOverrides)

 Any ideas?

 Bonus question: Is there a way to avoid adding all these jars with --jars?

 *My build.sbt file*

 name := Simple Project

 version := 1.0

 scalaVersion := 2.10.4

 libraryDependencies += org.apache.spark %% spark-core % 1.1.1
 //exclude(com.typesafe, config)

 libraryDependencies += org.apache.spark %% spark-sql % 1.1.1

 libraryDependencies += com.datastax.cassandra % cassandra-driver-core
 %
 2.1.3

 libraryDependencies += com.datastax.spark %%
 spark-cassandra-connector %
 1.1.0 withSources() withJavadoc()

 libraryDependencies += org.apache.cassandra % cassandra-thrift %
 2.0.5

 libraryDependencies += joda-time % joda-time % 2.6



 libraryDependencies += com.typesafe.akka %% akka-actor  % 2.3.8

 libraryDependencies += com.typesafe.akka %% akka-testkit% 2.3.8

 libraryDependencies += org.apache.hadoop %  hadoop-client   % 2.4.0

 libraryDependencies += ch.qos.logback%  logback-classic % 1.1.2

 libraryDependencies += org.mockito   %  mockito-all %
 1.10.17

 libraryDependencies += org.scalatest %% scalatest   % 2.2.3

 libraryDependencies += org.slf4j %  slf4j-api   % 1.7.5

 libraryDependencies += org.apache.spark  %% spark-streaming % 1.1.1


 libraryDependencies += com.typesafe.akka %% akka-stream-experimental
 % 1.0-M2

 libraryDependencies += com.typesafe.akka %% akka-http-experimental
 % 1.0-M2

 libraryDependencies += com.typesafe.akka %%
 akka-http-core-experimental
 % 1.0-M2


 libraryDependencies += com.typesafe % config % 1.2.1

 dependencyOverrides += com.typesafe % config % 1.2.1




 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail

NoSuchMethodError: com.typesafe.config.Config.getDuration with akka-http/akka-stream

2014-12-31 Thread Christophe Billiard
Hi all,

I am currently trying to combine datastax's spark-cassandra-connector and
typesafe's akka-http-experimental
on Spark 1.1.1 (spark-cassandra-connector for Spark 1.2.0 not out yet) and
scala 2.10.4
I am using the hadoop 2.4 pre built package. (build.sbt file at the end)

To solve the java.lang.NoClassDefFoundError:
com/datastax/spark/connector/mapper/ColumnMapper
and other NoClassDefFoundErrors, I have to give some jars to Spark
(build.sbt is not enough).
The connectors works fine.

My spark submit looks like:
sbt clean package; bin/spark-submit   --class SimpleAppStreaming3  
--master local[*]  --jars
spark-cassandra-connector_2.10-1.1.0.jar,cassandra-driver-core-2.1.3.jar,cassandra-thrift-2.0.5.jar,joda-time-2.6.jar
target/scala-2.10/simple-project_2.10-1.0.jar

Then I am trying to add some akka-http/akka-stream features.
Like before I get a java.lang.NoClassDefFoundError:
akka/stream/FlowMaterializer$
Same solution, I begin to add jars.

Now my spark submit looks like:
sbt clean package; bin/spark-submit   --class impleAppStreaming3   --master
local[*]  --jars
spark-cassandra-connector_2.10-1.1.0.jar,cassandra-driver-core-2.1.3.jar,cassandra-thrift-2.0.5.jar,joda-time-2.6.jar,akka-stream-experimental_2.10-1.0-M2.jar
 
target/scala-2.10/simple-project_2.10-1.0.jar

Then I have a new kind of error:
Exception in thread main java.lang.NoSuchMethodError:
com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J
at
akka.stream.StreamSubscriptionTimeoutSettings$.apply(FlowMaterializer.scala:256)
at akka.stream.MaterializerSettings$.apply(FlowMaterializer.scala:185)
at akka.stream.MaterializerSettings$.apply(FlowMaterializer.scala:172)
at 
akka.stream.FlowMaterializer$$anonfun$1.apply(FlowMaterializer.scala:42)
at 
akka.stream.FlowMaterializer$$anonfun$1.apply(FlowMaterializer.scala:42)
at scala.Option.getOrElse(Option.scala:120)
at akka.stream.FlowMaterializer$.apply(FlowMaterializer.scala:42)
at SimpleAppStreaming3$.main(SimpleAppStreaming3.scala:240)
at SimpleAppStreaming3.main(SimpleAppStreaming3.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:329)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I can't get rid of this error.
I tried:
1) adding several jars (including config-1.2.1.jar)
2) studying the dependency tree (with
https://github.com/jrudolph/sbt-dependency-graph)
3) excluding libraryDependencies (with dependencyOverrides)

Any ideas?

Bonus question: Is there a way to avoid adding all these jars with --jars?

*My build.sbt file*

name := Simple Project

version := 1.0

scalaVersion := 2.10.4

libraryDependencies += org.apache.spark %% spark-core % 1.1.1
//exclude(com.typesafe, config)

libraryDependencies += org.apache.spark %% spark-sql % 1.1.1

libraryDependencies += com.datastax.cassandra % cassandra-driver-core %
2.1.3

libraryDependencies += com.datastax.spark %% spark-cassandra-connector %
1.1.0 withSources() withJavadoc()

libraryDependencies += org.apache.cassandra % cassandra-thrift % 2.0.5

libraryDependencies += joda-time % joda-time % 2.6



libraryDependencies += com.typesafe.akka %% akka-actor  % 2.3.8

libraryDependencies += com.typesafe.akka %% akka-testkit% 2.3.8

libraryDependencies += org.apache.hadoop %  hadoop-client   % 2.4.0

libraryDependencies += ch.qos.logback%  logback-classic % 1.1.2

libraryDependencies += org.mockito   %  mockito-all % 1.10.17

libraryDependencies += org.scalatest %% scalatest   % 2.2.3

libraryDependencies += org.slf4j %  slf4j-api   % 1.7.5

libraryDependencies += org.apache.spark  %% spark-streaming % 1.1.1

 
libraryDependencies += com.typesafe.akka %% akka-stream-experimental   
% 1.0-M2

libraryDependencies += com.typesafe.akka %% akka-http-experimental 
% 1.0-M2

libraryDependencies += com.typesafe.akka %% akka-http-core-experimental
% 1.0-M2


libraryDependencies += com.typesafe % config % 1.2.1

dependencyOverrides += com.typesafe % config % 1.2.1




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org