Hi Akka users,

I am trying to use Akka Camel together with Spark Streaming and I am
getting this error message:

14/10/23 09:31:30 ERROR OneForOneStrategy: No configuration setting found
for key 'akka.camel'
akka.actor.ActorInitializationException: exception during creation

I have followed the pattern for creating an Actor based receiver:

http://spark.apache.org/docs/latest/streaming-custom-receivers.html

My Actor looks like this:

class NettyReceiver[T: ClassTag](port: Int) extends Consumer with
ActorHelper {
  def endpointUri = "netty:tcp://xyz:" + port
  def receive = {
    case data: T => store(data)
  }
}

And I create a DStream like this:

val dstream =  ssc.actorStream[MessageType](Props(new
NettyReceiver[MessageType](4548)), "msgNettyReceiver")

All good so far.  I use sbt assembly and sbt package to create jar files
for the project and the application and I run it on the server using this
command:

sudo ./spark-submit --class SparkStreamingCamelApp --master
spark://xyz:7077 --jars  /opt/app/bigProject.jar --total-executor-cores 3
/opt/app/smallApplication.jar

The streaming application runs without errors but in the Spark worker log I
see these errors:

akka.actor.ActorInitializationException: exception during creation
Caused by: java.lang.reflect.InvocationTargetException
Caused by: akka.actor.InvalidActorNameException: actor name
[camel-supervisor] is not unique!
14/10/23 09:31:30 ERROR OneForOneStrategy: No configuration setting found
for key 'akka.camel'
akka.actor.ActorInitializationException: exception during creation

I have researched the issue and found that Patrick Nordwell said this issue
"indicates that the reference.conf for akka-camel is not loaded":

http://grokbase.com/t/gg/akka-user/13bp25kd7f/akka-camel-osgi

If I run the following command on the assembled bigProject.jar, the
reference.conf is there:

[user@xyz tmp]$ jar tvf bigProject.jar | grep reference.conf
 81115 Thu Oct 23 15:29:10 BST 2014 reference.conf

If I do the same check on the driver application's smallApplication.jar the
reference.conf the file is not there.  Is this the issue?  I think not,
both jars are passed to the Spark workers and are in the work directory.

If I check the contents of reference.conf using the following command:

unzip -p bigProject.jar reference.conf

I find the following Akka Camel section:

akka {
  camel {
    # Whether JMX should be enabled or disabled for the Camel Context
    jmx = off
    # enable/disable streaming cache on the Camel Context
    streamingCache = on
    consumer {
      # Configured setting which determines whether one-way communications
      # between an endpoint and this consumer actor
      # should be auto-acknowledged or application-acknowledged.
      # This flag has only effect when exchange is in-only.
      auto-ack = on

      # When endpoint is out-capable (can produce responses) reply-timeout
is the
      # maximum time the endpoint can take to send the response before the
message
      # exchange fails. This setting is used for out-capable, in-only,
      # manually acknowledged communication.
      reply-timeout = 1m

      # The duration of time to await activation of an endpoint.
      activation-timeout = 10s
    }

    #Scheme to FQCN mappings for CamelMessage body conversions
    conversions {
      "file" = "java.io.InputStream"
    }
  }
}

(The file is much bigger, with other sections, of course).

So the file is there but I still get the "No configuration setting found
for key 'akka.camel'" error.

I am using Scala 2.10.4 and Akka 2.2.3, as I believe this is the version
that Spark 1.1 uses.

Patrick Nordwall also says "akka-osgi_2.10-2.1.4.jar should replace
akka-actor_2.10-2.1.4.jar in an osgi environment".

I changed my build.sbt so that akka-actor is "provided", like so:

  "com.typesafe.akka" % "akka-camel_2.10" % "2.2.3",
  "com.typesafe.akka" % "akka-osgi_2.10" % "2.2.3",
  "com.typesafe.akka" % "akka-cluster_2.10" % "2.2.3",
  "com.typesafe.akka" % "akka-actor_2.10" % "2.2.3" % "provided",
  "org.apache.camel" % "camel-netty" % "2.12.3"

I have checked and it is not in the assembled jar.  But more than likely
Spark itself will be loading the akka-actor jar, right?

Any ideas how to get Spark Streaming and Akka Camel working together?  Am I
missing something stupid or?  Any help greatly appreciated!

Best regards,
Patrick

-- 
>>>>>>>>>>      Read the docs: http://akka.io/docs/
>>>>>>>>>>      Check the FAQ: 
>>>>>>>>>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>>>>>>>>>      Search the archives: https://groups.google.com/group/akka-user
--- 
You received this message because you are subscribed to the Google Groups "Akka 
User List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/akka-user.
For more options, visit https://groups.google.com/d/optout.

Reply via email to