[Note: I answered my own question while writing this up and thought I
would share my findings anyways since it was a rather painful discovery
process.]
I get the following error when trying to run Spark in standalone mode
when I package up my code in a jar file:
No configuration setting found for key 'akka.remote.log-received-messages'
I believe this is exactly the same issue that is discussed here:
https://groups.google.com/forum/#!topic/spark-users/FmYGAP57MMk
<https://groups.google.com/forum/#%21topic/spark-users/FmYGAP57MMk>
I followed this thread closely, updated my pom.xml accordingly, and
filled in a couple of details that are not explicit from the thread. In
particular, I added the filter tag as described in here
<https://www.assembla.com/spaces/akka/tickets/2634-akka-remote-prevents-application-from--shading--%28standalone-jar-creation%29-due-to-config-issues-in---?comment=183467573#comment:183467573>
(as linked from the thread.) Also, (now obvious) step of adding a
reference.conf file to my project's src/main/resources directory so that
it would get picked up. My problem was that I couldn't figure out where
to find the reference.conf file! I tried a number of things including
extracting example reference.conf files out of various akka jar files in
my local maven repo. Short story long, I did a google search on "spark
reference.conf" and came across this:
https://spark-project.atlassian.net/browse/SPARK-395
which led me to this:
http://letitcrash.com/post/21025950392/howto-sbt-assembly-vs-reference-conf
which led me to realize that I needed to pull the reference.conf out of
spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.3.0.jar. It's right
there!
Summary - a good akka reference.conf file can be found in the top level
of spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.3.0.jar and can be
placed in src/main/resources before packaging up your application. My
problems went away....
Any thoughts about how this could be simplified?
HTH!
Philip