I have a small Spark "launcher" app able to instanciate a service via Spring
xml application context and then "broadcasts" it in order to make it
available on remote nodes.

I suppose when a Spring service is instanciated, all dependencies are
instanciated and injected at the same time, so broadcasting it should
broadcast everything on remote nodes. My DAOs are not accessing a remote
database, instead they use inner collections which are loaded at startup
from xml files (in dao constructors).

This works fine when I launch my app via spark-submit and "local" mode, but
when I use "yarn-cluster" mode, I receive an exception (after all jobs have
been launched) saying my inner collections inside DAOs are empty.

All my objects are Serializable and my inner collections are mostly maps
(HashMap). I've tried to declare collections as "static" but it has no
effect on the broadcasting...

Could someone tell me what is happening here ? Is there a maximum depth for
broadcasting ?







--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-share-a-spring-singleton-service-with-Spark-tp22997.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to