[ https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14139163#comment-14139163 ]
Matei Zaharia commented on SPARK-2593: -------------------------------------- Sure, it would be great to do this for streaming. For the other stuff by the way, my concern isn't so much upgrades, it's locking users in to a specific Spark version. For example, suppose we add an API involving classes from Akka 2.2 today, and later 2.3 or 2.4 changes those classes in a way that isn't compatible. Then to meet our API stability promises to our users, we're going to keep our version of Akka at 2.2, and users simply won't be able to use a new Akka in the same application as Spark. With Akka in particular you sometimes have to upgrade in order to use a new Scala version too, which is another reason we can't lock it into our API. Basically in these cases it's just very risky to expose fast-moving third-party classes if you want to have a stable API. We've been bitten by exposing stuff as mundane as Guava or Google Protobufs (!) because of incompatible changes in minor versions. We care a lot about API stability within Spark and in particular about shielding our users from the fast-moving APIs in distributed systems land. > Add ability to pass an existing Akka ActorSystem into Spark > ----------------------------------------------------------- > > Key: SPARK-2593 > URL: https://issues.apache.org/jira/browse/SPARK-2593 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Reporter: Helena Edelson > > As a developer I want to pass an existing ActorSystem into StreamingContext > in load-time so that I do not have 2 actor systems running on a node in an > Akka application. > This would mean having spark's actor system on its own named-dispatchers as > well as exposing the new private creation of its own actor system. > > -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org