Hi,

I have a mixed Java/Scala project. I have already been using Spark in Scala
code in local mode. Now, some new team members should develop
functionalities that should use Spark but in Java code, and they are not
familiar with Scala. I know it's not possible to have two Spark contexts in
the same application and that having JavaSparkContext instance in addition
to scala SparkContext instance would not work, but I'm wondering if there is
some workaround for this.

Thanks,
Zoran




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-in-mixed-Java-Scala-project-tp26091.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to