JavaSparkContext has a wrapper constructor for the "scala"
SparkContext. In this case all you need to do is declare a
SparkContext that is accessible both from the Java and Scala sides of
your project and wrap the context with a JavaSparkContext.

Search for java source compatibilty with scala for more information on
how to interface Java with Scala (the other way around is trivial).
Essentially, as long as you declare your SparkContext either in Java
or as a val/var/def in a plain Scala class you are good.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to