Hi Jakob,

Thanks a lot for your help. I'll try this.

Zoran

On Wed, Jan 27, 2016 at 10:49 AM, Jakob Odersky <ja...@odersky.com> wrote:

> JavaSparkContext has a wrapper constructor for the "scala"
> SparkContext. In this case all you need to do is declare a
> SparkContext that is accessible both from the Java and Scala sides of
> your project and wrap the context with a JavaSparkContext.
>
> Search for java source compatibilty with scala for more information on
> how to interface Java with Scala (the other way around is trivial).
> Essentially, as long as you declare your SparkContext either in Java
> or as a val/var/def in a plain Scala class you are good.
>

Reply via email to