Check 
https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task

​Sent with ProtonMail Secure Email.​

‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐

On July 15, 2018 8:01 AM, Mohit Jaggi <mohitja...@gmail.com> wrote:

> Trying again…anyone know how to make this work?
> 
> > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitja...@gmail.com wrote:
> > 
> > Folks,
> > 
> > I am writing some Scala/Java code and want it to be usable from pyspark.
> > 
> > For example:
> > 
> > class MyStuff(addend: Int) {
> > 
> > def myMapFunction(x: Int) = x + addend
> > 
> > }
> > 
> > I want to call it from pyspark as:
> > 
> > df = ...
> > 
> > mystuff = sc._jvm.MyStuff(5)
> > 
> > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> > 
> > How can I do this?
> > 
> > Mohit.
> 
> --
> 
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to