If you want to see some examples in a library shows a way to do it -
https://github.com/sparklingpandas/sparklingml and high performance spark
also talks about it.

On Sun, Jul 15, 2018, 11:57 AM <0xf0f...@protonmail.com.invalid> wrote:

> Check
> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>
> ​Sent with ProtonMail Secure Email.​
>
> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>
> On July 15, 2018 8:01 AM, Mohit Jaggi <mohitja...@gmail.com> wrote:
>
> > Trying again…anyone know how to make this work?
> >
> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitja...@gmail.com wrote:
> > >
> > > Folks,
> > >
> > > I am writing some Scala/Java code and want it to be usable from
> pyspark.
> > >
> > > For example:
> > >
> > > class MyStuff(addend: Int) {
> > >
> > > def myMapFunction(x: Int) = x + addend
> > >
> > > }
> > >
> > > I want to call it from pyspark as:
> > >
> > > df = ...
> > >
> > > mystuff = sc._jvm.MyStuff(5)
> > >
> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> > >
> > > How can I do this?
> > >
> > > Mohit.
> >
> > --
> >
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to