Hi all,
I read 'Passing Functions to Spark' section in Spark Programming Guide.
It recommends make function Anonymous or static to avoid transfer whole
class instance.
So, I wonder if I can pass functions like this.

case1: define func2 inside func1
class{
  def func1() = {
    def func2() = {...}
    rdd.map(x=> func2())
  }
}

case2: define inner object
class{
  def func1() = {
    rdd.map(x=>MyFunc.func2())
  }

  object MyFunc{
    def func2() = {...}
  }
}

Thanks in advance.

Kevin



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Recommended-ways-to-pass-functions-tp14875.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to