Hi Tapan, Perhaps this may work? It takes a range of 0..100 and creates an RDD out of them, then calls X(i) on each. The X(i) should be executed on the workers in parallel.
Scala: val results = sc.parallelize(0 until 100).map(idx => X(idx)) Python: results = sc.parallelize(range(100)).map(lambda idx: X(idx)) -sujit On Wed, Sep 23, 2015 at 6:46 AM, Tapan Sharma <tapan.sha...@gmail.com> wrote: > Hi All, > > I want to call a method X(int i) from my Spark program for different values > of i. > This means. > X(1), X(2).. X(n).. > Each time it returns the one object. > Currently I am doing this sequentially. > > Is there any way to run these in parallel and I get back the list of > objects? > Sorry for this basic question. > > Regards > Tapan > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Calling-a-method-parallel-tp24786.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >