Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?

2015-05-20 Thread Davies Liu
I think this is a general multiple-threading question, Queue is the right direction to go. Have you try something like this? results = Queue.Queue() def run_job(f, args): r = f(*args) results.put(r) # start multiple threads to run jobs threading.Thread(target=run_job, args=(f, args

Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?

2015-05-20 Thread MEETHU MATHEW
Hi Davies,Thank you for pointing to spark streaming. I am confused about how to return the result after running a function via  a thread.I tried using Queue to add the results to it and print it at the end.But here, I can see the results after all threads are finished.How to get the result of th

Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?

2015-05-18 Thread Davies Liu
SparkContext can be used in multiple threads (Spark streaming works with multiple threads), for example: import threading import time def show(x): time.sleep(1) print x def job(): sc.parallelize(range(100)).foreach(show) threading.Thread(target=job).start() On Mon, May 18, 2015

Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?

2015-05-18 Thread ayan guha
Hi So to be clear, do you want to run one operation in multiple threads within a function or you want run multiple jobs using multiple threads? I am wondering why python thread module can't be used? Or you have already gave it a try? On 18 May 2015 16:39, "MEETHU MATHEW" wrote: > Hi Akhil, > > T

Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?

2015-05-17 Thread MEETHU MATHEW
Hi Akhil, The python wrapper for Spark Job Server did not help me. I actually need the pyspark code sample  which shows how  I can call a function from 2 threads and execute it simultaneously. Thanks & Regards, Meethu M On Thursday, 14 May 2015 12:38 PM, Akhil Das wrote: Did you h

Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?

2015-05-14 Thread Akhil Das
Did you happened to have a look at the spark job server? Someone wrote a python wrapper around it, give it a try. Thanks Best Regards On Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW wrote: > Hi all,

How to run multiple jobs in one sparkcontext from separate threads in pyspark?

2015-05-13 Thread MEETHU MATHEW
Hi all,  Quote "Inside a given Spark application (SparkContext instance), multiple parallel jobs can run simultaneously if they were submitted from separate threads. "  How to run multiple jobs in one SPARKCONTEXT using separate threads in pyspark? I found some examples in scala and java, but co