I have 7 workers for spark and set SPARK_WORKER_CORES=12, therefore 84 tasks in 
one job can run simultaneously,
I call the tasks in a job started almost  simultaneously a wave.

While inserting, there is only one job on spark, not inserting from multiple 
programs concurrently.

— 
Best Regards!
Yijie Shen

On April 2, 2015 at 2:05:31 AM, Michael Armbrust (mich...@databricks.com) wrote:

When few waves (1 or 2) are used in a job, LoadApp could finish after a few 
failures and retries.
But when more waves (3) are involved in a job, the job would terminate 
abnormally.

Can you clarify what you mean by "waves"?  Are you inserting from multiple 
programs concurrently? 

Reply via email to