Re: Question about Parallel Stages in Spark

2017-06-27 Thread satish lalam
t; On Tue, Jun 27, 2017 at 9:17 AM, 萝卜丝炒饭 <1427357...@qq.com> wrote: >>> >>>> My words cause misunderstanding. >>>> Step 1:A is submited to spark. >>>> Step 2:B is submitted to spark. >>>> >>>> Spark gets two independent jobs.The FAIR

Re: Question about Parallel Stages in Spark

2017-06-27 Thread Bryan Jeffrey
A and B. Jeffrey' code did not cause two submit. ---Original---From: "Pralabh Kumar"<pralabhku...@gmail.com>Date: 2017/6/27 12:09:27To: "萝卜丝炒饭"<1427357...@qq.com>;Cc: "user"<user@spark.apache.org>;"satishl"<satish.la...@gmail.com>;&

Re: Question about Parallel Stages in Spark

2017-06-27 Thread satish lalam
> >> >> >> ---Original--- >> *From:* "Pralabh Kumar"<pralabhku...@gmail.com> >> *Date:* 2017/6/27 12:09:27 >> *To:* "萝卜丝炒饭"<1427357...@qq.com>; >> *Cc:* "user"<user@spark.apache.org>;"satishl"

Re: Question about Parallel Stages in Spark

2017-06-26 Thread Pralabh Kumar
use two submit. > > > > ---Original--- > *From:* "Pralabh Kumar"<pralabhku...@gmail.com> > *Date:* 2017/6/27 12:09:27 > *To:* "萝卜丝炒饭"<1427357...@qq.com>; > *Cc:* "user"<user@spark.apache.org>;"satishl"<satish.la.

Re: Question about Parallel Stages in Spark

2017-06-26 Thread ??????????
te: 2017/6/27 12:09:27 To: "??"<1427357...@qq.com>; Cc: "user"<user@spark.apache.org>;"satishl"<satish.la...@gmail.com>;"Bryan Jeffrey"<bryan.jeff...@gmail.com>; Subject: Re: Question about Parallel Stages in Spark Hi

Re: Question about Parallel Stages in Spark

2017-06-26 Thread Pralabh Kumar
ail.com>; > *Cc:* "user"<user@spark.apache.org>; > *Subject:* Re: Question about Parallel Stages in Spark > > Hello. > > The driver is running the individual operations in series, but each > operation is parallelized internally. If you want them run in p

Re: Question about Parallel Stages in Spark

2017-06-26 Thread ??????????
"<user@spark.apache.org>; Subject: Re: Question about Parallel Stages in Spark Hello. The driver is running the individual operations in series, but each operation is parallelized internally. If you want them run in parallel you need to provide the driver a mechanism to thread the j

Re: Question about Parallel Stages in Spark

2017-06-26 Thread Bryan Jeffrey
Hello. The driver is running the individual operations in series, but each operation is parallelized internally. If you want them run in parallel you need to provide the driver a mechanism to thread the job scheduling out: val rdd1 = sc.parallelize(1 to 10) val rdd2 = sc.parallelize(1 to