t; On Tue, Jun 27, 2017 at 9:17 AM, 萝卜丝炒饭 <1427357...@qq.com> wrote:
>>>
>>>> My words cause misunderstanding.
>>>> Step 1:A is submited to spark.
>>>> Step 2:B is submitted to spark.
>>>>
>>>> Spark gets two independent jobs.The FAIR
A and B.
Jeffrey' code did not cause two submit.
---Original---From: "Pralabh Kumar"<pralabhku...@gmail.com>Date: 2017/6/27
12:09:27To: "萝卜丝炒饭"<1427357...@qq.com>;Cc:
"user"<user@spark.apache.org>;"satishl"<satish.la...@gmail.com>;&
>
>>
>>
>> ---Original---
>> *From:* "Pralabh Kumar"<pralabhku...@gmail.com>
>> *Date:* 2017/6/27 12:09:27
>> *To:* "萝卜丝炒饭"<1427357...@qq.com>;
>> *Cc:* "user"<user@spark.apache.org>;"satishl"
use two submit.
>
>
>
> ---Original---
> *From:* "Pralabh Kumar"<pralabhku...@gmail.com>
> *Date:* 2017/6/27 12:09:27
> *To:* "萝卜丝炒饭"<1427357...@qq.com>;
> *Cc:* "user"<user@spark.apache.org>;"satishl"<satish.la.
te: 2017/6/27 12:09:27
To: "??"<1427357...@qq.com>;
Cc: "user"<user@spark.apache.org>;"satishl"<satish.la...@gmail.com>;"Bryan
Jeffrey"<bryan.jeff...@gmail.com>;
Subject: Re: Question about Parallel Stages in Spark
Hi
ail.com>;
> *Cc:* "user"<user@spark.apache.org>;
> *Subject:* Re: Question about Parallel Stages in Spark
>
> Hello.
>
> The driver is running the individual operations in series, but each
> operation is parallelized internally. If you want them run in p
"<user@spark.apache.org>;
Subject: Re: Question about Parallel Stages in Spark
Hello.
The driver is running the individual operations in series, but each operation
is parallelized internally. If you want them run in parallel you need to
provide the driver a mechanism to thread the j
Hello.
The driver is running the individual operations in series, but each
operation is parallelized internally. If you want them run in parallel you
need to provide the driver a mechanism to thread the job scheduling out:
val rdd1 = sc.parallelize(1 to 10)
val rdd2 = sc.parallelize(1 to