Hi
Thanks
I am new in spark development so can you provide some help to write a
custom partitioner to achieve this.
if you have and link or example to write custom partitioner please provide
to me.

On Mon, Nov 16, 2015 at 6:13 PM, Sabarish Sasidharan <
[email protected]> wrote:

> You can write your own custom partitioner to achieve this
>
> Regards
> Sab
> On 17-Nov-2015 1:11 am, "prateek arora" <[email protected]>
> wrote:
>
>> Hi
>>
>> I have a RDD with 30 record ( Key/value pair ) and running 30 executor . i
>> want to reparation this RDD in to 30 partition so every partition  get one
>> record and assigned to one executor .
>>
>> when i used rdd.repartition(30) its repartition my rdd in 30 partition but
>> some partition get 2 record , some get 1 record and some not getting any
>> record .
>>
>> is there any way in spark so i can evenly distribute my record in all
>> partition .
>>
>> Regards
>> Prateek
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/how-can-evenly-distribute-my-records-in-all-partition-tp25394.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
>>

Reply via email to