GitHub user WeichenXu123 opened a pull request: https://github.com/apache/spark/pull/15612
[SPARK-18078] Add option for customize zipPartition task preferred locations ## What changes were proposed in this pull request? add an option for `RDD.zipPartitions`, so that we can control the zipPartition task preferred locations to be consistent with specific RDD (Usually we specify it to be the RDD which is much larger than other zipped RDDs.) I test this improvement on spark-tfocus DMatrix multiplying DVector https://github.com/WeichenXu123/spark-tfocs/blob/master/src/main/scala/org/apache/spark/mllib/optimization/tfocs/fs/dvector/vector/LinopMatrixAdjoint.scala I generate the `DMatrix` RDD, using 100 partitions and each partition contains about 100MB data. and the `DVector` also 100 partitions, each partitions about 1MB data, and I start the job using 10 executors, each executor contains 4 cores. and I set the `spark.locality.wait` to be 3000ms, the result show that using the option, it have about 30% performance improvement. ## How was this patch tested? Manual. You can merge this pull request into a Git repository by running: $ git pull https://github.com/WeichenXu123/spark customize_zipPartition_task_preferredLocation Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/15612.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #15612 ---- commit e6d069008b931de7479819e84458645e9049254d Author: WeichenXu <weichenxu...@outlook.com> Date: 2016-10-24T15:19:25Z customize_zipPartition_task_preferredLocation ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org