Repository: spark
Updated Branches:
refs/heads/branch-2.0 bb2bdb440 -> 5c2bc8360
[SPARK-17521] Error when I use sparkContext.makeRDD(Seq())
## What changes were proposed in this pull request?
when i use sc.makeRDD below
```
val data3 = sc.makeRDD(Seq())
println(data3.partitions.length)
```
I got an error:
Exception in thread "main" java.lang.IllegalArgumentException: Positive number
of slices required
We can fix this bug just modify the last line ,do a check of seq.size
```
def makeRDD[T: ClassTag](seq: Seq[(T, Seq[String])]): RDD[T] = withScope {
assertNotStopped()
val indexToPrefs = seq.zipWithIndex.map(t => (t._2, t._1._2)).toMap
new ParallelCollectionRDD[T](this, seq.map(_._1), math.max(seq.size,
defaultParallelism), indexToPrefs)
}
```
## How was this patch tested?
manual tests
(If this patch involves UI changes, please attach a screenshot; otherwise,
remove this)
Author: codlife <[email protected]>
Author: codlife <[email protected]>
Closes #15077 from codlife/master.
(cherry picked from commit 647ee05e5815bde361662a9286ac602c44b4d4e6)
Signed-off-by: Sean Owen <[email protected]>
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5c2bc836
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5c2bc836
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/5c2bc836
Branch: refs/heads/branch-2.0
Commit: 5c2bc8360019fb08e2e62e50bb261f7ce19b231e
Parents: bb2bdb4
Author: codlife <[email protected]>
Authored: Thu Sep 15 09:38:13 2016 +0100
Committer: Sean Owen <[email protected]>
Committed: Thu Sep 15 09:38:22 2016 +0100
----------------------------------------------------------------------
core/src/main/scala/org/apache/spark/SparkContext.scala | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/5c2bc836/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala
b/core/src/main/scala/org/apache/spark/SparkContext.scala
index 71511b8..214758f 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -788,7 +788,7 @@ class SparkContext(config: SparkConf) extends Logging with
ExecutorAllocationCli
def makeRDD[T: ClassTag](seq: Seq[(T, Seq[String])]): RDD[T] = withScope {
assertNotStopped()
val indexToPrefs = seq.zipWithIndex.map(t => (t._2, t._1._2)).toMap
- new ParallelCollectionRDD[T](this, seq.map(_._1), seq.size, indexToPrefs)
+ new ParallelCollectionRDD[T](this, seq.map(_._1), math.max(seq.size, 1),
indexToPrefs)
}
/**
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]