[jira] [Updated] (SPARK-28854) Zipping iterators in mapPartitions will fail
[ https://issues.apache.org/jira/browse/SPARK-28854?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon updated SPARK-28854: - Description: {code} scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(2*).zip(xs)).collect.foreach(println) ... java.util.NoSuchElementException: next on empty iterator {code} Workaround - implement zip with mapping to tuple: {code} scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, x))).collect.foreach(println) (2,1) (4,2) (6,3) {code} was: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(2*).zip(xs)).collect.foreach(println) warning: there was one feature warning; re-run with -feature for details 19/08/22 21:13:18 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) java.util.NoSuchElementException: next on empty iterator Workaround - implement zip with mapping to tuple: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, x))).collect.foreach(println) (2,1) (4,2) (6,3) > Zipping iterators in mapPartitions will fail > > > Key: SPARK-28854 > URL: https://issues.apache.org/jira/browse/SPARK-28854 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.4.3 >Reporter: Hao Yang Ang >Priority: Minor > > {code} > scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => > xs.map(2*).zip(xs)).collect.foreach(println) > ... > java.util.NoSuchElementException: next on empty iterator > {code} > > > Workaround - implement zip with mapping to tuple: > {code} > scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, > x))).collect.foreach(println) > (2,1) > (4,2) > (6,3) > {code} > -- This message was sent by Atlassian Jira (v8.3.2#803003) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-28854) Zipping iterators in mapPartitions will fail
[ https://issues.apache.org/jira/browse/SPARK-28854?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hao Yang Ang updated SPARK-28854: - Description: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(2*).zip(xs)).collect.foreach(println) warning: there was one feature warning; re-run with -feature for details 19/08/22 21:13:18 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) java.util.NoSuchElementException: next on empty iterator Workaround - implement zip with mapping to tuple: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, x))).collect.foreach(println) (2,1) (4,2) (6,3) was: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(2*).zip(xs)).foreach(println) warning: there was one feature warning; re-run with -feature for details 19/08/22 21:13:18 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) java.util.NoSuchElementException: next on empty iterator Workaround - implement zip with mapping to tuple: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, x))).collect.foreach(println) (2,1) (4,2) (6,3) > Zipping iterators in mapPartitions will fail > > > Key: SPARK-28854 > URL: https://issues.apache.org/jira/browse/SPARK-28854 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.4.3 >Reporter: Hao Yang Ang >Priority: Minor > > scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => > xs.map(2*).zip(xs)).collect.foreach(println) > warning: there was one feature warning; re-run with -feature for details > 19/08/22 21:13:18 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) > java.util.NoSuchElementException: next on empty iterator > > > Workaround - implement zip with mapping to tuple: > scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, > x))).collect.foreach(println) > (2,1) > (4,2) > (6,3) > -- This message was sent by Atlassian Jira (v8.3.2#803003) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-28854) Zipping iterators in mapPartitions will fail
[ https://issues.apache.org/jira/browse/SPARK-28854?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hao Yang Ang updated SPARK-28854: - Description: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(2*).zip(xs)).foreach(println) warning: there was one feature warning; re-run with -feature for details 19/08/22 21:13:18 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) java.util.NoSuchElementException: next on empty iterator Workaround - implement zip with mapping to tuple: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, x))).collect.foreach(println) (2,1) (4,2) (6,3) was: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(2*).zip(xs)).foreach(println) warning: there was one feature warning; re-run with -feature for details 19/08/22 21:13:18 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) java.util.NoSuchElementException: next on empty iterator Workaround - implement zip with mapping to tuple: scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, x))).collect.foreach(println) (2,1) (4,2) (6,3) > Zipping iterators in mapPartitions will fail > > > Key: SPARK-28854 > URL: https://issues.apache.org/jira/browse/SPARK-28854 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.4.3 >Reporter: Hao Yang Ang >Priority: Minor > > scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => > xs.map(2*).zip(xs)).foreach(println) > warning: there was one feature warning; re-run with -feature for details > 19/08/22 21:13:18 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) > java.util.NoSuchElementException: next on empty iterator > > > Workaround - implement zip with mapping to tuple: > scala> sc.parallelize(Seq(1, 2, 3)).mapPartitions(xs => xs.map(x => (x * 2, > x))).collect.foreach(println) > (2,1) > (4,2) > (6,3) > -- This message was sent by Atlassian Jira (v8.3.2#803003) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org