[ https://issues.apache.org/jira/browse/SPARK-18003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Weichen Xu updated SPARK-18003: ------------------------------- Description: RDD zipWithIndex generate wrong result when one partition contains more than Int.MaxValue records. when RDD contains a partition with more than 2147483647 records, error occurs. for example, if partition-0 has more than 2147483647 records, the index became: 0,1, ..., 2147483647, -2147483648, -2147483647, -2147483646 .... when we do some operation such as repartition or coalesce, it is possible to generate big partition, so this bug should be fixed. was: RDD zipWithIndex generate wrong result when one partition contains more than Int.MaxValue records. when RDD contains a partition with more than 2147483647 records, error occurs. for example, if partition-0 has more than 2147483647 records, the index became: 0,1, ..., 2147483647, -2147483648, -2147483647, -2147483646 .... > RDD zipWithIndex generate wrong result when one partition contains more than > 2147483647 records. > ------------------------------------------------------------------------------------------------ > > Key: SPARK-18003 > URL: https://issues.apache.org/jira/browse/SPARK-18003 > Project: Spark > Issue Type: Bug > Components: Spark Core > Reporter: Weichen Xu > Original Estimate: 24h > Remaining Estimate: 24h > > RDD zipWithIndex generate wrong result when one partition contains more than > Int.MaxValue records. > when RDD contains a partition with more than 2147483647 records, > error occurs. > for example, if partition-0 has more than 2147483647 records, the index > became: > 0,1, ..., 2147483647, -2147483648, -2147483647, -2147483646 .... > when we do some operation such as repartition or coalesce, it is possible to > generate big partition, so this bug should be fixed. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org