[ https://issues.apache.org/jira/browse/SPARK-1040?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14904571#comment-14904571 ]
Glenn Strycker commented on SPARK-1040: --------------------------------------- My ticket SPARK-10762 may have just been a user error, but was interesting none-the-less... evidently Scala or Spark is not correctly reporting the type of an ArrayBuffer as ArrayBuffer[Any], but claimed it had correctly been cast to ArrayBuffer[(Int,String)]. > Collect as Map throws a casting exception when run on a JavaPairRDD object > -------------------------------------------------------------------------- > > Key: SPARK-1040 > URL: https://issues.apache.org/jira/browse/SPARK-1040 > Project: Spark > Issue Type: Bug > Components: Java API > Affects Versions: 0.9.0 > Reporter: Kevin Mader > Assignee: Josh Rosen > Priority: Minor > Fix For: 0.9.1 > > > The error that arises > {code} > Exception in thread "main" java.lang.ClassCastException: [Ljava.lang.Object; > cannot be cast to [Lscala.Tuple2; > at > org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.scala:427) > at > org.apache.spark.api.java.JavaPairRDD.collectAsMap(JavaPairRDD.scala:409) > {code} > The code being executed > {code:java} > public static String ImageSummary(final JavaPairRDD<Integer,int[]> inImg) { > final Set<Integer> keyList=inImg.collectAsMap().keySet(); > for(Integer cVal: keyList) outString+=cVal+","; > return outString; > } > {code} > The line 426-427 from PairRDDFunctions.scala > {code:java} > def collectAsMap(): Map[K, V] = { > val data = self.toArray() > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org