[ https://issues.apache.org/jira/browse/SPARK-18189?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15623869#comment-15623869 ]
Apache Spark commented on SPARK-18189: -------------------------------------- User 'seyfe' has created a pull request for this issue: https://github.com/apache/spark/pull/15706 > task not serializable with groupByKey() + mapGroups() + map > ----------------------------------------------------------- > > Key: SPARK-18189 > URL: https://issues.apache.org/jira/browse/SPARK-18189 > Project: Spark > Issue Type: Bug > Reporter: Yang Yang > > just run the following code > val a = spark.createDataFrame(sc.parallelize(Seq((1,2),(3,4)))).as[(Int,Int)] > val grouped = a.groupByKey({x:(Int,Int)=>x._1}) > val mappedGroups = grouped.mapGroups((k,x)=>{(k,1)}) > val yyy = sc.broadcast(1) > val last = mappedGroups.rdd.map(xx=>{ > val simpley = yyy.value > > 1 > }) > spark says Task not serializable -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org