[ 
https://issues.apache.org/jira/browse/SPARK-1199?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13981187#comment-13981187
 ] 

Andrew Kerr commented on SPARK-1199:
------------------------------------

I have something of a workaround:

{code}
object MyTypes {
  case class TestClass(a:Int)
}

object MyLogic {
  import MyClasses._
  def fn(b:TestClass) = TestClass(b.a * 2)
  val result = Seq(TestClass(1)).map(fn)
}

MyLogic.result
// Seq{MyTypes.TestClass] = List(TestClass(2))
{code}

Still can't access TestClass outside an object.

> Type mismatch in Spark shell when using case class defined in shell
> -------------------------------------------------------------------
>
>                 Key: SPARK-1199
>                 URL: https://issues.apache.org/jira/browse/SPARK-1199
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.9.0
>            Reporter: Andrew Kerr
>            Priority: Critical
>             Fix For: 1.1.0
>
>
> Define a class in the shell:
> {code}
> case class TestClass(a:String)
> {code}
> and an RDD
> {code}
> val data = sc.parallelize(Seq("a")).map(TestClass(_))
> {code}
> define a function on it and map over the RDD
> {code}
> def itemFunc(a:TestClass):TestClass = a
> data.map(itemFunc)
> {code}
> Error:
> {code}
> <console>:19: error: type mismatch;
>  found   : TestClass => TestClass
>  required: TestClass => ?
>               data.map(itemFunc)
> {code}
> Similarly with a mapPartitions:
> {code}
> def partitionFunc(a:Iterator[TestClass]):Iterator[TestClass] = a
> data.mapPartitions(partitionFunc)
> {code}
> {code}
> <console>:19: error: type mismatch;
>  found   : Iterator[TestClass] => Iterator[TestClass]
>  required: Iterator[TestClass] => Iterator[?]
> Error occurred in an application involving default arguments.
>               data.mapPartitions(partitionFunc)
> {code}
> The behavior is the same whether in local mode or on a cluster.
> This isn't specific to RDDs. A Scala collection in the Spark shell has the 
> same problem.
> {code}
> scala> Seq(TestClass("foo")).map(itemFunc)
> <console>:15: error: type mismatch;
>  found   : TestClass => TestClass
>  required: TestClass => ?
>               Seq(TestClass("foo")).map(itemFunc)
>                                         ^
> {code}
> When run in the Scala console (not the Spark shell) there are no type 
> mismatch errors.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to