But the idea is to keep at as RDD[SuperType] since i have
implicit contention to add custom functionality to RDD. Like here:
http://blog.madhukaraphatak.com/extending-spark-api/

Cheers

On 27 December 2015 at 19:13, Ted Yu <yuzhih...@gmail.com> wrote:

> Have you tried declaring RDD[ChildTypeOne] and writing separate functions
> for each sub-type ?
>
> Cheers
>
> On Sun, Dec 27, 2015 at 10:08 AM, pkhamutou <p.khamu...@gmail.com> wrote:
>
>> Hello,
>>
>> I have a such situation:
>>
>> abstract class SuperType {...}
>> case class ChildTypeOne(x: String) extends SuperType {.....}
>> case class ChildTypeTwo(x: String) extends SuperType {....}
>>
>> than I have:
>>
>> val rdd1: RDD[SuperType] = sc./*some code*/.map(r => ChildTypeOne(r))
>> val rdd2: RDD[SuperType] = sc./*some code*/.map(r => ChildTypeTwo(r))
>>
>> but when i try to:
>> def someFunction(rdd: RDD[SuperType]) = rdd match {
>>   case rdd: RDD[ChildTypeOne] => println("ChildTypeOne")
>>   case rdd: RDD[ChildTypeTwo] => println("ChildTypeTwo")
>> }
>>
>>
>> i get:
>>
>> Error:(60, 15) pattern type is incompatible with expected type;
>>  found   : org.apache.spark.rdd.RDD[ChildTypeOne]
>>  required: org.apache.spark.rdd.RDD[SuperType]
>> Note: ChildTypeOne <: SuperType, but class RDD is invariant in type T.
>> You may wish to define T as +T instead. (SLS 4.5)
>>       case rdd: RDD[ChildTypeOne] => println("ChildTypeOne")
>>               ^
>>
>> So how to work around it? Because in some situations I need to distinguish
>> them.
>>
>> Best regards,
>> Pavel Khamutou
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Pattern-type-is-incompatible-with-expected-type-tp25805.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to