[ 
https://issues.apache.org/jira/browse/SPARK-2331?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14152018#comment-14152018
 ] 

Patrick Wendell commented on SPARK-2331:
----------------------------------------

Yeah we could have made this a wider type in the public signature. However, it 
is not possible to do that while maintaining compatibility (others may be 
relying on this returning an EmptyRDD).

For now though you can safely cast it to work around this:

{code}
scala> sc.emptyRDD[String].asInstanceOf[RDD[String]]
res7: org.apache.spark.rdd.RDD[String] = EmptyRDD[3] at emptyRDD at <console>:14
{code}

> SparkContext.emptyRDD has wrong return type
> -------------------------------------------
>
>                 Key: SPARK-2331
>                 URL: https://issues.apache.org/jira/browse/SPARK-2331
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.0.0
>            Reporter: Ian Hummel
>
> The return type for SparkContext.emptyRDD is EmptyRDD[T].
> It should be RDD[T].  That means you have to add extra type annotations on 
> code like the below (which creates a union of RDDs over some subset of paths 
> in a folder)
> val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { 
> (rdd, path) ⇒
>           rdd.union(sc.textFile(path))
>         }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to