k out of the box? Shouldn't lazy evaluation and garbage collection
> prevent the program from running out of memory? I could manually split the
> Iterator into chunks and serialize each chunk, but it feels wrong. What is
> going wrong here?
>
>
>
>
> --
> View this
ge in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Convert-Iterable-to-RDD-tp16882p26211.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@
m]
> *Sent:* 2014年10月21日 10:58
> *To:* user@spark.apache.org
> *Subject:* Convert Iterable to RDD
>
>
>
> Hi, All
>
>
>
> Is there any way to convert iterable to RDD?
>
>
>
> Thanks,
>
> Kevin.
>
In addition, how to convert Iterable[Iterable[T]] to RDD[T]
Thanks,
Kevin.
From: Dai, Kevin [mailto:yun...@ebay.com]
Sent: 2014年10月21日 10:58
To: user@spark.apache.org
Subject: Convert Iterable to RDD
Hi, All
Is there any way to convert iterable to RDD?
Thanks,
Kevin.
Hi, All
Is there any way to convert iterable to RDD?
Thanks,
Kevin.