*This is a known issue. *
https://issues.apache.org/jira/browse/SPARK-3200
Prashant Sharma
On Thu, Mar 3, 2016 at 9:01 AM, Rahul Palamuttam
wrote:
> Thank you Jeff.
>
> I have filed a JIRA under the following link :
>
> https://issues.apache.org/jira/browse/SPARK-13634
>
> For some reason th
Thank you Jeff.
I have filed a JIRA under the following link :
https://issues.apache.org/jira/browse/SPARK-13634
For some reason the spark context is being pulled into the referencing
environment of the closure.
I also had no problems with batch jobs.
On Wed, Mar 2, 2016 at 7:18 PM, Jeff Zhang
I can reproduce it in spark-shell. But it works for batch job. Looks like
spark repl issue.
On Thu, Mar 3, 2016 at 10:43 AM, Rahul Palamuttam
wrote:
> Hi All,
>
> We recently came across this issue when using the spark-shell and zeppelin.
> If we assign the sparkcontext variable (sc) to a new va
Hi All,
We recently came across this issue when using the spark-shell and zeppelin.
If we assign the sparkcontext variable (sc) to a new variable and reference
another variable in an RDD lambda expression we get a task not serializable
exception.
The following three lines of code illustrate this