[jira] [Updated] (SPARK-1866) Closure cleaner does not null shadowed fields when outer scope is referenced

2019-05-20 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-1866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-1866:

Labels: bulk-closed  (was: )

> Closure cleaner does not null shadowed fields when outer scope is referenced
> 
>
> Key: SPARK-1866
> URL: https://issues.apache.org/jira/browse/SPARK-1866
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.0.0
>Reporter: Aaron Davidson
>Priority: Major
>  Labels: bulk-closed
>
> Take the following example:
> {code}
> val x = 5
> val instances = new org.apache.hadoop.fs.Path("/") /* non-serializable */
> sc.parallelize(0 until 10).map { _ =>
>   val instances = 3
>   (instances, x)
> }.collect
> {code}
> This produces a "java.io.NotSerializableException: 
> org.apache.hadoop.fs.Path", despite the fact that the outer instances is not 
> actually used within the closure. If you change the name of the outer 
> variable instances to something else, the code executes correctly, indicating 
> that it is the fact that the two variables share a name that causes the issue.
> Additionally, if the outer scope is not used (i.e., we do not reference "x" 
> in the above example), the issue does not appear.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-1866) Closure cleaner does not null shadowed fields when outer scope is referenced

2018-05-09 Thread Marcelo Vanzin (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin updated SPARK-1866:
--
Priority: Major  (was: Critical)

> Closure cleaner does not null shadowed fields when outer scope is referenced
> 
>
> Key: SPARK-1866
> URL: https://issues.apache.org/jira/browse/SPARK-1866
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.0.0
>Reporter: Aaron Davidson
>Assignee: Kan Zhang
>Priority: Major
>
> Take the following example:
> {code}
> val x = 5
> val instances = new org.apache.hadoop.fs.Path("/") /* non-serializable */
> sc.parallelize(0 until 10).map { _ =>
>   val instances = 3
>   (instances, x)
> }.collect
> {code}
> This produces a "java.io.NotSerializableException: 
> org.apache.hadoop.fs.Path", despite the fact that the outer instances is not 
> actually used within the closure. If you change the name of the outer 
> variable instances to something else, the code executes correctly, indicating 
> that it is the fact that the two variables share a name that causes the issue.
> Additionally, if the outer scope is not used (i.e., we do not reference "x" 
> in the above example), the issue does not appear.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-1866) Closure cleaner does not null shadowed fields when outer scope is referenced

2015-02-08 Thread Sean Owen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-1866:
-
Component/s: Spark Core

 Closure cleaner does not null shadowed fields when outer scope is referenced
 

 Key: SPARK-1866
 URL: https://issues.apache.org/jira/browse/SPARK-1866
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.0.0
Reporter: Aaron Davidson
Assignee: Kan Zhang
Priority: Critical

 Take the following example:
 {code}
 val x = 5
 val instances = new org.apache.hadoop.fs.Path(/) /* non-serializable */
 sc.parallelize(0 until 10).map { _ =
   val instances = 3
   (instances, x)
 }.collect
 {code}
 This produces a java.io.NotSerializableException: 
 org.apache.hadoop.fs.Path, despite the fact that the outer instances is not 
 actually used within the closure. If you change the name of the outer 
 variable instances to something else, the code executes correctly, indicating 
 that it is the fact that the two variables share a name that causes the issue.
 Additionally, if the outer scope is not used (i.e., we do not reference x 
 in the above example), the issue does not appear.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-1866) Closure cleaner does not null shadowed fields when outer scope is referenced

2014-12-26 Thread Patrick Wendell (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-1866:
---
Fix Version/s: (was: 1.2.0)
   1.3.0

 Closure cleaner does not null shadowed fields when outer scope is referenced
 

 Key: SPARK-1866
 URL: https://issues.apache.org/jira/browse/SPARK-1866
 Project: Spark
  Issue Type: Bug
Affects Versions: 1.0.0
Reporter: Aaron Davidson
Assignee: Kan Zhang
Priority: Critical
 Fix For: 1.0.1, 1.3.0


 Take the following example:
 {code}
 val x = 5
 val instances = new org.apache.hadoop.fs.Path(/) /* non-serializable */
 sc.parallelize(0 until 10).map { _ =
   val instances = 3
   (instances, x)
 }.collect
 {code}
 This produces a java.io.NotSerializableException: 
 org.apache.hadoop.fs.Path, despite the fact that the outer instances is not 
 actually used within the closure. If you change the name of the outer 
 variable instances to something else, the code executes correctly, indicating 
 that it is the fact that the two variables share a name that causes the issue.
 Additionally, if the outer scope is not used (i.e., we do not reference x 
 in the above example), the issue does not appear.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-1866) Closure cleaner does not null shadowed fields when outer scope is referenced

2014-12-26 Thread Patrick Wendell (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-1866:
---
Fix Version/s: (was: 1.3.0)
   (was: 1.0.1)

 Closure cleaner does not null shadowed fields when outer scope is referenced
 

 Key: SPARK-1866
 URL: https://issues.apache.org/jira/browse/SPARK-1866
 Project: Spark
  Issue Type: Bug
Affects Versions: 1.0.0
Reporter: Aaron Davidson
Assignee: Kan Zhang
Priority: Critical

 Take the following example:
 {code}
 val x = 5
 val instances = new org.apache.hadoop.fs.Path(/) /* non-serializable */
 sc.parallelize(0 until 10).map { _ =
   val instances = 3
   (instances, x)
 }.collect
 {code}
 This produces a java.io.NotSerializableException: 
 org.apache.hadoop.fs.Path, despite the fact that the outer instances is not 
 actually used within the closure. If you change the name of the outer 
 variable instances to something else, the code executes correctly, indicating 
 that it is the fact that the two variables share a name that causes the issue.
 Additionally, if the outer scope is not used (i.e., we do not reference x 
 in the above example), the issue does not appear.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-1866) Closure cleaner does not null shadowed fields when outer scope is referenced

2014-09-15 Thread Patrick Wendell (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-1866:
---
Fix Version/s: (was: 1.1.0)
   1.2.0

 Closure cleaner does not null shadowed fields when outer scope is referenced
 

 Key: SPARK-1866
 URL: https://issues.apache.org/jira/browse/SPARK-1866
 Project: Spark
  Issue Type: Bug
Affects Versions: 1.0.0
Reporter: Aaron Davidson
Assignee: Kan Zhang
Priority: Critical
 Fix For: 1.0.1, 1.2.0


 Take the following example:
 {code}
 val x = 5
 val instances = new org.apache.hadoop.fs.Path(/) /* non-serializable */
 sc.parallelize(0 until 10).map { _ =
   val instances = 3
   (instances, x)
 }.collect
 {code}
 This produces a java.io.NotSerializableException: 
 org.apache.hadoop.fs.Path, despite the fact that the outer instances is not 
 actually used within the closure. If you change the name of the outer 
 variable instances to something else, the code executes correctly, indicating 
 that it is the fact that the two variables share a name that causes the issue.
 Additionally, if the outer scope is not used (i.e., we do not reference x 
 in the above example), the issue does not appear.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-1866) Closure cleaner does not null shadowed fields when outer scope is referenced

2014-05-16 Thread Aaron Davidson (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-1866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aaron Davidson updated SPARK-1866:
--

Description: 
Take the following example:
{code}
val x = 5
val instances = new org.apache.hadoop.fs.Path(/) /* non-serializable */
sc.parallelize(0 until 10).map { _ =
  val instances = 3
  (instances, x)
}.collect
{code}

This produces a java.io.NotSerializableException: org.apache.hadoop.fs.Path, 
despite the fact that the outer instances is not actually used within the 
closure. If you change the name of the outer variable instances to something 
else, the code executes correctly, indicating that it is the fact that the two 
variables share a name that causes the issue.

Additionally, if the outer scope is not used (i.e., we do not reference x in 
the above example), the issue does not appear.

  was:
Take the following example:
{code}
val x = 5
val instances = new org.apache.hadoop.fs.Path(/) /* non-serializable */
sc.parallelize(0 until 10).map { _ =
  val instances = 3
  (instances, x)
}.collect
{code}

This produces a java.io.NotSerializableException: org.apache.hadoop.fs.Path, 
despite the fact that the outer instances is not actually used within the 
closure. If you change the name of the outer variable instances to something 
else, the code executes correctly, indicating that it is the fact that the two 
variables share a name that causes the issue.


 Closure cleaner does not null shadowed fields when outer scope is referenced
 

 Key: SPARK-1866
 URL: https://issues.apache.org/jira/browse/SPARK-1866
 Project: Spark
  Issue Type: Bug
Affects Versions: 1.0.0
Reporter: Aaron Davidson
Priority: Critical
 Fix For: 1.1.0, 1.0.1


 Take the following example:
 {code}
 val x = 5
 val instances = new org.apache.hadoop.fs.Path(/) /* non-serializable */
 sc.parallelize(0 until 10).map { _ =
   val instances = 3
   (instances, x)
 }.collect
 {code}
 This produces a java.io.NotSerializableException: 
 org.apache.hadoop.fs.Path, despite the fact that the outer instances is not 
 actually used within the closure. If you change the name of the outer 
 variable instances to something else, the code executes correctly, indicating 
 that it is the fact that the two variables share a name that causes the issue.
 Additionally, if the outer scope is not used (i.e., we do not reference x 
 in the above example), the issue does not appear.



--
This message was sent by Atlassian JIRA
(v6.2#6252)