[ 
https://issues.apache.org/jira/browse/SPARK-20706?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-20706:
------------------------------------

    Assignee:     (was: Apache Spark)

> Spark-shell not overriding method/variable definition
> -----------------------------------------------------
>
>                 Key: SPARK-20706
>                 URL: https://issues.apache.org/jira/browse/SPARK-20706
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.0.0, 2.1.1, 2.2.0
>         Environment: Linux, Scala 2.11.8
>            Reporter: Raphael Roth
>         Attachments: screenshot-1.png
>
>
> !screenshot-1.png!In the following example, the definition of myMethod is not 
> correctly updated:
> ------------------------------
> def myMethod()  = "first definition"
> val tmp = myMethod(); val out = tmp
> println(out) // prints "first definition"
> def myMethod()  = "second definition" // override above myMethod
> val tmp = myMethod(); val out = tmp 
> println(out) // should be "second definition" but is "first definition"
> ------------------------------
> I'm using semicolon to force two statements to be compiled at the same time. 
> It's also possible to reproduce the behavior using :paste
> So if I-redefine myMethod, the implementation seems not to be updated in this 
> case. I figured out that the second-last statement (val out = tmp) causes 
> this behavior, if this is moved in a separate block, the code works just fine.
> EDIT:
> The same behavior can be seen when declaring variables :
> ------------------------------
> val a = 1
> val b = a; val c = b;
> println(b) // prints "1"
> val a = 2 // override a
> val b = a; val c = b;
> println(b) // prints "1" instead of "2"
> ------------------------------
> Interestingly, if the second-last line "val b = a; val c = b;" is executed 
> twice, then I get the expected result



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to