[ 
https://issues.apache.org/jira/browse/SPARK-14146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15316383#comment-15316383
 ] 

Prashant Sharma commented on SPARK-14146:
-----------------------------------------

I am having difficulty fixing it. Unless there is a way to know in advance what 
implicit imports are required by a particular statement, we need to import 
everything. Which leads to several issues, that were addressed by the change I 
made by importing exactly what was required for a particular repl expression to 
execute.

Incase, we import everything for each executing expression the bugs like 
SPARK-1199 will again surface. I just felt that is much more difficult to deal 
with bug than this. 

Here we can somewhat reduce the impact of this bug by may be adding import 
spark.implicits._ in the wrappers. The same way we did for `addOuterScope`.

> Imported implicits can't be found in Spark REPL in some cases
> -------------------------------------------------------------
>
>                 Key: SPARK-14146
>                 URL: https://issues.apache.org/jira/browse/SPARK-14146
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>    Affects Versions: 2.0.0
>            Reporter: Wenchen Fan
>
> {code}
> class I(i: Int) {
>   def double: Int = i * 2
> }
> class Context {
>   implicit def toI(i: Int): I = new I(i)
> }
> val c = new Context
> import c._
> // OK
> 1.double
> // Fail
> class A; 1.double
> {code}
> The above code snippets can work in Scala REPL however.
> This will affect our Dataset functionality, for example:
> {code}
> class A; Seq(1 -> "a").toDS() // fail
> {code}
> or in paste mode:
> {code}
> :paste
> class A
> Seq(1 -> "a").toDS() // fail
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to