Github user bersprockets commented on the issue:

    https://github.com/apache/spark/pull/20505
  
    I am trying to understand the web that the compiler follows to satisfy the 
implicit Encoder parameter.
    
    The DataSet’s map function is declared as follows:
    
    <code>def map[U : Encoder](func: T => U): Dataset[U]</code>
    
    There is an implicit parameter of type Encoder[U], where U is the return 
type of the passed function.
    
    Then there is my application code:
    
    <pre>
    df.map(row => row.getValuesMap[Any](List("stationName", "year"))).collect 
    </pre>
    
    This causes the compiler to look for an instance of an Encoder[Map[String, 
Any]] in scope.
    
    Before your change, the compiler used to find:
    
    <code>implicit def newMapEncoder[T <: Map[_, _] : TypeTag]: Encoder[T] = 
ExpressionEncoder()</code>
    
    This finds an encoder for Map, but it fails to verify that the map’s type 
parameters (in this case, String and Any) also have encoders. Therefore the 
code compiles, but the map() method fails at runtime with a mystery exception.
    
    You removed the “implicit” keyword from above, so now the compiler 
finds your new definition instead:
    
    <pre>implicit def newCheckedMapEncoder[T[_, _], K : Encoder, V : Encoder]
    (implicit ev: T[K, V] <:< Map[K, V], tag: TypeTag[T[K, V]]): Encoder[T[K, 
V]]
    </pre>
    
    This method is parameterized with any sort of type that is itself 
parameterized with two types. However, your evidence parameter <code>ev: T[K, 
V] <:< Map[K, V]</code> restricts the match to Map types of K and V.
    
    You also have two unnamed implicit parameters of type Encoder[K] and 
Encoder[V] that caused the compiler to start a new search for instances of 
Encoder[K] and Encoder[V] that are in scope.
    
    The compiler surely finds an instance of Encoder[String], but fails to find 
an instance of Encoder[Any], so the compilation fails.
    
    I assume the compiler prints that odd error message ("diverging implicit 
expansion") because it failed during this second search, rather than on the 
search that started the whole thing.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to