Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19553#discussion_r148464415
  
    --- Diff: core/src/main/scala/org/apache/spark/api/java/JavaUtils.scala ---
    @@ -43,10 +43,16 @@ private[spark] object JavaUtils {
     
         override def size: Int = underlying.size
     
    -    override def get(key: AnyRef): B = try {
    -      underlying.getOrElse(key.asInstanceOf[A], null.asInstanceOf[B])
    -    } catch {
    -      case ex: ClassCastException => null.asInstanceOf[B]
    +    // Delegate to implementation because AbstractMap implementation 
iterates over whole key set
    +    override def containsKey(key: AnyRef): Boolean = {
    +      key.isInstanceOf[A] && underlying.contains(key.asInstanceOf[A])
    --- End diff --
    
    Ah, yeah right, that won't work without a ClassTag.
    
    ```
    [error] [warn] 
/home/jenkins/workspace/SparkPullRequestBuilder/core/src/main/scala/org/apache/spark/api/java/JavaUtils.scala:48:
 abstract type A is unchecked since it is eliminated by erasure
    [error] [warn]       key.isInstanceOf[A] && 
underlying.contains(key.asInstanceOf[A])
    [error] [warn] 
    [error] [warn] 
/home/jenkins/workspace/SparkPullRequestBuilder/core/src/main/scala/org/apache/spark/api/java/JavaUtils.scala:52:
 abstract type A is unchecked since it is eliminated by erasure
    [error] [warn]       if (key.isInstanceOf[A]) {
    [error] [warn] 
    ```
    
    I think that's why the implementation originally just caught a 
ClassCastException, but that's ugly.
    
    So the type `A` in the method declaration becomes `A : ClassTag` and the 
type check becomes `classTag[A].runtimeClass.isAssignableFrom(key.getClass)`
    
    @Whoosh would you mind giving that a try? I think that nails it then.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to