Github user marmbrus commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3208#discussion_r20392753
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/complexTypes.scala
 ---
    @@ -55,7 +55,11 @@ case class GetItem(child: Expression, ordinal: 
Expression) extends Expression {
               // TODO: consider using Array[_] for ArrayType child to avoid
               // boxing of primitives
               val baseValue = value.asInstanceOf[Seq[_]]
    -          val o = key.asInstanceOf[Int]
    +          val o = key match {
    +            case k if k.isInstanceOf[Byte] => k.asInstanceOf[Byte]
    +            case k if k.isInstanceOf[Short] => k.asInstanceOf[Short]
    +            case k => k.asInstanceOf[Int]
    --- End diff --
    
    I'm not sure that runtime type checking is really the best way to handle 
this.  It adds a reflective cost to each invocation and it means when we add 
numeric types we need to revisit this code.  Also, what about long?
    
    Instead, if we want to support using bytes/shorts/etc as ordinals I think 
we should instead add a rule to analysis (probably in 
`FunctionArgumentConversion`) that adds a cast as needed.
    
    Existing problem: `resolved` should not be true unless `ordinal` is of type 
`IntegerType`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to