Josh Rosen created SPARK-48081:
----------------------------------

             Summary: Fix ClassCastException in NTile.checkInputDataTypes()
                 Key: SPARK-48081
                 URL: https://issues.apache.org/jira/browse/SPARK-48081
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 4.0.0
            Reporter: Josh Rosen
            Assignee: Josh Rosen


{code:java}
sql("select ntile(99.9) OVER (order by id) from range(10)"){code}

results in

{code}
 java.lang.ClassCastException: class org.apache.spark.sql.types.Decimal cannot 
be cast to class java.lang.Integer (org.apache.spark.sql.types.Decimal is in 
unnamed module of loader 'app'; java.lang.Integer is in module java.base of 
loader 'bootstrap')
  at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:99)
  at 
org.apache.spark.sql.catalyst.expressions.NTile.checkInputDataTypes(windowExpressions.scala:877)
  at 
org.apache.spark.sql.catalyst.expressions.Expression.resolved$lzycompute(Expression.scala:267)
  at 
org.apache.spark.sql.catalyst.expressions.Expression.resolved(Expression.scala:267)
  at 
org.apache.spark.sql.catalyst.expressions.Expression.$anonfun$childrenResolved$1(Expression.scala:279)
  at 
org.apache.spark.sql.catalyst.expressions.Expression.$anonfun$childrenResolved$1$adapted(Expression.scala:279)
  at scala.collection.IterableOnceOps.forall(IterableOnce.scala:633)
  at scala.collection.IterableOnceOps.forall$(IterableOnce.scala:630)
  at scala.collection.AbstractIterable.forall(Iterable.scala:935)
  at 
org.apache.spark.sql.catalyst.expressions.Expression.childrenResolved(Expression.scala:279)
  at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$$anonfun$apply$22$$anonfun$applyOrElse$157.applyOrElse(Analyzer.scala:2243)
 
{code}

instead of the intended user-facing error message. This is a minor bug that was 
introduced in a previous error class refactoring PR.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to