[ https://issues.apache.org/jira/browse/SPARK-45106?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
ASF GitHub Bot updated SPARK-45106: ----------------------------------- Labels: pull-request-available (was: ) > percentile_cont gets internal error when user input fails runtime > replacement's input type check > ------------------------------------------------------------------------------------------------- > > Key: SPARK-45106 > URL: https://issues.apache.org/jira/browse/SPARK-45106 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.4.1, 3.5.0, 4.0.0 > Reporter: Bruce Robbins > Priority: Major > Labels: pull-request-available > > This query throws an internal error rather than producing a useful error > message: > {noformat} > select percentile_cont(b) WITHIN GROUP (ORDER BY a DESC) as x > from (values (12, 0.25), (13, 0.25), (22, 0.25)) as (a, b); > [INTERNAL_ERROR] Cannot resolve the runtime replaceable expression > "percentile_cont(a, b)". The replacement is unresolved: "percentile(a, b, 1)". > org.apache.spark.SparkException: [INTERNAL_ERROR] Cannot resolve the runtime > replaceable expression "percentile_cont(a, b)". The replacement is > unresolved: "percentile(a, b, 1)". > at > org.apache.spark.SparkException$.internalError(SparkException.scala:92) > at > org.apache.spark.SparkException$.internalError(SparkException.scala:96) > at > org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis0$6(CheckAnalysis.scala:313) > at > org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis0$6$adapted(CheckAnalysis.scala:277) > ... > {noformat} > It should instead inform the user that the input expression must be foldable. > {{PercentileCont}} does not check the user's input. If the runtime > replacement (an instance of {{Percentile}}) rejects the user's input, the > runtime replacement ends up unresolved. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org