[ https://issues.apache.org/jira/browse/SPARK-44973?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17788564#comment-17788564 ]
Mark Jarvin commented on SPARK-44973: ------------------------------------- Hmm, possibly it goes back to Spark 2.1.0 due to this line: [https://github.com/apache/spark/commit/95db8a44f3e2d79913cbe0d29297796b4c3b0d1b#diff-924be5a0a35024a5f63a1411b1a4c3000150356ab59f12eda84fada0659514a2R135] {code:java} val temp = new Array[Byte](64){code} I was looking only for the earliest version with [https://github.com/apache/spark/commit/c5b0cb2d945437a998c35917bbc9d653883244db#diff-924be5a0a35024a5f63a1411b1a4c3000150356ab59f12eda84fada0659514a2R131] {code:java} val temp = new Array[Byte](Math.max(n.length, 64)){code} > Fix ArrayIndexOutOfBoundsException in conv() > -------------------------------------------- > > Key: SPARK-44973 > URL: https://issues.apache.org/jira/browse/SPARK-44973 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.3, 3.3.3, 3.4.1, 3.5.0 > Reporter: Gera Shegalov > Assignee: Mark Jarvin > Priority: Major > Labels: pull-request-available > Fix For: 3.4.2, 4.0.0, 3.5.1, 3.3.4 > > > {code:scala} > scala> sql(s"SELECT CONV('${Long.MinValue}', 10, -2)").show(false) > java.lang.ArrayIndexOutOfBoundsException: -1 > at > org.apache.spark.sql.catalyst.util.NumberConverter$.convert(NumberConverter.scala:183) > at > org.apache.spark.sql.catalyst.expressions.Conv.nullSafeEval(mathExpressions.scala:463) > at > org.apache.spark.sql.catalyst.expressions.TernaryExpression.eval(Expression.scala:821) > at > org.apache.spark.sql.catalyst.expressions.ToPrettyString.eval(ToPrettyString.scala:57) > at > org.apache.spark.sql.catalyst.optimizer.ConstantFolding$.org$apache$spark$sql$catalyst$optimizer$ConstantFolding$$constantFolding(expressions.scala:81) > at > org.apache.spark.sql.catalyst.optimizer.ConstantFolding$.$anonfun$constantFolding$4(expressions.scala:91) > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org