[ 
https://issues.apache.org/jira/browse/SPARK-33420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-33420:
------------------------------------

    Assignee:     (was: Apache Spark)

> BroadCastJoin failure when keys on join side has cast from DateTyte to String
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-33420
>                 URL: https://issues.apache.org/jira/browse/SPARK-33420
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.1
>         Environment: spark 3.0.1 hadoop 2.9.2 
>            Reporter: qinyu
>            Priority: Major
>
> when use spark as below : 
> spark.sql(
>  """ create table table1(a1 INT , a2 STRING)
>  | using parquet
>  |
>  |""".stripMargin).show()
> spark.sql(
>  """ create table table2(b1 INT , b2 STRING)
>  | using parquet
>  |
>  |""".stripMargin).show()
> spark.sql(
>  """ select /*+ BROADCAST(a) */ * from table1 a join table2 b
>  | on cast(to_date(a.a2) as string) = b.b2
>  |
>  |""".stripMargin).show()
> Exception following will be thrown : 
>  java.util.NoSuchElementException: None.getjava.util.NoSuchElementException: 
> None.get at scala.None$.get(Option.scala:529) at 
> scala.None$.get(Option.scala:527) at 
> org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId(datetimeExpressions.scala:56)
>  at 
> org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId$(datetimeExpressions.scala:56)
>  at 
> org.apache.spark.sql.catalyst.expressions.CastBase.zoneId$lzycompute(Cast.scala:253)
>  at org.apache.spark.sql.catalyst.expressions.CastBase.zoneId(Cast.scala:253) 
> at 
> org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter$lzycompute(Cast.scala:287)
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to