[
https://issues.apache.org/jira/browse/SPARK-33306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
EdisonWang updated SPARK-33306:
-------------------------------
Description:
A simple way to reproduce this is
```
spark-shell --conf spark.sql.legacy.typeCoercion.datetimeToString.enabled
scala> sql("""
select a.d1 from
(select to_date(concat('2000-01-0', id)) as d1 from range(1, 2)) a
join
(select concat('2000-01-0', id) as d2 from range(1, 2)) b
on a.d1 = b.d2
""").show
```
it will throw
```
java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:529)
at scala.None$.get(Option.scala:527)
at
org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId(datetimeExpressions.scala:56)
at
org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId$(datetimeExpressions.scala:56)
at
org.apache.spark.sql.catalyst.expressions.CastBase.zoneId$lzycompute(Cast.scala:253)
at org.apache.spark.sql.catalyst.expressions.CastBase.zoneId(Cast.scala:253)
at
org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter$lzycompute(Cast.scala:287)
at
org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter(Cast.scala:287)
```
was:
A simple way to reproduce this is
```
spark-shell --conf spark.sql.legacy.typeCoercion.datetimeToString.enabled
>> sql("""
select a.d1 from
(select to_date(concat('2000-01-0', id)) as d1 from range(1, 2)) a
join
(select concat('2000-01-0', id) as d2 from range(1, 2)) b
on a.d1 = b.d2
""").show
```
it will throw
```
java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:529)
at scala.None$.get(Option.scala:527)
at
org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId(datetimeExpressions.scala:56)
at
org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId$(datetimeExpressions.scala:56)
at
org.apache.spark.sql.catalyst.expressions.CastBase.zoneId$lzycompute(Cast.scala:253)
at org.apache.spark.sql.catalyst.expressions.CastBase.zoneId(Cast.scala:253)
at
org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter$lzycompute(Cast.scala:287)
at
org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter(Cast.scala:287)
```
> TimezoneID is needed when there cast from Date to String
> --------------------------------------------------------
>
> Key: SPARK-33306
> URL: https://issues.apache.org/jira/browse/SPARK-33306
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: EdisonWang
> Priority: Major
>
> A simple way to reproduce this is
> ```
> spark-shell --conf spark.sql.legacy.typeCoercion.datetimeToString.enabled
> scala> sql("""
> select a.d1 from
> (select to_date(concat('2000-01-0', id)) as d1 from range(1, 2)) a
> join
> (select concat('2000-01-0', id) as d2 from range(1, 2)) b
> on a.d1 = b.d2
> """).show
> ```
>
> it will throw
> ```
> java.util.NoSuchElementException: None.get
> at scala.None$.get(Option.scala:529)
> at scala.None$.get(Option.scala:527)
> at
> org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId(datetimeExpressions.scala:56)
> at
> org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId$(datetimeExpressions.scala:56)
> at
> org.apache.spark.sql.catalyst.expressions.CastBase.zoneId$lzycompute(Cast.scala:253)
> at org.apache.spark.sql.catalyst.expressions.CastBase.zoneId(Cast.scala:253)
> at
> org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter$lzycompute(Cast.scala:287)
> at
> org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter(Cast.scala:287)
> ```
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]