gaogaotiantian commented on PR #53161:
URL: https://github.com/apache/spark/pull/53161#issuecomment-3580089750

   > Where do we do the conversion? At lease for the UDF case the conversion 
should all happen within an active query which belongs to a session ?
   
   We have to do it everywhere.
   
   ```python
   df = spark.createDataFrame([(datetime.datetime(1990, 8, 10, 0, 0),)], ["ts"])
   ```
   
   Here we are trying to create a `TimestampType` with a naive datetime - how 
could we determine the timezone info? It's not correct to assume it belongs to 
any timezone.
   
   There are two correct ways to do this:
   1. For every single conversion, we know session local timezone and we assume 
the naive datetime is that timezone
   2. We throw an error when the users try to convert a naive timestamp to 
`TimestampType` and suggest that they should use `TimestampNTZType`.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to