dan-blanchard edited a comment on issue #18139: [SPARK-20787][PYTHON] PySpark can't handle datetimes before 1900 URL: https://github.com/apache/spark/pull/18139#issuecomment-454155032 A backward compatible fix would be to use the `tzlocal` library to get the local timezone info: ```py In [26]: import calendar, time In [27]: import datetime as dt In [27]: import tzlocal In [28]: local_tz = tzlocal.get_localzone() In [29]: dt1 = dt.datetime.now() In [30]: calendar.timegm(local_tz.localize(dt1).utctimetuple()) Out[31]: 1547498530 In [31]: calendar.timegm(local_tz.localize(dt1).utctimetuple()) == time.mktime(dt1.timetuple()) Out[31]: True In [32]: dt2 = datetime(1899, 1, 1) In [33]: calendar.timegm(local_tz.localize(dt2).utctimetuple()) Out[33]: -2240507040 ``` You'd also probably want to cache `tzlocal.get_localzone()` somewhere so you didn't have to pay the cost of looking that up every time you called `toInternal`.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
