[GitHub] [spark] MaxGekk commented on pull request #39239: [SPARK-41730][PYTHON] Set tz to UTC while converting of timestamps to python's datetime

2023-03-15 Thread via GitHub
MaxGekk commented on PR #39239: URL: https://github.com/apache/spark/pull/39239#issuecomment-1470569694 > So the problem here would be implementation detail. @HyukjinKwon I think this is not impl details but a fundamental problem of `datetime`, especially in the corner case of

[GitHub] [spark] MaxGekk commented on pull request #39239: [SPARK-41730][PYTHON] Set tz to UTC while converting of timestamps to python's datetime

2023-03-12 Thread via GitHub
MaxGekk commented on PR #39239: URL: https://github.com/apache/spark/pull/39239#issuecomment-1465148669 @HyukjinKwon @cloud-fan A problem of PySpark's timestamp_ltz is it is a local timestamp, and not a physical timestamp that timestamp_ltz is supposed to be. Let's see even Java 7

[GitHub] [spark] MaxGekk commented on pull request #39239: [SPARK-41730][PYTHON] Set tz to UTC while converting of timestamps to python's datetime

2022-12-27 Thread GitBox
MaxGekk commented on PR #39239: URL: https://github.com/apache/spark/pull/39239#issuecomment-1366021084 @HyukjinKwon @itholic All 4 failed tests are related to Pandas, and it seems Pandas code suffers from the issue too but I cannot find where the conversion happens. Could you point me out