ueshin commented on code in PR #41240:
URL: https://github.com/apache/spark/pull/41240#discussion_r1204706187
##########
python/pyspark/sql/pandas/conversion.py:
##########
@@ -375,22 +379,105 @@ def _convert_from_pandas(
assert isinstance(self, SparkSession)
if timezone is not None:
- from pyspark.sql.pandas.types import
_check_series_convert_timestamps_tz_local
+ from pyspark.sql.pandas.types import (
+ _check_series_convert_timestamps_tz_local,
+ _get_local_timezone,
+ )
from pandas.core.dtypes.common import is_datetime64tz_dtype,
is_timedelta64_dtype
copied = False
if isinstance(schema, StructType):
- for field in schema:
- # TODO: handle nested timestamps, such as
ArrayType(TimestampType())?
- if isinstance(field.dataType, TimestampType):
- s =
_check_series_convert_timestamps_tz_local(pdf[field.name], timezone)
- if s is not pdf[field.name]:
- if not copied:
- # Copy once if the series is modified to
prevent the original
- # Pandas DataFrame from being updated
- pdf = pdf.copy()
- copied = True
- pdf[field.name] = s
+
+ def _create_converter(data_type: DataType) ->
Callable[[pd.Series], pd.Series]:
Review Comment:
Yes, I had, but I dropped the idea because, in the series of my recent PRs,
I wanted to consolidate the conversion logic in various cases, like
with/without Arrow, and Spark Connect, or conversions of array, map, and struct
types.
Now we can reuse `_create_converter_to_pandas` and
`_create_converter_from_pandas` in `pyspark.sql.pandas.types` package in many
cases.
The function you pointed was an unfortunate case, I couldn't consolidate
only that part. We might need to revisit here.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]