HyukjinKwon commented on code in PR #39469:
URL: https://github.com/apache/spark/pull/39469#discussion_r1066531328


##########
python/pyspark/sql/connect/session.py:
##########
@@ -215,7 +215,38 @@ def createDataFrame(
         _inferred_schema: Optional[StructType] = None
 
         if isinstance(data, pd.DataFrame):
-            _table = pa.Table.from_pandas(data)
+            from pandas.api.types import (  # type: ignore[attr-defined]
+                is_datetime64_dtype,
+                is_datetime64tz_dtype,
+            )
+            from pyspark.sql.pandas.types import (
+                _check_series_convert_timestamps_internal,
+                _get_local_timezone,
+            )
+
+            # Copying the frame to avoid modifying it.
+            data_copy = data.copy()
+            # We need double conversions for the truncation, first truncate to 
microseconds.
+            for col in data_copy:
+                if is_datetime64tz_dtype(data_copy[col].dtype):
+                    data_copy[col] = _check_series_convert_timestamps_internal(
+                        data_copy[col], _get_local_timezone()
+                    ).astype("datetime64[us, UTC]")
+                elif is_datetime64_dtype(data_copy[col].dtype):
+                    data_copy[col] = data_copy[col].astype("datetime64[us]")
+
+            # Create a new schema and change the types to the truncated 
microseconds.
+            pd_schema = pa.Schema.from_pandas(data_copy)

Review Comment:
   Hm .. I wonder if we cast the type after creating the Arrow batch here. 
Calling `pa.Schema.from_pandas` already copies and coverts pandas DataFrame.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to