ueshin commented on code in PR #40525:
URL: https://github.com/apache/spark/pull/40525#discussion_r1158858408


##########
python/pyspark/pandas/data_type_ops/boolean_ops.py:
##########
@@ -238,8 +241,8 @@ def __and__(self, left: IndexOpsLike, right: Any) -> 
SeriesOrIndex:
             return right.__and__(left)
         else:
 
-            def and_func(left: Column, right: Any) -> Column:
-                if not isinstance(right, Column):
+            def and_func(left: GenericColumn, right: Any) -> GenericColumn:
+                if not isinstance(right, (Column, ConnectColumn)):

Review Comment:
   I guess `isinstance(right, GenericColumn.__args__)` works?



##########
python/pyspark/pandas/internal.py:
##########
@@ -962,9 +1025,9 @@ def field_for(self, label: Label) -> InternalField:
             raise KeyError(name_like_string(label))
 
     @property
-    def spark_frame(self) -> SparkDataFrame:
+    def spark_frame(self) -> LegacyDataFrame:

Review Comment:
   `GenericDataFrame`?



##########
python/pyspark/pandas/internal.py:
##########
@@ -1179,7 +1242,7 @@ def resolved_copy(self) -> "InternalFrame":
 
     def with_new_sdf(
         self,
-        spark_frame: SparkDataFrame,
+        spark_frame: LegacyDataFrame,

Review Comment:
   ditto?



##########
python/pyspark/pandas/internal.py:
##########
@@ -25,7 +25,13 @@
 import pandas as pd
 from pandas.api.types import CategoricalDtype  # noqa: F401
 from pyspark._globals import _NoValue, _NoValueType
-from pyspark.sql import functions as F, Column, DataFrame as SparkDataFrame, 
Window
+from pyspark.sql import (
+    functions as F,
+    Column,
+    DataFrame as LegacyDataFrame,
+    Window,
+    SparkSession as PySparkSession,
+)

Review Comment:
   Should use only either `LegacyXxx` or `PySparkXxx`?



##########
python/pyspark/pandas/internal.py:
##########
@@ -616,8 +628,7 @@ def __init__(
         >>> internal.column_label_names
         [('column_labels_a',), ('column_labels_b',)]
         """
-
-        assert isinstance(spark_frame, SparkDataFrame)
+        assert isinstance(spark_frame, (LegacyDataFrame, ConnectDataFrame))

Review Comment:
   `isinstance(spark_frame, GenericDataFrame.__args__)` should work?



##########
python/pyspark/pandas/internal.py:
##########
@@ -1037,7 +1100,7 @@ def data_fields(self) -> List[InternalField]:
         return self._data_fields
 
     @lazy_property
-    def to_internal_spark_frame(self) -> SparkDataFrame:
+    def to_internal_spark_frame(self) -> LegacyDataFrame:

Review Comment:
   ditto?



##########
python/pyspark/pandas/spark/accessors.py:
##########
@@ -64,7 +68,7 @@ def column(self) -> Column:
         """
         return self._data._internal.spark_column_for(self._data._column_label)
 
-    def transform(self, func: Callable[[Column], Column]) -> IndexOpsLike:
+    def transform(self, func: Callable[[Column], Union[Column, 
ConnectColumn]]) -> IndexOpsLike:

Review Comment:
   `Callable[[GenericColumn], GenericColumn]`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to