BeishaoCao-db commented on code in PR #40907:
URL: https://github.com/apache/spark/pull/40907#discussion_r1179547179


##########
python/pyspark/sql/dataframe.py:
##########
@@ -3008,6 +3008,34 @@ def __getattr__(self, name: str) -> Column:
         jc = self._jdf.apply(name)
         return Column(jc)
 
+    def __dir__(self) -> List[str]:
+        """
+        Examples
+        --------
+        >>> from pyspark.sql.functions import lit
+
+        Create a dataframe with a column named 'id'.
+
+        >>> df = spark.range(3)
+        >>> [attr for attr in dir(df) if attr[0] == 'i'][:7] # Includes column 
id
+        ['id', 'inputFiles', 'intersect', 'intersectAll', 'isEmpty', 
'isLocal', 'isStreaming']
+
+        Add a column named 'i_like_pancakes'.
+
+        >>> df = df.withColumn('i_like_pancakes', lit(1))
+        >>> [attr for attr in dir(df) if attr[0] == 'i'][:7] # Includes 
columns i_like_pancakes, id
+        ['i_like_pancakes', 'id', 'inputFiles', 'intersect', 'intersectAll', 
'isEmpty', 'isLocal']
+
+        Try to add an existed column 'inputFiles'.
+
+        >>> df = df.withColumn('inputFiles', lit(2))
+        >>> [attr for attr in dir(df) if attr[0] == 'i'][:7] # Doesn't 
duplicate inputFiles
+        ['i_like_pancakes', 'id', 'inputFiles', 'intersect', 'intersectAll', 
'isEmpty', 'isLocal']
+        """
+        attrs = list(super().__dir__())
+        attrs.extend(attr for attr in self.columns if attr not in attrs)

Review Comment:
   I don't see the point of using hasattr: our implementation about 
`__getattr__ `check the columns exist or not and we still  need to check that 
`attr not in attrs`
   
   Also, I change to use set and we don't need to check if the column already 
there or not



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to