zhengruifeng opened a new pull request, #41180:
URL: https://github.com/apache/spark/pull/41180

   ### What changes were proposed in this pull request?
   Make `DataFrame.drop` support empty column
   
   
   ### Why are the changes needed?
   to be consistent with vanilla PySpark
   
   
   ### Does this PR introduce _any_ user-facing change?
   yes
   
   ```
   In [1]: df = spark.createDataFrame([(1, 21), (2, 30)], ("id", "age"))
   
   In [2]: df.drop()
   ```
   
   before:
   ```
   In [2]: df.drop()
   ---------------------------------------------------------------------------
   PySparkValueError                         Traceback (most recent call last)
   Cell In[2], line 1
   ----> 1 df.drop()
   
   File ~/Dev/spark/python/pyspark/sql/connect/dataframe.py:449, in 
DataFrame.drop(self, *cols)
       444     raise PySparkTypeError(
       445         error_class="NOT_COLUMN_OR_STR",
       446         message_parameters={"arg_name": "cols", "arg_type": 
type(cols).__name__},
       447     )
       448 if len(_cols) == 0:
   --> 449     raise PySparkValueError(
       450         error_class="CANNOT_BE_EMPTY",
       451         message_parameters={"item": "cols"},
       452     )
       454 return DataFrame.withPlan(
       455     plan.Drop(
       456         child=self._plan,
      (...)
       459     session=self._session,
       460 )
   
   PySparkValueError: [CANNOT_BE_EMPTY] At least one cols must be specified.
   ```
   
   after
   ```
   In [2]: df.drop()
   Out[2]: DataFrame[id: bigint, age: bigint]
   ```
   
   
   ### How was this patch tested?
   enabled UT


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to