zhengruifeng opened a new pull request, #42743:
URL: https://github.com/apache/spark/pull/42743
### What changes were proposed in this pull request?
Add CalendarIntervalType to Python Client
### Why are the changes needed?
for feature parity
### Does this PR introduce _any_ user-facing change?
yes
before this PR:
```
In [1]: from pyspark.sql import functions as sf
In [2]: spark.range(1).select(sf.make_interval(sf.lit(1))).schema
---------------------------------------------------------------------------
Exception Traceback (most recent call last)
Cell In[2], line 1
----> 1 spark.range(1).select(sf.make_interval(sf.lit(1))).schema
File ~/Dev/spark/python/pyspark/sql/connect/dataframe.py:1687, in
DataFrame.schema(self)
1685 if self._session is None:
1686 raise Exception("Cannot analyze without SparkSession.")
-> 1687 return self._session.client.schema(query)
1688 else:
1689 raise Exception("Empty plan.")
...
Exception: Unsupported data type calendar_interval
```
after this PR:
```
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 4.0.0.dev0
/_/
Using Python version 3.10.11 (main, May 17 2023 14:30:36)
Client connected to the Spark Connect server at localhost
SparkSession available as 'spark'.
In [1]: from pyspark.sql import functions as sf
In [2]: spark.range(1).select(sf.make_interval(sf.lit(1))).schema
Out[2]: StructType([StructField('make_interval(1, 0, 0, 0, 0, 0, 0)',
CalendarIntervalType(), True)])
```
### How was this patch tested?
added UT
### Was this patch authored or co-authored using generative AI tooling?
NO
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]