heyihong commented on code in PR #52321:
URL: https://github.com/apache/spark/pull/52321#discussion_r2346760811


##########
python/pyspark/sql/connect/expressions.py:
##########
@@ -436,11 +442,48 @@ def _to_value(
             assert dataType is None or isinstance(dataType, 
DayTimeIntervalType)
             return 
DayTimeIntervalType().fromInternal(literal.day_time_interval)
         elif literal.HasField("array"):
-            elementType = 
proto_schema_to_pyspark_data_type(literal.array.element_type)
+            if literal.array.HasField("data_type"):

Review Comment:
   To minimize changes, it is not necessary to support these new data type 
fields. They are still under development and not fully stabilized yet. 
Currently, the new data type fields are only used in the requests from the 
Spark Connect Scala Client.
   
   If we really want to support them, I think we need to consider:
   - How can we enable the new data type fields in the responses while 
maintaining backward compatibility?
   - How is this code path tested?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to