grundprinzip commented on code in PR #39103:
URL: https://github.com/apache/spark/pull/39103#discussion_r1051537745


##########
python/pyspark/sql/connect/types.py:
##########
@@ -75,6 +75,15 @@ def pyspark_types_to_proto_types(data_type: DataType) -> 
pb2.DataType:
     elif isinstance(data_type, DayTimeIntervalType):
         ret.day_time_interval.start_field = data_type.startField
         ret.day_time_interval.end_field = data_type.endField
+    elif isinstance(data_type, StructType):
+        fields = []
+        for field in data_type.fields:
+            struct_field = pb2.DataType.StructField()
+            struct_field.name = field.name
+            
struct_field.data_type.CopyFrom(pyspark_types_to_proto_types(field.dataType))
+            struct_field.nullable = field.nullable
+            fields.append(struct_field)
+        ret.struct.fields.extend(fields)

Review Comment:
   This does not need the extra copy, you can directly append to the return 
type array.
   
   ```suggestion
                ret.struct.fields.append(struct_field)
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to