LuciferYang commented on code in PR #47664:
URL: https://github.com/apache/spark/pull/47664#discussion_r1708808529


##########
sql/api/src/main/scala/org/apache/spark/sql/catalyst/encoders/RowEncoder.scala:
##########
@@ -106,7 +106,7 @@ object RowEncoder extends DataTypeErrorsBase {
       UDTEncoder(udt, udtClass.asInstanceOf[Class[_ <: UserDefinedType[_]]])
     case ArrayType(elementType, containsNull) =>
       IterableEncoder(
-        classTag[mutable.ArraySeq[_]],
+        classTag[immutable.ArraySeq[_]],

Review Comment:
   
https://github.com/apache/spark/blob/af70aafd330fdbb6ce0d5b3efbcb180cda488695/sql/api/src/main/scala/org/apache/spark/sql/Row.scala#L317-L325
   
   The change is to avoid the frequent collection copying when calling 
`getSeq`(`mutable.Seq` -> `immutable.Seq`), which can be costly especially when 
the collection itself is large."
   
   



##########
sql/api/src/main/scala/org/apache/spark/sql/catalyst/encoders/RowEncoder.scala:
##########
@@ -106,7 +106,7 @@ object RowEncoder extends DataTypeErrorsBase {
       UDTEncoder(udt, udtClass.asInstanceOf[Class[_ <: UserDefinedType[_]]])
     case ArrayType(elementType, containsNull) =>
       IterableEncoder(
-        classTag[mutable.ArraySeq[_]],
+        classTag[immutable.ArraySeq[_]],

Review Comment:
   
https://github.com/apache/spark/blob/af70aafd330fdbb6ce0d5b3efbcb180cda488695/sql/api/src/main/scala/org/apache/spark/sql/Row.scala#L317-L325
   
   The change is to avoid the frequent collection copying when calling 
`row.getSeq`(`mutable.Seq` -> `immutable.Seq`), which can be costly especially 
when the collection itself is large."
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to