wankunde commented on code in PR #44351:
URL: https://github.com/apache/spark/pull/44351#discussion_r1426902797


##########
sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/ParquetDictionary.java:
##########
@@ -70,7 +70,7 @@ public byte[] decodeToBinary(int id) {
       long signed = dictionary.decodeToLong(id);
       return new BigInteger(Long.toUnsignedString(signed)).toByteArray();
     } else {
-      return dictionary.decodeToBinary(id).getBytes();
+      return dictionary.decodeToBinary(id).getBytesUnsafe();

Review Comment:
   The comment of `getBytesUnsafe()` method:
   ```
     /**
      * Variant of getBytes() that avoids copying backing data structure by 
returning
      * backing byte[] of the Binary. Do not modify backing byte[] unless you 
know what
      * you are doing.
      *
      * @return backing byte[] of correct size, with an offset of 0, if 
possible, else returns result of getBytes()
      */
     public abstract byte[] getBytesUnsafe();
   ```
   Spark will copy those cached bytes before using them, so I think it's safe 
for spark.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to