muyihao commented on issue #39261:
URL: https://github.com/apache/arrow/issues/39261#issuecomment-1859057759

   I get each row of data to write to parquet. The data has more than 1,000 
fields and each batch has more than 100 rows. Is there any good processing 
method that can help me avoid memory leaks? I have tried using the 
try-with-resource method to ensure the closure of VectorSchemaRoot.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to