Kimahriman commented on PR #13815:
URL: https://github.com/apache/arrow/pull/13815#issuecomment-1383199650

   While working on https://github.com/apache/spark/pull/39572 to support the 
large variable width vectors in Spark, I think I found that this PR effectively 
limits these regular width variable vectors to 1 GiB total. While it was 
definitely a bug how things were handled before this PR, now whenever you try 
to add data beyond 1 GiB, the vector will try to double itself to the next 
power of two, which would be `2147483648`, which is greater than 
`Integer.MAX_VALUE` which is `2147483647`, thus throwing a 
`OversizedAllocationException`.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to