domoritz commented on PR #35780:
URL: https://github.com/apache/arrow/pull/35780#issuecomment-1837198871

   I updated things based on the comments so far. I mentioned we would need to 
support larger arrays. I plan to look into the following
   * We support BigInt64Array as offset buffers. The offsets are into the data 
buffer. The offset buffer itself can only have 2^32−2 many entries (because 
they are indexed by number)
   * The offsets are into the data buffer. This data buffer now needs to be 
indexed by a bigint. Since normal JS arrays are indexed by (32 bit) numbers, we 
need to implement a new data structure that chunks. I really wonder, though, 
whether there is ever a use case for arrays this large in JS since it would 
probably exceed the available browser memory and a serialized buffer can also 
not be that large.
   
   Thoughts? especially @kylebarron since you mentioned above that you think 
the issues of supporting deserialization and actually supporting massive 
strings are separate.  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to