R-JunmingChen commented on issue #35268:
URL: https://github.com/apache/arrow/issues/35268#issuecomment-1576946749

   Hi, @westonpace, I am stuck with reading back data. I need to read back data 
with specific offset and batch size in spillover files.
   Currenty, my offset/batch size counts on the number of rows and I use 
parquet format for temporal spillover.
   However, our parquet lib doesn't support read file with row offset( only the 
byte offset are supported). Besides, the recordBatchReader, which is a 
generator, supports batch_size with the number of row but it doesn't support 
offset.
   
   May be AsyncGenerator is a good choice to utilize recordBatchReader, but I 
don't find examples for reading parquet file, the test cases of it are all 
memory operations. If the AsyncGenerator could resolve the problem, could you 
please show me a simple example?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to