adrien-grl opened a new issue, #20435:
URL: https://github.com/apache/datafusion/issues/20435

   ### Is your feature request related to a problem or challenge?
   
   I am trying to apply an SQL query on a .feather file. This file cannot be 
materialized into RAM (too big) but each `pa.RecordBatch` can be.
   I would like to be able to register the .feather file directly, or 
preferably via a `pa.ipc.RecordBatchFileReader`, then iter on the filtered 
`pa.RecordBatch` via `df.execute_stream()`. 
   
   I haven't found any way to do that right now that does not involve a 
`pa.dataset.Dataset`.
   It is linked to the fact `pa.dataset.Dataset` cannot be instantiated from a 
`pa.ipc.RecordBatchFileReader`...
   
   ### Describe the solution you'd like
   
   ```python
   import datafusion
   import pyarrow.dataset as pdata
   
   query = "SELECT * FROM archive WHERE foo = 'bar'"
   
   handle = pa.memory_map(path, "rb")
   reader = pa.ipc.RecordBatchFileReader(handle)
   
   ctx = datafusion.SessionContext()
   ctx.register_batch_reader("archive", reader)
   df = ctx.sql(query)
   
   for batch in df.execute_stream():
       # do something
       pass
   ```
   
   ### Describe alternatives you've considered
   
   Right now, I am doing:
   ```python
   import datafusion
   import pyarrow.dataset as pdata
   
   query = "SELECT * FROM archive WHERE foo = 'bar'"
   dataset = pdata.dataset(path, format="ipc")
   
   ctx = datafusion.SessionContext()
   ctx.register_dataset("archive", dataset)
   df = ctx.sql(query)
   
   for batch in df.execute_stream():
       # do something
       pass
   ```
   This works but it forces me to read the .feather file via 
`pa.dataset.Dataset`. 
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to