Ishan created ARROW-9958:
----------------------------

             Summary: Error writing record batches to IPC streaming format
                 Key: ARROW-9958
                 URL: https://issues.apache.org/jira/browse/ARROW-9958
             Project: Apache Arrow
          Issue Type: Bug
          Components: GLib, Python
    Affects Versions: 1.0.1
         Environment: pyarrow - Version: 1.0.1
python - version 3.7.6
Operating system - CentOS Linux release 7.8.2003 (Core)
            Reporter: Ishan
         Attachments: example1.py, example2.py

Writing record batches to the Arrow IPC streaming format with on-the-fly 
compression generally raises errors of one type or the other. 

PFA the code producing each of the below errors. I can't reproduce it for 
smaller batch sizes, so it probably has to do with size of each record batch. 
It does not seem specific to pyarrow since I see a similar issue with the 
C-Glib API. 


# Error case 1

```
~/py376/lib/python3.7/site-packages/pyarrow/ipc.pxi in 
pyarrow.lib._CRecordBatchReader.read_next_batch()

~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in 
pyarrow.lib.check_status()

OSError: Truncated compressed stream
```

# Error case 2

```
~/py376/lib/python3.7/site-packages/pyarrow/ipc.pxi in 
pyarrow.lib._RecordBatchStreamReader._open()

~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in 
pyarrow.lib.pyarrow_internal_check_status()

~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in 
pyarrow.lib.check_status()

ArrowInvalid: Tried reading schema message, was null or length 0
```





--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to