[ 
https://issues.apache.org/jira/browse/ARROW-7156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16973713#comment-16973713
 ] 

Anthony Abate commented on ARROW-7156:
--------------------------------------

[~npr]- do you know if an individual RecordBatch can exceed 2 gigs (int32 max) 
? 

This might not be an Arrow C++ issue, but another bug in the C# library that I 
used to generate the file.

> [R] [C++] Large Batches Cause Error / Crashes
> ---------------------------------------------
>
>                 Key: ARROW-7156
>                 URL: https://issues.apache.org/jira/browse/ARROW-7156
>             Project: Apache Arrow
>          Issue Type: Bug
>          Components: C++, R
>    Affects Versions: 0.14.1, 0.15.1
>            Reporter: Anthony Abate
>            Priority: Major
>
> I have a 30 gig arrow file with 100 batches.  the largest batch in the file 
> causes get batch to fail - All other batches load fine. in 14.11 the 
> individual batch errors.. in 15.1.1 the batch crashes R studio when it is used
> *14.1.1*
> {code:java}
> >  rbn <- data_rbfr$get_batch(x)
> Error in ipc__RecordBatchFileReader_ReadRecordBatch(self, i) : 
> Invalid: negative malloc size
>   {code}
> *15.1.1*
> {code:java}
> rbn <- data_rbfr$get_batch(x)  works!
> df <- as.data.frame(rbn) - Crashes R Studio! {code}
>  
> Update
> I put the data in the batch into a separate file.  The file size is over 2 
> gigs. 
> Using 15.1.1, when I try to load this entire file via read_arrow it also 
> fails.
> {code:java}
> ar <- arrow::read_arrow("e:\\temp\\file.arrow") 
> Error in Table__from_RecordBatchFileReader(batch_reader) :
>  Invalid: negative malloc size{code}
> {color:#c5060b} {color}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to