EB80 edited a comment on issue #11665:
URL: https://github.com/apache/arrow/issues/11665#issuecomment-969113711


   Thanks for the reply,
   
   Additional information:
   -The feather was written with feather::write_feather and not 
arrow::write_feather
   -I receive the same error with older versions of arrow (currently running 
6.0.0.2)
   -I actually encounter a separate issue attempting to write with 
arrow::write_feather.  78MB of the 32GB file will write, and then the writing 
hangs indefinitely without continuing or crashing.  (I used 
feather::write_feather for this reason.)
   -I am writing to a network drive and not a local disk
   -The dataframe is 26M rows and 150 columns
   
   Traceback:
   6: stop(e)
   5: value[[3L]](cond)
   4: tryCatchOne(expr, names, parentenv, handlers[[1L]])
   3: tryCatchList(expr, classes, parentenv, handlers)
   2: tryCatch(reader$Read(columns), error = read_compressed_error)
   1: arrow::read_feather("Archive/2021-06-23/2021-06-23 - Pre-Processed 
DECKPLATE - June 2021.feather")
   
   Thank you,
   Edward


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to