Hi arrow experts,

I have what I think should be a standard problem, but I'm not seeing the
correct solution.

I have data in a nonstandard form (nifti neuroimaging files) that I can
load into R and transform into a single row dataframe (which is 30K
columns). In a small example I can load about 80 of these into a single
dataframe and save as feather or parquet without problem. I'd like to
address the problem where I have thousands.

The approach of loading a collection (e.g. 10) into a dataframe and saving
with a hive standard name and repeating does work, but doesn't seem like
the right way to do it.

Is there a way to stream data, one row at a time, into a feather or parquet
file?
I've attempted to use write_feather with a FileOutputputStream sink, but
without luch

Reply via email to