hi, i know it is may not be the best place to ask but would like to try anyways, as it is quite hard for me to find good example of this online.
My usecase: i'd like to generate from streaming data (using Scala) into arrow format in memory mapped file and then have my parquet-cpp program writing it as parquet file to disk. my understanding is that java parquet only implements HDFS writer, which is not my use case (not using hadoop) and parquet-cpp is much more succinct. My question: does my usecase make sense? or if there is better way? Thanks, -- Alex Wang, Open vSwitch developer