luffyd commented on issue #1913:
URL: https://github.com/apache/hudi/issues/1913#issuecomment-670039732


   Thanks for the input @bvaradar 
   "Too many open files on IOException" issue also seems to be co-related with 
having 2G as max file limit.
   Will confirm the parquet version.
   
   Regarding the tests you are correct, write is touching all 1000 partitions. 
I am simulating prod write pattern.
   Will check on Spark Structured streaming options.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to