n3nash commented on issue #654: Many small files
URL: https://github.com/apache/incubator-hudi/issues/654#issuecomment-489742356
 
 
   @zk19930911 How much data do you ingest per batch ? Try reducing the 
following parallelisms : 
   
   `hoodie.bulkinsert.shuffle.parallelism` & 
`hoodie.insert.shuffle.parallelism` from 1500 to a lower value like say 300 if 
every batch that you're ingesting is not a lot. Say you want to write 512MB 
files, you have 1 GB worth of data, parallelism set to between 3 - 5 might 
suffice. Obviously it will increase the runtime since you are writing larger 
parquet files..
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to