pranotishanbhag commented on issue #2414:
URL: https://github.com/apache/hudi/issues/2414#issuecomment-758042202


   Hi,
   
   I tried copy_on_write with insert mode for 4.6 TB dataset which is failing 
with lost nodes (previously tried bulk_insert which worked fine). I tried to 
tweak the executor memory and also changes the config 
"hoodie.copyonwrite.insert.split.size" to 100k. It still failed. Later i tried 
to add the config "hoodie.insert.shuffle.parallelism" and set it to a higher 
value and still I see failures. Please can you tell me what is wrong here.
   
   Thanks,
   Pranoti


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to