Re: [I] "SparkRDDWriteClient:Committing stats: " job running only 1 task although it has 20 executor cores available [hudi]

Wed, 25 Feb 2026 10:27:48 -0800


Sahil333 commented on issue #18246:
URL: https://github.com/apache/hudi/issues/18246#issuecomment-3961177586

   The DAG for "Doing partition and writing data"/"Committing stats" is 
different with Hudi 0.15.0 from that of "Committing stats" with Hudi 1.1.0. 
Partilcularly in Hudi 1.1.0, there is a "coalese + union" task which could be 
causing  my "1 task" issue.
   
   Screenshot from Hudi 0.15.0
   
   <img width="1250" height="771" alt="Image" 
src="https://github.com/user-attachments/assets/91098be2-6c91-4d53-88a5-a51dcb2dcfd6";
 />
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to