ruanwenjun commented on issue #16906:
URL: 
https://github.com/apache/dolphinscheduler/issues/16906#issuecomment-2589782507

   > > But you said when you manual execute the xx.kill, there is no error?
   > 
   > I executed the xx.kill file manually with no error, without executing 
/usr/hdp/current/hadoop/libexec/yarn-config.sh. The `.kill` file only cd to a 
directory and yarn application -kill applicationId. The root cause for `Cannot 
execute /usr/hdp/current/hadoop/libexec/yarn-config.sh` is error config of the 
`haddoop_home` variable in my environment. I fixed it and then the stop job 
function works perfectly. The `yarn-config.sh` contains various environment 
variables. Is it necessary for dolphinscheduler to stop a Spark job?
   
   This is strange, ds will not lode this file `yarn-config.sh`.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to