SaiUtturkar opened a new issue, #14245:
URL: https://github.com/apache/dolphinscheduler/issues/14245

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/dolphinscheduler/issues?q=is%3Aissue) and 
found no similar feature requirement.
   
   
   ### Description
   
   **Description:**
   Add support for availability of application_id(received after submitting 
spark job)/tracking_url/diagnostic in downstream tasks inside the same workflow 
as that of spark task resides in.
   
   **Current accessibility**
   Either we need to fetch app_id from mysql db (table: t_ds_task_instance | 
column: app_link) or use api and fetch the workflow instance info with app_link.
   
   ### Use case
   
   In case of spark job failure, reaching out to DS UI every time and getting 
diagnostic message in logs is tedious when we have loads of spark workflows 
running. Users might want to track the spark job separately to send the 
status/diagnostic messages to external application. Users will benefit from 
being able to retrieve spark logs according to their needs rather than 
frequently visiting the log file on UI if this app_id/tracking_URL is made 
available in downstream tasks. Even if we include diagnostic message received 
from YARN in alert info, it will be helpful. 
   
   ### Related issues
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: 
[email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to