kyle-winkelman opened a new issue, #29733:
URL: https://github.com/apache/airflow/issues/29733

   ### Description
   
   Allow an Airflow DAG to define a Databricks job with the /2.1/jobs/create 
(or /2.1/jobs/reset) endpoint then run that same job with the /2.1/jobs/run-now 
endpoint. This would give similar capabilities as the DatabricksSubmitRun 
operator, but the /2.1/jobs/create endpoint supports additional parameters that 
the /2.1/jobs/runs/submit doesn't (e.g. job_clusters, email notifications, 
etc.).
   
   ### Use case/motivation
   
   Create and run a Databricks job all in the Airflow DAG. Currently, 
DatabricksSubmitRun operator uses the /2.1/jobs/runs/submit endpoint which 
doesn't support all features and creates runs that aren't tied to a job in the 
Databricks UI. Also, DatabricksRunNow operator requires you to define the job 
either directly in the Databricks UI or through a separate CI/CD pipeline 
causing the headache of having to change code in multiple places.
   
   ### Related issues
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to