gaborratky-db commented on code in PR #43004:
URL: https://github.com/apache/airflow/pull/43004#discussion_r1801022097
##########
providers/src/airflow/providers/databricks/operators/databricks.py:
##########
@@ -711,12 +717,15 @@ class DatabricksRunNowOperator(BaseOperator):
Currently the named parameters that ``DatabricksRunNowOperator`` supports
are
- ``job_id``
- ``job_name``
+ - ``job_parameters``
- ``json``
+ - ``dbt_commands``
- ``notebook_params``
- ``python_params``
- ``python_named_parameters``
- ``jar_params``
- ``spark_submit_params``
+ - ``sql_params``
Review Comment:
We shouldn't add support for `sql_params` at this point. They are
[deprecated in the
API](https://docs.databricks.com/api/workspace/jobs/runnow#sql_params), and
passing `job_parameters` to the job run will automatically push them to every
SQL task in the job.
##########
providers/src/airflow/providers/databricks/operators/databricks.py:
##########
@@ -731,6 +740,17 @@ class DatabricksRunNowOperator(BaseOperator):
It must exist only one job with the specified name.
``job_id`` and ``job_name`` are mutually exclusive.
This field will be templated.
+
+ :param job_parameters: A dict from keys to values that override or augment
the job's
Review Comment:
Should we also add similar docs for `dbt_commands`?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]