rawwar opened a new issue, #41816: URL: https://github.com/apache/airflow/issues/41816
### Apache Airflow version main (development) ### If "Other Airflow 2 version" selected, which one? _No response_ ### What happened? According to the Databricks API documentation, task_key has a max length of 100: [Link](https://docs.databricks.com/api/workspace/jobs/getrun#tasks-task_key) . When the Dag ID and task ID strings are long enough, we create a task_key with more than 100 characters. However, this limit does not affect during job creation. Job gets created with the full name. But, when fetching using the job run details using [getrun](https://docs.databricks.com/api/workspace/jobs/getrun#tasks-task_key) endpoint, it truncates the task_key. This is causing issue in the following line of code to cause key error: [Link](https://github.com/apache/airflow/blob/c018a479546ccc5d46eaf6c9aaf68f0d98f330cd/airflow/providers/databricks/operators/databricks.py#L1067) ### What you think should happen instead? task key should be unique. Hence, we can include an uuid, instead of using dag_id+task_id ### How to reproduce have a dag_id and task_id names to be longer than 100 characters together and use DatabricksNotebookOperator ### Operating System Debian GNU/Linux 12 (bookworm) ### Versions of Apache Airflow Providers apache-airflow-providers-databricks==6.8.0 ### Deployment Astronomer ### Deployment details _No response_ ### Anything else? _No response_ ### Are you willing to submit PR? - [X] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
