karthi-keyan-n commented on PR #45066: URL: https://github.com/apache/airflow/pull/45066#issuecomment-2644747627
> checkout airflow repo install breeze: uv pip install -e ./dev/breeze breeze release-management prepare-provider-packages databricks --version-suffix-for-pypi dev0 Install resulting package that will be placed in dist. Hey @potiuk I followed your steps to install and validate. But got stuck post that. 1. I executed the commands which you shared and it generate .whl file in dist directory. `dist/apache_airflow_providers_databricks-7.0.0.dev0-py3-none-any.whl` 2. I googled and found the command `pip install dist/apache_airflow_providers_databricks-7.0.0.dev0-py3-none-any.whl` to install the wheel packages. 3. I executed the command in the same directory (airflow source code) and I guess it install airflow there. (This is my assumption) Other things which I tried was, 1. Build a docker container from my dev machine and tried to substitute it with the apache/airflow docker image in my custom airflow code to bring up the airflow env. Post this, I triggered the DAG and I saw only notebook params field in `rendered template` section. <img width="672" alt="image" src="https://github.com/user-attachments/assets/20897eb5-8abd-4eea-a8cd-b0ec5b42982e" /> 2. I did some digging and went through the docker file and found `ARG AIRFLOW_VERSION="2.10.4"` in docker file. 3. I also went through the `INSTALL` doc within the airflow source code repo to bring up the test env. If you don't mind, would you able to point me to the right direction from where I can setup in my local and validate? It would be great if you can share the doc for substituting the official airflow image with the custom build image from source code which will make the task earlier to quickly validate -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
