casra-developers commented on pull request #16110:
URL: https://github.com/apache/airflow/pull/16110#issuecomment-934400461


   Today I finally had a few hours to spare and look into it again. The hint 
about the cloudpickle and distributed versions was the solution to get the 
airflow version from the main-branch working, thanks @Anurag-Shetty  :)
   Sadly, I was not able to run the git rebase without issues, so I ended up 
cloning the main-branch and implementing the necessary changes locally from our 
forked repository.
   As is, the changes required to use a Windows based Dask-Worker are actually 
quite minimal:
   
   - airflow/utils/platform.py: Added the IS_WINDOWS constant
   - airflow/utils/process_utils.py: Made some imports conditional
   - airflow/utils/timeout.py: Added second thread-based rimwour-implementation 
for Windows systems
   - airflow/utils/configuration.py: Added check to not use the os.fchmod 
function on Windows
   - airflow/task/task_runner/base_task_runner.py: Made imports conditional
   
   Hacks for running the dags from the another path are not necessary anymore. 
Logs also do not need to be copied anymore but can be served via the **python 
-m http.server** command which can be run as a service on the Dask-Worker.
   
   The question now is how to integrate those few changes into the main-branch. 
Should I just overwrite the entire code within the original branch which was 
reviewed within this thread? Should I create a new pull-request from the newly 
cloned repository?
   
   @potiuk I assume you are quite familiar with merging procedures on GitHub so 
I would do what you think is best.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to