tatiana commented on PR #59558:
URL: https://github.com/apache/airflow/pull/59558#issuecomment-3767781150

   It’s great to see renewed interest and progress around improving how Ray 
runs in Airflow — thanks for the work here, @VladaZakharova!
   
   I agree with @raphaelauv that Ray operators, sensors, and hooks would be 
better suited to a dedicated provider (either `pythorch/ray` or `ray`) rather 
than living in the Google provider. We’ve seen similar separations work well 
elsewhere, for example:
   * 
[Apache](https://github.com/apache/airflow/tree/main/providers/apache/spark) 
[Spark](https://github.com/apache/airflow/tree/main/providers/apache/spark) & 
[Databricks](https://github.com/apache/airflow/tree/main/providers/databricks)
   * [Apache 
Beam](https://github.com/apache/airflow/tree/main/providers/apache/beam) & 
[Dataflow](https://github.com/apache/airflow/tree/main/providers/google/src/airflow/providers/google)
   
   Any new provider should also align with 
[AIP-95](https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-95+Provider+lifecycle+update+proposal).
   
   Between June and October 2024, Astronomer worked with a few customers on a 
proof-of-concept to expose Ray in Airflow, as referenced by @jscheffl in 
https://github.com/astronomer/astro-provider-ray. This was actually our second 
attempt in this space; there was an [earlier company-led initiative back in 
2021](https://airflowsummit.org/sessions/2021/airflow-ray/) as well.
   
   Unfortunately, we (Astronomer, including @pankajkoti and @pankajastro) don’t 
currently have the bandwidth to continue this work. That said, if 
@VladaZakharova or others are interested in picking it up, we’d be happy to 
hand over what we’ve built so far. We’d also be supportive of either this PR or 
the Astronomer provider serving as a starting point for Ray support as a new 
provider in the Apache Airflow repository, with the understanding that we 
wouldn’t be able to actively steer the effort.
   
   We discussed a potential donation from this provider with the Anyscale team 
last Friday, but they don’t have the capacity to take this on right now. Given 
that [PyTorch has recently adopted 
Ray](https://pytorch.org/blog/pytorch-foundation-welcomes-ray-to-deliver-a-unified-open-source-ai-compute-stack/),
 one possible next step could be reaching out to the PyTorch Foundation to see 
whether they are interested in supporting this workstream as well.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to