Answering as one of the Airflow developers contributing to the task SDK.

Q1: If Engine = Execution and API = Server side, the analogy is comparable.
The goal of task SDK is to decouple
Dag authoring from Airflow internals and providing a version agnostic
stable interface for writing Dags.

Q2: Yes, that's the intention. Custom executor's might require a some
adaptation while adopting AF3 the first time
because Airflow 3, deals in *workloads
<https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/executor/index.html#workloads>
*vs CLI commands in < 3.0

Q3: You can compare / draw relations provided that the comparison is in
context to the server-client separation and future
proofing consumers from internal changes.

Thanks & Regards,
Amogh Desai


On Sat, Nov 22, 2025 at 10:10 AM Kyungjun Lee <[email protected]>
wrote:

> Hi Airflow developers,
>
> I’ve been studying the Airflow *Task SDK* in detail, and I find its
> direction very interesting—especially the idea of introducing a stable,
> user-facing API layer that is decoupled from the internal executor,
> scheduler, and runtime behavior.
>
> While going through the design notes and recent changes around the Task
> SDK, it reminded me of the architectural philosophy behind *Apache Spark
> Connect*, which also emphasizes:
>
>    -
>
>    separating user-facing APIs from the underlying execution engine
>    -
>
>    providing a stable long-term public API surface
>    -
>
>    enabling flexible execution models
>    -
>
>    reducing coupling between API definitions and the actual runtime
>    environment
>
> This made me wonder whether the philosophical direction is similar or if I
> am drawing an incorrect analogy.
> I would like to ask a few questions to better understand Airflow’s
> long-term intent:
> ------------------------------
> *Q1.*
>
> Is the Task SDK intentionally aiming for a form of *API–engine decoupling*
> similar to Spark Connect?
> Or is the motivation fundamentally different?
> *Q2.*
>
> Is the long-term vision that tasks will be defined through a stable Task
> SDK interface while the underlying scheduler/executor implementations
> evolve independently without breaking user code?
> *Q3.*
>
> *https://issues.apache.org/jira/browse/SPARK-39375
> <https://issues.apache.org/jira/browse/SPARK-39375>  # spark-connect*
>
> From the perspective of the Airflow dev community, does it make sense to
> compare Task SDK ↔ Spark Connect, or is the architectural direction of
> Airflow fundamentally different?
> ------------------------------
>
> I’m asking these questions because I want to *better understand the
> philosophy that Airflow is trying to pursue*, and confirm whether my
> interpretation of the Task SDK direction is accurate.
>
> Any insights or clarifications would be greatly appreciated.
> Thank you for your continued work on Airflow.
>
> Best regards,
> *Kyungjun Lee*
>

Reply via email to