kaxil commented on issue #51545: URL: https://github.com/apache/airflow/issues/51545#issuecomment-2980286062
> And I do not think we need a spike. For me it's just "common sense" (pun intended). Determining what is shared is as important and part of the spike as the title mentions :) >task-sdk only. Task-sdk should not depend on common-util. There some duplication is likely but we should handle it at pre-commit level where task sdk code (serialization) is copied to common util for example) That's the whole point of this discussion though! client is Task SDK here: Do we want to duplicate the seriaization code and module between common & task SDK -- I don't think so, I disagree. >I'd say airflow-*: common-util, scheduler, api-server, triggerer, task-sdk (for all kinds of workers) is probably best approach. And "apache-airflow" as meta-data package installing all of them together. That's where we disagree too. Server IMO should consists of Scheduler & API-server. They don't need to be separate packages. >There should not be combinations. All packages should have the same version and should be released together - even if there are no changes in particular distribution. They should have == pinned dependencies to other apache-airflow things. If all packages have pinned versions, you can't independently update client or server. "Dependency conflicts for administrators supporting data teams using different versions of providers, libraries, or python packages" like Ash mentioned -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
