Lee-W opened a new issue, #41641:
URL: https://github.com/apache/airflow/issues/41641

   # Description
   
   ## Why
   As we're introducing breaking changes to the main branch, it would be better 
to begin recording the things we could use migration tools to help our users 
migrate from Airflow 2 to 3. 
   
   The breaking changes can be found at 
https://github.com/apache/airflow/pulls?q=is%3Apr+label%3Aairflow3.0%3Abreaking 
and through 
[newsfragments/.*.significant.rst](https://github.com/apache/airflow/tree/main/newsfragments)
   
   ## What
   * Add migration rules to ruff and `airflow config lint`, `airflow config 
update`
       * Implement the rules listed down in `uv run 
scripts/ci/pre_commit/significant_newsfragments_checker.py --list-todo` 
   
   * Code: AIR201
       * https://github.com/apache/airflow/issues/48389 
   * Code: AIR301
       * 45961
           * https://github.com/astral-sh/ruff/pull/22850
       * 45960
           * https://github.com/astral-sh/ruff/pull/22850
       * 41348
           * https://github.com/astral-sh/ruff/pull/22850
       * 46415
           * https://github.com/astral-sh/ruff/pull/22850
   
   ## AIR3 rules we have by ruff 0.11.13
   
   <details>
       <summary> AIR301 </summary>
   
   ```python
   # airflow.PY\d{1,2}
   from airflow import PY36 # → Use `sys.version_info` instead
   from airflow import PY37 # → Use `sys.version_info` instead
   from airflow import PY38 # → Use `sys.version_info` instead
   from airflow import PY39 # → Use `sys.version_info` instead
   from airflow import PY310 # → Use `sys.version_info` instead
   from airflow import PY311 # → Use `sys.version_info` instead
   from airflow import PY312 # → Use `sys.version_info` instead
   
   # airflow.api_connexion.security
   from airflow.api_connexion.security import requires_access # → Use 
`airflow.api_fastapi.core_api.security.requires_access_*` instead
   from airflow.api_connexion.security import requires_access_dataset # → from 
airflow.api_fastapi.core_api.security import requires_access_asset
   
   # airflow.auth.managers
   from airflow.auth.managers.models.resource_details import DatasetDetails # → 
from airflow.api_fastapi.auth.managers.models.resource_details import 
AssetDetails
   
   # airflow.configuration
   from airflow.configuration import as_dict # → from airflow.configuration 
import conf; conf.as_dict
   from airflow.configuration import get # → from airflow.configuration import 
conf; conf.get
   from airflow.configuration import getboolean # → from airflow.configuration 
import conf; conf.getboolean
   from airflow.configuration import getfloat # → from airflow.configuration 
import conf; conf.getfloat
   from airflow.configuration import getint # → from airflow.configuration 
import conf; conf.getint
   from airflow.configuration import has_optio # → from airflow.configuration 
import conf; conf.has_option
   from airflow.configuration import remove_option # → from 
airflow.configuration import conf; conf.remove_option
   from airflow.configuration import set # → from airflow.configuration import 
conf; conf.set
   
   # airflow.contrib.*
   from airflow import contrib # → The whole `airflow.contrib` module has been 
removed.
   
   # airflow.datasets.manager
   from airflow.datasets.manager import DatasetManager # → from 
airflow.assets.manager import AssetManager
   from airflow.datasets.manager import dataset_manager # → from 
airflow.assets.manager import asset_manager
   from airflow.datasets.manager import resolve_dataset_manager # → from 
airflow.assets.manager import resolve_asset_manager
   # airflow.datasets
   from airflow.datasets import DatasetAliasEvent # → None
   
   # airflow.hooks
   from airflow.hooks.base_hook import BaseHook# → from airflow.hooks.base 
import BaseHook
   
   # airflow.lineage.hook
   from airflow.lineage.hook import DatasetLineageInfo# → from 
airflow.lineage.hook import AssetLineageInfo
   
   # airflow.listeners.spec
   from airflow.listeners.spec.dataset import on_dataset_created # → from 
airflow.listeners.spec.asset import on_asset_created
   from airflow.listeners.spec.dataset import on_dataset_changed # → from 
airflow.listeners.spec.asset import on_asset_changed
   
   # airflow.metrics.validators
   from airflow.metrics.validators import AllowListValidator # → from 
airflow.metrics.validators import PatternAllowListValidator
   from airflow.metrics.validators import BlockListValidator # → from 
airflow.metrics.validators import PatternBlockListValidator
   
   # airflow.notifications
   from airflow.notifications.basenotifier import BaseNotifier # → from 
airflow.sdk.bases.notifier import BaseNotifier
   
   # airflow.operators
   from airflow.operators import subdag # → The whole `airflow.subdag` module 
has been removed.
   from airflow.operators.python import get_current_context # → from 
airflow.sdk import get_current_context
   
   # airflow.secrets
   from airflow.secrets.local_filesystem import load_connections # → from 
airflow.secrets.local_filesystem import load_connections_dict
   
   # airflow.security
   from airflow.security.permissions import RESOURCE_DATASET # → from 
airflow.security.permissions import RESOURCE_ASSET
   
   # airflow.sensors
   from airflow.sensors.base_sensor_operator import BaseSensorOperator # → from 
airflow.sdk.bases.sensor import BaseSensorOperator
   
   # airflow.timetables
   from airflow.timetables.simple import DatasetTriggeredTimetable # → from 
airflow.timetables.simple import AssetTriggeredTimetable
   
   # airflow.triggers
   from airflow.triggers.external_task import TaskStateTrigger # → None
   
   # airflow.utils
   # airflow.utils.dag_cycle_tester
   from airflow.utils.dag_cycle_tester import test_cycle # → None
   
   # airflow.utils.db
   from airflow.utils.db import create_session # → None
   
   # airflow.utils.decorators
   from airflow.utils.decorators import apply_defaults # → `apply_defaults` is 
now unconditionally done and can be safely removed.
   # airflow.utils.dates
   from airflow.utils.dates import date_range # → None
   from airflow.utils.dates import days_ago # → Use 
`pendulum.today('UTC').add(days=-N, ...)` instead
   
   from airflow.utils.dates import parse_execution_date # → None
   from airflow.utils.dates import round_time # → None
   from airflow.utils.dates import scale_time_units # → None
   from airflow.utils.dates import infer_time_unit # → None
   
   # airflow.utils.file
   from airflow.utils.file import TemporaryDirectory # → from tempfile import 
TemporaryDirectory
   from airflow.utils.file import mkdirs # → Use `pathlib.Path({path).mkdir` 
instead
   
   # airflow.utils.helpers
   from airflow.utils.helpers import chain # → from airflow.sdk import chain
   from airflow.utils.helpers import cross_downstream # → from airflow.sdk 
import cross_downstream
   
   # airflow.utils.log.secrets_masker
   from airflow.utils.log import secrets_masker# → from 
airflow.sdk.execution_time import secrets_masker
   
   # airflow.utils.state
   from airflow.utils.state import SHUTDOWN # → None
   from airflow.utils.state import terminating_states # → None
   
   # airflow.utils.trigger_rule
   from airflow.utils.trigger_rule import TriggerRule
   
   TriggerRule.DUMMY # → None
   TriggerRule.NONE_FAILED_OR_SKIPPED # → None
   
   # airflow.www
   from airflow.www.auth import has_access # → None
   from airflow.www.auth import has_access_dataset # → None
   from airflow.www.utils import get_sensitive_variables_fields # → None
   from airflow.www.utils import should_hide_value_for_key # → None
   
   # airflow.providers.amazon
   from airflow.providers.amazon.aws.datasets.s3 import create_dataset # → from 
airflow.providers.amazon.aws.assets.s3 import create_asset
   from airflow.providers.amazon.aws.datasets.s3 import 
convert_dataset_to_openlineage # → from airflow.providers.amazon.aws.assets.s3 
import convert_asset_to_openlineage
   from airflow.providers.amazon.aws.datasets.s3 import sanitize_uri # → from 
airflow.providers.amazon.aws.assets.s3 import sanitize_uri
   
   from airflow.providers.amazon.aws.auth_manager.avp.entities import 
AvpEntities; AvpEntities.DATASET # → from 
airflow.providers.amazon.aws.auth_manager.avp.entities import AvpEntities; 
AvpEntities.ASSET
   
   # airflow.providers.common.io
   # airflow.providers.common.io.datasets.file
   from airflow.providers.common.io.datasets.file import create_dataset # → 
from airflow.providers.common.io.assets.file import create_asset
   from airflow.providers.common.io.datasets.file import 
convert_dataset_to_openlineage # → from airflow.providers.common.io.assets.file 
import convert_asset_to_openlineage
   from airflow.providers.common.io.datasets.file import sanitize_uri # → from 
airflow.providers.common.io.assets.file import sanitize_uri
   
   # airflow.providers.google
   # airflow.providers.google.datasets
   from airflow.providers.google.datasets.bigquery import create_dataset# → 
from airflow.providers.google.assets.bigquery import create_asset
   from airflow.providers.google.datasets.gcs import create_dataset# → from 
airflow.providers.google.assets.gcs import create_asset
   from airflow.providers.google.datasets.gcs import 
convert_dataset_to_openlineage# → from airflow.providers.google.assets.gcs 
import convert_asset_to_openlineage
   from airflow.providers.google.datasets.gcs import sanitize_uri# → from 
airflow.providers.google.assets.gcs import sanitize_uri
   
   # airflow.providers.mysql
   from airflow.providers.mysql.datasets.mysql import sanitize_uri # → from 
airflow.providers.mysql.assets.mysql import sanitize_uri
   
   # airflow.providers.postgres
   from airflow.providers.postgres.datasets.postgres import sanitize_uri # → 
from airflow.providers.postgres.assets.postgres import sanitize_uri
   
   # airflow.providers.openlineage
   # airflow.providers.openlineage.utils.utils
   from airflow.providers.openlineage.utils.utils import DatasetInfo # → from 
airflow.providers.openlineage.utils.utils import AssetInfo
   from airflow.providers.openlineage.utils.utils import 
translate_airflow_dataset # → from airflow.providers.openlineage.utils.utils 
import translate_airflow_asset
   
   # airflow.providers.trino
   from airflow.providers.trino.datasets.trino import sanitize_uri # → from 
airflow.providers.trino.assets.trino import sanitize_uri
   ```
   
   * airflow plugin extension
   
   ```python
   from airflow.plugins_manager import AirflowPlugin
   
   class AirflowTestPlugin(AirflowPlugin):
       operators = [PluginOperator] # This extension should just be imported as 
a regular python module.
       sensors = [PluginSensorOperator] # This extension should just be 
imported as a regular python module.
       hooks = [PluginHook] # This extension should just be imported as a 
regular python module.
       executors = [PluginExecutor] # This extension should just be imported as 
a regular python module.
   ```
   
   * class attribute/method
   
   ```python
   from airflow.assets.manager import AssetManager
   
   manager = AssetManager()
   manager.register_dataset_change # → manager.register_asset_change
   manager.create_datasets # → manager.create_assets
   manager.notify_dataset_created # → manager.notify_asset_created
   manager.notify_dataset_changed # → manager.notify_asset_changed
   manager.notify_dataset_alias_created # → manager.notify_asset_alias_created
   
   from airflow.lineage.hook import HookLineageCollector
   
   hook_lineage_collector = HookLineageCollector()
   
   hook_lineage_collector.create_dataset # → hook_lineage_collector.create_asset
   hook_lineage_collector.add_input_dataset # → 
hook_lineage_collector.add_input_asset
   hook_lineage_collector.add_output_dataset # → 
hook_lineage_collector.add_output_asset
   hook_lineage_collector.collected_datasets # → 
hook_lineage_collector.collected_assets
   
   
   from airflow.providers.amazon.aws.auth_manager.aws_auth_manager import 
AwsAuthManager
   
   aws_auth_manager = AwsAuthManager()
   aws_auth_manager.is_authorized_dataset # → 
aws_auth_manager.is_authorized_asset
   
   from airflow.providers.fab.auth_manager.fab_auth_manager import 
FabAuthManager
   
   fab_auth_manager = FabAuthManager()
   fab_auth_manager.is_authorized_dataset # → 
fab_auth_manager.is_authorized_asset
   
   
   from airflow.providers_manager import ProvidersManager
   pm = ProvidersManager()
   pm.initialize_providers_dataset_uri_resources # → 
pm.initialize_providers_asset_uri_resources
   pm.dataset_factories # → pm.asset_factories
   pm.dataset_uri_handlers # → pm.asset_uri_handlers
   pm.dataset_to_openlineage_converters # → pm.asset_to_openlineage_converters
   
   from airflow.secrets.local_filesystem import LocalFilesystemBackend
   
   lfb = LocalFilesystemBackend()
   lfb.get_connections("test") # → lfb.get_connection("test")
   
   
   from airflow.datasets import Dataset
   
   dataset = Dataset("test")
   dataset.iter_datasets() # -> dataset.iter_assets()
   dataset.iter_dataset_aliases() # → dataset.iter_asset_aliases()
   
   from airflow.secrets.base_secrets import BaseSecretsBackend
   
   base_secret_backend = BaseSecretsBackend()
   base_secret_backend.get_conn_uri("test") # → 
base_secret_backend.get_conn_value("test")
   base_secret_backend.get_connections("test") # → 
base_secret_backend.get_connection("test"")
   
   from airflow.providers.standard.hooks.filesystem import FSHook
   fs_hook = FSHook()
   fs_hook.get_connections("test") # → fs_hook.get_connection("test")
   
   from airflow.lineage.hook import AssetLineageInfo
   
   asset_lineage_info = AssetLineageInfo("test", "test", "test")
   asset_lineage_info.dataset # → asset_lineage_info.asset
   ```
   
   * arguments
   
   ```python
   from airflow.sdk import DAG
   DAG(
       dag_id="dag_id",
       fail_stop=..., # → fail_fast
       schedule_interval=..., # → schedule
       timetable=..., # → schedule
       default_view=..., # → None
       orientation=..., # → None
   )
   
   from airflow.providers.fab.auth_manager.fab_auth_manager import 
FabAuthManager
   fab_auth_manager = FabAuthManager(
       None # → "The constructor takes no parameter now"
   )
   
   from airflow.utils.log.file_task_handler import FileTaskHandler
   
   handler = FileTaskHandler(
       "test",
       filename_template="test" # → None
   )
   
   from airflow.operators.trigger_dagrun import TriggerDagRunOperator
   trigger_dagrun_op2 = TriggerDagRunOperator(
       task_id="trigger_dagrun_op2",
       trigger_dag_id="test",
       execution_date="2024-12-04" # → None
   )
   
   
   from airflow.operators.datetime import BranchDateTimeOperator
   
   from airflow.operators.weekday import BranchDayOfWeekOperator 
   from airflow.sensors.weekday import DayOfWeekSensor
   
   dof_task_sensor2 = DayOfWeekSensor(
       task_id="dof_task_sensor2",
       week_day=1,
       use_task_execution_day=True, # → None
   )
   
   
   bdow_op2 = BranchDayOfWeekOperator(
       task_id="bdow_op2",
       follow_task_ids_if_false=None,
       follow_task_ids_if_true=None,
       week_day=1,
       use_task_execution_day=True, # → None
   )
   
   
   branch_dt_op2 = BranchDateTimeOperator(
       task_id="branch_dt_op2",
       follow_task_ids_if_false=None,
       follow_task_ids_if_true=None,
       target_lower=0,
       target_upper=1,
       use_task_execution_day=True, # → None
   )
   ```
   
   * context
   
   ```python
   from airflow.decorators import task
   
   @task
   def task1():
       context = get_current_context()
   
       removed_keys = [
           "conf", # → None
           "execution_date", # → None
           "next_ds", # → None
           "next_ds_nodash", # → None
           "next_execution_date", # → None
           "prev_ds", # → None
           "prev_ds_nodash", # → None
           "prev_execution_date", # → None
           "prev_execution_date_success", # → None
           "tomorrow_ds", # → None
           "yesterday_ds", # → None
           "yesterday_ds_nodash", # → None
       ]
   
       for key in removed_keys:
           context[key]
           context.get(key)
   
   
   @task
   def task2(**context):
       removed_keys = [
           "conf", # → None
           "execution_date", # → None
           "next_ds", # → None
           "next_ds_nodash", # → None
           "next_execution_date", # → None
           "prev_ds", # → None
           "prev_ds_nodash", # → None
           "prev_execution_date", # → None
           "prev_execution_date_success", # → None
           "tomorrow_ds", # → None
           "yesterday_ds", # → None
           "yesterday_ds_nodash", # → None
       ]
   
       for key in removed_keys:
           context[key]
           context.get(key)
   
   
   @task
   def task3(
       conf, # → None
       execution_date, # → None
       next_ds, # → None
       next_ds_nodash, # → None
       next_execution_date, # → None
       prev_ds, # → None
       prev_ds_nodash, # → None
       prev_execution_date, # → None
       prev_execution_date_success, # → None
       tomorrow_ds, # → None
       yesterday_ds, # → None
       yesterday_ds_nodash, # → None
   ):
       pass
   ```
   
   </details>
   
   ---
   
   <details>
     <summary> AIR302 </summary>
   
   ```python
   # apache-airflow-providers-amazon
   from airflow.hooks.S3_hook import S3Hook # → from 
airflow.providers.amazon.aws.hooks.s3 import S3Hook
   from airflow.hooks.S3_hook import provide_bucket_name # → from 
airflow.providers.amazon.aws.hooks.s3 import provide_bucket_name
   
   from airflow.operators.gcs_to_s3 import GCSToS3Operator # → from 
airflow.providers.amazon.aws.transfers.gcs_to_s3 import GCSToS3Operator
   
   from airflow.operators.google_api_to_s3_transfer import 
GoogleApiToS3Operator # → from 
airflow.providers.amazon.aws.transfers.google_api_to_s3 import 
GoogleApiToS3Operator
   from airflow.operators.google_api_to_s3_transfer import 
GoogleApiToS3Transfer # → from 
airflow.providers.amazon.aws.transfers.google_api_to_s3 import 
GoogleApiToS3Operator
              
   from airflow.operators.redshift_to_s3_operator import RedshiftToS3Operator  
# → from airflow.providers.amazon.aws.transfers.redshift_to_s3 import 
RedshiftToS3Operator
   from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer # 
→ from airflow.providers.amazon.aws.transfers.redshift_to_s3 import 
RedshiftToS3Operator
              
   from airflow.operators.s3_file_transform_operator import 
S3FileTransformOperator # → from airflow.providers.amazon.aws.operators.s3 
import S3FileTransformOperator
   
   from airflow.operators.s3_to_redshift_operator import S3ToRedshiftOperator # 
→ from airflow.providers.amazon.aws.transfers.s3_to_redshift import 
S3ToRedshiftOperator
   from airflow.operators.s3_to_redshift_operator import S3ToRedshiftTransfer # 
→ from airflow.providers.amazon.aws.transfers.s3_to_redshift import 
S3ToRedshiftOperator
               
   from airflow.sensors.s3_key_sensor import S3KeySensor # → from 
airflow.providers.amazon.aws.sensors.s3 import S3KeySensor
   
   
   # apache-airflow-providers-celery
   from airflow.config_templates.default_celery import DEFAULT_CELERY_CONFIG # 
→ from airflow.providers.celery.executors.default_celery import 
DEFAULT_CELERY_CONFIG
   
   from airflow.executors.celery_executor import app # → from 
airflow.providers.celery.executors.celery_executor_utils import app
   from airflow.executors.celery_executor import CeleryExecutor # → from 
airflow.providers.celery.executors.celery_executor import CeleryExecutor
   
   from airflow.executors.celery_kubernetes_executor import 
CeleryKubernetesExecutor # → from 
airflow.providers.celery.executors.celery_kubernetes_executor import 
CeleryKubernetesExecutor
   
   
   # apache-airflow-providers-common-sql
   from airflow.hooks.dbapi import ConnectorProtocol # → from 
airflow.providers.common.sql.hooks.sql import ConnectorProtocol
   from airflow.hooks.dbapi import DbApiHook # → from 
airflow.providers.common.sql.hooks.sql import DbApiHook
   
   from airflow.hooks.dbapi_hook import DbApiHook # → from 
airflow.providers.common.sql.hooks.sql import DbApiHook
   
   from airflow.operators.check_operator import SQLCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLCheckOperator
   from airflow.operators.sql import SQLCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLCheckOperator
   from airflow.operators.check_operator import CheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLCheckOperator
   from airflow.operators.druid_check_operator import CheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLCheckOperator
   from airflow.operators.presto_check_operator import CheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLCheckOperator
   from airflow.operators.druid_check_operator import DruidCheckOperator # → 
from airflow.providers.common.sql.operators.sql import SQLCheckOperator
   from airflow.operators.presto_check_operator import PrestoCheckOperator # → 
from airflow.providers.common.sql.operators.sql import SQLCheckOperator
   
   from airflow.operators.check_operator import IntervalCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLIntervalCheckOperator
   from airflow.operators.check_operator import SQLIntervalCheckOperator # → 
from airflow.providers.common.sql.operators.sql import SQLIntervalCheckOperator
   from airflow.operators.presto_check_operator import IntervalCheckOperator # 
→ from airflow.providers.common.sql.operators.sql import 
SQLIntervalCheckOperator
   from airflow.operators.presto_check_operator import 
PrestoIntervalCheckOperator # → from airflow.providers.common.sql.operators.sql 
import SQLIntervalCheckOperator
   from airflow.operators.sql import SQLIntervalCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLIntervalCheckOperator
               
   from airflow.operators.check_operator import SQLThresholdCheckOperator # → 
from airflow.providers.common.sql.operators.sql import SQLThresholdCheckOperator
   from airflow.operators.check_operator import ThresholdCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLThresholdCheckOperator
   from airflow.operators.sql import SQLThresholdCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLThresholdCheckOperator
              
   from airflow.operators.check_operator import SQLValueCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
   from airflow.operators.check_operator import ValueCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
   from airflow.operators.presto_check_operator import PrestoValueCheckOperator 
# → from airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
   from airflow.operators.presto_check_operator import ValueCheckOperator # → 
from airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
   from airflow.operators.sql import SQLValueCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLValueCheckOperator
              
   from airflow.operators.sql import BaseSQLOperator # → from 
airflow.providers.common.sql.operators.sql import BaseSQLOperator
   from airflow.operators.sql import BranchSQLOperator # → from 
airflow.providers.common.sql.operators.sql import BranchSQLOperator
   from airflow.operators.sql import SQLTableCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLTableCheckOperator
                  
   from airflow.operators.sql import SQLColumnCheckOperator # → from 
airflow.providers.common.sql.operators.sql import SQLColumnCheckOperator
   from airflow.operators.sql import _convert_to_float_if_possible # → from 
airflow.providers.common.sql.operators.sql import _convert_to_float_if_possible
   from airflow.operators.sql import parse_boolean # → from 
airflow.providers.common.sql.operators.sql import parse_boolean
   from airflow.sensors.sql import SqlSensor # → from 
airflow.providers.common.sql.sensors.sql import SqlSensor
   from airflow.sensors.sql_sensor import SqlSensor # → from 
airflow.providers.common.sql.sensors.sql import SqlSensor
   
   from airflow.operators.jdbc_operator import JdbcOperator # → from 
airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
   from airflow.operators.mssql_operator import MsSqlOperator # → from 
airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
   from airflow.operators.mysql_operator import MySqlOperator # → from 
airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
   from airflow.operators.oracle_operator import OracleOperator # → from 
airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
   from airflow.operators.postgres_operator import PostgresOperator # → from 
airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
   from airflow.operators.sqlite_operator import SqliteOperator # → from 
airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
   
   
   # apache-airflow-providers-daskexecutor
   from airflow.executors.dask_executor import DaskExecutor # → from 
airflow.providers.daskexecutor.executors.dask_executor import DaskExecutor
   
   
   # apache-airflow-providers-docker
   from airflow.hooks.docker_hook import DockerHook # → from 
airflow.providers.docker.hooks.docker import DockerHook
   from airflow.operators.docker_operator import DockerOperator # → from 
airflow.providers.docker.operators.docker import DockerOperator
   
   
   # apache-airflow-providers-apache-druid
   from airflow.hooks.druid_hook import DruidDbApiHook # → from 
airflow.providers.apache.druid.hooks.druid import DruidDbApiHook
   from airflow.hooks.druid_hook import DruidHook # → from 
airflow.providers.apache.druid.hooks.druid import DruidHook
   
   from airflow.operators.hive_to_druid import HiveToDruidOperator # → from 
airflow.providers.apache.druid.transfers.hive_to_druid import 
HiveToDruidOperator
   from airflow.operators.hive_to_druid import HiveToDruidTransfer # → from 
airflow.providers.apache.druid.transfers.hive_to_druid import 
HiveToDruidOperator
   
   
   # apache-airflow-providers-fab
   from airflow.api.auth.backend.basic_auth import CLIENT_AUTH # → from 
airflow.providers.fab.auth_manager.api.auth.backend.basic_auth import 
CLIENT_AUTH
   from airflow.api.auth.backend.basic_auth import init_app # → from 
airflow.providers.fab.auth_manager.api.auth.backend.basic_auth import init_app
   from airflow.api.auth.backend.basic_auth import auth_current_user # → from 
airflow.providers.fab.auth_manager.api.auth.backend.basic_auth import 
auth_current_user
   from airflow.api.auth.backend.basic_auth import requires_authentication # → 
from airflow.providers.fab.auth_manager.api.auth.backend.basic_auth import 
requires_authentication
   
   from airflow.api.auth.backend.kerberos_auth import log # → from 
airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import log
   from airflow.api.auth.backend.kerberos_auth import CLIENT_AUTH # → from 
airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
CLIENT_AUTH
   from airflow.api.auth.backend.kerberos_auth import find_user # → from 
airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
find_user
   from airflow.api.auth.backend.kerberos_auth import init_app # → from 
airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
init_app
   from airflow.api.auth.backend.kerberos_auth import requires_authentication # 
→ from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
requires_authentication
   
   from airflow.auth.managers.fab.api.auth.backend.kerberos_auth import log # → 
from airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
log
   from airflow.auth.managers.fab.api.auth.backend.kerberos_auth import 
CLIENT_AUTH # → from 
airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
CLIENT_AUTH
   from airflow.auth.managers.fab.api.auth.backend.kerberos_auth import 
find_user # → from 
airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
find_user
   from airflow.auth.managers.fab.api.auth.backend.kerberos_auth import 
init_app # → from 
airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
init_app
   from airflow.auth.managers.fab.api.auth.backend.kerberos_auth import 
requires_authentication # → from 
airflow.providers.fab.auth_manager.api.auth.backend.kerberos_auth import 
requires_authentication
   
   from airflow.auth.managers.fab.fab_auth_manager import FabAuthManager # → 
from airflow.providers.fab.auth_manager.fab_auth_manager import FabAuthManager
   
   from airflow.auth.managers.fab.security_manager.override import 
MAX_NUM_DATABASE_USER_SESSIONS # → from 
airflow.providers.fab.auth_manager.security_manager.override import 
MAX_NUM_DATABASE_USER_SESSIONS
   
   from airflow.auth.managers.fab.security_manager.override import 
FabAirflowSecurityManagerOverride # → from 
airflow.providers.fab.auth_manager.security_manager.override import 
FabAirflowSecurityManagerOverride
   from airflow.www.security import FabAirflowSecurityManagerOverride # → from 
airflow.providers.fab.auth_manager.security_manager.override import 
FabAirflowSecurityManagerOverride
   
   
   # apache-airflow-providers-apache-hdfs
   from airflow.hooks.webhdfs_hook import WebHDFSHook# → from 
airflow.providers.apache.hdfs.hooks.webhdfs import WebHDFSHook
   from airflow.sensors.web_hdfs_sensor import WebHdfsSensor # → from 
airflow.providers.apache.hdfs.sensors.web_hdfs import WebHdfsSensor
   
   
   # apache-airflow-providers-apache-hive
   from airflow.hooks.hive_hooks import HiveCliHook # → from 
airflow.providers.apache.hive.hooks.hive import HiveCliHook
   from airflow.hooks.hive_hooks import HiveMetastoreHook # → from 
airflow.providers.apache.hive.hooks.hive import HiveMetastoreHook
   from airflow.hooks.hive_hooks import HiveServer2Hook # → from 
airflow.providers.apache.hive.hooks.hive import HiveServer2Hook
   from airflow.hooks.hive_hooks import HIVE_QUEUE_PRIORITIES # → from 
airflow.providers.apache.hive.hooks.hive import HIVE_QUEUE_PRIORITIES
   from airflow.macros.hive import closest_ds_partition # → from 
airflow.providers.apache.hive.macros.hive import closest_ds_partition
   from airflow.macros.hive import max_partition # → from 
airflow.providers.apache.hive.macros.hive import max_partition
   
   from airflow.operators.hive_operator import HiveOperator # → from 
airflow.providers.apache.hive.operators.hive import HiveOperator
   
   from airflow.operators.hive_stats_operator import 
HiveStatsCollectionOperator # → from 
airflow.providers.apache.hive.operators.hive_stats import 
HiveStatsCollectionOperator
   
   from airflow.operators.hive_to_mysql import HiveToMySqlOperator # → from 
airflow.providers.apache.hive.transfers.hive_to_mysql import HiveToMySqlOperator
   from airflow.operators.hive_to_mysql import HiveToMySqlTransfer # → from 
airflow.providers.apache.hive.transfers.hive_to_mysql import HiveToMySqlOperator
   
   from airflow.operators.hive_to_samba_operator import HiveToSambaOperator # → 
from airflow.providers.apache.hive.transfers.hive_to_samba import 
HiveToSambaOperator
   
   from airflow.operators.mssql_to_hive import MsSqlToHiveOperator # → from 
airflow.providers.apache.hive.transfers.mssql_to_hive import MsSqlToHiveOperator
   from airflow.operators.mssql_to_hive import MsSqlToHiveTransfer # → from 
airflow.providers.apache.hive.transfers.mssql_to_hive import MsSqlToHiveOperator
   
   from airflow.operators.mysql_to_hive import MySqlToHiveOperator # → from 
airflow.providers.apache.hive.transfers.mysql_to_hive import MySqlToHiveOperator
   from airflow.operators.mysql_to_hive import MySqlToHiveTransfer # → from 
airflow.providers.apache.hive.transfers.mysql_to_hive import MySqlToHiveOperator
   
   from airflow.operators.s3_to_hive_operator import S3ToHiveOperator # → from 
airflow.providers.apache.hive.transfers.s3_to_hive import S3ToHiveOperator
   from airflow.operators.s3_to_hive_operator import S3ToHiveTransfer # → from 
airflow.providers.apache.hive.transfers.s3_to_hive import S3ToHiveOperator
   
   from airflow.sensors.hive_partition_sensor import HivePartitionSensor # → 
from airflow.providers.apache.hive.sensors.hive_partition import 
HivePartitionSensor
   
   from airflow.sensors.metastore_partition_sensor import 
MetastorePartitionSensor # → from 
airflow.providers.apache.hive.sensors.metastore_partition import 
MetastorePartitionSensor
   
   from airflow.sensors.named_hive_partition_sensor import 
NamedHivePartitionSensor # → from 
airflow.providers.apache.hive.sensors.named_hive_partition import 
NamedHivePartitionSensor
   
   
   # apache-airflow-providers-http
   from airflow.hooks.http_hook import HttpHook # → from 
airflow.providers.http.hooks.http import HttpHook
   
   from airflow.operators.http_operator import SimpleHttpOperator # → from 
airflow.providers.http.operators.http import HttpOperator
   
   from airflow.sensors.http_sensor import HttpSensor # → from 
airflow.providers.http.sensors.http import HttpSensor
   
   
   # apache-airflow-providers-jdbc
   from airflow.hooks.jdbc_hook import JdbcHook # → from 
airflow.providers.jdbc.hooks.jdbc import JdbcHook
   from airflow.hooks.jdbc_hook import jaydebeapi # → from 
airflow.providers.jdbc.hooks.jdbc import jaydebeapi
   
   
   # apache-airflow-providers-cncf-kubernetes
   from airflow.executors.kubernetes_executor_types import ALL_NAMESPACES # → 
from airflow.providers.cncf.kubernetes.executors.kubernetes_executor_types 
import ALL_NAMESPACES
   from airflow.executors.kubernetes_executor_types import 
POD_EXECUTOR_DONE_KEY # → from 
airflow.providers.cncf.kubernetes.executors.kubernetes_executor_types import 
POD_EXECUTOR_DONE_KEY
   
   from airflow.kubernetes.k8s_model import K8SModel # → from 
airflow.providers.cncf.kubernetes.k8s_model import K8SModel
   from airflow.kubernetes.k8s_model import append_to_pod # → from 
airflow.providers.cncf.kubernetes.k8s_model import append_to_pod
   
   from airflow.kubernetes.kube_client import _disable_verify_ssl # → from 
airflow.providers.cncf.kubernetes.kube_client import _disable_verify_ssl
   from airflow.kubernetes.kube_client import _enable_tcp_keepalive # → from 
airflow.providers.cncf.kubernetes.kube_client import _enable_tcp_keepalive
   from airflow.kubernetes.kube_client import get_kube_client # → from 
airflow.providers.cncf.kubernetes.kube_client import get_kube_client
   
   from airflow.kubernetes.kubernetes_helper_functions import add_pod_suffix # 
→ from airflow.providers.cncf.kubernetes.kubernetes_helper_functions import 
add_unique_suffix
   
   from airflow.kubernetes.kubernetes_helper_functions import create_pod_id # → 
from airflow.providers.cncf.kubernetes.kubernetes_helper_functions import 
create_unique_id
   
   from airflow.kubernetes.kubernetes_helper_functions import 
annotations_for_logging_task_metadata # → from 
airflow.providers.cncf.kubernetes.kubernetes_helper_functions import 
annotations_for_logging_task_metadata
   from airflow.kubernetes.kubernetes_helper_functions import 
annotations_to_key # → from 
airflow.providers.cncf.kubernetes.kubernetes_helper_functions import 
annotations_to_key
   from airflow.kubernetes.kubernetes_helper_functions import 
get_logs_task_metadata # → from 
airflow.providers.cncf.kubernetes.kubernetes_helper_functions import 
get_logs_task_metadata
   from airflow.kubernetes.kubernetes_helper_functions import rand_str # → from 
airflow.providers.cncf.kubernetes.kubernetes_helper_functions import rand_str
   
   from airflow.kubernetes.pod import Port # → from kubernetes.client.models 
import V1ContainerPort
   from airflow.kubernetes.pod import Resources # → from 
kubernetes.client.models import V1ResourceRequirements
   
   from airflow.kubernetes.pod_generator import 
datetime_to_label_safe_datestring # → from 
airflow.providers.cncf.kubernetes.pod_generator import 
datetime_to_label_safe_datestring
   from airflow.kubernetes.pod_generator import extend_object_field # → from 
airflow.providers.cncf.kubernetes.pod_generator import extend_object_field
   from airflow.kubernetes.pod_generator import 
label_safe_datestring_to_datetime # → from 
airflow.providers.cncf.kubernetes.pod_generator import 
label_safe_datestring_to_datetime
   from airflow.kubernetes.pod_generator import make_safe_label_value # → from 
airflow.providers.cncf.kubernetes.pod_generator import make_safe_label_value
   from airflow.kubernetes.pod_generator import merge_objects # → from 
airflow.providers.cncf.kubernetes.pod_generator import merge_objects
   from airflow.kubernetes.pod_generator import PodGenerator # → from 
airflow.providers.cncf.kubernetes.pod_generator import PodGenerator
   from airflow.kubernetes.pod_generator import PodDefaults  # → from 
airflow.providers.cncf.kubernetes.utils.xcom_sidecar import PodDefaults
   from airflow.kubernetes.pod_generator import PodGeneratorDeprecated  # → 
from airflow.providers.cncf.kubernetes.pod_generator import PodGenerator
   from airflow.kubernetes.pod_generator import add_pod_suffix  # → from 
airflow.providers.cncf.kubernetes.kubernetes_helper_functions import 
add_unique_suffix
   from airflow.kubernetes.pod_generator import rand_str # → from 
airflow.providers.cncf.kubernetes.kubernetes_helper_functions import rand_str
   
   from airflow.kubernetes.pod_generator_deprecated import 
make_safe_label_value # → from airflow.providers.cncf.kubernetes.pod_generator 
import make_safe_label_value
   from airflow.kubernetes.pod_generator_deprecated import PodGenerator # → 
from airflow.providers.cncf.kubernetes.pod_generator import PodGenerator
    
   from airflow.kubernetes.pod_generator_deprecated import PodDefaults # → from 
airflow.providers.cncf.kubernetes.utils.xcom_sidecar import PodDefaults
   from airflow.kubernetes.pod_launcher_deprecated import PodDefaults # → from 
airflow.providers.cncf.kubernetes.utils.xcom_sidecar import PodDefaults
   
   from airflow.kubernetes.pod_launcher_deprecated import get_kube_client # → 
from airflow.providers.cncf.kubernetes.kube_client import get_kube_client
   
   from airflow.kubernetes.pod_launcher import PodLauncher # → from 
airflow.providers.cncf.kubernetes.utils.pod_manager import PodManager
   from airflow.kubernetes.pod_launcher_deprecated import PodLauncher # → from 
airflow.providers.cncf.kubernetes.utils.pod_manager import PodManager
   
   from airflow.kubernetes.pod_launcher import PodStatus # → from  
airflow.providers.cncf.kubernetes.utils.pod_manager import PodPhase
   from airflow.kubernetes.pod_launcher_deprecated import PodStatus # → from  
airflow.providers.cncf.kubernetes.utils.pod_manager import PodPhase
   
   from airflow.kubernetes.pod_runtime_info_env import PodRuntimeInfoEnv # → 
from kubernetes.client.models import V1EnvVar
   
   from airflow.kubernetes.secret import K8SModel # → from 
airflow.providers.cncf.kubernetes.k8s_model import K8SModel
   from airflow.kubernetes.secret import Secret # → from 
airflow.providers.cncf.kubernetes.secret import Secret
   
   from airflow.kubernetes.volume import Volume# → from 
kubernetes.client.models import V1Volume
   from airflow.kubernetes.volume_mount import VolumeMount # → from 
kubernetes.client.models import V1VolumeMount
   
   
   # apache-airflow-providers-microsoft-mssql
   from airflow.hooks.mssql_hook import MsSqlHook # → from 
airflow.providers.microsoft.mssql.hooks.mssql import MsSqlHook
   
   
   # apache-airflow-providers-mysql
   from airflow.hooks.mysql_hook import MySqlHook # → from 
airflow.providers.mysql.hooks.mysql import MySqlHook
   
   from airflow.operators.presto_to_mysql import PrestoToMySqlOperator # → from 
airflow.providers.mysql.transfers.presto_to_mysql import PrestoToMySqlOperator
   from airflow.operators.presto_to_mysql import PrestoToMySqlTransfer # → from 
airflow.providers.mysql.transfers.presto_to_mysql import PrestoToMySqlOperator
   
   
   # apache-airflow-providers-oracle
   from airflow.hooks.oracle_hook import OracleHook # → from 
airflow.providers.oracle.hooks.oracle import OracleHook
   
   
   # apache-airflow-providers-papermill
   from airflow.operators.papermill_operator import PapermillOperator # → from 
airflow.providers.papermill.operators.papermill import PapermillOperator
   
   
   # apache-airflow-providers-apache-pig
   from airflow.hooks.pig_hook import PigCliHook # → from 
airflow.providers.apache.pig.hooks.pig import PigCliHook
   from airflow.operators.pig_operator import PigOperator # → from 
airflow.providers.apache.pig.operators.pig import PigOperator
   
   
   # apache-airflow-providers-postgres
   from airflow.hooks.postgres_hook import PostgresHook # → from 
airflow.providers.postgres.hooks.postgres import PostgresHook
   from airflow.operators.postgres_operator import Mapping # → None 
   
   
   # apache-airflow-providers-presto
   from airflow.hooks.presto_hook import PrestoHook # → from 
airflow.providers.presto.hooks.presto import PrestoHook
   
   
   # apache-airflow-providers-samba
   from airflow.hooks.samba_hook import SambaHook # → from 
airflow.providers.samba.hooks.samba import SambaHook
   
   # apache-airflow-providers-slack
   from airflow.hooks.slack_hook import SlackHook # → from 
airflow.providers.slack.hooks.slack import SlackHook
   from airflow.operators.slack_operator import SlackAPIOperator # → from 
airflow.providers.slack.operators.slack import SlackAPIOperator
   from airflow.operators.slack_operator import SlackAPIPostOperator # → from 
airflow.providers.slack.operators.slack import SlackAPIPostOperator
   
   
   # apache-airflow-providers-smtp
   from airflow.operators.email_operator import EmailOperator # → from 
airflow.providers.smtp.operators.smtp import EmailOperator
   from airflow.operators.email import EmailOperator # → from 
airflow.providers.smtp.operators.smtp import EmailOperator
   
   
   # apache-airflow-providers-sqlite
   from airflow.hooks.sqlite_hook import SqliteHook # → from 
airflow.providers.sqlite.hooks.sqlite import SqliteHook
   
   # apache-airflow-providers-zendesk
   from airflow.hooks.zendesk_hook import ZendeskHook # → from 
airflow.providers.zendesk.hooks.zendesk import ZendeskHook
   
   # apache-airflow-providers-standard
   from airflow.hooks.subprocess import SubprocessResult # → from 
airflow.providers.standard.hooks.subprocess import SubprocessResult
   from airflow.hooks.subprocess import working_directory # → from 
airflow.providers.standard.hooks.subprocess import working_directory
   
   from airflow.operators.bash_operator import BashOperator # → from 
airflow.providers.standard.operators.bash import BashOperator
   
   from airflow.operators.dagrun_operator import TriggerDagRunLink # → from 
airflow.providers.standard.operators.trigger_dagrun import TriggerDagRunLink
   from airflow.operators.dagrun_operator import TriggerDagRunOperator # → from 
airflow.providers.standard.operators.trigger_dagrun import TriggerDagRunOperator
   
   from airflow.operators.trigger_dagrun import TriggerDagRunLink # → from 
airflow.providers.standard.operators.trigger_dagrun import TriggerDagRunLink
   
   from airflow.operators.datetime import target_times_as_dates # → from 
airflow.providers.standard.operators.datetime import target_times_as_dates
   
   from airflow.operators.dummy import DummyOperator # → from 
airflow.providers.standard.operators.empty import EmptyOperator
   from airflow.operators.dummy import EmptyOperator # → from 
airflow.providers.standard.operators.empty import EmptyOperator
   from airflow.operators.dummy_operator import DummyOperator # → from 
airflow.providers.standard.operators.empty import EmptyOperator
   from airflow.operators.dummy_operator import EmptyOperator # → from 
airflow.providers.standard.operators.empty import EmptyOperator
   
   from airflow.operators.latest_only_operator import LatestOnlyOperator # → 
from airflow.providers.standard.operators.latest_only import LatestOnlyOperator
   
   from airflow.operators.python_operator import BranchPythonOperator # → from 
airflow.providers.standard.operators.python import BranchPythonOperator
   from airflow.operators.python_operator import PythonOperator # → from 
airflow.providers.standard.operators.python import PythonOperator
   from airflow.operators.python_operator import PythonVirtualenvOperator # → 
from airflow.providers.standard.operators.python import PythonVirtualenvOperator
   from airflow.operators.python_operator import ShortCircuitOperator # → from 
airflow.providers.standard.operators.python import ShortCircuitOperator
   
   from airflow.sensors.external_task import ExternalTaskSensorLink # → from 
airflow.providers.standard.sensors.external_task import ExternalDagLink
   
   from airflow.sensors.external_task_sensor import ExternalTaskMarker # → from 
airflow.providers.standard.sensors.external_task import ExternalTaskMarker
   from airflow.sensors.external_task_sensor import ExternalTaskSensor # → from 
airflow.providers.standard.sensors.external_task import ExternalTaskSensor
   from airflow.sensors.external_task_sensor import ExternalTaskSensorLink # → 
from airflow.providers.standard.sensors.external_task import ExternalDagLink
   
   from airflow.sensors.time_delta import WaitSensor # → from 
airflow.providers.standard.sensors.time_delta import WaitSensor
   ```
   
   </details>
   
   ---
   
   
   <details>
       <summary> AIR311 </summary>
   
   * Calling HookLineageCollector.create_asset with positional argument should 
raise an error
   
   </details>
   
   ---
   
   <details>
       <summary> AIR311 </summary>
   
   ```python
   # airflow.datasets.metadata
   from airflow.datasets.metadata import Metadata # → from airflow.sdk import 
Metadata
   
   # airflow.datasets
   from airflow import Dataset # → from airflow.sdk import Asset
   from airflow.datasets import Dataset # → from airflow.sdk import Asset
   from airflow.datasets import DatasetAlias # → from airflow.sdk import 
AssetAlias
   from airflow.datasets import DatasetAll # → from airflow.sdk import AssetAll
   from airflow.datasets import DatasetAny # → from airflow.sdk import AssetAny
   from airflow.datasets import expand_alias_to_datasets # → from 
airflow.models.asset import expand_alias_to_assets
   
   # airflow.decorators
   from airflow.decorators.base import DecoratedOperator # → from 
airflow.sdk.bases.decorator import DecoratedOperator
   from airflow.decorators.base import DecoratedMappedOperator # → from 
airflow.sdk.bases.decorator import DecoratedMappedOperator
   from airflow.decorators.base import DecoratedOperator # → from 
airflow.sdk.bases.decorator import DecoratedOperator
   from airflow.decorators.base import TaskDecorator # → from 
airflow.sdk.bases.decorator import TaskDecorator
   from airflow.decorators.base import get_unique_task_id # → from 
airflow.sdk.bases.decorator import get_unique_task_id
   from airflow.decorators.base import task_decorator_factory # → from 
airflow.sdk.bases.decorator import task_decorator_factory
   from airflow.decorators import dag # → from airflow.sdk import dag
   from airflow.decorators import task # → from airflow.sdk import task
   from airflow.decorators import task_group # → from airflow.sdk import 
task_group
   from airflow.decorators import setup # → from airflow.sdk import setup
   from airflow.decorators import teardown # → from airflow.sdk import teardown
   
   # airflow.io
   from airflow.io.path import ObjectStoragePath # → from airflow.sdk import 
ObjectStoragePath
   from airflow.io.store import attach # → from airflow.sdk.io import attach
   
   # airflow.models.baseoperator
   from airflow.models.baseoperator import chain # → from airflow.sdk import 
chain
   from airflow.models.baseoperator import chain_linear # → from airflow.sdk 
import chain_linear
   from airflow.models.baseoperator import cross_downstream # → from 
airflow.sdk import cross_downstream
   from airflow.models.baseoperatorlink import BaseOperatorLink # → from 
airflow.sdk import BaseOperatorLink
   
   # airflow.model
   from airflow.models import Connection # → from airflow.sdk import Connection
   from airflow.models import DAG # → from airflow.sdk import DAG
   from airflow.models import Variable # → from airflow.sdk import Variable
   from airflow.models.param import Param # → from 
airflow.sdk.definitions.param import Param
   
   # airflow.sensor.base
   from airflow.sensors.base import BaseSensorOperator # → from 
airflow.sdk.bases.sensor import BaseSensorOperator
   from airflow.sensors.base import PokeReturnValue # → from 
airflow.sdk.bases.sensor import PokeReturnValue
   from airflow.sensors.base import poke_mode_only # → from 
airflow.sdk.bases.sensor import poke_mode_only
   
   # airflow.timetables
   from airflow.timetables.datasets import DatasetOrTimeSchedule # → from 
airflow.timetables.assets import AssetOrTimeSchedule
   
   # airflow.utils
   from airflow.utils.dag_parsing_context import get_parsing_context # → from 
airflow.sdk import get_parsing_context
   ```
   
   * Arguments
   
   ```python
   from airflow.sdk import DAG
   
   DAG(
     dag_id="dag_id",
     sla_miss_callback=None # → None
   )
   
   from airflow.timetables.datasets import DatasetOrTimeSchedule
   
   DatasetOrTimeSchedule(datasets=[]) # → assets
   ```
   
   </details>
   
   ---
   
   <details>
       <summary> AIR312 </summary>
   
   ```python
   from airflow.hooks.filesystem import FSHook # → from 
airflow.providers.standard.hooks.filesystem import FSHook
   from airflow.hooks.package_index import PackageIndexHook # → from 
airflow.providers.standard.hooks.package_index import PackageIndexHook
   from airflow.hooks.subprocess import SubprocessHook # → from 
airflow.providers.standard.hooks.subprocess import SubprocessHook
   from airflow.operators.bash import BashOperator # → from 
airflow.providers.standard.operators.bash import BashOperator
   from airflow.operators.datetime import BranchDateTimeOperator # → from 
airflow.providers.standard.operators.datetime import BranchDateTimeOperator
   from airflow.operators.trigger_dagrun import TriggerDagRunOperator # → from 
airflow.providers.standard.operators.trigger_dagrun import TriggerDagRunOperator
   from airflow.operators.empty import EmptyOperator # → from 
airflow.providers.standard.operators.empty import EmptyOperator
   from airflow.operators.latest_only import LatestOnlyOperator # → from 
airflow.providers.standard.operators.latest_only import LatestOnlyOperator
   from airflow.operators.python import BranchPythonOperator # → from 
irflow.providers.standard.operators.python import BranchPythonOperator
   from airflow.operators.python import PythonOperator # → from 
irflow.providers.standard.operators.python import PythonOperator
   from airflow.operators.python import PythonVirtualenvOperator # → from 
irflow.providers.standard.operators.python import PythonVirtualenvOperator
   from airflow.operators.python import ShortCircuitOperator # → from 
irflow.providers.standard.operators.python import ShortCircuitOperator
   from airflow.operators.weekday import BranchDayOfWeekOperator # → from 
airflow.providers.standard.operators.weekday import BranchDayOfWeekOperator
   from airflow.sensors.bash import BashSensor # → from 
airflow.providers.standard.sensor.bash import BashSensor
   from airflow.sensors.date_time import DateTimeSensor # → from 
airflow.providers.standard.sensors.date_time import DateTimeSensor
   from airflow.sensors.date_time import DateTimeSensorAsync # → from 
airflow.providers.standard.sensors.date_time import DateTimeSensorAsync
   from airflow.sensors.external_task import ExternalTaskMarker # → from 
airflow.providers.standard.sensors.external_task import ExternalTaskMarker
   from airflow.sensors.external_task import ExternalTaskSensor # → from 
airflow.providers.standard.sensors.external_task import ExternalTaskSensor
   from airflow.sensors.filesystem import FileSensor # → from 
airflow.providers.standard.sensors.filesystem import FileSensor
   from airflow.sensors.python import PythonSensor    # from 
airflow.providers.standard.sensors.python import PythonSensor
   from airflow.sensors.time_sensor import TimeSensor # → from 
airflow.providers.standard.sensors.time import TimeSensor
   from airflow.sensors.time_sensor import TimeSensorAsync # → from 
airflow.providers.standard.sensors.time import TimeSensorAsync
   from airflow.sensors.time_delta import TimeDeltaSensor # → from 
airflow.providers.standard.sensors.time_delta import TimeDeltaSensor
   from airflow.sensors.time_delta import TimeDeltaSensorAsync # → from 
airflow.providers.standard.sensors.time_delta import TimeDeltaSensorAsync
   from airflow.sensors.weekday import DayOfWeekSensor # → from 
airflow.providers.standard.sensors.weekday import DayOfWeekSensor
   from airflow.triggers.external_task import DagStateTrigger # → from 
airflow.providers.standard.triggers.external_task import DagStateTrigger
   from airflow.triggers.external_task import WorkflowTrigger # → from 
airflow.providers.standard.triggers.external_task import WorkflowTrigger
   from airflow.triggers.file import FileTrigger # → from 
airflow.providers.standard.triggers.file import FileTrigger
   from airflow.triggers.temporal import DateTimeTrigger # → from 
airflow.providers.standard.triggers.temporal import DateTimeTrigger
   from airflow.triggers.temporal import TimeDeltaTrigger # → from 
airflow.providers.standard.triggers.temporal import TimeDeltaTrigger
   ```
   
   </details>
   
   ### Related issues
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to