Taragolis opened a new issue, #38270: URL: https://github.com/apache/airflow/issues/38270
### Body Follow-up task for https://github.com/apache/airflow/pull/38219 We enable this rule for avoid accidentally misses some asserts into the `pytest.raises()` blocks ### Easy way to find all linting problem into the module 1. Remove corresponding line to particular module into the [`pyproject.toml`](https://github.com/apache/airflow/blob/main/pyproject.toml#L1490) in [tool.ruff.lint.per-file-ignores] 2. Run ruff via [pre-commit](https://github.com/apache/airflow/blob/main/contributing-docs/08_static_code_checks.rst#pre-commit-hooks) hook 3. [Run tests locally](https://github.com/apache/airflow/blob/main/contributing-docs/testing/unit_tests.rst#running-unit-tests) ```console ❯ pre-commit run ruff --all-files Run 'ruff' for extremely fast Python linting.............................Failed - hook id: ruff - exit code: 1 tests/providers/amazon/aws/hooks/test_base_aws.py:1126:5: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/hooks/test_datasync.py:410:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/hooks/test_eks.py:792:13: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/hooks/test_redshift_data.py:80:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/hooks/test_s3.py:495:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/operators/test_emr_serverless.py:435:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/operators/test_redshift_data.py:300:9: PT012 `pytest.raises()` block should contain a single simple statement Found 7 errors. tests/providers/amazon/aws/sensors/test_glacier.py:91:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/sensors/test_glue.py:129:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/sensors/test_lambda_function.py:100:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/system/utils/test_helpers.py:88:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/system/utils/test_helpers.py:125:13: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/transfers/test_redshift_to_s3.py:377:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/triggers/test_ecs.py:55:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/triggers/test_ecs.py:74:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/waiters/test_neptune.py:56:9: PT012 `pytest.raises()` block should contain a single simple statement tests/providers/amazon/aws/waiters/test_neptune.py:77:9: PT012 `pytest.raises()` block should contain a single simple statement Found 10 errors. ``` > [!TIP] > Feel free to ask a suggestion/helps in slack channels `#contributors` or `#new-contributors` > [!NOTE] > There is no problem to mark specific case which cannot be easily resolved by add `# noqa: PT012 reason why it should be skiped from check` # Regular Unit Tests ## Provider [amazon](https://github.com/apache/airflow/tree/main/airflow/providers/amazon) - [ ] `tests/providers/amazon/aws/hooks/test_base_aws.py` - [ ] `tests/providers/amazon/aws/hooks/test_datasync.py` - [ ] `tests/providers/amazon/aws/hooks/test_eks.py` - [ ] `tests/providers/amazon/aws/hooks/test_redshift_data.py` - [ ] `tests/providers/amazon/aws/hooks/test_s3.py` - [ ] `tests/providers/amazon/aws/operators/test_emr_serverless.py` - [ ] `tests/providers/amazon/aws/operators/test_redshift_data.py` - [ ] `tests/providers/amazon/aws/sensors/test_glacier.py` - [ ] `tests/providers/amazon/aws/sensors/test_glue.py` - [ ] `tests/providers/amazon/aws/sensors/test_lambda_function.py` - [ ] `tests/providers/amazon/aws/system/utils/test_helpers.py` - [ ] `tests/providers/amazon/aws/transfers/test_redshift_to_s3.py` - [ ] `tests/providers/amazon/aws/triggers/test_ecs.py` - [ ] `tests/providers/amazon/aws/waiters/test_neptune.py` ## Provider [apache.beam](https://github.com/apache/airflow/tree/main/airflow/providers/apache/beam) - [ ] `tests/providers/apache/beam/hooks/test_beam.py` ## Provider [apache.hive](https://github.com/apache/airflow/tree/main/airflow/providers/apache/hive) - [ ] `tests/providers/apache/hive/hooks/test_hive.py` - [ ] `tests/providers/apache/hive/sensors/test_named_hive_partition.py` ## Provider [apache.spark](https://github.com/apache/airflow/tree/main/airflow/providers/apache/spark) - [ ] `tests/providers/apache/spark/hooks/test_spark_sql.py` ## Provider [celery](https://github.com/apache/airflow/tree/main/airflow/providers/celery) - [ ] `tests/providers/celery/sensors/test_celery_queue.py` ## Provider [cncf.kubernetes](https://github.com/apache/airflow/tree/main/airflow/providers/cncf/kubernetes) - [ ] `tests/providers/cncf/kubernetes/executors/test_kubernetes_executor.py` - [ ] `tests/providers/cncf/kubernetes/hooks/test_kubernetes.py` - [ ] `tests/providers/cncf/kubernetes/operators/test_pod.py` - [ ] `tests/providers/cncf/kubernetes/utils/test_k8s_resource_iterator.py` - [ ] `tests/providers/cncf/kubernetes/utils/test_pod_manager.py` ## Provider [common](https://github.com/apache/airflow/tree/main/airflow/providers/common) - [ ] `tests/providers/common/sql/operators/test_sql.py` ## Provider [databricks](https://github.com/apache/airflow/tree/main/airflow/providers/databricks) - [ ] `tests/providers/databricks/hooks/test_databricks.py` - [ ] `tests/providers/databricks/operators/test_databricks.py` - [ ] `tests/providers/databricks/operators/test_databricks_repos.py` - [ ] `tests/providers/databricks/sensors/test_databricks_partition.py` ## Provider [datadog](https://github.com/apache/airflow/tree/main/airflow/providers/datadog) - [ ] `tests/providers/datadog/sensors/test_datadog.py` ## Provider [dbt](https://github.com/apache/airflow/tree/main/airflow/providers/dbt) - [ ] `tests/providers/dbt/cloud/operators/test_dbt.py` ## Provider [fab](https://github.com/apache/airflow/tree/main/airflow/providers/fab) - [ ] `tests/providers/fab/auth_manager/cli_commands/test_user_command.py` ## Provider [ftp](https://github.com/apache/airflow/tree/main/airflow/providers/ftp) - [ ] `tests/providers/ftp/operators/test_ftp.py` ## Provider [google](https://github.com/apache/airflow/tree/main/airflow/providers/google) - [ ] `tests/providers/google/cloud/hooks/test_bigquery.py` - [ ] `tests/providers/google/cloud/hooks/test_cloud_storage_transfer_service.py` - [ ] `tests/providers/google/cloud/hooks/test_dataflow.py` - [ ] `tests/providers/google/cloud/hooks/test_dataprep.py` - [ ] `tests/providers/google/cloud/hooks/test_kubernetes_engine.py` - [ ] `tests/providers/google/cloud/hooks/test_pubsub.py` - [ ] `tests/providers/google/cloud/operators/test_bigtable.py` - [ ] `tests/providers/google/cloud/operators/test_cloud_sql.py` - [ ] `tests/providers/google/cloud/operators/test_cloud_storage_transfer_service.py` - [ ] `tests/providers/google/cloud/operators/test_compute.py` - [ ] `tests/providers/google/cloud/operators/test_datafusion.py` - [ ] `tests/providers/google/cloud/operators/test_dataproc.py` - [ ] `tests/providers/google/cloud/operators/test_functions.py` - [ ] `tests/providers/google/cloud/operators/test_kubernetes_engine.py` - [ ] `tests/providers/google/cloud/operators/test_mlengine.py` - [ ] `tests/providers/google/cloud/operators/test_spanner.py` - [ ] `tests/providers/google/cloud/sensors/test_datafusion.py` - [ ] `tests/providers/google/cloud/sensors/test_dataproc.py` - [ ] `tests/providers/google/cloud/sensors/test_gcs.py` - [ ] `tests/providers/google/cloud/sensors/test_pubsub.py` - [ ] `tests/providers/google/cloud/transfers/test_gcs_to_bigquery.py` - [ ] `tests/providers/google/cloud/utils/test_credentials_provider.py` - [ ] `tests/providers/google/common/hooks/test_base_google.py` ## Provider [jenkins](https://github.com/apache/airflow/tree/main/airflow/providers/jenkins) - [ ] `tests/providers/jenkins/sensors/test_jenkins.py` ## Provider [microsoft.azure](https://github.com/apache/airflow/tree/main/airflow/providers/microsoft/azure) - [ ] `tests/providers/microsoft/azure/hooks/test_data_factory.py` - [ ] `tests/providers/microsoft/azure/hooks/test_wasb.py` ## Provider [microsoft.psrp](https://github.com/apache/airflow/tree/main/airflow/providers/microsoft/psrp) - [ ] `tests/providers/microsoft/psrp/hooks/test_psrp.py` ## Provider [openai](https://github.com/apache/airflow/tree/main/airflow/providers/openai) - [ ] `tests/providers/openai/operators/test_openai.py` ## Provider [oracle](https://github.com/apache/airflow/tree/main/airflow/providers/oracle) - [ ] `tests/providers/oracle/hooks/test_oracle.py` ## Provider [papermill](https://github.com/apache/airflow/tree/main/airflow/providers/papermill) - [ ] `tests/providers/papermill/operators/test_papermill.py` ## Provider [sftp](https://github.com/apache/airflow/tree/main/airflow/providers/sftp) - [ ] `tests/providers/sftp/hooks/test_sftp.py` - [ ] `tests/providers/sftp/operators/test_sftp.py` - [ ] `tests/providers/sftp/sensors/test_sftp.py` - [ ] `tests/providers/sftp/triggers/test_sftp.py` ## Provider [ssh](https://github.com/apache/airflow/tree/main/airflow/providers/ssh) - [ ] `tests/providers/ssh/hooks/test_ssh.py` - [ ] `tests/providers/ssh/operators/test_ssh.py` ## Provider [telegram](https://github.com/apache/airflow/tree/main/airflow/providers/telegram) - [ ] `tests/providers/telegram/hooks/test_telegram.py` - [ ] `tests/providers/telegram/operators/test_telegram.py` # K8S Tests For more detail have a look [Kubernetes tests](https://github.com/apache/airflow/blob/main/contributing-docs/testing/k8s_tests.rst) into the contributors documentation https://github.com/apache/airflow/blob/5a612dac4a6ebdf44d73d5d23fed5419e2519eb1/pyproject.toml#L1406 - [ ] `kubernetes_tests/test_kubernetes_pod_operator.py` ### Committer - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow project. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
