potiuk commented on PR #41437:
URL: https://github.com/apache/airflow/pull/41437#issuecomment-2293710232

   Likely side-effect of other tests that have been somewhat masked or avoided 
before the change - where some setup/teardown removed the side-effect.
   
   You can reproduce the set of tests run with `breeze testing db-tests 
--test-type CLI` locally (for example - for CLI tests) - that should run the 
tests in the same sequence as they are run in CI in each of the parallel runs 
and then they shoudl reproducibly fail as well. Also it could be caused by new 
version of dependencies - (look at `generate constraints` output of your build) 
but it's rather unlikely - 
https://github.com/apache/airflow/actions/runs/10420774982/ `canary` build just 
got green and updated constraints without any test failures, so it's rather 
unlikely (however you can always rebase and see if it will fail in the same 
way).
   
   Generally those kind of side-effects are best investigated by a bit guessing 
and bi-secting - and trying to run a smaller-and-smaller subset of tests until 
you find the one that is the culprit. At least that's what I did in the past.
   
   You should start by looking  at the pytest command that was run in the 
original test type - unfold the `red` failing test type and you will see:
   
   ```
     Starting the tests with those pytest arguments: tests/cli --verbosity=0 
--strict-markers --durations=100 --maxfail=50 --color=yes 
--junitxml=/files/test_result-cli-postgres.xml --timeouts-order moi 
--setup-timeout=60 --execution-timeout=60 --teardown-timeout=60 
--disable-warnings -rfEX --run-db-tests-only --ignore=tests/system 
--ignore=tests/integration 
--warning-output-path=/files/warnings-cli-postgres.txt --ignore=helm_tests 
--with-db-init --no-cov
     
     ============================= test session starts 
==============================
     platform linux -- Python 3.8.19, pytest-8.3.2, pluggy-1.5.0
     rootdir: /opt/airflow
     configfile: pyproject.toml
     plugins: icdiff-0.9, timeouts-1.2.1, instafail-0.5.0, 
custom-exit-code-0.3.0, rerunfailures-14.0, asyncio-0.23.8, 
time-machine-2.15.0, anyio-4.4.0, requests-mock-1.12.1, cov-5.0.0, mock-3.14.0, 
xdist-3.6.1
     asyncio: mode=strict
     setup timeout: 60.0s, execution timeout: 60.0s, teardown timeout: 60.0s
     collected 406 items
     
     tests/cli/commands/test_celery_command.py ..........                     [ 
 2%]
     tests/cli/commands/test_cheat_sheet_command.py s                         [ 
 2%]
     tests/cli/commands/test_config_command.py ssssssssssssssssss             [ 
 7%]
     tests/cli/commands/test_connection_command.py .......................... [ 
13%]
     .....................                                                    [ 
18%]
     tests/cli/commands/test_dag_command.py ................................. [ 
26%]
     ....................                                                     [ 
31%]
     tests/cli/commands/test_dag_processor_command.py .                       [ 
32%]
     tests/cli/commands/test_db_command.py .................................. [ 
40%]
     .....................................                                    [ 
49%]
     tests/cli/commands/test_info_command.py sssssssss..s                     [ 
52%]
     tests/cli/commands/test_internal_api_command.py ssss...                  [ 
54%]
     tests/cli/commands/test_jobs_command.py ......                           [ 
55%]
     tests/cli/commands/test_kerberos_command.py ....                         [ 
56%]
     tests/cli/commands/test_kubernetes_command.py ..........                 [ 
59%]
     tests/cli/commands/test_legacy_commands.py sss                           [ 
59%]
     tests/cli/commands/test_plugins_command.py ...                           [ 
60%]
     tests/cli/commands/test_pool_command.py ...........                      [ 
63%]
     tests/cli/commands/test_rotate_fernet_key_command.py ..                  [ 
63%]
     tests/cli/commands/test_scheduler_command.py ...................         [ 
68%]
     tests/cli/commands/test_standalone_command.py ssssssssssssss             [ 
71%]
     tests/cli/commands/test_task_command.py ................................ [ 
79%]
     .F............                                                           [ 
83%]
     tests/cli/commands/test_triggerer_command.py ..                          [ 
83%]
     tests/cli/commands/test_variable_command.py ...........                  [ 
86%]
     tests/cli/commands/test_version_command.py s                             [ 
86%]
     tests/cli/commands/test_webserver_command.py sssssssssss....             [ 
90%]
     tests/cli/test_cli_parser.py ..................................s....     
[100%]
   ```
   
   If your tests succeeds when run separately, but fails when run as 
`tests/cli` - then side-effect is almost certain root cause. And you can 
attempt guess which one is producing the side-effect and run only that test and 
the one that's failing to confirm your guess. Or attempt to bisect it:
   
   In this case you might convert the single:
   
   * `pytest --run-db-tests-only tests/cli`  (that should fail locally for you 
as well)
   
   into (looking at the output): 
   
   * `pytest --run-db-tests-only  tests/cli/commands/test_celery_command.py 
tests/cli/commands/test_cheat_sheet_command.py ... 
tests/cli/commands/test_task_command.py` 
   
   Then you can remove half of the modules from the list and run it again (then 
you will see whether side-effect comes from the removed half or the remaining 
half). And continue that path - even down to a single test that causes the side 
effect. Then usually fixing it is trivial by adding missing setup/teardown or 
changing the test so that it patches and restores any state.
   
   It's slow and tedious, yes, but this is the way I've been successfully using 
in the past to trace root causes of similar issues, and have no other idea how 
to do it differently faster.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to