mik-laj commented on pull request #8556:
URL: https://github.com/apache/airflow/pull/8556#issuecomment-620609408
Example DAG works
<details>
```
root@d8cf57dc3068:/opt/airflow# pytest
tests/providers/google/cloud/operators/test_gcs_to_bigquery.py --system google
-s
===========================================================================================================================================================================
test session starts
============================================================================================================================================================================
platform linux -- Python 3.6.10, pytest-5.4.1, py-1.8.1, pluggy-0.13.1 --
/usr/local/bin/python
cachedir: .pytest_cache
rootdir: /opt/airflow, inifile: pytest.ini
plugins: flaky-3.6.1, rerunfailures-9.0, forked-1.1.3,
instafail-0.4.1.post0, requests-mock-1.7.0, xdist-1.31.0, timeout-1.3.4,
celery-4.4.2, cov-2.8.1
collected 3 items
tests/providers/google/cloud/operators/test_gcs_to_bigquery.py::TestGoogleCloudStorageToBigQueryExample::test_run_example_dag_gcs_to_bigquery_operator
========================= AIRFLOW ==========================
Home of the user: /root
Airflow home /root/airflow
Skipping initializing of the DB as it was initialized already.
You can re-initialize the database by adding --with-db-init flag when
running tests.
Removing all log files except previous_runs
[2020-04-28 13:29:00,246] {logging_command_executor.py:33} INFO - Executing:
'gcloud auth activate-service-account
--key-file=/files/airflow-breeze-config/keys/gcp_bigquery.json'
[2020-04-28 13:29:01,254] {logging_command_executor.py:40} INFO - Stdout:
[2020-04-28 13:29:01,256] {logging_command_executor.py:41} INFO - Stderr:
Activated service account credentials for:
[[email protected]]
[2020-04-28 13:29:01,257] {system_tests_class.py:137} INFO - Looking for
DAG: example_gcs_to_bigquery_operator in
/opt/airflow/airflow/providers/google/cloud/example_dags
[2020-04-28 13:29:01,257] {dagbag.py:368} INFO - Filling up the DagBag from
/opt/airflow/airflow/providers/google/cloud/example_dags
[2020-04-28 13:29:03,882] {system_tests_class.py:151} INFO - Attempting to
run DAG: example_gcs_to_bigquery_operator
[2020-04-28 13:29:04,565] {taskinstance.py:718} INFO - Dependencies all met
for <TaskInstance: example_gcs_to_bigquery_operator.create_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>
[2020-04-28 13:29:04,582] {base_executor.py:75} INFO - Adding to queue:
['airflow', 'tasks', 'run', 'example_gcs_to_bigquery_operator',
'create_airflow_test_dataset', '2020-04-26T00:00:00+00:00', '--local',
'--pool', 'default_pool', '--subdir',
'/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py',
'--cfg-path', '/tmp/tmpye7zceo1']
[2020-04-28 13:29:04,602] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'create_airflow_test_dataset'}
[2020-04-28 13:29:04,628] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:05,035] {local_executor.py:66} INFO - QueuedLocalWorker
running ['airflow', 'tasks', 'run', 'example_gcs_to_bigquery_operator',
'create_airflow_test_dataset', '2020-04-26T00:00:00+00:00', '--local',
'--pool', 'default_pool', '--subdir',
'/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py',
'--cfg-path', '/tmp/tmpye7zceo1']
[2020-04-28 13:29:05,056] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 2 | succeeded: 0 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 2
[2020-04-28 13:29:05,088] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'create_airflow_test_dataset'}
[2020-04-28 13:29:05,118] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:06,042] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 2 | succeeded: 0 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 2
[2020-04-28 13:29:06,060] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'create_airflow_test_dataset'}
[2020-04-28 13:29:06,093] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:07,047] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 2 | succeeded: 0 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 2
[2020-04-28 13:29:07,068] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'create_airflow_test_dataset'}
[2020-04-28 13:29:07,100] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:07,717] {dagbag.py:368} INFO - Filling up the DagBag from
/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py
[2020-04-28 13:29:08,071] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 2 | succeeded: 0 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 2
[2020-04-28 13:29:08,142] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'create_airflow_test_dataset'}
[2020-04-28 13:29:08,208] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
Running <TaskInstance:
example_gcs_to_bigquery_operator.create_airflow_test_dataset
2020-04-26T00:00:00+00:00 [None]> on host d8cf57dc3068
[2020-04-28 13:29:09,063] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 2 | succeeded: 0 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 2
[2020-04-28 13:29:09,085] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'create_airflow_test_dataset'}
[2020-04-28 13:29:09,112] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:10,069] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 2 | succeeded: 0 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 2
[2020-04-28 13:29:10,094] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'create_airflow_test_dataset'}
[2020-04-28 13:29:10,121] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:11,078] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 2 | succeeded: 1 | running: 0 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 2
[2020-04-28 13:29:11,103] {taskinstance.py:718} INFO - Dependencies all met
for <TaskInstance: example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26 00:00:00+00:00 [scheduled]>
[2020-04-28 13:29:11,114] {base_executor.py:75} INFO - Adding to queue:
['airflow', 'tasks', 'run', 'example_gcs_to_bigquery_operator',
'gcs_to_bigquery_example', '2020-04-26T00:00:00+00:00', '--local', '--pool',
'default_pool', '--subdir',
'/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py',
'--cfg-path', '/tmp/tmpuzkczu3l']
[2020-04-28 13:29:11,137] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:12,035] {local_executor.py:66} INFO - QueuedLocalWorker
running ['airflow', 'tasks', 'run', 'example_gcs_to_bigquery_operator',
'gcs_to_bigquery_example', '2020-04-26T00:00:00+00:00', '--local', '--pool',
'default_pool', '--subdir',
'/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py',
'--cfg-path', '/tmp/tmpuzkczu3l']
[2020-04-28 13:29:12,037] {backfill_job.py:262} WARNING -
('example_gcs_to_bigquery_operator', 'create_airflow_test_dataset',
datetime.datetime(2020, 4, 26, 0, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00,
STD]>), 3) state success not in running=dict_values([<TaskInstance:
example_gcs_to_bigquery_operator.gcs_to_bigquery_example 2020-04-26
00:00:00+00:00 [queued]>])
[2020-04-28 13:29:12,053] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:12,074] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:13,047] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:13,070] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:14,064] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:14,114] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:14,751] {dagbag.py:368} INFO - Filling up the DagBag from
/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py
[2020-04-28 13:29:15,064] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:15,108] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
Running <TaskInstance:
example_gcs_to_bigquery_operator.gcs_to_bigquery_example
2020-04-26T00:00:00+00:00 [None]> on host d8cf57dc3068
[2020-04-28 13:29:16,062] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:16,077] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:17,064] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:17,081] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:18,068] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:18,083] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:19,075] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:19,093] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:20,077] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:20,092] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:21,090] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:21,107] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:22,096] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 1 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:22,112] {taskinstance.py:712} INFO - Dependencies not met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>, dependency 'Trigger Rule' FAILED:
Task's trigger rule 'all_success' requires all upstream tasks to have
succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1,
'successes': 0, 'skipped': 0, 'failed': 0, 'upstream_failed': 0, 'done': 0},
upstream_task_ids={'gcs_to_bigquery_example'}
[2020-04-28 13:29:23,107] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 1 | succeeded: 2 | running: 0 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 1
[2020-04-28 13:29:23,133] {taskinstance.py:718} INFO - Dependencies all met
for <TaskInstance: example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26 00:00:00+00:00 [scheduled]>
[2020-04-28 13:29:23,140] {base_executor.py:75} INFO - Adding to queue:
['airflow', 'tasks', 'run', 'example_gcs_to_bigquery_operator',
'delete_airflow_test_dataset', '2020-04-26T00:00:00+00:00', '--local',
'--pool', 'default_pool', '--subdir',
'/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py',
'--cfg-path', '/tmp/tmpknrawrfh']
[2020-04-28 13:29:24,101] {local_executor.py:66} INFO - QueuedLocalWorker
running ['airflow', 'tasks', 'run', 'example_gcs_to_bigquery_operator',
'delete_airflow_test_dataset', '2020-04-26T00:00:00+00:00', '--local',
'--pool', 'default_pool', '--subdir',
'/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py',
'--cfg-path', '/tmp/tmpknrawrfh']
[2020-04-28 13:29:24,117] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 0 | succeeded: 2 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 0
[2020-04-28 13:29:25,103] {backfill_job.py:262} WARNING -
('example_gcs_to_bigquery_operator', 'gcs_to_bigquery_example',
datetime.datetime(2020, 4, 26, 0, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00,
STD]>), 2) state success not in running=dict_values([<TaskInstance:
example_gcs_to_bigquery_operator.delete_airflow_test_dataset 2020-04-26
00:00:00+00:00 [queued]>])
[2020-04-28 13:29:25,134] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 0 | succeeded: 2 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 0
[2020-04-28 13:29:26,116] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 0 | succeeded: 2 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 0
[2020-04-28 13:29:26,484] {dagbag.py:368} INFO - Filling up the DagBag from
/opt/airflow/airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py
[2020-04-28 13:29:27,117] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 0 | succeeded: 2 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 0
Running <TaskInstance:
example_gcs_to_bigquery_operator.delete_airflow_test_dataset
2020-04-26T00:00:00+00:00 [None]> on host d8cf57dc3068
[2020-04-28 13:29:28,124] {backfill_job.py:379} INFO - [backfill progress] |
finished run 0 of 1 | tasks waiting: 0 | succeeded: 2 | running: 1 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 0
[2020-04-28 13:29:29,132] {dagrun.py:336} INFO - Marking run <DagRun
example_gcs_to_bigquery_operator @ 2020-04-26 00:00:00+00:00:
backfill__2020-04-26T00:00:00+00:00, externally triggered: False> successful
[2020-04-28 13:29:29,139] {backfill_job.py:379} INFO - [backfill progress] |
finished run 1 of 1 | tasks waiting: 0 | succeeded: 3 | running: 0 | failed: 0
| skipped: 0 | deadlocked: 0 | not ready: 0
[2020-04-28 13:29:29,348] {backfill_job.py:830} INFO - Backfill done.
Exiting.
Saving all log files to /root/airflow/logs/previous_runs/2020-04-28_13_29_29
PASSED
tests/providers/google/cloud/operators/test_gcs_to_bigquery.py::TestGoogleCloudStorageToBigQueryOperator::test_execute_explicit_project
SKIPPED
tests/providers/google/cloud/operators/test_gcs_to_bigquery.py::TestGoogleCloudStorageToBigQueryOperator::test_execute_explicit_project_legacy
SKIPPED
=============================================================================================================================================================================
warnings summary
=============================================================================================================================================================================
tests/providers/google/cloud/operators/test_gcs_to_bigquery.py::TestGoogleCloudStorageToBigQueryExample::test_run_example_dag_gcs_to_bigquery_operator
/opt/airflow/airflow/providers/google/cloud/example_dags/example_mlengine.py:82:
DeprecationWarning: This operator is deprecated. Consider using operators for
specific operations: MLEngineCreateModelOperator, MLEngineGetModelOperator.
"name": MODEL_NAME,
tests/providers/google/cloud/operators/test_gcs_to_bigquery.py::TestGoogleCloudStorageToBigQueryExample::test_run_example_dag_gcs_to_bigquery_operator
/opt/airflow/airflow/providers/google/cloud/example_dags/example_mlengine.py:91:
DeprecationWarning: This operator is deprecated. Consider using operators for
specific operations: MLEngineCreateModelOperator, MLEngineGetModelOperator.
"name": MODEL_NAME,
tests/providers/google/cloud/operators/test_gcs_to_bigquery.py::TestGoogleCloudStorageToBigQueryExample::test_run_example_dag_gcs_to_bigquery_operator
/usr/local/lib/python3.6/site-packages/future/standard_library/__init__.py:65:
DeprecationWarning: the imp module is deprecated in favour of importlib; see
the module's documentation for alternative uses
import imp
tests/providers/google/cloud/operators/test_gcs_to_bigquery.py::TestGoogleCloudStorageToBigQueryExample::test_run_example_dag_gcs_to_bigquery_operator
/opt/airflow/airflow/providers/google/cloud/example_dags/example_datacatalog.py:26:
DeprecationWarning: This module is deprecated. Please use
`airflow.operators.bash`.
from airflow.operators.bash_operator import BashOperator
-- Docs: https://docs.pytest.org/en/latest/warnings.html
=========================================================================================================================================================================
short test summary info
==========================================================================================================================================================================
SKIPPED [1] /opt/airflow/tests/conftest.py:238: The test is skipped because
it does not have the right system marker. Only tests marked with
pytest.mark.system(SYSTEM) are run with SYSTEM being one of ['google'].
<TestCaseFunction test_execute_explicit_project>
SKIPPED [1] /opt/airflow/tests/conftest.py:238: The test is skipped because
it does not have the right system marker. Only tests marked with
pytest.mark.system(SYSTEM) are run with SYSTEM being one of ['google'].
<TestCaseFunction test_execute_explicit_project_legacy>
================================================================================================================================================================
1 passed, 2 skipped, 4 warnings in 30.77s
=================================================================================================================================================================
```
</details>
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]