[
https://issues.apache.org/jira/browse/AIRFLOW-3118?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16628735#comment-16628735
]
Brylie Christopher Oxley commented on AIRFLOW-3118:
---------------------------------------------------
Here is the output from airlow list_dags -r
{noformat}
-------------------------------------------------------------------
DAGS
-------------------------------------------------------------------
example_bash_operator
example_branch_dop_operator_v3
example_branch_operator
example_http_operator
example_kubernetes_executor
example_passing_params_via_test_command
example_python_operator
example_short_circuit_operator
example_skip_dag
example_subdag_operator
example_subdag_operator.section-1
example_subdag_operator.section-2
example_trigger_controller_dag
example_trigger_target_dag
example_xcom
latest_only
latest_only_with_trigger
test_utils
tutorial
-------------------------------------------------------------------
DagBag loading stats for /home/brylie/airflow/dags
-------------------------------------------------------------------
Number of DAGs: 0
Total task number: 0
DagBag parsing time: 0
None
{noformat}
I am trying to run any of the example DAGs that ship with Airflow.
> DAGs not successful on new installation
> ---------------------------------------
>
> Key: AIRFLOW-3118
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3118
> Project: Apache Airflow
> Issue Type: Bug
> Components: DAG
> Affects Versions: 1.10.0
> Environment: Ubuntu 18.04
> Python 3.6
> Reporter: Brylie Christopher Oxley
> Priority: Blocker
> Attachments: image-2018-09-26-12-39-03-094.png
>
>
> When trying out Airflow, on localhost, none of the DAG runs are getting to
> the 'success' state. They are getting stuck in 'running', or I manually label
> them as failed:
> !image-2018-09-26-12-39-03-094.png!
> h2. Steps to reproduce
> # create new conda environment
> ** conda create -n airflow
> ** source activate airflow
> # install airflow
> ** pip install apache-airflow
> # initialize Airflow db
> ** airflow initdb
> # disable default paused setting in airflow.cfg
> ** dags_are_paused_at_creation = False
> # {color:#6a8759}run airflow and airflow scheduler (in separate
> terminal){color}
> ** {color:#6a8759}airflow scheduler{color}
> ** {color:#6a8759}airflow webserver{color}
> # {color:#6a8759}unpause example_bash_operator{color}
> ** {color:#6a8759}airflow unpause example_bash_operator{color}
> # {color:#6a8759}log in to Airflow UI{color}
> # {color:#6a8759}turn on example_bash_operator{color}
> # {color:#6a8759}click "Trigger DAG" in `example_bash_operator` row{color}
> h2. {color:#6a8759}Observed result{color}
> {color:#6a8759}The `example_bash_operator` never leaves the "running"
> state.{color}
> h2. {color:#6a8759}Expected result{color}
> {color:#6a8759}The `example_bash_operator` would quickly enter the "success"
> state{color}
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)