[ 
https://issues.apache.org/jira/browse/AIRFLOW-4461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

JudyXie updated AIRFLOW-4461:
-----------------------------
    Description: 
*[Basic Info]:*
 * Airflow web and scheduler Pod server timezone:  
{code:java}
[Etc/GMT+7]{code}

 * Airflow sqlalchemy backend database is *mysql* which timezone  is
{code:java}
[Etc/GMT+7]{code}

 * Airflow Executor and default timezone info:

 
{code:java}
[2019-05-05 21:15:19,172] {__init__.py:51} INFO - Using executor 
KubernetesExecutor [2019-05-05 21:15:19,359] {jobs.py:1500} INFO - Starting the 
scheduler [2019-05-05 21:15:19,359] {jobs.py:1508} INFO - Running execute loop 
for -1 seconds [2019-05-05 21:15:19,360] {jobs.py:1509} INFO - Processing each 
file at most -1 times [2019-05-05 21:15:19,360] {jobs.py:1512} INFO - Searching 
for files in /usr/local/airflow/dags [2019-05-05 21:15:19,367] {jobs.py:1514} 
INFO - There are 16 files in /usr/local/airflow/dags [2019-05-05 21:15:19,373] 
{kubernetes_executor.py:690} INFO - Start Kubernetes executor [2019-05-05 
21:15:19,397] {kubernetes_executor.py:627} INFO - When executor started up, 
found 0 queued task instances [2019-05-05 21:15:19,398] 
{kubernetes_executor.py:304} INFO - Event: and now my watch begins starting at 
resource_version: 0 [2019-05-05 21:15:19,398] {jobs.py:1559} INFO - Resetting 
orphaned tasks for active dag runs [2019-05-05 21:15:19,405] 
{dag_processing.py:515} INFO - Launched DagFileProcessorManager with pid: 140 
[2019-05-05 21:15:19,409] {settings.py:53} INFO - Configured default timezone 
<Timezone [Etc/GMT+7]>
{code}
*[UI Bug]:*

Here *Execution_Date* always show *[Etc/GMT+7] timestamp* which lead the *Run 
Id* hyper link not work

graph?dag_id=example_bash_operator&run_id=manual__2019-05-06T04%3A29%3A21.874850%2B00%3A00&*execution_date=2019-05-05+21%3A29%3A21.874850%2B*
 => The log path will not find due to the *log path folder* is 
{color:#FF0000}*UTC*{color} 
(*execution_date=2019-05-{color:#FF0000}06+04{color}%3A29%3A21.874850%2B)*

!image-2019-05-06-12-33-25-265.png!

 

  was:
*[Basic Info]:*
 * Airflow web and scheduler Pod server timezone:  
{code:java}
[Etc/GMT+7]{code}

 * Airflow sqlalchemy backend database is *mysql* which timezone  is
{code:java}
[Etc/GMT+7]{code}

 * Airflow Executor and default timezone info:

{code:java}
[2019-05-05 21:15:19,172] {__init__.py:51} INFO - Using executor 
KubernetesExecutor

[2019-05-05 21:15:19,359] {jobs.py:1500} INFO - Starting the scheduler

[2019-05-05 21:15:19,359] {jobs.py:1508} INFO - Running execute loop for -1 
seconds

[2019-05-05 21:15:19,360] {jobs.py:1509} INFO - Processing each file at most -1 
times

[2019-05-05 21:15:19,360] {jobs.py:1512} INFO - Searching for files in 
/usr/local/airflow/dags

[2019-05-05 21:15:19,367] {jobs.py:1514} INFO - There are 16 files in 
/usr/local/airflow/dags

[2019-05-05 21:15:19,373] {kubernetes_executor.py:690} INFO - Start Kubernetes 
executor

[2019-05-05 21:15:19,397] {kubernetes_executor.py:627} INFO - When executor 
started up, found 0 queued task instances

[2019-05-05 21:15:19,398] {kubernetes_executor.py:304} INFO - Event: and now my 
watch begins starting at resource_version: 0

[2019-05-05 21:15:19,398] {jobs.py:1559} INFO - Resetting orphaned tasks for 
active dag runs

[2019-05-05 21:15:19,405] {dag_processing.py:515} INFO - Launched 
DagFileProcessorManager with pid: 140

[2019-05-05 21:15:19,409] {settings.py:53} INFO - Configured default timezone 
<Timezone [Etc/GMT+7]>{code}


> Execution_Date from UI not convert to utc in List Dag Run Page
> --------------------------------------------------------------
>
>                 Key: AIRFLOW-4461
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-4461
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: ui
>    Affects Versions: 1.10.3
>            Reporter: JudyXie
>            Priority: Major
>         Attachments: image-2019-05-06-12-33-25-265.png
>
>
> *[Basic Info]:*
>  * Airflow web and scheduler Pod server timezone:  
> {code:java}
> [Etc/GMT+7]{code}
>  * Airflow sqlalchemy backend database is *mysql* which timezone  is
> {code:java}
> [Etc/GMT+7]{code}
>  * Airflow Executor and default timezone info:
>  
> {code:java}
> [2019-05-05 21:15:19,172] {__init__.py:51} INFO - Using executor 
> KubernetesExecutor [2019-05-05 21:15:19,359] {jobs.py:1500} INFO - Starting 
> the scheduler [2019-05-05 21:15:19,359] {jobs.py:1508} INFO - Running execute 
> loop for -1 seconds [2019-05-05 21:15:19,360] {jobs.py:1509} INFO - 
> Processing each file at most -1 times [2019-05-05 21:15:19,360] 
> {jobs.py:1512} INFO - Searching for files in /usr/local/airflow/dags 
> [2019-05-05 21:15:19,367] {jobs.py:1514} INFO - There are 16 files in 
> /usr/local/airflow/dags [2019-05-05 21:15:19,373] 
> {kubernetes_executor.py:690} INFO - Start Kubernetes executor [2019-05-05 
> 21:15:19,397] {kubernetes_executor.py:627} INFO - When executor started up, 
> found 0 queued task instances [2019-05-05 21:15:19,398] 
> {kubernetes_executor.py:304} INFO - Event: and now my watch begins starting 
> at resource_version: 0 [2019-05-05 21:15:19,398] {jobs.py:1559} INFO - 
> Resetting orphaned tasks for active dag runs [2019-05-05 21:15:19,405] 
> {dag_processing.py:515} INFO - Launched DagFileProcessorManager with pid: 140 
> [2019-05-05 21:15:19,409] {settings.py:53} INFO - Configured default timezone 
> <Timezone [Etc/GMT+7]>
> {code}
> *[UI Bug]:*
> Here *Execution_Date* always show *[Etc/GMT+7] timestamp* which lead the *Run 
> Id* hyper link not work
> graph?dag_id=example_bash_operator&run_id=manual__2019-05-06T04%3A29%3A21.874850%2B00%3A00&*execution_date=2019-05-05+21%3A29%3A21.874850%2B*
>  => The log path will not find due to the *log path folder* is 
> {color:#FF0000}*UTC*{color} 
> (*execution_date=2019-05-{color:#FF0000}06+04{color}%3A29%3A21.874850%2B)*
> !image-2019-05-06-12-33-25-265.png!
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to