[jira] [Created] (AIRFLOW-7059) Pass hive_conf to get_pandas_df in HiveServer2Hook
Ping Zhang created AIRFLOW-7059: --- Summary: Pass hive_conf to get_pandas_df in HiveServer2Hook Key: AIRFLOW-7059 URL: https://issues.apache.org/jira/browse/AIRFLOW-7059 Project: Apache Airflow Issue Type: Improvement Components: hooks Affects Versions: 1.10.9 Reporter: Ping Zhang code: [https://github.com/apache/airflow/blob/97a429f9d0cf740c5698060ad55f11e93cb57b55/airflow/providers/apache/hive/hooks/hive.py#L973] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6999) Use check_output to capture the stderr in celery executor
Ping Zhang created AIRFLOW-6999: --- Summary: Use check_output to capture the stderr in celery executor Key: AIRFLOW-6999 URL: https://issues.apache.org/jira/browse/AIRFLOW-6999 Project: Apache Airflow Issue Type: Improvement Components: worker Affects Versions: 1.10.9 Reporter: Ping Zhang so that airflow celery worker can capture the error output -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6860) Default ignore_first_depends_on_past to True
[ https://issues.apache.org/jira/browse/AIRFLOW-6860?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang updated AIRFLOW-6860: Description: to avoid BackfillJob is deadlocked.Some of the deadlocked tasks were unable to run because of "depends_on_past" relationships. > Default ignore_first_depends_on_past to True > > > Key: AIRFLOW-6860 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6860 > Project: Apache Airflow > Issue Type: Improvement > Components: cli >Affects Versions: 1.10.9 >Reporter: Ping Zhang >Assignee: Ping Zhang >Priority: Minor > > to avoid > BackfillJob is deadlocked.Some of the deadlocked tasks were unable to run > because of "depends_on_past" relationships. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (AIRFLOW-6860) Default ignore_first_depends_on_past to True
[ https://issues.apache.org/jira/browse/AIRFLOW-6860?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang reassigned AIRFLOW-6860: --- Assignee: Ping Zhang > Default ignore_first_depends_on_past to True > > > Key: AIRFLOW-6860 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6860 > Project: Apache Airflow > Issue Type: Improvement > Components: cli >Affects Versions: 1.10.9 >Reporter: Ping Zhang >Assignee: Ping Zhang >Priority: Minor > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6860) Default ignore_first_depends_on_past to True
Ping Zhang created AIRFLOW-6860: --- Summary: Default ignore_first_depends_on_past to True Key: AIRFLOW-6860 URL: https://issues.apache.org/jira/browse/AIRFLOW-6860 Project: Apache Airflow Issue Type: Improvement Components: cli Affects Versions: 1.10.9 Reporter: Ping Zhang -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (AIRFLOW-6837) limit description of dag in home page
[ https://issues.apache.org/jira/browse/AIRFLOW-6837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang reassigned AIRFLOW-6837: --- Assignee: Ping Zhang > limit description of dag in home page > - > > Key: AIRFLOW-6837 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6837 > Project: Apache Airflow > Issue Type: Improvement > Components: webserver >Affects Versions: 1.10.9 >Reporter: Ping Zhang >Assignee: Ping Zhang >Priority: Major > > so that the tooltip will not cover the dag link -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6837) limit description of dag in home page
Ping Zhang created AIRFLOW-6837: --- Summary: limit description of dag in home page Key: AIRFLOW-6837 URL: https://issues.apache.org/jira/browse/AIRFLOW-6837 Project: Apache Airflow Issue Type: Improvement Components: webserver Affects Versions: 1.10.9 Reporter: Ping Zhang so that the tooltip will not cover the dag link -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (AIRFLOW-6588) json_format and write_stdout are boolean
[ https://issues.apache.org/jira/browse/AIRFLOW-6588?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang reassigned AIRFLOW-6588: --- Assignee: Ping Zhang > json_format and write_stdout are boolean > > > Key: AIRFLOW-6588 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6588 > Project: Apache Airflow > Issue Type: Bug > Components: logging >Affects Versions: master >Reporter: Ping Zhang >Assignee: Ping Zhang >Priority: Minor > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6588) json_format and write_stdout are boolean
Ping Zhang created AIRFLOW-6588: --- Summary: json_format and write_stdout are boolean Key: AIRFLOW-6588 URL: https://issues.apache.org/jira/browse/AIRFLOW-6588 Project: Apache Airflow Issue Type: Bug Components: logging Affects Versions: master Reporter: Ping Zhang -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (AIRFLOW-6167) Escape col name in MysqlToHive operator
[ https://issues.apache.org/jira/browse/AIRFLOW-6167?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang reassigned AIRFLOW-6167: --- Assignee: Ping Zhang > Escape col name in MysqlToHive operator > --- > > Key: AIRFLOW-6167 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6167 > Project: Apache Airflow > Issue Type: Bug > Components: operators >Affects Versions: 1.10.4 >Reporter: Ping Zhang >Assignee: Ping Zhang >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6167) Escape col name in MysqlToHive operator
Ping Zhang created AIRFLOW-6167: --- Summary: Escape col name in MysqlToHive operator Key: AIRFLOW-6167 URL: https://issues.apache.org/jira/browse/AIRFLOW-6167 Project: Apache Airflow Issue Type: Bug Components: operators Affects Versions: 1.10.4 Reporter: Ping Zhang -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6049) airflow test double prints when the task log handler is already StreamHandler
Ping Zhang created AIRFLOW-6049: --- Summary: airflow test double prints when the task log handler is already StreamHandler Key: AIRFLOW-6049 URL: https://issues.apache.org/jira/browse/AIRFLOW-6049 Project: Apache Airflow Issue Type: Bug Components: cli Affects Versions: 1.10.4 Reporter: Ping Zhang -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-5868) Should not deepcopy shallow_copy attributes
Ping Zhang created AIRFLOW-5868: --- Summary: Should not deepcopy shallow_copy attributes Key: AIRFLOW-5868 URL: https://issues.apache.org/jira/browse/AIRFLOW-5868 Project: Apache Airflow Issue Type: Bug Components: operators Affects Versions: 1.10.4 Reporter: Ping Zhang Assignee: Ping Zhang -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-5695) Cannot run a task from UI if its state is None
Ping Zhang created AIRFLOW-5695: --- Summary: Cannot run a task from UI if its state is None Key: AIRFLOW-5695 URL: https://issues.apache.org/jira/browse/AIRFLOW-5695 Project: Apache Airflow Issue Type: Bug Components: ui Affects Versions: 1.10.4 Reporter: Ping Zhang -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-5556) Should not use dagbag_import_timeout to time out DagFileProcessor
Ping Zhang created AIRFLOW-5556: --- Summary: Should not use dagbag_import_timeout to time out DagFileProcessor Key: AIRFLOW-5556 URL: https://issues.apache.org/jira/browse/AIRFLOW-5556 Project: Apache Airflow Issue Type: Bug Components: scheduler Affects Versions: 1.10.5, 1.10.4 Reporter: Ping Zhang Assignee: Ping Zhang since `dagbag_import_timeout` is used to control timeout `load_source` of airflow dag. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-5543) Tooltip in tree view and graph view disappears after page scrolls up
Ping Zhang created AIRFLOW-5543: --- Summary: Tooltip in tree view and graph view disappears after page scrolls up Key: AIRFLOW-5543 URL: https://issues.apache.org/jira/browse/AIRFLOW-5543 Project: Apache Airflow Issue Type: Bug Components: webserver Affects Versions: 1.10.5 Reporter: Ping Zhang Assignee: Ping Zhang Attachments: image-2019-09-23-12-18-51-668.png !image-2019-09-23-12-18-51-668.png! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (AIRFLOW-5528) end of log mark in broken in es_task_handler
[ https://issues.apache.org/jira/browse/AIRFLOW-5528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang reassigned AIRFLOW-5528: --- Assignee: Ping Zhang > end of log mark in broken in es_task_handler > > > Key: AIRFLOW-5528 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5528 > Project: Apache Airflow > Issue Type: Improvement > Components: logging >Affects Versions: 1.10.5 >Reporter: Ping Zhang >Assignee: Ping Zhang >Priority: Critical > > introduced in: [https://github.com/apache/airflow/pull/5048/files] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-5528) end of log mark in broken in es_task_handler
Ping Zhang created AIRFLOW-5528: --- Summary: end of log mark in broken in es_task_handler Key: AIRFLOW-5528 URL: https://issues.apache.org/jira/browse/AIRFLOW-5528 Project: Apache Airflow Issue Type: Improvement Components: logging Affects Versions: 1.10.5 Reporter: Ping Zhang introduced in: [https://github.com/apache/airflow/pull/5048/files] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-5087) Display task/dag run stats on UI so that users can debug more easily
Ping Zhang created AIRFLOW-5087: --- Summary: Display task/dag run stats on UI so that users can debug more easily Key: AIRFLOW-5087 URL: https://issues.apache.org/jira/browse/AIRFLOW-5087 Project: Apache Airflow Issue Type: Improvement Components: ui Affects Versions: 1.10.0 Reporter: Ping Zhang Assignee: Ping Zhang # display the current running dag runs count / max dag runs limit (max_active_runs_per_dag) # display the current running tasks count / max running task limit (dag_concurrency) # hover a task to also display `depends_on_past` and `wait_for_downstream` params for not scheduled tasks -- This message was sent by Atlassian JIRA (v7.6.14#76016)
[jira] [Updated] (AIRFLOW-5023) airflow scheudler `RuntimeError: maximum recursion depth exceeded while calling a Python object`
[ https://issues.apache.org/jira/browse/AIRFLOW-5023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang updated AIRFLOW-5023: Description: I created a new conda env {code:java} conda create -n airflow-kb8 python=2.7.13 conda activate airflow-kb8 pip install apache-airflow pip install Flask==1.0.4 # to fix a issue werkzeug.wrappers.json missing json pip install cython pip install 'apache-airflow[all_dbs]' # change to use LocalExecutor OR CeleryExecutor{code} After that, i tried to run airflow scheduler and got this error: {code:java} [2019-07-22 16:43:31,270] {settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1008 [2019-07-22 16:43:32,387] {_init_.py:51} INFO - Using executor KubernetesExecutor _ |_( ) _/_ / __ /| |_ /__ __/ /_ __ /_ __ _ | /| / / ___ ___ | / _ / _ _/ _ / / // /_ |/ |/ / // |// // // // __/__/|_/ [2019-07-22 16:43:32,708] {jobs.py:1501} INFO - Starting the scheduler [2019-07-22 16:43:32,708] {jobs.py:1509} INFO - Running execute loop for -1 seconds [2019-07-22 16:43:32,709] {jobs.py:1510} INFO - Processing each file at most -1 times [2019-07-22 16:43:32,709] {jobs.py:1513} INFO - Searching for files in /Users/x/airflow/dags [2019-07-22 16:43:32,717] {jobs.py:1515} INFO - There are 21 files in /Users/x/airflow/dags [2019-07-22 16:43:32,722] {kubernetes_executor.py:690} INFO - Start Kubernetes executor [2019-07-22 16:43:32,776] {kubernetes_executor.py:627} INFO - When executor started up, found 0 queued task instances [2019-07-22 16:43:32,777] {jobs.py:1560} INFO - Resetting orphaned tasks for active dag runs [2019-07-22 16:43:32,785] {dag_processing.py:515} INFO - Launched DagFileProcessorManager with pid: 1459 [2019-07-22 16:43:32,798] {settings.py:53} INFO - Configured default timezone [2019-07-22 16:43:32,811] {kubernetes_executor.py:304} INFO - Event: and now my watch begins starting at resource_version: 0 [2019-07-22 16:43:32,819] {settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1459 Process DagFileProcessor22-Process: Traceback (most recent call last): File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap self.run() File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/multiprocessing/process.py", line 114, in run self._target(*self._args, **self._kwargs) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/jobs.py", line 410, in helper print(e) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/logging_mixin.py", line 95, in write self.logger.log(self.level, self._buffer.rstrip()) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1231, in log self._log(level, msg, args, **kwargs) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1286, in _log self.handle(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1296, in handle self.callHandlers(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1336, in callHandlers hdlr.handle(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 759, in handle self.emit(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/file_processor_handler.py", line 76, in emit self.handler.emit(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 957, in emit StreamHandler.emit(self, record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 889, in emit self.handleError(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 812, in handleError None, sys.stderr) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/traceback.py", line 124, in print_exception _print(file, 'Traceback (most recent call last):') File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/traceback.py", line 13, in _print file.write(str+terminator) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/logging_mixin.py", line 95, in write self.logger.log(self.level, self._buffer.rstrip()) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1231, in log self._log(level, msg, args, **kwargs) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1286, in _log self.handle(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1296, in handle self.callHandlers(record) File
[jira] [Created] (AIRFLOW-5023) airflow scheudler `RuntimeError: maximum recursion depth exceeded while calling a Python object`
Ping Zhang created AIRFLOW-5023: --- Summary: airflow scheudler `RuntimeError: maximum recursion depth exceeded while calling a Python object` Key: AIRFLOW-5023 URL: https://issues.apache.org/jira/browse/AIRFLOW-5023 Project: Apache Airflow Issue Type: Bug Components: scheduler Affects Versions: 1.10.3 Reporter: Ping Zhang I created a new conda env ``` conda create -n airflow-kb8 python=2.7.13 conda activate airflow-kb8 pip install apache-airflow pip install Flask==1.0.4 # to fix a issue werkzeug.wrappers.json missing json pip install cython pip install 'apache-airflow[all_dbs]' ``` After that, i tried to run airflow scheduler and got this error: ``` [2019-07-22 16:43:31,270] \{settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1008 [2019-07-22 16:43:32,387] \{__init__.py:51} INFO - Using executor KubernetesExecutor _ |__( )_ __/__ / __ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / / ___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ / _/_/ |_/_/ /_/ /_/ /_/ \//|__/ [2019-07-22 16:43:32,708] \{jobs.py:1501} INFO - Starting the scheduler [2019-07-22 16:43:32,708] \{jobs.py:1509} INFO - Running execute loop for -1 seconds [2019-07-22 16:43:32,709] \{jobs.py:1510} INFO - Processing each file at most -1 times [2019-07-22 16:43:32,709] \{jobs.py:1513} INFO - Searching for files in /Users/x/airflow/dags [2019-07-22 16:43:32,717] \{jobs.py:1515} INFO - There are 21 files in /Users/x/airflow/dags [2019-07-22 16:43:32,722] \{kubernetes_executor.py:690} INFO - Start Kubernetes executor [2019-07-22 16:43:32,776] \{kubernetes_executor.py:627} INFO - When executor started up, found 0 queued task instances [2019-07-22 16:43:32,777] \{jobs.py:1560} INFO - Resetting orphaned tasks for active dag runs [2019-07-22 16:43:32,785] \{dag_processing.py:515} INFO - Launched DagFileProcessorManager with pid: 1459 [2019-07-22 16:43:32,798] \{settings.py:53} INFO - Configured default timezone [2019-07-22 16:43:32,811] \{kubernetes_executor.py:304} INFO - Event: and now my watch begins starting at resource_version: 0 [2019-07-22 16:43:32,819] \{settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1459 Process DagFileProcessor22-Process: Traceback (most recent call last): File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap self.run() File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/multiprocessing/process.py", line 114, in run self._target(*self._args, **self._kwargs) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/jobs.py", line 410, in helper print(e) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/logging_mixin.py", line 95, in write self.logger.log(self.level, self._buffer.rstrip()) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 1231, in log self._log(level, msg, args, **kwargs) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 1286, in _log self.handle(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 1296, in handle self.callHandlers(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 1336, in callHandlers hdlr.handle(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 759, in handle self.emit(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/file_processor_handler.py", line 76, in emit self.handler.emit(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 957, in emit StreamHandler.emit(self, record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 889, in emit self.handleError(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 812, in handleError None, sys.stderr) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/traceback.py", line 124, in print_exception _print(file, 'Traceback (most recent call last):') File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/traceback.py", line 13, in _print file.write(str+terminator) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/logging_mixin.py", line 95, in write self.logger.log(self.level, self._buffer.rstrip()) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/__init__.py", line 1231, in log self._log(level, msg, args, **kwargs) File
[jira] [Updated] (AIRFLOW-5023) airflow scheudler `RuntimeError: maximum recursion depth exceeded while calling a Python object`
[ https://issues.apache.org/jira/browse/AIRFLOW-5023?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang updated AIRFLOW-5023: Description: I created a new conda env {code:java} conda create -n airflow-kb8 python=2.7.13 conda activate airflow-kb8 pip install apache-airflow pip install Flask==1.0.4 # to fix a issue werkzeug.wrappers.json missing json pip install cython pip install 'apache-airflow[all_dbs]' {code} After that, i tried to run airflow scheduler and got this error: {code:java} [2019-07-22 16:43:31,270] {settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1008 [2019-07-22 16:43:32,387] {_init_.py:51} INFO - Using executor KubernetesExecutor _ |_( ) _/_ / __ /| |_ /__ __/ /_ __ /_ __ _ | /| / / ___ ___ | / _ / _ _/ _ / / // /_ |/ |/ / // |// // // // __/__/|_/ [2019-07-22 16:43:32,708] {jobs.py:1501} INFO - Starting the scheduler [2019-07-22 16:43:32,708] {jobs.py:1509} INFO - Running execute loop for -1 seconds [2019-07-22 16:43:32,709] {jobs.py:1510} INFO - Processing each file at most -1 times [2019-07-22 16:43:32,709] {jobs.py:1513} INFO - Searching for files in /Users/x/airflow/dags [2019-07-22 16:43:32,717] {jobs.py:1515} INFO - There are 21 files in /Users/x/airflow/dags [2019-07-22 16:43:32,722] {kubernetes_executor.py:690} INFO - Start Kubernetes executor [2019-07-22 16:43:32,776] {kubernetes_executor.py:627} INFO - When executor started up, found 0 queued task instances [2019-07-22 16:43:32,777] {jobs.py:1560} INFO - Resetting orphaned tasks for active dag runs [2019-07-22 16:43:32,785] {dag_processing.py:515} INFO - Launched DagFileProcessorManager with pid: 1459 [2019-07-22 16:43:32,798] {settings.py:53} INFO - Configured default timezone [2019-07-22 16:43:32,811] {kubernetes_executor.py:304} INFO - Event: and now my watch begins starting at resource_version: 0 [2019-07-22 16:43:32,819] {settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1459 Process DagFileProcessor22-Process: Traceback (most recent call last): File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap self.run() File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/multiprocessing/process.py", line 114, in run self._target(*self._args, **self._kwargs) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/jobs.py", line 410, in helper print(e) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/logging_mixin.py", line 95, in write self.logger.log(self.level, self._buffer.rstrip()) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1231, in log self._log(level, msg, args, **kwargs) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1286, in _log self.handle(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1296, in handle self.callHandlers(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1336, in callHandlers hdlr.handle(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 759, in handle self.emit(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/file_processor_handler.py", line 76, in emit self.handler.emit(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 957, in emit StreamHandler.emit(self, record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 889, in emit self.handleError(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 812, in handleError None, sys.stderr) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/traceback.py", line 124, in print_exception _print(file, 'Traceback (most recent call last):') File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/traceback.py", line 13, in _print file.write(str+terminator) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/site-packages/airflow/utils/log/logging_mixin.py", line 95, in write self.logger.log(self.level, self._buffer.rstrip()) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1231, in log self._log(level, msg, args, **kwargs) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1286, in _log self.handle(record) File "/Users/x/anaconda2/envs/airflow-kb8/lib/python2.7/logging/_init_.py", line 1296, in handle self.callHandlers(record) File
[jira] [Resolved] (AIRFLOW-4084) Fails to download the entire log when file is large and stored in the ElasticSearch
[ https://issues.apache.org/jira/browse/AIRFLOW-4084?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ping Zhang resolved AIRFLOW-4084. - Resolution: Fixed Thanks Cong > Fails to download the entire log when file is large and stored in the > ElasticSearch > --- > > Key: AIRFLOW-4084 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4084 > Project: Apache Airflow > Issue Type: Bug > Components: webserver >Affects Versions: 1.10.0 >Reporter: Ping Zhang >Assignee: Cong Zhu >Priority: Major > > The task log download by attempts does not take metadata into account. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-4447) Display task duration as human friendly format in Tree View
Ping Zhang created AIRFLOW-4447: --- Summary: Display task duration as human friendly format in Tree View Key: AIRFLOW-4447 URL: https://issues.apache.org/jira/browse/AIRFLOW-4447 Project: Apache Airflow Issue Type: Improvement Components: webserver Reporter: Ping Zhang Assignee: Ping Zhang -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-4084) Fails to download the entire log when file is large and stored in the ElasticSearch
Ping Zhang created AIRFLOW-4084: --- Summary: Fails to download the entire log when file is large and stored in the ElasticSearch Key: AIRFLOW-4084 URL: https://issues.apache.org/jira/browse/AIRFLOW-4084 Project: Apache Airflow Issue Type: Bug Components: webserver Affects Versions: 1.10.0 Reporter: Ping Zhang Assignee: Ping Zhang The task log download by attempts does not take metadata into account. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-3636) Fix a test introduced in PR #4425
[ https://issues.apache.org/jira/browse/AIRFLOW-3636?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16735076#comment-16735076 ] Ping Zhang commented on AIRFLOW-3636: - PR: [https://github.com/apache/airflow/pull/4446] > Fix a test introduced in PR #4425 > - > > Key: AIRFLOW-3636 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3636 > Project: Apache Airflow > Issue Type: Bug >Reporter: Ping Zhang >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-3636) Fix a test introduced in PR #4425
Ping Zhang created AIRFLOW-3636: --- Summary: Fix a test introduced in PR #4425 Key: AIRFLOW-3636 URL: https://issues.apache.org/jira/browse/AIRFLOW-3636 Project: Apache Airflow Issue Type: Bug Reporter: Ping Zhang -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-3623) Support download log file from UI
Ping Zhang created AIRFLOW-3623: --- Summary: Support download log file from UI Key: AIRFLOW-3623 URL: https://issues.apache.org/jira/browse/AIRFLOW-3623 Project: Apache Airflow Issue Type: Improvement Components: ui Reporter: Ping Zhang Assignee: Ping Zhang for some large log files, it is not a good idea to fetch and render in the UI. Adding the ability to let users to download by try_number in the dag modal. -- This message was sent by Atlassian JIRA (v7.6.3#76005)