[
https://issues.apache.org/jira/browse/AIRFLOW-3279?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Paul Velthuis updated AIRFLOW-3279:
-----------------------------------
Description:
The documentation of how to install logging to a Google Cloud bucket is unclear.
I am now following the tutorial on the airflow page:
[https://airflow.apache.org/howto/write-logs.html]
Here I find it unclear what part of the 'logger' I have to adjust in the
`{{airflow/config_templates/airflow_local_settings.py}}`.
The adjustment states:
# Update the airflow.task and airflow.tas_runner blocks to be 'gcs.task'
instead of 'file.task'. 'loggers':
Unknown macro: \{ 'airflow.task'}
However what I find in the template is:
|'loggers': \{\| \|'airflow.processor': { \\\\\\| \\\\\\|'handlers':
['processor'], \\\\\\| \\\\\\|'level': LOG_LEVEL, \\\\\\| \\\\\\|'propagate':
False, \\\\\\| \\\\\\|},|
|'airflow.task': {
\|
\|'handlers': ['task'],
\|
\|'level': LOG_LEVEL,
\|
\|'propagate': False,
\|
\|},|
|'flask_appbuilder': {
\|
\|'handler': ['console'],
\|
\|'level': FAB_LOG_LEVEL,
\|
\|'propagate': True,
\|
\|}|
},
Since for me it is very important to do it right at the first time I hope some
clarity can be provided in what has to be adjusted in the logger. Is it only
the 'airflow.task' or more?
Furthermore, at step 6 it is a little unclear what remote_log_conn_id means. I
would propose to add a little more information to make this more clear.
When using the config file on:
https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/airflow_local_settings.py
I receive the error `ImportError: Unable to load custom logging from
log_config.LOGGING_CONFIG due to section/key
[core/dag_processor_manager_log_location] not found in config`. Should I adjust
here anything?
was:
The documentation of how to install logging to a Google Cloud bucket is unclear.
I am now following the tutorial on the airflow page:
[https://airflow.apache.org/howto/write-logs.html]
Here I find it unclear what part of the 'logger' I have to adjust in the
`{{airflow/config_templates/airflow_local_settings.py}}`.
The adjustment states:
# Update the airflow.task and airflow.tas_runner blocks to be 'gcs.task'
instead of 'file.task'. 'loggers':
Unknown macro: \{ 'airflow.task'}
However what I find in the template is:
|'loggers': \{\| \|'airflow.processor': { \\\\\| \\\\\|'handlers':
['processor'], \\\\\| \\\\\|'level': LOG_LEVEL, \\\\\| \\\\\|'propagate':
False, \\\\\| \\\\\|},|
|'airflow.task': \{ \\| \\|'handlers': ['task'], \\| \\|'level': LOG_LEVEL, \\|
\\|'propagate': False, \\| \\|},|
|'flask_appbuilder': \{ \\| \\|'handler': ['console'], \\| \\|'level':
FAB_LOG_LEVEL, \\| \\|'propagate': True, \\| \\|}|
},
Since for me it is very important to do it right at the first time I hope some
clarity can be provided in what has to be adjusted in the logger. Is it only
the 'airflow.task' or more?
Furthermore, at step 6 it is a little unclear what remote_log_conn_id means. I
would propose to add a little more information to make this more clear.
> Documentation for Google Logging unclear
> ----------------------------------------
>
> Key: AIRFLOW-3279
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3279
> Project: Apache Airflow
> Issue Type: Bug
> Components: configuration, Documentation, logging
> Reporter: Paul Velthuis
> Assignee: Fokko Driesprong
> Priority: Major
>
> The documentation of how to install logging to a Google Cloud bucket is
> unclear.
> I am now following the tutorial on the airflow page:
> [https://airflow.apache.org/howto/write-logs.html]
> Here I find it unclear what part of the 'logger' I have to adjust in the
> `{{airflow/config_templates/airflow_local_settings.py}}`.
>
> The adjustment states:
>
> # Update the airflow.task and airflow.tas_runner blocks to be 'gcs.task'
> instead of 'file.task'. 'loggers':
> Unknown macro: \{ 'airflow.task'}
>
> However what I find in the template is:
> |'loggers': \{\| \|'airflow.processor': { \\\\\\| \\\\\\|'handlers':
> ['processor'], \\\\\\| \\\\\\|'level': LOG_LEVEL, \\\\\\| \\\\\\|'propagate':
> False, \\\\\\| \\\\\\|},|
> |'airflow.task': {
> \|
> \|'handlers': ['task'],
> \|
> \|'level': LOG_LEVEL,
> \|
> \|'propagate': False,
> \|
> \|},|
> |'flask_appbuilder': {
> \|
> \|'handler': ['console'],
> \|
> \|'level': FAB_LOG_LEVEL,
> \|
> \|'propagate': True,
> \|
> \|}|
> },
>
> Since for me it is very important to do it right at the first time I hope
> some clarity can be provided in what has to be adjusted in the logger. Is it
> only the 'airflow.task' or more?
> Furthermore, at step 6 it is a little unclear what remote_log_conn_id means.
> I would propose to add a little more information to make this more clear.
>
> When using the config file on:
> https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/airflow_local_settings.py
> I receive the error `ImportError: Unable to load custom logging from
> log_config.LOGGING_CONFIG due to section/key
> [core/dag_processor_manager_log_location] not found in config`. Should I
> adjust here anything?
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)