[ 
https://issues.apache.org/jira/browse/AIRFLOW-3279?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Paul Velthuis updated AIRFLOW-3279:
-----------------------------------
    Description: 
The documentation of how to install logging to a Google Cloud bucket is unclear.

I am now following the tutorial on the airflow page:

[https://airflow.apache.org/howto/write-logs.html]

Here I find it unclear what part of the 'logger' I have to adjust in the 
`{{airflow/config_templates/airflow_local_settings.py}}`.

 

The adjustment states:

 
 # Update the airflow.task and airflow.tas_runner blocks to be 'gcs.task' 
instead of 'file.task'. 'loggers':
 Unknown macro: \{ 'airflow.task'}

 

However what I find in the template is:
|'loggers': \{\| \|'airflow.processor': { \\\| \\\|'handlers': ['processor'], 
\\\| \\\|'level': LOG_LEVEL, \\\| \\\|'propagate': False, \\\| \\\|},|
|'airflow.task': \{ \\| \\|'handlers': ['task'], \\| \\|'level': LOG_LEVEL, \\| 
\\|'propagate': False, \\| \\|},|
|'flask_appbuilder': \{ \\| \\|'handler': ['console'], \\| \\|'level': 
FAB_LOG_LEVEL, \\| \\|'propagate': True, \\| \\|}|

},

 

Since for me it is very important to do it right at the first time I hope some 
clarity can be provided in what has to be adjusted in the logger. Is it only 
the 'airflow.task' or more?

Furthermore, at step 6 it is a little unclear what remote_log_conn_id means. I 
would propose to add a little more information to make this more clear.

 

When training to deploy the new log_config.py and following the instructions I 
receive the following error:

python3.6/site-packages/airflow/logging_config.py", line 60, in 
configure_logging
     'Unable to load custom logging from {}'.format(logging_class_path)
 ImportError: Unable to load custom logging from log_config.LOGGING_CONFIG

If I add the `logging_class_path` instead of `logging_config_class` this error 
resolves.

 

  was:
The documentation of how to install logging to a Google Cloud bucket is unclear.

I am now following the tutorial on the airflow page:

[https://airflow.apache.org/howto/write-logs.html]

Here I find it unclear what part of the 'logger' I have to adjust in the 
`{{airflow/config_templates/airflow_local_settings.py}}`.

 

The adjustment states:

 
 # Update the airflow.task and airflow.tas_runner blocks to be 'gcs.task' 
instead of 'file.task'. 'loggers':
 Unknown macro: \{ 'airflow.task'}

 

However what I find in the template is:
|'loggers': \{\| \|'airflow.processor': { \\| \\|'handlers': ['processor'], \\| 
\\|'level': LOG_LEVEL, \\| \\|'propagate': False, \\| \\|},|
|'airflow.task': {
\| 
\|'handlers': ['task'],
\| 
\|'level': LOG_LEVEL,
\| 
\|'propagate': False,
\| 
\|},|
|'flask_appbuilder': {
\| 
\|'handler': ['console'],
\| 
\|'level': FAB_LOG_LEVEL,
\| 
\|'propagate': True,
\| 
\|}|

},

 

Since for me it is very important to do it right at the first time I hope some 
clarity can be provided in what has to be adjusted in the logger. Is it only 
the 'airflow.task' or more?

Furthermore, at step 6 it is a little unclear what remote_log_conn_id means. I 
would propose to add a little more information to make this more clear.

 

When training to deploy the new log_config.py and following the instructions I 
receive the following error:

python3.6/site-packages/airflow/logging_config.py", line 60, in 
configure_logging
     'Unable to load custom logging from {}'.format(logging_class_path)
 ImportError: Unable to load custom logging from log_config.LOGGING_CONFIG

If I add the `logging_class_path` instead of `logging_config_class` this error 
resolves.

After this I get the following error: ValueError: Unable to configure handler 
'file.processor': [Errno 13] Permission denied: '/usr/local/airflow'

I am running in a virtualenv

 


> Documentation for Google Logging unclear
> ----------------------------------------
>
>                 Key: AIRFLOW-3279
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-3279
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: configuration, Documentation, logging
>            Reporter: Paul Velthuis
>            Assignee: Fokko Driesprong
>            Priority: Major
>
> The documentation of how to install logging to a Google Cloud bucket is 
> unclear.
> I am now following the tutorial on the airflow page:
> [https://airflow.apache.org/howto/write-logs.html]
> Here I find it unclear what part of the 'logger' I have to adjust in the 
> `{{airflow/config_templates/airflow_local_settings.py}}`.
>  
> The adjustment states:
>  
>  # Update the airflow.task and airflow.tas_runner blocks to be 'gcs.task' 
> instead of 'file.task'. 'loggers':
>  Unknown macro: \{ 'airflow.task'}
>  
> However what I find in the template is:
> |'loggers': \{\| \|'airflow.processor': { \\\| \\\|'handlers': ['processor'], 
> \\\| \\\|'level': LOG_LEVEL, \\\| \\\|'propagate': False, \\\| \\\|},|
> |'airflow.task': \{ \\| \\|'handlers': ['task'], \\| \\|'level': LOG_LEVEL, 
> \\| \\|'propagate': False, \\| \\|},|
> |'flask_appbuilder': \{ \\| \\|'handler': ['console'], \\| \\|'level': 
> FAB_LOG_LEVEL, \\| \\|'propagate': True, \\| \\|}|
> },
>  
> Since for me it is very important to do it right at the first time I hope 
> some clarity can be provided in what has to be adjusted in the logger. Is it 
> only the 'airflow.task' or more?
> Furthermore, at step 6 it is a little unclear what remote_log_conn_id means. 
> I would propose to add a little more information to make this more clear.
>  
> When training to deploy the new log_config.py and following the instructions 
> I receive the following error:
> python3.6/site-packages/airflow/logging_config.py", line 60, in 
> configure_logging
>      'Unable to load custom logging from {}'.format(logging_class_path)
>  ImportError: Unable to load custom logging from log_config.LOGGING_CONFIG
> If I add the `logging_class_path` instead of `logging_config_class` this 
> error resolves.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to