Hi Andy,

I've pushed a fix:
https://github.com/apache/incubator-airflow/pull/2721/files

I've tested locally, and this will add ${AIRFLOW_HOME}/config automatically
to the pythonpath. Setting:
logging_config_class = airflow_logging_settings.LOGGING_CONFIG

And creating ${AIRFLOW_HOME}/config/airflow_logging_settings.py and
providing a config as in the mail before, should work. I've tested it
locally within a Python 3 docker container, and this fixed it. I also made
some small changes to the config, based on your suggestions.

Cheers, Fokko




2017-10-18 16:42 GMT+02:00 Andrew Maguire <[email protected]>:

> Cool - i'll help test once something ready,
>
> So is the expected approach (easiest for a normal users wanting to just log
> to GCS) here for me to
>
> 1. copy the "airflow_local_settings.py" from here
> <https://github.com/apache/incubator-airflow/blob/master/
> airflow/config_templates/airflow_local_settings.py>
> into {AIRFLOW_HOME}/plugins
> 2. uncomment the GCS stuff, add a line to pick up GCS_LOG_FOLDER from
> {AIRFLOW_HOME}/airflow.cfg
> 3. add an __init__.py to {AIRFLOW_HOME}/plugins
> 4. update the {AIRFLOW_HOME}/airflow.cfg accordingly
>
> If so, it seems like really i'm not doing anything of substance in steps
> 1-3 but just potential places for less technical users (like me :) ) to
> mess up.
>
> When i first set up airflow i was really impressed with how easy it was to
> just figure out airflow.cfg and get things up and running.
>
> Maybe the plan is to cut down some of these steps once the new logging is
> more final but just wanted to give a little bit of feedback - being able to
> just set everything up via airflow.cfg is great.
>
> Cheers,
> Andy
>
> On Wed, Oct 18, 2017 at 2:51 PM Driesprong, Fokko <[email protected]>
> wrote:
>
> > Hi all,
> >
> > I'll push a fix today. I've created a Jira ticket:
> > https://issues.apache.org/jira/browse/AIRFLOW-1731
> >
> > Cheers, Fokko
> >
> > 2017-10-17 19:27 GMT+02:00 Driesprong, Fokko <[email protected]>:
> >
> > > Hi Andy,
> > >
> > > I see something weird. While debugging the code, the paths as
> documented
> > > on the updating.md are not on the pythonpath:
> > > ['/usr/local/bin', '/usr/local/lib/python36.zip',
> > > '/usr/local/lib/python3.6', '/usr/local/lib/python3.6/lib-dynload',
> > > '/usr/local/lib/python3.6/site-packages']
> > >
> > > This is an installed instance of 1.9-alpha using the .tar.gz of Chris.
> I
> > > would also expect more paths appended to the PYTHONPATH. So when I try
> to
> > > run an example with the config provided by Andrew:
> > > logging_config_class = airflow_logging_settings.LOGGING_CONFIG
> > >
> > > I end up with the same error as Andrew:
> > > root@9e3cf03c0544:~# airflow test example_bash_operator runme_0
> > 2017-01-01
> > > <20%2017%2001%2001>
> > > Current path:
> > > ['/usr/local/bin', '/usr/local/lib/python36.zip',
> > > '/usr/local/lib/python3.6', '/usr/local/lib/python3.6/lib-dynload',
> > > '/usr/local/lib/python3.6/site-packages']
> > > Traceback (most recent call last):
> > >   File
> > "/usr/local/lib/python3.6/site-packages/airflow/logging_config.py",
> > > line 40, in configure_logging
> > >     logging_config = import_string(logging_class_path)
> > >   File
> > "/usr/local/lib/python3.6/site-packages/airflow/utils/
> module_loading.py",
> > > line 28, in import_string
> > >     module = import_module(module_path)
> > >   File "/usr/local/lib/python3.6/importlib/__init__.py", line 126, in
> > > import_module
> > >     return _bootstrap._gcd_import(name[level:], package, level)
> > >   File "<frozen importlib._bootstrap>", line 978, in _gcd_import
> > >   File "<frozen importlib._bootstrap>", line 961, in _find_and_load
> > >   File "<frozen importlib._bootstrap>", line 948, in
> > > _find_and_load_unlocked
> > > ModuleNotFoundError: No module named 'airflow_logging_settings'
> > >
> > > During handling of the above exception, another exception occurred:
> > >
> > > Traceback (most recent call last):
> > >   File "/usr/local/bin/airflow", line 16, in <module>
> > >     from airflow import configuration
> > >   File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py",
> line
> > > 31, in <module>
> > >     from airflow import settings
> > >   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py",
> line
> > > 148, in <module>
> > >     configure_logging()
> > >   File
> > "/usr/local/lib/python3.6/site-packages/airflow/logging_config.py",
> > > line 52, in configure_logging
> > >     'Unable to load custom logging from {}'.format(logging_class_path)
> > > ImportError: Unable to load custom logging from
> airflow_logging_settings.
> > > LOGGING_CONFIG
> > >
> > >
> > > When explicitly passing the path when starting the test:
> > > root@9e3cf03c0544:~# PYTHONPATH=/root/airflow/plugins/ airflow test
> > > example_bash_operator runme_0 2017-01-01 <20%2017%2001%2001>
> > > Current path:
> > > ['/usr/local/bin', '/root/airflow/plugins',
> > '/usr/local/lib/python36.zip',
> > > '/usr/local/lib/python3.6', '/usr/local/lib/python3.6/lib-dynload',
> > > '/usr/local/lib/python3.6/site-packages']
> > > [2017-10-17 17:21:19,793] {__init__.py:45} INFO - Using executor
> > > SequentialExecutor
> > > [2017-10-17 17:21:19,835] {models.py:186} INFO - Filling up the DagBag
> > > from /root/airflow/dags
> > > [2017-10-17 17:21:19,880] {dag.py:31} WARNING - test warn
> > > [2017-10-17 17:21:19,880] {dag.py:32} INFO - test info
> > > [2017-10-17 17:21:19,902] {models.py:1165} INFO - Dependencies all met
> > for
> > > <TaskInstance: example_bash_operator.runme_0 2017-01-01 00:00:00
> [None]>
> > > [2017-10-17 17:21:19,904] {models.py:1165} INFO - Dependencies all met
> > for
> > > <TaskInstance: example_bash_operator.runme_0 2017-01-01 00:00:00
> [None]>
> > > [2017-10-17 17:21:19,905] {models.py:1375} INFO -
> > > ------------------------------------------------------------
> > > --------------------
> > > Starting attempt 1 of 1
> > > ------------------------------------------------------------
> > > --------------------
> > >
> > > [2017-10-17 17:21:19,905] {models.py:1396} INFO - Executing
> > > <Task(BashOperator): runme_0> on 2017-01-01 00:00:00
> > > [2017-10-17 17:21:19,916] {bash_operator.py:70} INFO - Tmp dir root
> > > location:
> > >  /tmp
> > > [2017-10-17 17:21:19,919] {bash_operator.py:80} INFO - Temporary script
> > > location: /tmp/airflowtmpmco2abof//tmp/airflowtmpmco2abof/runme_
> 0w53ja72x
> > > [2017-10-17 17:21:19,919] {bash_operator.py:82} INFO - Running command:
> > > echo "example_bash_operator__runme_0__20170101 <20%2017%2001%2001>" &&
> > > sleep 1
> > > [2017-10-17 17:21:19,924] {bash_operator.py:91} INFO - Output:
> > > [2017-10-17 17:21:19,925] {bash_operator.py:95} INFO -
> > > example_bash_operator__runme_0__20170101 <20%2017%2001%2001>
> > > [2017-10-17 17:21:20,928] {bash_operator.py:99} INFO - Command exited
> > with
> > > return code 0
> > >
> > > This will require a patch for 1.9 because I expected more paths being
> > > loaded by default. I checked this before, and the paths mentioned in
> the
> > > updating.md where present, anyone any idea?
> > >
> > > Cheers, Fokko
> > >
> > >
> > >
> > > 2017-10-17 13:04 GMT+02:00 Andrew Maguire <[email protected]>:
> > >
> > >> Hey,
> > >>
> > >> My airflow_logging_settings.py is attached. (is the same as the
> > >> airflow_local_settings.py you mention - i think i saw both and
> > >> 'airflow_local_settings.py' and an 'airflow_logging_settings.py'
> > referenced
> > >> in two separate places although i think it's pretty much the same
> thing,
> > >> just slightly different filenames. I think one is mentioned in the
> > >> updating.md file and the other is what the file is actually called in
> > >> the github repo. Am guessing does not matter so long as i refrence the
> > >> correct file in airflow.cfg).
> > >>
> > >> I had not renamed DEFAULT_LOGGING_CONFIG to LOGGING_CONFIG in my
> > >> airflow_logging_settings.py. I did and uncommented the class line in
> my
> > >> airflow.cfg and still got below error:
> > >>
> > >> andrew_maguire@airflow-server:~$ airflow list_dags
> > >> Traceback (most recent call last):
> > >>   File "/usr/local/bin/airflow", line 16, in <module>
> > >>     from airflow import configuration
> > >>   File "/usr/local/lib/python2.7/dist-packages/airflow/__init__.py",
> > line 31, in <module>
> > >>     from airflow import settings
> > >>   File "/usr/local/lib/python2.7/dist-packages/airflow/settings.py",
> > line 148, in <module>
> > >>     configure_logging()
> > >>   File
> > "/usr/local/lib/python2.7/dist-packages/airflow/logging_config.py", line
> > 47, in configure_logging
> > >>     'Unable to load custom logging from {}'.format(logging_class_path)
> > >> ImportError: Unable to load custom logging from
> > plugins.airflow_logging_settings.LOGGING_CONFIG
> > >>
> > >> So don't think my error even got that far.
> > >>
> > >> Happy to help debug or test or be of any use.
> > >>
> > >> Cheers,
> > >> Andy
> > >>
> > >>
> > >> On Tue, Oct 17, 2017 at 11:43 AM Driesprong, Fokko
> <[email protected]
> > >
> > >> wrote:
> > >>
> > >>> Hi Andy,
> > >>>
> > >>> Thanks for reaching out. We are debugging the new logging, and input
> > from
> > >>> the community is highly appreciated.
> > >>>
> > >>> If you are using Python 2, you'll need to put an empty __init__.py in
> > all
> > >>> the directory, so ~/airflow/plugins/__init__.py, this needs to be
> > empty.
> > >>> Could you share your airflow_local_settings.py? If there are any GCS
> > >>> credentials, please remove them. Please check that you've renamed
> > >>> the DEFAULT_LOGGING_CONFIG variable to LOGGING_CONFIG, this might not
> > be
> > >>> evident from the updating.md.
> > >>>
> > >>> Cheers, Fokko
> > >>>
> > >>> 2017-10-17 12:00 GMT+02:00 Andrew Maguire <[email protected]>:
> > >>>
> > >>> > Hi,
> > >>> >
> > >>> > I've updated to 1.9 but am having trouble setting the
> > >>> logging_config_class
> > >>> > class path in the airflow.cfg file.
> > >>> >
> > >>> > Currently i have below files in {AIRFLOW_HOME}/plugins
> > >>> >
> > >>> > [image: image.png]
> > >>> > Where airflow_logging_settings.py is just a copy of this file
> > >>> > <https://github.com/apache/incubator-airflow/blob/master/air
> > >>> flow/config_templates/airflow_local_settings.py>
> > >>>
> > >>> > but with the GCS stuff uncommented and a line added for
> > GCS_LOG_FOLDER
> > >>> to
> > >>> > be pulled from airflow.cfg just like BASE_LOG_FOLDER
> > >>> >
> > >>> > then in my {AIRFLOW_HOME}/airflow.cfg file i have this following
> line
> > >>> to
> > >>> > set up the log stuff:
> > >>> >
> > >>> > # The folder where airflow should store its log files
> > >>> > # This path must be absolute
> > >>> > base_log_folder = {AIRFLOW_HOME}/logs
> > >>> >
> > >>> > gcs_log_folder = gs://pmc-airflow/logs
> > >>> >
> > >>> > # Airflow can store logs remotely in AWS S3 or Google Cloud
> Storage.
> > >>> Users
> > >>> > # must supply an Airflow connection id that provides access to the
> > >>> storage
> > >>> > # location.
> > >>> > remote_log_conn_id = my_gcp_connection
> > >>> > encrypt_s3_logs = False
> > >>> >
> > >>> > # Logging level
> > >>> > logging_level = INFO
> > >>> >
> > >>> > # Logging class
> > >>> > # Specify the class that will specify the logging configuration
> > >>> > # This class has to be on the python classpath
> > >>> > logging_config_class =
> > plugins.airflow_logging_settings.LOGGING_CONFIG
> > >>> >
> > >>> > # Log format
> > >>> > log_format = [%%(asctime)s] {{%%(filename)s:%%(lineno)d}}
> > >>> %%(levelname)s - %%(message)s
> > >>> > simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s
> > >>> >
> > >>> >
> > >>> > However if i run "airflow list_dags" i now get:
> > >>> >
> > >>> > andrew_maguire@airflow-server:~/airflow$ airflow list_dags
> > >>> > Traceback (most recent call last):
> > >>> >   File "/usr/local/bin/airflow", line 16, in <module>
> > >>> >     from airflow import configuration
> > >>> >   File "/usr/local/lib/python2.7/dist-packages/airflow/__init__
> .py",
> > >>> line 31, in <module>
> > >>> >     from airflow import settings
> > >>> >   File "/usr/local/lib/python2.7/dist-packages/airflow/
> settings.py",
> > >>> line 148, in <module>
> > >>> >     configure_logging()
> > >>> >   File
> > "/usr/local/lib/python2.7/dist-packages/airflow/logging_config.py",
> > >>> line 47, in configure_logging
> > >>> >     'Unable to load custom logging from
> > {}'.format(logging_class_path)
> > >>> > ImportError: Unable to load custom logging from
> > >>> plugins.airflow_logging_settings.LOGGING_CONFIG
> > >>> >
> > >>> > If i go back into my airflow.cfg and comment out the line:
> > >>> >
> > >>> > logging_config_class =
> > plugins.airflow_logging_settings.LOGGING_CONFIG
> > >>> >
> > >>> > Things work again but i'm only doing local logging.
> > >>> >
> > >>> > So am sure i'm doing something wrong here in how i'm setting that
> > line
> > >>> in the airflow.cfg file.
> > >>> >
> > >>> > So what i did was.
> > >>> >
> > >>> > 1. create a folder {AIRFLOW_HOME}/plugins - i did this as
> > updating.md
> > >>> <https://github.com/apache/incubator-airflow/blob/master/UPDATING.md
> >
> > >>> mentioned that "The logging configuration file that contains the
> > >>> configuration needs te on the the PYTHONPATH, for example in
> > ~/airflow/dags
> > >>> or ~/airflow/plugins. These directories are loaded by default".
> > >>> >
> > >>> > 2. copy this file <https://github.com/apache/inc
> > >>> ubator-airflow/blob/master/airflow/config_templates/airflow_
> > >>> local_settings.py> into {AIRFLOW_HOME}/plugins with the GCS changes i
> > >>> mentioned.
> > >>> >
> > >>> > 3. create an __init__.py file in {AIRFLOW_HOME}/plugins - do i need
> > to
> > >>> put anyting in partiucular in here?
> > >>> >
> > >>> > 4. update  {AIRFLOW_HOME}/airflow.cfg as above.
> > >>> >
> > >>> > Can someone help me figure out where i went wrong?
> > >>> >
> > >>> > I'm hesitant to change or add anytihng to the pythonpath as an not
> > >>> 100% sure what i'm doing. So was hoping to just drop the logging
> config
> > >>> file somewhere it would automatically be picked up. And i'm also not
> > really
> > >>> sure about python packages and class paths etc so kinda feeling my
> way
> > >>> through it but not confident.
> > >>> >
> > >>> > Cheers
> > >>> >
> > >>> > Andy
> > >>> >
> > >>> >
> > >>> >
> > >>>
> > >>
> > >
> >
>

Reply via email to