Import logging

Logging,info()

Is really bad practice (why of why does python allow that?)

Always do something like:

1)
from utils.log.loggin_mixin import LoggingMixin

Log = LoggingMixin().log

Or

2)

Import logging

Log = logging.getLogger(__name__)

The merge I made should keep everything the same as before, except that you can 
reconfigure it now. However, it could be we need to redirect (in code) stdout 
and stderr to the standard handler. Please report back if that is the case.

Bolke


> On 31 Oct 2017, at 21:12, Chris Riccomini <[email protected]> wrote:
> 
> Can I just do the normal import logging, logging.info() call, or do I have
> to mess with handlers? I saw you recently merged a logging change to
> migrate the dag processor over to the new logging infrastructure. If I have
> to mess with handlers, some guidance/examples/docs on that would be good.
> 
> On Tue, Oct 31, 2017 at 1:11 PM, Chris Riccomini <[email protected]>
> wrote:
> 
>> @Bolke, I want them to end up in the DAG processor log.
>> 
>> On Tue, Oct 31, 2017 at 1:03 PM, Bolke de Bruin <[email protected]> wrote:
>> 
>>> If the print from
>>> 
>>> https://github.com/trbs/airflow-examples/blob/master/dags/
>>> example_python_operator.py <https://github.com/trbs/airfl
>>> ow-examples/blob/master/dags/example_python_operator.py>
>>> 
>>> Does not get into the logs anymore (we might need to update the config to
>>> redirect stdout), you can always pass a reference to it via op_kwargs.
>>> 
>>> Bolke
>>> 
>>> 
>>> 
>>>> On 31 Oct 2017, at 20:59, Niels Zeilemaker <[email protected]> wrote:
>>>> 
>>>> How would I access the logging from within a PyhtonOperator python
>>> callable?
>>>> 
>>>> That's a method that's defined in your dag, but doesn't have a
>>> reference to
>>>> the operator.
>>>> 
>>>> Niels
>>>> 
>>>> Op 31 okt. 2017 20:56 schreef "Bolke de Bruin" <[email protected]>:
>>>> 
>>>>> Where do you want those to end up? As they are (probably) evaluated
>>> during
>>>>> parsing, they will end up in the log of the parsing process. So dag
>>>>> processor log file or executor (celery worker).
>>>>> 
>>>>> Bolke
>>>>> 
>>>>>> On 31 Oct 2017, at 20:31, Chris Riccomini <[email protected]>
>>> wrote:
>>>>>> 
>>>>>> How does this work for DAG logging (as opposed to task logging). DAG
>>>>>> logging can't easily use LoggingMixin. Is there some example code
>>>>> somewhere
>>>>>> about what to do on DAGs?
>>>>>> 
>>>>>> On Tue, Oct 31, 2017 at 11:22 AM, Boris Tyukin <[email protected]
>>>> 
>>>>>> wrote:
>>>>>> 
>>>>>>> Chris,
>>>>>>> 
>>>>>>> see my post "new logging" - apparently we cannot use logging any more
>>>>> and
>>>>>>> have to init log handler.
>>>>>>> 
>>>>>>> On Tue, Oct 31, 2017 at 1:54 PM, Chris Riccomini <
>>> [email protected]
>>>>>> 
>>>>>>> wrote:
>>>>>>> 
>>>>>>>> Correction:
>>>>>>>> 
>>>>>>>> import logging
>>>>>>>> 
>>>>>>>> class DqRowCheckOperator(BaseOperator):
>>>>>>>> ...
>>>>>>>> def execute(...):
>>>>>>>>  logging.info('foo')
>>>>>>>> ...
>>>>>>>> 
>>>>>>>> It's an operator that we're using. The 'foo' doesn't show up in the
>>>>> logs
>>>>>>> in
>>>>>>>> the UI or file.
>>>>>>>> 
>>>>>>>> On Tue, Oct 31, 2017 at 10:47 AM, Chris Riccomini <
>>>>> [email protected]
>>>>>>>> 
>>>>>>>> wrote:
>>>>>>>> 
>>>>>>>>> Hey all,
>>>>>>>>> 
>>>>>>>>> Just noticed when we upgraded to 1.9.0 that logging from our custom
>>>>>>>>> operators are no longer visible in the file. Assuming this is due
>>> to
>>>>>>> all
>>>>>>>>> the log changes that were made in 1.9.0.
>>>>>>>>> 
>>>>>>>>> Our custom operators just have:
>>>>>>>>> 
>>>>>>>>> import logging
>>>>>>>>> 
>>>>>>>>> class DbDagBuilder(object):
>>>>>>>>> ...
>>>>>>>>> logging.info('foo')
>>>>>>>>> ...
>>>>>>>>> 
>>>>>>>>> This was working fine in 1.8.2. What is the suggested way to make
>>> this
>>>>>>>>> work?
>>>>>>>>> 
>>>>>>>>> Cheers,
>>>>>>>>> Chris
>>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>> 
>>>>> 
>>> 
>>> 
>> 

Reply via email to