Can I just do the normal import logging, logging.info() call, or do I have
to mess with handlers? I saw you recently merged a logging change to
migrate the dag processor over to the new logging infrastructure. If I have
to mess with handlers, some guidance/examples/docs on that would be good.

On Tue, Oct 31, 2017 at 1:11 PM, Chris Riccomini <criccom...@apache.org>
wrote:

> @Bolke, I want them to end up in the DAG processor log.
>
> On Tue, Oct 31, 2017 at 1:03 PM, Bolke de Bruin <bdbr...@gmail.com> wrote:
>
>> If the print from
>>
>> https://github.com/trbs/airflow-examples/blob/master/dags/
>> example_python_operator.py <https://github.com/trbs/airfl
>> ow-examples/blob/master/dags/example_python_operator.py>
>>
>> Does not get into the logs anymore (we might need to update the config to
>> redirect stdout), you can always pass a reference to it via op_kwargs.
>>
>> Bolke
>>
>>
>>
>> > On 31 Oct 2017, at 20:59, Niels Zeilemaker <ni...@zeilemaker.nl> wrote:
>> >
>> > How would I access the logging from within a PyhtonOperator python
>> callable?
>> >
>> > That's a method that's defined in your dag, but doesn't have a
>> reference to
>> > the operator.
>> >
>> > Niels
>> >
>> > Op 31 okt. 2017 20:56 schreef "Bolke de Bruin" <bdbr...@gmail.com>:
>> >
>> >> Where do you want those to end up? As they are (probably) evaluated
>> during
>> >> parsing, they will end up in the log of the parsing process. So dag
>> >> processor log file or executor (celery worker).
>> >>
>> >> Bolke
>> >>
>> >>> On 31 Oct 2017, at 20:31, Chris Riccomini <criccom...@apache.org>
>> wrote:
>> >>>
>> >>> How does this work for DAG logging (as opposed to task logging). DAG
>> >>> logging can't easily use LoggingMixin. Is there some example code
>> >> somewhere
>> >>> about what to do on DAGs?
>> >>>
>> >>> On Tue, Oct 31, 2017 at 11:22 AM, Boris Tyukin <bo...@boristyukin.com
>> >
>> >>> wrote:
>> >>>
>> >>>> Chris,
>> >>>>
>> >>>> see my post "new logging" - apparently we cannot use logging any more
>> >> and
>> >>>> have to init log handler.
>> >>>>
>> >>>> On Tue, Oct 31, 2017 at 1:54 PM, Chris Riccomini <
>> criccom...@apache.org
>> >>>
>> >>>> wrote:
>> >>>>
>> >>>>> Correction:
>> >>>>>
>> >>>>> import logging
>> >>>>>
>> >>>>> class DqRowCheckOperator(BaseOperator):
>> >>>>> ...
>> >>>>> def execute(...):
>> >>>>>   logging.info('foo')
>> >>>>> ...
>> >>>>>
>> >>>>> It's an operator that we're using. The 'foo' doesn't show up in the
>> >> logs
>> >>>> in
>> >>>>> the UI or file.
>> >>>>>
>> >>>>> On Tue, Oct 31, 2017 at 10:47 AM, Chris Riccomini <
>> >> criccom...@apache.org
>> >>>>>
>> >>>>> wrote:
>> >>>>>
>> >>>>>> Hey all,
>> >>>>>>
>> >>>>>> Just noticed when we upgraded to 1.9.0 that logging from our custom
>> >>>>>> operators are no longer visible in the file. Assuming this is due
>> to
>> >>>> all
>> >>>>>> the log changes that were made in 1.9.0.
>> >>>>>>
>> >>>>>> Our custom operators just have:
>> >>>>>>
>> >>>>>> import logging
>> >>>>>>
>> >>>>>> class DbDagBuilder(object):
>> >>>>>> ...
>> >>>>>> logging.info('foo')
>> >>>>>> ...
>> >>>>>>
>> >>>>>> This was working fine in 1.8.2. What is the suggested way to make
>> this
>> >>>>>> work?
>> >>>>>>
>> >>>>>> Cheers,
>> >>>>>> Chris
>> >>>>>>
>> >>>>>
>> >>>>
>> >>
>> >>
>>
>>
>

Reply via email to