Re: Rerunning task without cleaning DB?

2018-02-07 Thread David Capwell
Ananth, I am not familiar with that and couldn't find any reference in the
code, can you say more?

On Feb 7, 2018 3:02 PM, "Trent Robbins"  wrote:

> If you want to keep the rest of your history you can:
>
> 1. turn the DAG off
> 2. delete its bad tasks, delete the bad DAG run
> 3. turn the DAG on
> 4. let it backfill or hit the play button manually depending on your needs
>
> Unfortunately this does not keep the task you are working with, but it's
> better than dropping the database by far.
>
>
>
>
>
> Best,
>
> Trent Robbins
> Strategic Consultant for Open Source Software
> Tau Informatics LLC
> desk: 415-404-9452
> cell: 513-233-5651
> tr...@tauinformatics.com
> https://www.linkedin.com/in/trentrobbins
>
> On Wed, Feb 7, 2018 at 2:57 PM, Ananth Durai  wrote:
>
> > We can't do that, unfortunately. Airflow schedule the task based on the
> > current state in the DB. If you would like to preserve the history one
> > option would be to add instrumentation on airflow_local_settings.py
> >
> > Regards,
> > Ananth.P,
> >
> >
> >
> >
> >
> >
> > On 5 February 2018 at 13:09, David Capwell  wrote:
> >
> > > When a production issue happens it's common that we clear the history
> to
> > > get airflow to run the task again.  This is problematic since it throws
> > > away the history making finding out what real happened harder.
> > >
> > > Is there any way to rerun a task without deleting from the DB?
> > >
> >
>


Re: Rerunning task without cleaning DB?

2018-02-07 Thread Trent Robbins
If you want to keep the rest of your history you can:

1. turn the DAG off
2. delete its bad tasks, delete the bad DAG run
3. turn the DAG on
4. let it backfill or hit the play button manually depending on your needs

Unfortunately this does not keep the task you are working with, but it's
better than dropping the database by far.





Best,

Trent Robbins
Strategic Consultant for Open Source Software
Tau Informatics LLC
desk: 415-404-9452
cell: 513-233-5651
tr...@tauinformatics.com
https://www.linkedin.com/in/trentrobbins

On Wed, Feb 7, 2018 at 2:57 PM, Ananth Durai  wrote:

> We can't do that, unfortunately. Airflow schedule the task based on the
> current state in the DB. If you would like to preserve the history one
> option would be to add instrumentation on airflow_local_settings.py
>
> Regards,
> Ananth.P,
>
>
>
>
>
>
> On 5 February 2018 at 13:09, David Capwell  wrote:
>
> > When a production issue happens it's common that we clear the history to
> > get airflow to run the task again.  This is problematic since it throws
> > away the history making finding out what real happened harder.
> >
> > Is there any way to rerun a task without deleting from the DB?
> >
>


Re: Rerunning task without cleaning DB?

2018-02-07 Thread Ananth Durai
We can't do that, unfortunately. Airflow schedule the task based on the
current state in the DB. If you would like to preserve the history one
option would be to add instrumentation on airflow_local_settings.py

Regards,
Ananth.P,






On 5 February 2018 at 13:09, David Capwell  wrote:

> When a production issue happens it's common that we clear the history to
> get airflow to run the task again.  This is problematic since it throws
> away the history making finding out what real happened harder.
>
> Is there any way to rerun a task without deleting from the DB?
>


Re: Rerunning task without cleaning DB?

2018-02-07 Thread Sean Fern
Would rerunning a task be different from manually triggering that task?

On Mon, Feb 5, 2018 at 4:09 PM, David Capwell  wrote:

> When a production issue happens it's common that we clear the history to
> get airflow to run the task again.  This is problematic since it throws
> away the history making finding out what real happened harder.
>
> Is there any way to rerun a task without deleting from the DB?
>