At Netflix we've put our plugin inside the DAGs folder and pointed the
config to it there so we can both import directly in DAGs AND update the
plugin as we go. This makes it easy to test changes to operators needed for
ongoing DAG development in the same PR.
The two plugin features I've used whic
Any discoveries from this thread should probably be added to
https://airflow.apache.org/concepts.html#core-ideas
FWIW, I'm not sure if it's still this way on the master branch, but at one
point a couple useful features (e.g. was it accessing task logs from the UI
and maybe running from the UI?) we
Bravo!!! Bien fait!
On Fri, Apr 13, 2018 at 3:54 PM Joy Gao wrote:
> 🙌👍
>
> On Fri, Apr 13, 2018 at 11:47 AM, Naik Kaxil wrote:
>
> > Couldn't agree more. Thanks Fokko
> >
> > On 13/04/2018, 17:56, "Maxime Beauchemin"
> > wrote:
> >
> > Hey all,
> >
> > I wanted to point out the amaz
It would be a good webui update to add a multiselect option to clear by
task state. Or maybe clear anything but running/success by default and add
an "include success" option.
On Fri, Apr 27, 2018 at 06:47 Maxime Beauchemin
wrote:
> https://airflow.apache.org/cli.html#clear
>
> `airflow clear myd
Bravo!
On Tue, May 1, 2018, 12:57 Driesprong, Fokko wrote:
> Awesome! Looking forward to give it a spin! Great job guys!
>
> Cheers!
>
> 2018-05-01 21:26 GMT+02:00 Arthur Wiedmer :
>
> > Feng,
> >
> > We are really grateful for the work Googlers have put in the in the
> > project, including impr
^this
On Sat, May 5, 2018, 15:37 Marcin Szymański wrote:
> Hi Bolke
>
> Great stuff. Pieces of this this remind work I have done for one
> organization. However in that case, instead of defining base classes like
> Dataset form scratch, I extended objects from SQLAlchemy, such as Metadata,
> Tab
Could we adopt some sort of merge-blocking hook that prohibits merge of PRs
failing unit tests? My team has such an approach at work and it reduces the
volume of breakage quite a bit. The only time we experience problems now is
where our unit test coverage is poor, but we improve the coverage every
Don't you need to preserve the task objects? Your implementation overwrites
each by the successor, so only the last task would be kept, despite your
print statements. Try building a list or dict of tasks like:
tasks =[] #only at the top
for file in glob('dags/snowsql/create/udf/*.sql'):
print("FIL
Re: [Brian Greene] "How does filename matter? Frankly I wish the filename
was REQUIRED to be the dag name so people would quit confusing themselves
by mismatching them !"
FWIW in the Facebook predecessor to airflow, the file path/name WAS the dag
name. E.g. if your dag resided in best_team/new_pr
Exactly that Ash! This awesome community never fails to impress. Thanks
folks!
On Tue, Sep 25, 2018 at 3:29 AM Ash Berlin-Taylor wrote:
>
> > On 24 Sep 2018, at 23:12, Alex Tronchin-James 949-412-7220
> <(949)%20412-7220> wrote:
> >
> > Re: [Brian Greene] "H
"It's running on my phone"
On Sat, Oct 6, 2018, 10:59 Nehil Jain wrote:
> 😂
>
> On Sat, Oct 6, 2018 at 6:42 AM Jarek Potiuk
> wrote:
>
> > Hey everyone,
> >
> > In case you got asked to explain what Airflow is all about - as of few
> > days, it's super simple to explain it.
> >
> > It's simply
11 matches
Mail list logo