Question for the community. Did some hunting around and didn't see any
compelling answers. SO link:
https://stackoverflow.com/questions/49290546/how-to-set-up-a-dag-when-downstream-task-definitions-depend-on-upstream-outcomes
--
*Aaron Polhamus*
*Chief Technology Officer *
Cel (México): +52
AG ever--doesn't seem like
there should be a problem. Previously I was using the basic pip install
before switching to the apache-airflow repo.
Here's a screenshot of the Web UI. Everything seems to be working alright,
except the top and bottom DAGS which have scheduling interval set to @once and
are
/48692703/a-specific-dag-stops-running-in-airflow-even-when-the-scheduler-is-running-fine
Happy, of course, to just get an answer here as well. Many thanks in
advance.
Cheers,
Aaron
--
*Aaron Polhamus*
*Chief Technology Officer *
Cel (México): +52 (55) 1951-5612
Cell (USA): +1 (206) 380-3948
at the top of the graph isn't
updated by what happens inside the worker.
Has anyone else worked with this? Is this something that can be solved with
XCom?
--
*Aaron Polhamus*
*Director of Data Science *
Cel (México): +52 (55) 1951-5612
Cell (USA): +1 (206) 380-3948
Tel: +52 (55) 1168 9757
(get_id_creds)
Thanks in advance!
--
*Aaron Polhamus*
*Director of Data Science *
Cel (México): +52 (55) 1951-5612
Cell (USA): +1 (206) 380-3948
Tel: +52 (55) 1168 9757 - Ext. 181
--
***Por favor referirse a nuestra página web
<https://www.credijusto.com/aviso-de-privacidad/> pa
that if I can get this working I can run multiple data profile generation
jobs in parallel on the same machine.
I have two questions here:
1. Where is this error coming from and how can I get this script working?
2. Task 1 writes a list of IDs to hard disk and those are then read back
into the DAG