Can you please tell me more about your environment? Especially how do
you sync your DAGs / logs from celery workers? I know one setup where
I've seen I/O error when wiritting to a log...

T.


On Wed, Feb 19, 2020 at 10:56 AM Mehmet Ersoy
<[email protected]> wrote:
>
> Hello Friends,
>
> I'm new to Airflow and I'm using Airflow Celery executor with Postgres 
> backend and Redis Message Queue service. For now, there is 4 worker, 1 
> Scheduler and 1 Web Server.
> I have been preparing parallel Sqoop Jobs in my daily DAGs.
> When I scheduled a daily DAG, Often some task instances turning to failed 
> without running state after started state. Then I can't see the logs of them. 
> It's blank. And when I run that task manually it's running without any 
> problem.
> I don't really understand if there is an inconsistent situation in my DAG 
> writing.
> I have attached one of my DAGs.
>
> Thank you in advance,
> Best Regards.
> Mehmet.

Reply via email to