I'm using 1.10.6 version of Airflow.
Yes, this problem occurs in all of my parallel DAGs. And I attached one of
my DAGs in my first mail. In addition, graph view of my DAG is as follows:

[image: image.png]

Thanks.


Tomasz Urbaszek <[email protected]>, 19 Şub 2020 Çar, 15:09
tarihinde şunu yazdı:

> What version of Airflow do you use?
>
> The mentioned DAG is missing but I'm curious about "parallel" jobs you are
> running :) Does this problem occur with only one DAG?
>
> T.
>
> On Wed, Feb 19, 2020 at 12:58 PM Mehmet Ersoy <[email protected]>
> wrote:
>
> > Hi Tomasz,
> > For now, I'm syncing the DAGs manually sending across airflow hosts. So,
> > for now there is no git repository etc.
> > In addition, My configs related with parallelism are as follows:
> >
> >
> >
> > # How many processes CeleryExecutor uses to sync task state.
> >
> > # 0 means to use max(1, number of cores - 1) processes.
> >
> > *sync_parallelism = 0*
> >
> >
> >
> >
> >
> > # The amount of parallelism as a setting to the executor. This defines
> >
> > # the max number of task instances that should run simultaneously
> >
> > # on this airflow installation
> >
> > *parallelism = 128*
> >
> >
> >
> >
> >
> > # The number of task instances allowed to run concurrently by the
> scheduler
> >
> > *dag_concurrency = 128*
> >
> >
> >
> >
> >
> > # The concurrency that will be used when starting workers with the
> >
> > # "airflow worker" command. This defines the number of task instances
> that
> >
> > # a worker will take, so size up your workers based on the resources on
> >
> > # your worker box and the nature of your tasks
> >
> > *worker_concurrency = 32*
> >
> >
> >
> >
> >
> > # The maximum number of active DAG runs per DAG
> >
> > *max_active_runs_per_dag = 32*
> >
> >
> >
> > Thank you,
> >
> > Best regards.
> >
> > Tomasz Urbaszek <[email protected]>, 19 Şub 2020 Çar, 13:12 tarihinde
> > şunu yazdı:
> >
> > > Can you please tell me more about your environment? Especially how do
> > > you sync your DAGs / logs from celery workers? I know one setup where
> > > I've seen I/O error when wiritting to a log...
> > >
> > > T.
> > >
> > >
> > > On Wed, Feb 19, 2020 at 10:56 AM Mehmet Ersoy
> > > <[email protected]> wrote:
> > > >
> > > > Hello Friends,
> > > >
> > > > I'm new to Airflow and I'm using Airflow Celery executor with
> Postgres
> > > backend and Redis Message Queue service. For now, there is 4 worker, 1
> > > Scheduler and 1 Web Server.
> > > > I have been preparing parallel Sqoop Jobs in my daily DAGs.
> > > > When I scheduled a daily DAG, Often some task instances turning to
> > > failed without running state after started state. Then I can't see the
> > logs
> > > of them. It's blank. And when I run that task manually it's running
> > without
> > > any problem.
> > > > I don't really understand if there is an inconsistent situation in my
> > > DAG writing.
> > > > I have attached one of my DAGs.
> > > >
> > > > Thank you in advance,
> > > > Best Regards.
> > > > Mehmet.
> > >
> >
> >
> > --
> > Mehmet ERSOY
> >
>
>
> --
>
> Tomasz Urbaszek
> Polidea <https://www.polidea.com/> | Software Engineer
>
> M: +48 505 628 493 <+48505628493>
> E: [email protected] <[email protected]>
>
> Unique Tech
> Check out our projects! <https://www.polidea.com/our-work>
>


-- 
Mehmet ERSOY

Reply via email to