This seems to work, with a tiny caveat: `airflow celery worker` is the
command to start the celery worker instead of `airflow worker`:

   1. Launch breeze with `--integration rabbitq` or `--integration redis`
   2.  Set the executor to celery with `export
   AIRFLOW__CORE__EXECUTOR=CeleryExecutor`
   3. Start a celery worker with `airflow celery worker -D`

Would I be crazy to ask if Breeze should be rewritten in Python? Arguments
in support, I think, might be...

   1. Easier modularization/extensibility
   2. I'd guess that more people are comfortable in Python than Bash
   3. At least one of the macOS issues could be resolved

Thoughts?


On Mon, Mar 22, 2021 at 5:51 PM Jarek Potiuk <[email protected]> wrote:

> Actually it is not that strightforward, but possibly we can make it works
> much more easily
>
> In order to make Celery Executor works you need to do a bit more (but
> should be easy to add as an option to Breeze):
>
> * you need to start rabitmq or redis as integration (`--integration
> rabbitq --integration redis')
> * you need to start worker(s) (`airflow worker` in the background)
> * you might want to start flower optionally (the celery monitoring tool)
>
> So maybe we could add extra switch to start-airflow command
> (--use-celery-executor) that could set those integrations and start
> worker/flower additionally to running webserver/scheduler now in tmux ?
>
> WDYT? Maybe Ryan you can check if my recipe works ? I could add it then as
> an option.
>
> BTW. We already have a number of CeleryExecutor tests that use the
> integrations, so Breeze has all what's needed:
>
>
> https://github.com/apache/airflow/blob/master/tests/executors/test_celery_executor.py#L109
>
>  J.
>
> On Mon, Mar 22, 2021 at 2:24 PM Ryan Hatter <[email protected]> wrote:
>
>> Hmm, maybe I was just getting twisted around with docker then. I’ll have
>> a look at what you shared.
>>
>> Thanks Bin :)
>>
>> On Mar 22, 2021, at 01:52, Xinbin Huang <[email protected]> wrote:
>>
>> 
>> Hi Ryan,
>>
>> I believe breeze already provides tools for you to do that:
>>
>> 1.Would it make sense to allow developers to choose the executor for
>> their Breeze environment?
>>
>> You can set up environment variables and any other custom setup you want
>> in the file: */files/airflow-breeze-config/variables.env. *To set up
>> CeleryExecutor, you just need to put `export
>> AIRFLOW__CORE__EXECUTOR=CeleryExecutor` in the file.
>>
>> 2. Developing using Docker
>> <https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html> 
>> would
>> require an update to the image each time you want to make a change to the
>> codebase
>>
>> Breeze automatically mounts the local source to the container unless you
>> explicitly skip it with the flag *--skip-mounting-local-sources. *You
>> can find more details here
>> https://github.com/apache/airflow/blob/master/BREEZE.rst#mounting-local-sources-to-breeze
>>
>> Best
>> Bin
>>
>>
>> On Sun, Mar 21, 2021 at 5:32 PM Ryan Hatter <[email protected]>
>> wrote:
>>
>>> I recently had some trouble trying to fix a bug in the CeleryExecutor
>>> <https://github.com/apache/airflow/pull/14883>. The code change was
>>> small, but it was really difficult to set up a development environment
>>> using the CeleryExecutor. I ultimately had to muck around with the test
>>> case that covers this situation, default_airflow.cfg, and
>>> default_celery.py. Developing using Docker
>>> <https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html>
>>> would require an update to the image each time you want to make a change to
>>> the codebase (or maybe `exec`ing into the relevant container?) which is a
>>> pain.
>>>
>>> This led me to two questions:
>>>
>>>    1. Would it make sense to allow developers to choose the executor
>>>    for their Breeze environment?
>>>    2. If not, how do folks test out changes they make to the
>>>    CeleryExecutor or KubernetesExecutor?
>>>
>>> Thanks!
>>> Ryan
>>>
>>>
>>>
>
> --
> +48 660 796 129
>

Reply via email to