Repository: incubator-airflow Updated Branches: refs/heads/master be54f0485 -> 815270bb5
[AIRFLOW-1911] Rename celeryd_concurrency There are still celeryd_concurrency occurrences left in the code this needs to be renamed to worker_concurrency to make the config with Celery consistent Closes #2870 from Fokko/AIRFLOW-1911-update- airflow-config Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/815270bb Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/815270bb Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/815270bb Branch: refs/heads/master Commit: 815270bb56255e0e0653f7bbfeb7d34d2e8c780b Parents: be54f04 Author: Fokko Driesprong <[email protected]> Authored: Tue Dec 12 13:47:55 2017 +0100 Committer: Fokko Driesprong <[email protected]> Committed: Tue Dec 12 13:47:55 2017 +0100 ---------------------------------------------------------------------- airflow/bin/cli.py | 2 +- airflow/config_templates/default_test.cfg | 4 ++-- docs/configuration.rst | 4 ++-- scripts/ci/airflow_travis.cfg | 2 +- 4 files changed, 6 insertions(+), 6 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/815270bb/airflow/bin/cli.py ---------------------------------------------------------------------- diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py index 812977b..3e954dc 100755 --- a/airflow/bin/cli.py +++ b/airflow/bin/cli.py @@ -1394,7 +1394,7 @@ class CLIFactory(object): ("-c", "--concurrency"), type=int, help="The number of worker processes", - default=conf.get('celery', 'celeryd_concurrency')), + default=conf.get('celery', 'worker_concurrency')), 'celery_hostname': Arg( ("-cn", "--celery_hostname"), help=("Set the hostname of celery worker " http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/815270bb/airflow/config_templates/default_test.cfg ---------------------------------------------------------------------- diff --git a/airflow/config_templates/default_test.cfg b/airflow/config_templates/default_test.cfg index b065313..1e8a7df 100644 --- a/airflow/config_templates/default_test.cfg +++ b/airflow/config_templates/default_test.cfg @@ -72,10 +72,10 @@ smtp_mail_from = [email protected] [celery] celery_app_name = airflow.executors.celery_executor -celeryd_concurrency = 16 +worker_concurrency = 16 worker_log_server_port = 8793 broker_url = sqla+mysql://airflow:airflow@localhost:3306/airflow -celery_result_backend = db+mysql://airflow:airflow@localhost:3306/airflow +result_backend = db+mysql://airflow:airflow@localhost:3306/airflow flower_host = 0.0.0.0 flower_port = 5555 default_queue = default http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/815270bb/docs/configuration.rst ---------------------------------------------------------------------- diff --git a/docs/configuration.rst b/docs/configuration.rst index 51984e0..61f5511 100644 --- a/docs/configuration.rst +++ b/docs/configuration.rst @@ -35,7 +35,7 @@ You can also derive the connection string at run time by appending ``_cmd`` to t [core] sql_alchemy_conn_cmd = bash_command_to_run --But only three such configuration elements namely sql_alchemy_conn, broker_url and result_backend can be fetched as a command. The idea behind this is to not store passwords on boxes in plain text files. The order of precedence is as follows - +-But only three such configuration elements namely sql_alchemy_conn, broker_url and result_backend can be fetched as a command. The idea behind this is to not store passwords on boxes in plain text files. The order of precedence is as follows - 1. environment variable 2. configuration in airflow.cfg @@ -159,7 +159,7 @@ Some caveats: - Make sure to use a database backed result backend - Make sure to set a visibility timeout in [celery_broker_transport_options] that exceeds the ETA of your longest running task -- Tasks can and consume resources, make sure your worker as enough resources to run `celeryd_concurrency` tasks +- Tasks can and consume resources, make sure your worker as enough resources to run `worker_concurrency` tasks Scaling Out with Dask ''''''''''''''''''''' http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/815270bb/scripts/ci/airflow_travis.cfg ---------------------------------------------------------------------- diff --git a/scripts/ci/airflow_travis.cfg b/scripts/ci/airflow_travis.cfg index ee29148..c1ced74 100644 --- a/scripts/ci/airflow_travis.cfg +++ b/scripts/ci/airflow_travis.cfg @@ -43,7 +43,7 @@ smtp_mail_from = [email protected] [celery] celery_app_name = airflow.executors.celery_executor -celeryd_concurrency = 16 +worker_concurrency = 16 worker_log_server_port = 8793 broker_url = amqp://guest:guest@localhost:5672/ result_backend = db+mysql://root@localhost/airflow
