potiuk commented on code in PR #32669:
URL: https://github.com/apache/airflow/pull/32669#discussion_r1267143506
##########
airflow/config_templates/default_airflow.cfg:
##########
@@ -16,1507 +15,19 @@
# specific language governing permissions and limitations
# under the License.
-# This is the template for Airflow's default configuration. When Airflow is
-# imported, it looks for a configuration file at $AIRFLOW_HOME/airflow.cfg. If
-# it doesn't exist, Airflow uses this template to generate it by replacing
-# variables in curly braces with their global values from configuration.py.
-
-# Users should not modify this file; they should customize the generated
-# airflow.cfg instead.
-
-
-# ----------------------- TEMPLATE BEGINS HERE -----------------------
-
-[core]
-# The folder where your airflow pipelines live, most likely a
-# subfolder in a code repository. This path must be absolute.
-dags_folder = {AIRFLOW_HOME}/dags
-
-# Hostname by providing a path to a callable, which will resolve the hostname.
-# The format is "package.function".
#
-# For example, default value "airflow.utils.net.getfqdn" means that result
from patched
-# version of socket.getfqdn() - see
https://github.com/python/cpython/issues/49254.
+# NOTE IF YOU ARE LOOKING FOR DEFAULT CONFIGURATION FILE HERE !!! LOOK NO
MORE. READ NOTE BELOW!
#
-# No argument should be required in the function specified.
-# If using IP address as hostname is preferred, use value
``airflow.utils.net.get_host_ip_address``
-hostname_callable = airflow.utils.net.getfqdn
-
-# A callable to check if a python file has airflow dags defined or not
-# with argument as: `(file_path: str, zip_file: zipfile.ZipFile | None = None)`
-# return True if it has dags otherwise False
-# If this is not provided, Airflow uses its own heuristic rules.
-might_contain_dag_callable =
airflow.utils.file.might_contain_dag_via_default_heuristic
-
-# Default timezone in case supplied date times are naive
-# can be utc (default), system, or any IANA timezone string (e.g.
Europe/Amsterdam)
-default_timezone = utc
-
-# The executor class that airflow should use. Choices include
-# ``SequentialExecutor``, ``LocalExecutor``, ``CeleryExecutor``,
``DaskExecutor``,
-# ``KubernetesExecutor``, ``CeleryKubernetesExecutor`` or the
-# full import path to the class when using a custom executor.
-executor = SequentialExecutor
-
-# The auth manager class that airflow should use. Full import path to the auth
manager class.
-auth_manager = airflow.auth.managers.fab.fab_auth_manager.FabAuthManager
-
-# This defines the maximum number of task instances that can run concurrently
per scheduler in
-# Airflow, regardless of the worker count. Generally this value, multiplied by
the number of
-# schedulers in your cluster, is the maximum number of task instances with the
running
-# state in the metadata database.
-parallelism = 32
-
-# The maximum number of task instances allowed to run concurrently in each
DAG. To calculate
-# the number of tasks that is running concurrently for a DAG, add up the
number of running
-# tasks for all DAG runs of the DAG. This is configurable at the DAG level
with ``max_active_tasks``,
-# which is defaulted as ``max_active_tasks_per_dag``.
+# This file used to have something that was similar to the default Airflow
configuration but it was
+# really just a template. It was used to generate the final configuration and
it was confusing
+# if you copied it to your configuration and some of values were wrong.
Review Comment:
reworded slightly.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]