GitHub user martintalero added a comment to the discussion: Unable to login 
because of timezone error

Same issue here. 

____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
HTTP Request: GET 
https://apacheairflow.gateway.scarf.sh/scheduler?version=2.10.2&python_version=3.11&platform=Linux&arch=x86_64&database=postgresql&db_version=14.4&executor=CeleryExecutor
 "HTTP/1.1 200 OK"
/opt/airflow/venv/lib/python3.11/site-packages/airflow/utils/module_loading.py:42
 DeprecationWarning: The 
`airflow.config_templates.default_celery.DEFAULT_CELERY_CONFIG` class is 
deprecated. Please use 
`'airflow.providers.celery.executors.default_celery.DEFAULT_CELERY_CONFIG'`. 
The `celery` provider must be >= 3.3.0 for that..
/opt/airflow/venv/lib/python3.11/site-packages/airflow/utils/providers_configuration_loader.py:55
 AirflowProviderDeprecationWarning: The celery.CELERY_APP_NAME configuration 
uses deprecated package name: 'airflow.executors.celery_executor'. Change it to 
`airflow.providers.celery.executors.celery_executor`, and update the `-app` 
flag in your Celery Health Checks to use 
`airflow.providers.celery.executors.celery_executor.app`.
Loaded executor: CeleryExecutor
Starting the scheduler
Processing each file at most -1 times
Launched DagFileProcessorManager with pid: 31
Adopting or resetting orphaned tasks for active dag runs
Configured default timezone UTC
--- Logging error ---
Traceback (most recent call last):
  File "/usr/lib/python3.11/logging/handlers.py", line 73, in emit
    if self.shouldRollover(record):
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/logging/handlers.py", line 196, in shouldRollover
    msg = "%s\n" % self.format(record)
                   ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/logging/__init__.py", line 953, in format
    return fmt.format(record)
           ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/logging/__init__.py", line 689, in format
    record.asctime = self.formatTime(record, self.datefmt)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/utils/log/timezone_aware.py",
 line 44, in formatTime
    dt = timezone.from_timestamp(record.created, tz="local")
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/utils/timezone.py", 
line 319, in from_timestamp
    tz = local_timezone()
         ^^^^^^^^^^^^^^^^
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/utils/timezone.py", 
line 301, in local_timezone
    return pendulum.tz.local_timezone()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/pendulum/tz/__init__.py", line 
51, in local_timezone
    return get_local_timezone()
           ^^^^^^^^^^^^^^^^^^^^
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/pendulum/tz/local_timezone.py", 
line 33, in get_local_timezone
    tz = _get_system_timezone()
         ^^^^^^^^^^^^^^^^^^^^^^
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/pendulum/tz/local_timezone.py", 
line 61, in _get_system_timezone
    return _get_unix_timezone()
           ^^^^^^^^^^^^^^^^^^^^
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/pendulum/tz/local_timezone.py", 
line 179, in _get_unix_timezone
    return Timezone(etctz.replace(" ", "_"))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/pendulum/tz/timezone.py", line 
65, in __new__
    return super().__new__(cls, key)  # type: ignore[call-arg]
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/zoneinfo/_tzpath.py", line 67, in find_tzfile
    _validate_tzfile_path(key)
  File "/usr/lib/python3.11/zoneinfo/_tzpath.py", line 81, in 
_validate_tzfile_path
    raise ValueError(
ValueError: ZoneInfo keys may not be absolute paths, got: /UTC
Call stack:
  File "/opt/airflow/venv/bin/airflow", line 8, in <module>
    sys.exit(main())
  File "/opt/airflow/venv/lib/python3.11/site-packages/airflow/__main__.py", 
line 62, in main
    args.func(args)
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/cli/cli_config.py", 
line 49, in command
    return func(*args, **kwargs)
  File "/opt/airflow/venv/lib/python3.11/site-packages/airflow/utils/cli.py", 
line 115, in wrapper
    return f(*args, **kwargs)
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/utils/providers_configuration_loader.py",
 line 55, in wrapped_function
    return func(*args, **kwargs)
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/cli/commands/scheduler_command.py",
 line 59, in scheduler
    run_command_with_daemon_option(
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/cli/commands/daemon_utils.py",
 line 86, in run_command_with_daemon_option
    callback()
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/cli/commands/scheduler_command.py",
 line 62, in <lambda>
    callback=lambda: _run_scheduler_job(args),
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/cli/commands/scheduler_command.py",
 line 48, in _run_scheduler_job
    run_job(job=job_runner.job, execute_callable=job_runner._execute)
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/utils/session.py", line 
97, in wrapper
    return func(*args, session=session, **kwargs)
  File "/opt/airflow/venv/lib/python3.11/site-packages/airflow/jobs/job.py", 
line 421, in run_job
    return execute_job(job, execute_callable=execute_callable)
  File "/opt/airflow/venv/lib/python3.11/site-packages/airflow/jobs/job.py", 
line 450, in execute_job
    ret = execute_callable()
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/jobs/scheduler_job_runner.py",
 line 980, in _execute
    self.processor_agent.start()
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/dag_processing/manager.py",
 line 172, in start
    process.start()
  File "/usr/lib/python3.11/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
  File "/usr/lib/python3.11/multiprocessing/context.py", line 281, in _Popen
    return Popen(process_obj)
  File "/usr/lib/python3.11/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/usr/lib/python3.11/multiprocessing/popen_fork.py", line 71, in _launch
    code = process_obj._bootstrap(parent_sentinel=child_r)
  File "/usr/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/usr/lib/python3.11/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/dag_processing/manager.py",
 line 247, in _run_processor_manager
    processor_manager.start()
  File 
"/opt/airflow/venv/lib/python3.11/site-packages/airflow/dag_processing/manager.py",
 line 483, in start
    self.log.info("Processing files using up to %s processes at a time ", 
self._parallelism)
Message: 'Processing files using up to %s processes at a time '
Arguments: (2,)

GitHub link: 
https://github.com/apache/airflow/discussions/44407#discussioncomment-11389851

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to