ecerulm opened a new issue #16551:
URL: https://github.com/apache/airflow/issues/16551


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the 
context.
   
   -->
   
   **Apache Airflow version**: 2.0.2
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**:
   - **OS** (e.g. from /etc/os-release):
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   
   In a DAG with `datetime(2021, 5, 31, tzinfo=timezone.utc)` it will raise an 
`AttributeError: 'datetime.timezone' object has no attribute 'name'` in the 
scheduler. 
   
   It seems that airflow relies on the tzinfo object to have a  `.name` 
attribute so the "canonical" `datetime.timezone.utc` does not comply with that 
requirement. 
   
   
   
   ```
   AttributeError: 'datetime.timezone' object has no attribute 'name'
   Process DagFileProcessor302-Process:
   Traceback (most recent call last):
     File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in 
_bootstrap
       self.run()
     File "/usr/local/lib/python3.8/multiprocessing/process.py", line 108, in 
run
       self._target(*self._args, **self._kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
 line 184, in _run_file_processor
       result: Tuple[int, int] = dag_file_processor.process_file(
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", 
line 70, in wrapper
       return func(*args, session=session, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
 line 648, in process_file
       dagbag.sync_to_db()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", 
line 70, in wrapper
       return func(*args, session=session, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dagbag.py", 
line 556, in sync_to_db
       for attempt in run_with_db_retries(logger=self.log):
     File 
"/home/airflow/.local/lib/python3.8/site-packages/tenacity/__init__.py", line 
390, in __iter__
       do = self.iter(retry_state=retry_state)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/tenacity/__init__.py", line 
356, in iter
       return fut.result()
     File "/usr/local/lib/python3.8/concurrent/futures/_base.py", line 437, in 
result
       return self.__get_result()
     File "/usr/local/lib/python3.8/concurrent/futures/_base.py", line 389, in 
__get_result
       raise self._exception
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dagbag.py", 
line 570, in sync_to_db
       DAG.bulk_write_to_db(self.dags.values(), session=session)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", 
line 67, in wrapper
       return func(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dag.py", line 
1892, in bulk_write_to_db
       orm_dag.calculate_dagrun_date_fields(
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dag.py", line 
2268, in calculate_dagrun_date_fields
       self.next_dagrun, self.next_dagrun_create_after = 
dag.next_dagrun_info(most_recent_dag_run)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dag.py", line 
536, in next_dagrun_info
       next_execution_date = 
self.next_dagrun_after_date(date_last_automated_dagrun)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dag.py", line 
571, in next_dagrun_after_date
       next_start = self.following_schedule(now)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dag.py", line 
485, in following_schedule
       tz = pendulum.timezone(self.timezone.name)
   AttributeError: 'datetime.timezone' object has no attribute 'name'
   ```
   **What you expected to happen**:
   
   If `start_date` or any other input parameter requires a `tzinfo` with a 
`name` attribute it should  check for that  in the DAG object and produce a 
more specific error message not `AttributeError`. 
   
   Also I guess this requirement should be explicitly mentioned in 
https://airflow.apache.org/docs/apache-airflow/stable/timezone.html with a 
comment like
   ```
     you can't use datetime.timezone.utc because it does not have a name 
attribute
   ```
   
   Or even better it would be not to rely on the presence of a `name` attribute 
in the tzinfo....
   
   
   **How to reproduce it**:
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access 
to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using 
minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a 
youtube video of the bug in action
   
   You can include images using the .md style of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an 
unlisted youtube video with the resulting .mov file.
   
   --->
   
   ```
   from datetime import timedelta, datetime, timezone
   args = {
       "owner": "airflow",
       "retries": 3,
   }
   dag = DAG(
       dag_id="xxxx",
       default_args=args,
       start_date=datetime(2021, 5, 31, tzinfo=timezone.utc),
       schedule_interval="0 8 * * *",
       max_active_runs=1,
       dagrun_timeout=timedelta(minutes=60),
       catchup=False,
       description="xxxxx",
   )
   
   ```
   
   **Anything else we need to know**:
   
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to