duc-luong-tnsl opened a new issue #17568:
URL: https://github.com/apache/airflow/issues/17568
**Apache Airflow docker image version**: apache/airflow:2.1.1-python3.8
** Environment: Kubernetes
**What happened**:
I set up a DAG with the group `Create Grafana alert` and inside the group a
run a for loop for creating task
Then all schedulers crash because of the above error, K8s tried to restart
but the scheduler still met this error inspire I fixed it on DAG.
Finally, I deleted DAG on UI after that scheduler created a new one and can
start.
I think this can be an error but I cannot crash all schedulers and cannot
back to normal until delete DAG.
```
Traceback (most recent call last):
8/12/2021 2:25:29 PM File "/home/airflow/.local/bin/airflow", line 8, in
<module>
8/12/2021 2:25:29 PM sys.exit(main())
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/__main__.py", line
40, in main
8/12/2021 2:25:29 PM args.func(args)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py",
line 48, in command
8/12/2021 2:25:29 PM return func(*args, **kwargs)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line
91, in wrapper
8/12/2021 2:25:29 PM return f(*args, **kwargs)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/scheduler_command.py",
line 64, in scheduler
8/12/2021 2:25:29 PM job.run()
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/base_job.py",
line 237, in run
8/12/2021 2:25:29 PM self._execute()
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1303, in _execute
8/12/2021 2:25:29 PM self._run_scheduler_loop()
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1396, in _run_scheduler_loop
8/12/2021 2:25:29 PM num_queued_tis = self._do_scheduling(session)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1535, in _do_scheduling
8/12/2021 2:25:29 PM self._schedule_dag_run(dag_run,
active_runs_by_dag_id.get(dag_run.dag_id, set()), session)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1706, in _schedule_dag_run
8/12/2021 2:25:29 PM dag = dag_run.dag =
self.dagbag.get_dag(dag_run.dag_id, session=session)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py",
line 67, in wrapper
8/12/2021 2:25:29 PM return func(*args, **kwargs)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dagbag.py",
line 186, in get_dag
8/12/2021 2:25:29 PM self._add_dag_from_db(dag_id=dag_id,
session=session)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dagbag.py",
line 261, in _add_dag_from_db
8/12/2021 2:25:29 PM dag = row.dag
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/serialized_dag.py",
line 175, in dag
8/12/2021 2:25:29 PM dag = SerializedDAG.from_dict(self.data) # type:
Any
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/serialization/serialized_objects.py",
line 792, in from_dict
8/12/2021 2:25:29 PM return cls.deserialize_dag(serialized_obj['dag'])
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/serialization/serialized_objects.py",
line 716, in deserialize_dag
8/12/2021 2:25:29 PM v = {task["task_id"]:
SerializedBaseOperator.deserialize_operator(task) for task in v}
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/serialization/serialized_objects.py",
line 716, in <dictcomp>
8/12/2021 2:25:29 PM v = {task["task_id"]:
SerializedBaseOperator.deserialize_operator(task) for task in v}
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/serialization/serialized_objects.py",
line 446, in deserialize_operator
8/12/2021 2:25:29 PM op =
SerializedBaseOperator(task_id=encoded_op['task_id'])
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py",
line 185, in apply_defaults
8/12/2021 2:25:29 PM result = func(self, *args, **kwargs)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/serialization/serialized_objects.py",
line 381, in __init__
8/12/2021 2:25:29 PM super().__init__(*args, **kwargs)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py",
line 185, in apply_defaults
8/12/2021 2:25:29 PM result = func(self, *args, **kwargs)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py",
line 527, in __init__
8/12/2021 2:25:29 PM validate_key(task_id)
8/12/2021 2:25:29 PM File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/helpers.py",
line 44, in validate_key
8/12/2021 2:25:29 PM raise AirflowException(
8/12/2021 2:25:29 PM airflow.exceptions.AirflowException: The key (Create
grafana alert.alert_ESCALATION_AGING_FORWARD) has to be made of alphanumeric
characters, dashes, dots and underscores exclusively
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]