erdos2n commented on issue #23833:
URL: https://github.com/apache/airflow/issues/23833#issuecomment-1133936446

   @snjypl here is the code and error message when it runs in Airflow. So the 
UI Error does disappear, but it still isn't working. 
   
   ```python
   import logging
   from airflow.decorators import dag, task
   from airflow.operators.python import PythonOperator
   
   from airflow.utils.dates import datetime
   
   def log_strings(string):
       logging.info(f"here is the string = {string}")
   
   @dag(
       dag_id='dynamic_dag_test',
       schedule_interval=None,
       start_date=datetime(2021, 1, 1),
       catchup=False,
       tags=['example', 'dynamic_tasks']
   )
   def tutorial_taskflow_api_etl():
       op2 = (PythonOperator
              .partial(task_id="logging_with_operator_task",
                       python_callable=log_strings)
              .expand(op_kwargs={"string":["a", "b", "c"]}))
   
       return op2
   
   
   tutorial_etl_dag = tutorial_taskflow_api_etl()
   ```
   
   **Error message:**
   ```python
   [2022-05-22, 17:06:49 UTC] {taskinstance.py:1376} INFO - Executing 
<Mapped(PythonOperator): logging_with_operator_task> on 2022-05-22 
17:06:48.858540+00:00
   [2022-05-22, 17:06:49 UTC] {standard_task_runner.py:52} INFO - Started 
process 290 to run task
   [2022-05-22, 17:06:49 UTC] {standard_task_runner.py:79} INFO - Running: 
['airflow', 'tasks', 'run', 'dynamic_dag_test', 'logging_with_operator_task', 
'manual__2022-05-22T17:06:48.858540+00:00', '--job-id', '41', '--raw', 
'--subdir', 'DAGS_FOLDER/dynamic_dag_test.py', '--cfg-path', 
'/tmp/tmp1yeh1bff', '--map-index', '0', '--error-file', '/tmp/tmp2mzffl7i']
   [2022-05-22, 17:06:49 UTC] {standard_task_runner.py:80} INFO - Job 41: 
Subtask logging_with_operator_task
   [2022-05-22, 17:06:49 UTC] {task_command.py:369} INFO - Running 
<TaskInstance: dynamic_dag_test.logging_with_operator_task 
manual__2022-05-22T17:06:48.858540+00:00 map_index=0 [running]> on host 
5b49114612fc
   [2022-05-22, 17:06:49 UTC] {taskinstance.py:1568} INFO - Exporting the 
following env vars:
   AIRFLOW_CTX_DAG_OWNER=airflow
   AIRFLOW_CTX_DAG_ID=dynamic_dag_test
   AIRFLOW_CTX_TASK_ID=logging_with_operator_task
   AIRFLOW_CTX_EXECUTION_DATE=2022-05-22T17:06:48.858540+00:00
   AIRFLOW_CTX_TRY_NUMBER=1
   AIRFLOW_CTX_DAG_RUN_ID=manual__2022-05-22T17:06:48.858540+00:00
   [2022-05-22, 17:06:49 UTC] {taskinstance.py:1888} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File "/usr/local/lib/python3.9/site-packages/airflow/operators/python.py", 
line 168, in execute
       context_merge(context, self.op_kwargs, 
templates_dict=self.templates_dict)
     File "/usr/local/lib/python3.9/site-packages/airflow/utils/context.py", 
line 256, in context_merge
       context.update(*args, **kwargs)
     File "/usr/local/lib/python3.9/_collections_abc.py", line 946, in update
       for key, value in other:
   ValueError: too many values to unpack (expected 2)
   [2022-05-22, 17:06:50 UTC] {taskinstance.py:1394} INFO - Marking task as 
FAILED. dag_id=dynamic_dag_test, task_id=logging_with_operator_task, 
map_index=0, execution_date=20220522T170648, start_date=20220522T170649, 
end_date=20220522T170650
   [2022-05-22, 17:06:50 UTC] {standard_task_runner.py:92} ERROR - Failed to 
execute job 41 for task logging_with_operator_task (too many values to unpack 
(expected 2); 290)
   [2022-05-22, 17:06:50 UTC] {local_task_job.py:156} INFO - Task exited with 
return code 1
   [2022-05-22, 17:06:50 UTC] {local_task_job.py:273} INFO - 0 downstream tasks 
scheduled from follow-on schedule check
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to