benbole commented on issue #34688:
URL: https://github.com/apache/airflow/issues/34688#issuecomment-1743306845

   I was able to confirm the above import worked. However, I modified the `pip 
install` statement from `pip install apache-airflow[postgres]==2.7.1` to `pip 
install apache-airflow==2.7.1` it resulted in the same previously seen error:
   
   ```
   from __future__ import annotations
   import os
   import yaml
   from datetime import datetime
   from airflow.models import DAG
   from airflow.operators.bash import BashOperator
   from airflow.decorators import dag, task
   
   @dag(
       dag_id='new_example_break',
       schedule=None,
       start_date=datetime(2021, 1, 1),
   )
   def example_break():
       setup_external_python = BashOperator(
           task_id="setup_external_python",
           bash_command="python -m venv 
/opt/airflow/python_venvs/operators_venv"
       )
       install_packages = BashOperator(
           task_id="install_packages",
           bash_command="/opt/airflow/python_venvs/operators_venv/bin/pip 
install apache-airflow==2.7.1"
       )
   
       
@task.external_python(python="/opt/airflow/python_venvs/operators_venv/bin/python",
 expect_airflow=True)
       def this_should_break():
           from airflow.providers.postgres.hooks.postgres import PostgresHook
           print(f"I get some postgress hook object here: {PostgresHook()}")
           return "yes"
   
       cleanup_python = BashOperator(
           task_id="cleanup_python",
           bash_command="rm -Rvf /opt/airflow/python_venvs/operators_venv"
       )
   
       setup_external_python.as_setup() >> install_packages.as_setup() >> 
this_should_break() >> cleanup_python.as_teardown()
   
   example_break()
   ```
   
   Resulted in the following exception:
   ```
   bab250711cb1
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=new_example_break/run_id=manual__2023-10-02T16:05:43.285961+00:00/task_id=this_should_break/attempt=1.log
   [2023-10-02T16:06:38.136+0000] {taskinstance.py:1157} INFO - Dependencies 
all met for dep_context=non-requeueable deps ti=<TaskInstance: 
new_example_break.this_should_break manual__2023-10-02T16:05:43.285961+00:00 
[queued]>
   [2023-10-02T16:06:38.150+0000] {taskinstance.py:1157} INFO - Dependencies 
all met for dep_context=requeueable deps ti=<TaskInstance: 
new_example_break.this_should_break manual__2023-10-02T16:05:43.285961+00:00 
[queued]>
   [2023-10-02T16:06:38.152+0000] {taskinstance.py:1359} INFO - Starting 
attempt 1 of 1
   [2023-10-02T16:06:38.167+0000] {taskinstance.py:1380} INFO - Executing 
<Task(_PythonExternalDecoratedOperator): this_should_break> on 2023-10-02 
16:05:43.285961+00:00
   [2023-10-02T16:06:38.176+0000] {standard_task_runner.py:57} INFO - Started 
process 488 to run task
   [2023-10-02T16:06:38.181+0000] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'new_example_break', 'this_should_break', 
'manual__2023-10-02T16:05:43.285961+00:00', '--job-id', '1472', '--raw', 
'--subdir', 'DAGS_FOLDER/extraction_pipeline.py', '--cfg-path', 
'/tmp/tmpmmcj3jul']
   [2023-10-02T16:06:38.186+0000] {standard_task_runner.py:85} INFO - Job 1472: 
Subtask this_should_break
   [2023-10-02T16:06:38.252+0000] {task_command.py:415} INFO - Running 
<TaskInstance: new_example_break.this_should_break 
manual__2023-10-02T16:05:43.285961+00:00 [running]> on host bab250711cb1
   [2023-10-02T16:06:38.348+0000] {taskinstance.py:1660} INFO - Exporting env 
vars: AIRFLOW_CTX_DAG_OWNER='***' AIRFLOW_CTX_DAG_ID='new_example_break' 
AIRFLOW_CTX_TASK_ID='this_should_break' 
AIRFLOW_CTX_EXECUTION_DATE='2023-10-02T16:05:43.285961+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='manual__2023-10-02T16:05:43.285961+00:00'
   [2023-10-02T16:06:39.150+0000] {process_utils.py:182} INFO - Executing cmd: 
/opt/***/python_venvs/operators_venv/bin/python /tmp/tmdb2r5pv98/script.py 
/tmp/tmdb2r5pv98/script.in /tmp/tmdb2r5pv98/script.out 
/tmp/tmdb2r5pv98/string_args.txt /tmp/tmdb2r5pv98/termination.log
   [2023-10-02T16:06:39.169+0000] {process_utils.py:186} INFO - Output:
   [2023-10-02T16:06:39.943+0000] {process_utils.py:190} INFO - Traceback (most 
recent call last):
   [2023-10-02T16:06:39.944+0000] {process_utils.py:190} INFO -   File 
"/tmp/tmdb2r5pv98/script.py", line 31, in <module>
   [2023-10-02T16:06:39.944+0000] {process_utils.py:190} INFO -     res = 
this_should_break(*arg_dict["args"], **arg_dict["kwargs"])
   [2023-10-02T16:06:39.945+0000] {process_utils.py:190} INFO -   File 
"/tmp/tmdb2r5pv98/script.py", line 26, in this_should_break
   [2023-10-02T16:06:39.946+0000] {process_utils.py:190} INFO -     from 
***.providers.postgres.hooks.postgres import PostgresHook
   [2023-10-02T16:06:39.948+0000] {process_utils.py:190} INFO -   File "<frozen 
importlib._bootstrap>", line 991, in _find_and_load
   [2023-10-02T16:06:39.949+0000] {process_utils.py:190} INFO -   File "<frozen 
importlib._bootstrap>", line 971, in _find_and_load_unlocked
   [2023-10-02T16:06:39.950+0000] {process_utils.py:190} INFO -   File "<frozen 
importlib._bootstrap>", line 914, in _find_spec
   [2023-10-02T16:06:39.951+0000] {process_utils.py:190} INFO -   File "<frozen 
importlib._bootstrap_external>", line 1407, in find_spec
   [2023-10-02T16:06:39.952+0000] {process_utils.py:190} INFO -   File "<frozen 
importlib._bootstrap_external>", line 1373, in _get_spec
   [2023-10-02T16:06:39.953+0000] {process_utils.py:190} INFO -   File "<frozen 
importlib._bootstrap_external>", line 1239, in __iter__
   [2023-10-02T16:06:39.954+0000] {process_utils.py:190} INFO -   File "<frozen 
importlib._bootstrap_external>", line 1227, in _recalculate
   [2023-10-02T16:06:39.954+0000] {process_utils.py:190} INFO -   File "<frozen 
importlib._bootstrap_external>", line 1223, in _get_parent_path
   [2023-10-02T16:06:39.955+0000] {process_utils.py:190} INFO - KeyError: '***'
   [2023-10-02T16:06:40.022+0000] {taskinstance.py:1935} ERROR - Task failed 
with exception
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/decorators/base.py", 
line 221, in execute
       return_value = super().execute(context)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", 
line 395, in execute
       return super().execute(context=serializable_context)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", 
line 192, in execute
       return_value = self.execute_callable()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", 
line 720, in execute_callable
       return self._execute_python_callable_in_subprocess(python_path, tmp_path)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", 
line 463, in _execute_python_callable_in_subprocess
       raise AirflowException(error_msg) from None
   airflow.exceptions.AirflowException: Process returned non-zero exit status 1.
   '***'
   [2023-10-02T16:06:40.029+0000] {taskinstance.py:1398} INFO - Marking task as 
FAILED. dag_id=new_example_break, task_id=this_should_break, 
execution_date=20231002T160543, start_date=20231002T160638, 
end_date=20231002T160640
   [2023-10-02T16:06:40.045+0000] {standard_task_runner.py:104} ERROR - Failed 
to execute job 1472 for task this_should_break (Process returned non-zero exit 
status 1.
   '***'; 488)
   [2023-10-02T16:06:40.087+0000] {local_task_job_runner.py:228} INFO - Task 
exited with return code 1
   [2023-10-02T16:06:40.111+0000] {taskinstance.py:2776} INFO - 1 downstream 
tasks scheduled from follow-on schedule check
   
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to