This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new 5cd185a95b1 Add caution on using Airflow packages in virtualenv 
operator (#57599)
5cd185a95b1 is described below

commit 5cd185a95b1fcc8b4df6e2dca19ae557ec177dae
Author: Ariel Mordoch <[email protected]>
AuthorDate: Sat Nov 1 17:30:42 2025 +0100

    Add caution on using Airflow packages in virtualenv operator (#57599)
    
    * Add note to using constraints file in venv operator
    
    * Fix line endings
---
 providers/standard/docs/operators/python.rst | 13 ++++++++-----
 1 file changed, 8 insertions(+), 5 deletions(-)

diff --git a/providers/standard/docs/operators/python.rst 
b/providers/standard/docs/operators/python.rst
index bc8069e0a28..2e5e63ea437 100644
--- a/providers/standard/docs/operators/python.rst
+++ b/providers/standard/docs/operators/python.rst
@@ -157,16 +157,19 @@ Passing in arguments
 
 Pass extra arguments to the ``@task.virtualenv`` decorated function as you 
would with a normal Python function.
 Unfortunately, Airflow does not support serializing ``var``, ``ti`` and 
``task_instance`` due to incompatibilities
-with the underlying library. For Airflow context variables make sure that you 
either have access to Airflow through
-setting ``system_site_packages`` to ``True`` or add ``apache-airflow`` to the 
``requirements`` argument.
-Otherwise you won't have access to the most context variables of Airflow in 
``op_kwargs``.
-If you want the context related to datetime objects like 
``data_interval_start`` you can add ``pendulum`` and
+with the underlying library. For Airflow context variables, make sure that you 
have access to Airflow by
+setting ``system_site_packages`` to ``True`` or you won't have access to most 
context variables in ``op_kwargs``.
+If you want the context related to datetime objects like 
``data_interval_start``, you can add ``pendulum`` and
 ``lazy_object_proxy``.
 
+.. important::
+
+    When Airflow or provider packages are required, you must specify the 
Airflow :ref:`apache-airflow:installation:constraints`
+    using ``pip_install_options`` to avoid dependency conflicts.
 
 .. important::
     The Python function body defined to be executed is cut out of the Dag into 
a temporary file w/o surrounding code.
-    As in the examples you need to add all imports again and you can not rely 
on variables from the global Python context.
+    As in the examples you need to add all imports again and you cannot rely 
on variables from the global Python context.
 
     If you want to pass variables into the classic 
:class:`~airflow.providers.standard.operators.python.PythonVirtualenvOperator` 
use
     ``op_args`` and ``op_kwargs``.

Reply via email to