jaketf commented on a change in pull request #6590: [AIRFLOW-5520] Add options
to run Dataflow in a virtual environment
URL: https://github.com/apache/airflow/pull/6590#discussion_r347019690
##########
File path: airflow/gcp/hooks/dataflow.py
##########
@@ -515,8 +530,20 @@ def label_formatter(labels_dict):
return ['--labels={}={}'.format(key, value)
for key, value in labels_dict.items()]
- self._start_dataflow(variables, name, [py_interpreter] + py_options +
[dataflow],
- label_formatter, project_id)
+ if py_requirements is not None:
+ with TemporaryDirectory(prefix='dataflow-venv') as tmp_dir:
+ py_interpreter = prepare_virtualenv(
Review comment:
1) would we be able to specify a pip.conf (in case of wanting to install
from private pypi server)?
2) This re-instantiation / pip install every time the task runs adds
potentially a lot of latency before pipeline submission (and pip install
logging spam) if installing large/many packages.
It would be good if we could re-use these virtualenvs effectively caching
some dependencies (rather than re-installing from pip all the time from a bare
virtualenv). However, there'd have to be a mechanism to check if this worker
has a copy of the virtualenv necessary .
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services