I have local Python package (with my xyz module) exported to .tar.gz file
and I want to use it in my DataFlow-deployed pipeline. Unfortunately,
extra_packages parameter does not work. Logs show that the package is
indeed uploaded to GCS, but then during the execution in DataFlow, I get
ImportError: No module named xyz

How to add local python packages as dependencies to DataFlow?

I've posted similar question to SO: https://stackoverflow.com/
questions/46604870/apache-beam-local-python-dependencies
Thanks,
Marcin Zabłocki
Data Engineer
Twitter: @marrrcin
[image: Egnyte] <http://www.egnyte.com/>
Smart Content Collaboration & Governance
WATCH: Our Solutions Transform Business
<https://www.egnyte.com/?linkposition=0&play=1>
Start your Free Trial
<https://www-sjc.egnyte.com/corp/registration/register_trial_2.html?plan=business&georedirect=true>
 Today!
[image: Egnyte] <https://twitter.com/Egnyte> [image: social] [image: Egnyte]
<https://www.linkedin.com/company/egnyte> [image: social] [image: Egnyte]
<http://www.egnyte.com/file-server/videos.html>

Reply via email to