dianfu commented on a change in pull request #10017: [FLINK-14019][python] add 
support for managing environment and dependencies of Python UDF in Flink Python 
API
URL: https://github.com/apache/flink/pull/10017#discussion_r348507704
 
 

 ##########
 File path: flink-python/pyflink/table/table_environment.py
 ##########
 @@ -715,6 +718,111 @@ def create_temporary_view(self, view_path, table):
         """
         self._j_tenv.createTemporaryView(view_path, table._j_table)
 
+    def add_python_file(self, file_path):
+        """
+        Upload single python file, python package or local directory to 
cluster.
+        These files and directory will append to the PYTHONPATH variable of 
python UDF worker.
+        Please make sure they can be imported directly.
+
+        If python UDFs used in job depend on additional files, this method 
should be called to
+        upload them to cluster.
+
+        :param file_path: The path of python file, python package or local 
directory.
+        :type file_path: str
+        """
+        self._dependency_manager.add_python_file(file_path)
+
+    def set_python_requirements(self, requirements_file_path, 
requirements_cache_dir=None):
+        """
+        Upload a "requirements.txt" file to specify the third-party libraries 
used in python UDF.
+        These libraries will be installed to a temporary directory and be 
imported to the python
+        UDF worker.
+
+        If python UDFs used in job depend on third-party libraries, this 
method should be called
+        to transmit them to cluster.
+
+        If the cluster is running on a private network that cannot access the 
PyPI central
+        repository, a local directory contains all packages listed in the 
"requirements.txt" file
+        should be specified using the "requirements_cached_dir" parameter. It 
will be uploaded to
+        cluster to support offline installation.
+
+        Example:
+        ::
+
+            # commands executed in shell
+            $ echo numpy==1.16.5 > requirements.txt
+            $ pip download -d cached_dir -r requirements.txt --no-binary :all:
+
+            # python code
+            >>> table_env.set_python_requirements("requirements.txt", 
"cached_dir")
+
+        .. note::
+
+            Please make sure the package file matches the platform and python 
version of cluster.
+            These packages will be installed using pip.
+
+        :param requirements_file_path: The path of "requirements.txt" file.
+        :type requirements_file_path: str
+        :param requirements_cache_dir: The path of the local directory 
contains all packages
+                                        listed in the "requirements.txt" file.
+        :type requirements_cache_dir: str
+        """
+        
self._dependency_manager.set_python_requirements(requirements_file_path,
+                                                         
requirements_cache_dir)
+
+    def add_python_archive(self, archive_path, target_dir):
+        """
+        Upload a packaged file to cluster. The file will be extracted to the 
working directory of
+        python UDF worker.
+
+        If the "target_dir" is specified. The file will be extracted to the 
directory with the
 
 Review comment:
   "If parameter "target_dir" is specified, the archive file will be extracted 
to a directory named ${target_dir}. Otherwise, the archive file will be 
extracted to a directory with the same name of the archive file."

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to