[ 
https://issues.apache.org/jira/browse/FLINK-16943?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17073559#comment-17073559
 ] 

sunjincheng edited comment on FLINK-16943 at 4/2/20, 9:48 AM:
--------------------------------------------------------------

I think the best situation is that Java also supports job submission with 
multiple jars, but no matter what, PyFlink must need this function, so I think 
it's reasonable to add 'add_jars' interface in PyFlink (Does not affect any 
other Java modules).  The implementation of 'add_jars' is transparent to users. 
We can align with Java at any time in the future, if necessary. i.e. we want to 
ensure the compatibility of follow up versions.  The JIRA and discussion on 
Java supporting multiple jars can be found in [1] & [2].

[1] https://issues.apache.org/jira/browse/FLINK-14319
[2] 
http://apache-flink-mailing-list-archive.1008284.n3.nabble.com/DISCUSS-Register-user-jar-files-in-Stream-ExecutionEnvironment-td35801.html


was (Author: sunjincheng121):
I think the best situation is that Java also supports job submission with 
multiple jars, but no matter what, PyFlink must need this function, so I think 
it's reasonable to add 'add_jars' interface in PyFlink (Does not affect any 
other Java modules).  The implementation of 'add_jars' is transparent to users. 
We can align with Java at any time in the future, if necessary. i.e. we want to 
ensure the compatibility of follow up versions.

> Support adding jars in PyFlink
> ------------------------------
>
>                 Key: FLINK-16943
>                 URL: https://issues.apache.org/jira/browse/FLINK-16943
>             Project: Flink
>          Issue Type: Improvement
>          Components: API / Python
>            Reporter: Wei Zhong
>            Priority: Major
>
> Since flink-1.10.0 released, many users have complained that PyFlink is 
> inconvenient when loading external jar packages. For local execution, users 
> need to copy the jar files to the lib folder under the installation directory 
> of PyFlink, which is hard to locate. For job submission, users need to merge 
> their jars into one, as `flink run` only accepts one jar file. It may be easy 
> for Java users but difficult for Python users if they haven't touched Java.
> We intend to add a `add_jars` interface on PyFlink TableEnvironment to solve 
> this problem. It will add the jars to the context classloader of Py4j gateway 
> server and add to the `PipelineOptions.JARS` of the configuration of 
> StreamExecutionEnviornment/ExecutionEnviornment.
> Via this interface, users could add jars in their python job. The jars will 
> be loaded immediately, and users could use it even on the next line of the 
> Python code. Submitting a job with multiple external jars won't be a problem 
> anymore because all the jars in `PipelineOptions.JARS` will be added to the 
> JobGraph and upload to the cluster.
> As it is not a big change I'm not sure whether it is necessary to create a 
> FLIP to discuss this. So I created a JIRA first for flexibility. What do you 
> think guys?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to