[ 
https://issues.apache.org/jira/browse/FLINK-2161?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14600917#comment-14600917
 ] 

ASF GitHub Bot commented on FLINK-2161:
---------------------------------------

Github user nikste commented on the pull request:

    https://github.com/apache/flink/pull/805#issuecomment-115180597
  
    This did not work unfortunately, the class was available in the test, but 
unfortunately not in the shell which is invoked in the test. 
    However, if you add the classpath of the external class to 
```settings.classpath.value``` of the scala shell before starting it, it seems 
to work.
    
    I added a test for instantiating and printing a DenseVector with flink-ml 
jar. This should check if the external jar is sent to the cluster.
    The only remaining problem is the name of the jar, which will change if the 
flink-version changes. 


> Flink Scala Shell does not support external jars (e.g. Gelly, FlinkML)
> ----------------------------------------------------------------------
>
>                 Key: FLINK-2161
>                 URL: https://issues.apache.org/jira/browse/FLINK-2161
>             Project: Flink
>          Issue Type: Improvement
>            Reporter: Till Rohrmann
>            Assignee: Nikolaas Steenbergen
>
> Currently, there is no easy way to load and ship external libraries/jars with 
> Flink's Scala shell. Assume that you want to run some Gelly graph algorithms 
> from within the Scala shell, then you have to put the Gelly jar manually in 
> the lib directory and make sure that this jar is also available on your 
> cluster, because it is not shipped with the user code. 
> It would be good to have a simple mechanism how to specify additional jars 
> upon startup of the Scala shell. These jars should then also be shipped to 
> the cluster.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to