[
https://issues.apache.org/jira/browse/FLINK-2161?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14585859#comment-14585859
]
ASF GitHub Bot commented on FLINK-2161:
---------------------------------------
Github user tillrohrmann commented on a diff in the pull request:
https://github.com/apache/flink/pull/805#discussion_r32414909
--- Diff:
flink-staging/flink-scala-shell/src/main/scala/org.apache.flink/api/scala/FlinkILoop.scala
---
@@ -191,5 +203,14 @@ HINT: You can use print() on a DataSet to print the
contents to this shell.
)
}
+
+ def getExternalJars(): Array[String] ={
+ var extJars : Array[String] = Array.empty[String]
+ this.externalJars match{
+ case Some(ej) => extJars = ej
+ case None =>
+ }
+ extJars
--- End diff --
Why not doing simply:
```
def getExternalJars(): Array[String] =
externalJars.getOrElse(Array.empty[String])
```
?
> Flink Scala Shell does not support external jars (e.g. Gelly, FlinkML)
> ----------------------------------------------------------------------
>
> Key: FLINK-2161
> URL: https://issues.apache.org/jira/browse/FLINK-2161
> Project: Flink
> Issue Type: Improvement
> Reporter: Till Rohrmann
> Assignee: Nikolaas Steenbergen
>
> Currently, there is no easy way to load and ship external libraries/jars with
> Flink's Scala shell. Assume that you want to run some Gelly graph algorithms
> from within the Scala shell, then you have to put the Gelly jar manually in
> the lib directory and make sure that this jar is also available on your
> cluster, because it is not shipped with the user code.
> It would be good to have a simple mechanism how to specify additional jars
> upon startup of the Scala shell. These jars should then also be shipped to
> the cluster.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)