[
https://issues.apache.org/jira/browse/FLINK-2161?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14585866#comment-14585866
]
ASF GitHub Bot commented on FLINK-2161:
---------------------------------------
Github user tillrohrmann commented on a diff in the pull request:
https://github.com/apache/flink/pull/805#discussion_r32415111
--- Diff:
flink-staging/flink-scala-shell/src/main/scala/org.apache.flink/api/scala/FlinkShell.scala
---
@@ -51,14 +63,16 @@ object FlinkShell {
// parse arguments
parser.parse (args, Config () ) match {
case Some(config) =>
- startShell(config.host,config.port)
+ startShell(config.host,config.port,config.externalJars)
case _ => println("Could not parse program arguments")
}
}
- def startShell(userHost : String, userPort : Int): Unit ={
+ def startShell(userHost : String,
--- End diff --
parameters start in a new line with 4 spaces indentation
> Flink Scala Shell does not support external jars (e.g. Gelly, FlinkML)
> ----------------------------------------------------------------------
>
> Key: FLINK-2161
> URL: https://issues.apache.org/jira/browse/FLINK-2161
> Project: Flink
> Issue Type: Improvement
> Reporter: Till Rohrmann
> Assignee: Nikolaas Steenbergen
>
> Currently, there is no easy way to load and ship external libraries/jars with
> Flink's Scala shell. Assume that you want to run some Gelly graph algorithms
> from within the Scala shell, then you have to put the Gelly jar manually in
> the lib directory and make sure that this jar is also available on your
> cluster, because it is not shipped with the user code.
> It would be good to have a simple mechanism how to specify additional jars
> upon startup of the Scala shell. These jars should then also be shipped to
> the cluster.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)