Please verify that you have set the correct main class manifest in you
pom.xml when you build the user code jar. Alternatively you can specify the
class to execute via `bin/flink run -c CLASS_TO_EXECUTE` if your user code
jar contains multiple classes.

Cheers,
Till

On Thu, Jun 21, 2018 at 11:28 AM Mano Swerts <mano.swe...@ixxus.com> wrote:

> Hi guys,
>
> I have a question. I have been playing around with Fink this week and
> created some basic Java jobs that work fine. Now I am trying to run one in
> Scala.
>
> Running this code in the Scala REP prints the expected output:
>
> *env.fromElements(1, 2, 3).map(i => "==== Integer: " + i).print()*
>
> However, having it packaged in a JAR which I then deploy through the user
> interface doesn’t give me any output at all. I can start the job and it
> finishes without exceptions, but I don’t see the result of the print()
> statement in the log. The class looks like this:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *package com.ixxus.playground.fmk.flink
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.api.scala._ object LearnDocumentEntityRelationship
> {     def main(args: Array[String]) {         val env =
> ExecutionEnvironment.getExecutionEnvironment         val params:
> ParameterTool = ParameterTool.fromArgs(args)
> env.fromElements(1, 2, 3).map(i => "==== Integer: " + i).print()
> env.execute("Scala example")     } }*
>
>
> I did notice that the job name isn’t what I pass to env.execute. It is
> named “Flink Java Job”:
>
>
>
> I can’t find anything online however about this phenomenon. Does anyone
> have any idea?
>
> Thanks.
>
> — Mano
>

Reply via email to