As I said, the workaround is using the "bin/flink" tool from the command line. I think it should be possible to add a "student" account on the cluster to access the Flink installation?
On Tue, Sep 8, 2015 at 12:36 PM, Florian Heyl <f.h...@gmx.de> wrote: > Ok I see, thank you. Do not have experience with that but does there > exists a possible work around? > > > Am 08.09.2015 um 13:13 schrieb Robert Metzger <rmetz...@apache.org>: > > That's the bug: https://issues.apache.org/jira/browse/FLINK-2632 > > On Tue, Sep 8, 2015 at 1:11 PM, Robert Metzger <rmetz...@apache.org> > wrote: > >> There is a bug in the web client which sets the wrong class loader when >> running the user code. >> >> On Tue, Sep 8, 2015 at 12:05 PM, Florian Heyl <f.h...@gmx.de> wrote: >> >>> Locally we are using the 0.9-SNAPSHOT but the cluster should work with >>> the 0.10-SNAPSHOT. I have no direct control of the cluster because our prof >>> is responsible for that. >>> The students are using the flink web submission client to upload their >>> jar and run it on the cluster. >>> >>> >>> Am 08.09.2015 um 12:48 schrieb Robert Metzger <rmetz...@apache.org>: >>> >>> Which version of Flink are you using? >>> >>> Have you tried submitting the job using the "./bin/flink run" tool? >>> >>> On Tue, Sep 8, 2015 at 11:44 AM, Florian Heyl <f.h...@gmx.de> wrote: >>> >>>> Dear Sir or Madam, >>>> Me and my colleague are developing a pipeline based on scala and java >>>> to classify cancer stages. This pipeline should be uploaded on the hdfs >>>> (apache flink). >>>> The pipeline locally works fine but on the hdfs it crashes with the >>>> following error (see below). The main method is simply structured we are >>>> only passing one argument to set the working directory and then other >>>> methods from different scala and java files are called. Sorry if I can not >>>> give you more details because I can not figure out what the exact problem >>>> is. I hope you can help me. >>>> >>>> Best wishes, >>>> >>>> Flo >>>> >>>> >>>> An error occurred while invoking the program: >>>> >>>> The main method caused an error. >>>> >>>> >>>> java.io.IOException: Class not found >>>> at >>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.a(Unknown >>>> Source) >>>> at >>>> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown >>>> Source) >>>> at >>>> org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$getClassReader(ClosureCleaner.scala:42) >>>> at >>>> org.apache.flink.api.scala.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:90) >>>> pache.flink.api.scala.DataSet.clean(DataSet.scala:120) >>>> at org.apache.flink.ap >>>> at >>>> org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:113) >>>> at org. >>>> ai.scala.DataSet$$anon$6.<init>(DataSet.scala:437) >>>> at org.apache.flink.api.scala.DataSet.filter(DataSet.scala:436) >>>> VI$sp(Regression2.scala:98) >>>> at scala.collection.immutable.Range. >>>> at AddFunctions$.splitIntoTrainingAndTest(AddFunctions.scala:42) >>>> at Regression2$$anonfun$mainRegression$1.apply$m >>>> cforeach$mVc$sp(Range.scala:141) >>>> at Regression2$.mainRegression(Regression2.scala:91) >>>> at MainClass$.main(MainClass.scala:41) >>>> at MainClass.main(MainClass.scala) >>>> elegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>> at >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) >>>> at sun.reflect. >>>> D.java:43) >>>> at java.lang.reflect.Method.invoke(Method.java:483) >>>> at >>>> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:437) >>>> tOptimizedPlan(Client.java:229) >>>> at >>>> org.apache.flink.client.web.JobSubmissionServlet.doGet(JobSubmissionServlet >>>> at >>>> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:353) >>>> at org.apache.flink.client.program.Client.g >>>> e.java:186) >>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:668) >>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:770) >>>> at >>>> org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:532) >>>> doHandle(ContextHandler.java:965) >>>> at org.eclipse.jetty.servlet.ServletHandler >>>> at >>>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:453) >>>> at >>>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:227) >>>> at org.eclipse.jetty.server.handler.ContextHandler >>>> ..doScope(ServletHandler.java:388) >>>> at >>>> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:187) >>>> at >>>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:901) >>>> at >>>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117) >>>> ection.java:596) >>>> at org.eclipse.jetty.server.HttpConnection$RequestHandler. >>>> at >>>> org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:47) >>>> at >>>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:113) >>>> at org.eclipse.jetty.server.Server.handle(Server.java:352) >>>> at org.eclipse.jetty.server.HttpConnection.handleRequest(HttpCon >>>> nheaderComplete(HttpConnection.java:1048) >>>> at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:549) >>>> at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:211) >>>> at >>>> org.eclipse.jetty.server.HttpConnection.handle(HttpConnection.java:425) >>>> at >>>> org.eclipse.jetty.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:489) >>>> at >>>> org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:436) >>>> at java.lang.Thread.run(Thread.java:745) >>>> >>>> >>>> >>>> >>> >>> >> > >