[ 
https://issues.apache.org/jira/browse/FLINK-15216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chesnay Schepler closed FLINK-15216.
------------------------------------
    Resolution: Duplicate

> Can't use rocksdb with hdfs filesystem with flink-s3-fs-hadoop
> --------------------------------------------------------------
>
>                 Key: FLINK-15216
>                 URL: https://issues.apache.org/jira/browse/FLINK-15216
>             Project: Flink
>          Issue Type: Bug
>          Components: FileSystems
>    Affects Versions: 1.9.0
>            Reporter: Tank
>            Priority: Major
>
> Setup:
> Flink 1.9.0 packaged with EMR 5.28 (uses hadoop 2.8.5)
> After adding flink-s3-fs-hadoop to the plugins or libs directory, I was not 
> able to use rocksdb with hdfs filesystem.
>  
> To replicate:
>  
> Add rocksdb checkpoint & savepoint configuration to 
> /etc/flink/conf/flink-conf.yaml
> state.backend: rocksdb
> state.backend.incremental: true
> state.savepoints.dir: hdfs://<path>savepoints/
> state.checkpoints.dir: hdfs://<path>/checkpoints/
>  
> copy  flink-s3-fs-hadoop to plugins directory
>  
> {{cp ./opt/flink-s3-fs-hadoop-1.9.0.jar ./plugins/s3-fs-hadoop/}}
>  
> Start any job:
>  
> flink run -m yarn-cluster -yid ${app_id} -yn 1 -p 1 
> /usr/lib/flink/examples/streaming/WordCount.jar
>  
> Looks like the hadoop classes packaged in flink-s3-fs-hadoop are interfering 
> with flink runtime.
> {{org.apache.flink.client.program.ProgramInvocationException: Could not 
> retrieve the execution result. (JobID: 
> b2d8b907839d82dda5e7407248941d4d)org.apache.flink.client.program.ProgramInvocationException:
>  Could not retrieve the execution result. (JobID: 
> b2d8b907839d82dda5e7407248941d4d) at 
> org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:255)
>  at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338) 
> at 
> org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:60)
>  at 
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1507)
>  at 
> org.apache.flink.streaming.examples.wordcount.WordCount.main(WordCount.java:89)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)
>  at 
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:438)
>  at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:274) 
> at 
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:746) 
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:273) 
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205) at 
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1010)
>  at 
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1083) 
> at java.security.AccessController.doPrivileged(Native Method) at 
> javax.security.auth.Subject.doAs(Subject.java:422) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844)
>  at 
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>  at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1083)Caused 
> by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit 
> JobGraph. at 
> org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$8(RestClusterClient.java:382)
>  at 
> java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)
>  at 
> java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
>  at 
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
>  at 
> java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
>  at 
> org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:263)
>  at 
> java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
>  at 
> java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
>  at 
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
>  at 
> java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561) 
> at 
> java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:929)
>  at 
> java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)Caused by: 
> org.apache.flink.runtime.rest.util.RestClientException: [Internal server 
> error., <Exception on server 
> side:org.apache.flink.runtime.client.JobSubmissionException: Failed to submit 
> job. at 
> org.apache.flink.runtime.dispatcher.Dispatcher.lambda$internalSubmitJob$2(Dispatcher.java:333)
>  at 
> java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:822) 
> at 
> java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:797)
>  at 
> java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
>  at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40) at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
>  at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at 
> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
> at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at 
> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)Caused
>  by: java.lang.RuntimeException: 
> org.apache.flink.runtime.client.JobExecutionException: Could not set up 
> JobManager at 
> org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
>  at 
> java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
>  ... 6 moreCaused by: org.apache.flink.runtime.client.JobExecutionException: 
> Could not set up JobManager at 
> org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
>  at 
> org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:83)
>  at 
> org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:375)
>  at 
> org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
>  ... 7 moreCaused by: org.apache.flink.util.FlinkRuntimeException: Failed to 
> create checkpoint storage at checkpoint coordinator side. at 
> org.apache.flink.runtime.checkpoint.CheckpointCoordinator.<init>(CheckpointCoordinator.java:255)
>  at 
> org.apache.flink.runtime.executiongraph.ExecutionGraph.enableCheckpointing(ExecutionGraph.java:594)
>  at 
> org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:340)
>  at 
> org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:106)
>  at 
> org.apache.flink.runtime.scheduler.LegacyScheduler.createExecutionGraph(LegacyScheduler.java:207)
>  at 
> org.apache.flink.runtime.scheduler.LegacyScheduler.createAndRestoreExecutionGraph(LegacyScheduler.java:184)
>  at 
> org.apache.flink.runtime.scheduler.LegacyScheduler.<init>(LegacyScheduler.java:176)
>  at 
> org.apache.flink.runtime.scheduler.LegacySchedulerFactory.createInstance(LegacySchedulerFactory.java:70)
>  at 
> org.apache.flink.runtime.jobmaster.JobMaster.createScheduler(JobMaster.java:275)
>  at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:265) 
> at 
> org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:98)
>  at 
> org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:40)
>  at 
> org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
>  ... 10 moreCaused by: 
> org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find 
> a file system implementation for scheme 'hdfs'. The scheme is not directly 
> supported by Flink and no Hadoop file system to support this scheme could be 
> loaded. at 
> org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:447)
>  at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:359) at 
> org.apache.flink.core.fs.Path.getFileSystem(Path.java:298) at 
> org.apache.flink.runtime.state.filesystem.FsCheckpointStorage.<init>(FsCheckpointStorage.java:61)
>  at 
> org.apache.flink.runtime.state.filesystem.FsStateBackend.createCheckpointStorage(FsStateBackend.java:490)
>  at 
> org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createCheckpointStorage(RocksDBStateBackend.java:458)
>  at 
> org.apache.flink.runtime.checkpoint.CheckpointCoordinator.<init>(CheckpointCoordinator.java:253)
>  ... 22 moreCaused by: 
> org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Cannot support 
> file system for 'hdfs' via Hadoop, because Hadoop is not in the classpath, or 
> some classes are missing from the classpath. at 
> org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:179)
>  at 
> org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:443)
>  ... 28 moreCaused by: java.lang.NoClassDefFoundError: 
> org/apache/flink/fs/shaded/hadoop3/org/apache/hadoop/hdfs/HdfsConfiguration 
> at 
> org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:85)
>  ... 29 moreCaused by: java.lang.ClassNotFoundException: 
> org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.hdfs.HdfsConfiguration 
> at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:424) at 
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 30 more}}
> {{End of exception on server side>] at 
> org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:389) 
> at 
> org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:373)
>  at 
> java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:952) 
> at 
> java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926)
>  ... 4 more}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to