[
https://issues.apache.org/jira/browse/HBASE-24873?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17176909#comment-17176909
]
Viraj Jasani commented on HBASE-24873:
--------------------------------------
HBase Opensource community has EOL'ed 2.0 release line and also, the community
does not own HDP packages. Could you please reach out to Hortonworks support to
get more help on this issue as this involves Flink and Ambari also with HDP
3.0.1?
Thanks
> Not able to access Flink 1.7.2 with HBase 2.0.0 included in HDP cluster 3.0.1
> -----------------------------------------------------------------------------
>
> Key: HBASE-24873
> URL: https://issues.apache.org/jira/browse/HBASE-24873
> Project: HBase
> Issue Type: Bug
> Components: Client
> Affects Versions: 2.0.0
> Environment: * I am using Ambari Server 2.7.1 with HDP cluster 3.0.1
> with YARN 3.1.1 and HBase 2.0.0
> * Using hbase-client-2.0.0.jar along with Flink 1.7.2
> *
> [flink-1.7.2-bin-hadoop27-scala_2.11.tgz|https://archive.apache.org/dist/flink/flink-1.7.2/flink-1.7.2-bin-hadoop27-scala_2.11.tgz]
> using this one for trying.
> Reporter: Pasha Shaik
> Priority: Blocker
> Attachments: 0874A89C-597E-451A-8986-E619A0E8237B.jpeg,
> 0ECDB56A-3A76-424F-8926-A9FEB1BD96BB.jpeg,
> 4237EAEA-D3CD-4791-8322-49E2F9FA8666.png,
> 667CC1DB-CF0E-44E9-B9E8-1B44151FC00E.jpeg,
> 87939306-6571-4886-A23B-4780897B88D4.jpeg
>
>
> * I am not able to access Flink 1.7.2 with HDP 3.0.1
> * The YARN version is 3.1.1 and HBASE is 2.0.0
> * Flink is successfully getting mounted on Yarn and showing it as RUNNING.
> * But in actual when I try to test my code, it is showing below error.
> * The .tgz which I used is
> [flink-1.7.2-bin-hadoop27-scala_2.11.tgz|https://archive.apache.org/dist/flink/flink-1.7.2/flink-1.7.2-bin-hadoop27-scala_2.11.tgz]
> * The reason for the failure is w.r.t HDP 3.0.1, the associated HBase-Client
> ("org.apache.hbase:hbase-client:2.0.0") is still not in sync with
> Flink-Hbase_2.11-1.7.2 as the HTable.class constructor is completely removed
> in this version as those respective classes still uses that and throws below
> I/O exception.
> * Please find the logs and screenshots for more info.
>
>
> ---------**********************************************************--------d--------------------
> HERE ARE THE LOGS BELOW
> *org.apache.flink.runtime.client.JobExecutionException: Failed to submit job
> cbb64a9b4e2e3ad0167eb4ceeb53ac87 (Flink Java Job at Tue Aug 11 10:10:47 CEST
> 2020) at*
> org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:1325)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:447)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:38)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> akka.actor.Actor$class.aroundReceive(Actor.scala:502)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:122)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> akka.actor.ActorCell.receiveMessage(ActorCell.scala:526)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.actor.ActorCell.invoke(ActorCell.scala:495)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.dispatch.Mailbox.run(Mailbox.scala:224) ~[akka-actor_2.11-2.4.20.jar:?]
> at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> ~
>
> *[scala-library-2.11.11.jar:?]Caused by:
> org.apache.flink.runtime.JobException: Creating the input splits caused an
> error: connection is closed at
> org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>*(ExecutionJobVertex.java:262)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:810)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:180)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:1277)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:447)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:38)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> akka.actor.Actor$class.aroundReceive(Actor.scala:502)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:122)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> akka.actor.ActorCell.receiveMessage(ActorCell.scala:526)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.actor.ActorCell.invoke(ActorCell.scala:495)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.dispatch.Mailbox.run(Mailbox.scala:224) ~[akka-actor_2.11-2.4.20.jar:?]
> at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> ~
>
> *[scala-library-2.11.11.jar:?]Caused by: java.io.IOException: connection is
> closed at
> org.apache.hadoop.hbase.MetaTableAccessor.getMetaHTable(MetaTableAccessor.java:263)
> ~[hbase-client-2.0.0.jar:2.0.0] at*
> org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:761)
> ~[hbase-client-2.0.0.jar:2.0.0] at
> org.apache.hadoop.hbase.MetaTableAccessor.scanMeta(MetaTableAccessor.java:680)
> ~[hbase-client-2.0.0.jar:2.0.0] at
> org.apache.hadoop.hbase.MetaTableAccessor.scanMetaForTableRegions(MetaTableAccessor.java:675)
> ~[hbase-client-2.0.0.jar:2.0.0] at
> org.apache.hadoop.hbase.client.HRegionLocator.listRegionLocations(HRegionLocator.java:156)
> ~[hbase-client-2.0.0.jar:2.0.0] at
> org.apache.hadoop.hbase.client.HRegionLocator.getStartEndKeys(HRegionLocator.java:122)
> ~[hbase-client-2.0.0.jar:2.0.0] at
> org.apache.flink.addons.hbase.AbstractTableInputFormat.createInputSplits(AbstractTableInputFormat.java:205)
> ~[flink-hbase_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.addons.hbase.AbstractTableInputFormat.createInputSplits(AbstractTableInputFormat.java:44)
> ~[flink-hbase_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:248)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:810)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:180)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:1277)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:447)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:38)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> ~[scala-library-2.11.11.jar:?] at
> org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> akka.actor.Actor$class.aroundReceive(Actor.scala:502)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:122)
> ~[flink-runtime_2.11-1.4.2.jar:1.4.2] at
> akka.actor.ActorCell.receiveMessage(ActorCell.scala:526)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.actor.ActorCell.invoke(ActorCell.scala:495)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> akka.dispatch.Mailbox.run(Mailbox.scala:224) ~[akka-actor_2.11-2.4.20.jar:?]
> at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
> ~[akka-actor_2.11-2.4.20.jar:?] at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> ~[scala-library-2.11.11.jar:?] at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> ~[scala-library-2.11.11.jar:?]
--
This message was sent by Atlassian Jira
(v8.3.4#803005)