Hi,

I am running on a 9 node cassandra cluster (cassandra version 3.11.5 , OS:
centos 7.6). We have been running this cluster from long time, but now for
a couple of days we are receiving strange errors from the
same specific node again and again, during the error i don't find anything
unusual on the node. below are the error from application side :



java.util.concurrent.CompletionException: java.io.IOException: Failed
to open native connection to Cassandra at {xx.xx.xx.xx}:9042
        at 
java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
        at 
java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1592)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1582)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: java.io.IOException: Failed to open native connection to
Cassandra at {xx.xx.xx.xx}:9042
        at 
com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:168)
        at 
com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
        at 
com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
        at 
com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
        at 
com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
        at 
com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
        at 
com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
        at 
com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111)
        at 
com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:122)
        at 
com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:332)
        at 
com.datastax.spark.connector.cql.Schema$.tableFromCassandra(Schema.scala:352)
        at 
com.datastax.spark.connector.writer.ReplicaLocator$.apply(ReplicaLocator.scala:62)
        at 
com.datastax.spark.connector.RDDFunctions.repartitionByCassandraReplica(RDDFunctions.scala:238)
        at 
com.datastax.spark.connector.japi.RDDJavaFunctions.repartitionByCassandraReplica(RDDJavaFunctions.java:157)
        at 
jobs.GenerateContestSplitPDF.getLutvDataSet(GenerateContestSplitPDF.java:119)
        at 
jobs.GenerateContestSplitPDF.lambda$main$0(GenerateContestSplitPDF.java:66)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
        ... 5 more
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException:
All host(s) tried for query failed (tried:
cps-cassandra/xx.xx.xx.xx:9042
(com.datastax.driver.core.exceptions.TransportException:
[cps-cassandra/xx.xx.xx.xx:9042] Cannot connect))
        at 
com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:233)
        at 
com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:79)
        at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1483)
        at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:399)
        at 
com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:161)
        ... 21 more
java.lang.Exception: java.util.concurrent.CompletionException:
java.io.IOException: Failed to open native connection to Cassandra at
{xx.xx.xx.xx}:9042
        at xx.xx.utils.Utils.handleException(Utils.java:123)
        at 
jobs.GenerateContestSplitPDF.lambda$main$3(GenerateContestSplitPDF.java:77)
        at 
java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)
        at 
java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
        at 
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1595)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1582)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: java.util.concurrent.CompletionException:
java.io.IOException: Failed to open native connection to Cassandra at
{xx.xx.xx.xx}:9042
        at 
java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
        at 
java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1592)
        ... 5 more
Caused by: java.io.IOException: Failed to open native connection to
Cassandra at {xx.xx.xx.xx}:9042
        at 
com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:168)
        at 
com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
        at 
com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
        at 
com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
        at 
com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
        at 
com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
        at 
com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
        at 
com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111)
        at 
com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:122)
        at 
com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:332)
        at 
com.datastax.spark.connector.cql.Schema$.tableFromCassandra(Schema.scala:352)
        at 
com.datastax.spark.connector.writer.ReplicaLocator$.apply(ReplicaLocator.scala:62)
        at 
com.datastax.spark.connector.RDDFunctions.repartitionByCassandraReplica(RDDFunctions.scala:238)
        at 
com.datastax.spark.connector.japi.RDDJavaFunctions.repartitionByCassandraReplica(RDDJavaFunctions.java:157)
        at 
jobs.GenerateContestSplitPDF.getLutvDataSet(GenerateContestSplitPDF.java:119)
        at 
jobs.GenerateContestSplitPDF.lambda$main$0(GenerateContestSplitPDF.java:66)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
        ... 5 more
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException:
All host(s) tried for query failed (tried:
ip-xx-xx-xx-xx.ec2.internal/xx.xx.xx.xx:9042
(com.datastax.driver.core.exceptions.TransportException:
[ip-xx-xx-xx-xx.ec2.internal/xx.xx.xx.xx:9042] Cannot connect))
        at 
com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:233)
        at 
com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:79)
        at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1483)
        at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:399)
        at 
com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:161)
        ... 21 more


2020-12-18 07:06:31 WARN  Session:378 - Error creating pool to
ip-xx-xx-xx-xx.ec2.internal/xx.xx.xx.xx:9042
com.datastax.driver.core.exceptions.ConnectionException:
[ip-xx-xx-xx-xx.ec2.internal/xx.xx.xx.xx:9042] Pool was closed during
initialization
        at 
com.datastax.driver.core.HostConnectionPool$2.onSuccess(HostConnectionPool.java:148)
        at 
com.datastax.driver.core.HostConnectionPool$2.onSuccess(HostConnectionPool.java:134)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$6.run(Futures.java:1319)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.execute(ExecutionList.java:145)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.AbstractFuture.set(AbstractFuture.java:185)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$CombinedFuture.setOneValue(Futures.java:1764)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$CombinedFuture.access$400(Futures.java:1608)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$CombinedFuture$2.run(Futures.java:1686)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:457)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.execute(ExecutionList.java:145)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.AbstractFuture.set(AbstractFuture.java:185)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$FallbackFuture$1$1.onSuccess(Futures.java:479)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$6.run(Futures.java:1319)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:457)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$ImmediateFuture.addListener(Futures.java:106)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures.addCallback(Futures.java:1322)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$FallbackFuture$1.onFailure(Futures.java:476)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$6.run(Futures.java:1310)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.execute(ExecutionList.java:145)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:202)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$FallbackFuture$1$1.onFailure(Futures.java:487)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$6.run(Futures.java:1310)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:457)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.add(ExecutionList.java:101)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.AbstractFuture.addListener(AbstractFuture.java:170)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures.addCallback(Futures.java:1322)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$FallbackFuture$1.onFailure(Futures.java:476)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$6.run(Futures.java:1310)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.execute(ExecutionList.java:145)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:202)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:902)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$1$1.run(Futures.java:635)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.Futures$1.run(Futures.java:632)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:457)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.ExecutionList.execute(ExecutionList.java:145)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:202)
        at 
shade.com.datastax.spark.connector.google.common.util.concurrent.SettableFuture.setException(SettableFuture.java:68)
        at 
com.datastax.driver.core.Connection$1.operationComplete(Connection.java:166)
        at 
com.datastax.driver.core.Connection$1.operationComplete(Connection.java:149)
        at 
io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
        at 
io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
        at 
io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
        at 
io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
        at 
io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
        at 
io.netty.channel.epoll.AbstractEpollChannel$AbstractEpollUnsafe.fulfillConnectPromise(AbstractEpollChannel.java:659)
        at 
io.netty.channel.epoll.AbstractEpollChannel$AbstractEpollUnsafe.finishConnect(AbstractEpollChannel.java:678)
        at 
io.netty.channel.epoll.AbstractEpollChannel$AbstractEpollUnsafe.epollOutReady(AbstractEpollChannel.java:552)
        at 
io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:394)
        at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:304)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
        at java.lang.Thread.run(Thread.java:748)

Any suggestions ?

-- 
regards

Vikash Rajoriya
+91-8904505536

Reply via email to