I am not Cassandra expert :-)

Please consider consulting Cassandra mailing list.

On Thu, Nov 19, 2015 at 4:03 AM, satish chandra j <[email protected]>
wrote:

> HI Ted,
> CASSANDRA-7894 does has the same error details "UnauthorizedException:
> User <UserIDXYZ> has no SELECT permission on <table system.size_estimates>
> or any of its parents" and cause for the error is Java Runtime Exception
> but my stack trace has the cause as "UnauthorizedException: User
> <UserIDXYZ> has no SELECT permission on <table system.size_estimates> or
> any of its parents"
>
> I have verified the permissions grants for the user "UserIDXYZ" which has
> SELECT permission on the "keyspace and the table" on the which it is
> performing query
>
> Please let me know if any further inputs on the same
>
> Regards,
> Satish Chandra
>
>
>
> On Wed, Nov 18, 2015 at 9:06 AM, Ted Yu <[email protected]> wrote:
>
>> Have you considered polling Cassandra mailing list ?
>>
>> A brief search led to CASSANDRA-7894
>>
>> FYI
>>
>> On Tue, Nov 17, 2015 at 7:24 PM, satish chandra j <
>> [email protected]> wrote:
>>
>>> HI All,
>>> I am getting "*.UnauthorizedException: User <UserIDXYZ> has no SELECT
>>> permission on <table system.size_estimates> or any of its parents*"
>>> error while Spark job is fetching data from Cassandra but could able to
>>> save data into Cassandra with out any issues
>>>
>>> Note: With the same user <UserIDXYZ>,  I could able to access and query
>>> the table in CQL UI and code used in Spark Job has been tested in Spark
>>> Shell and it is working fine
>>>
>>> Regards,
>>> Satish Chandra
>>>
>>> On Tue, Nov 17, 2015 at 11:45 PM, satish chandra j <
>>> [email protected]> wrote:
>>>
>>>> HI All,
>>>> I am getting "*.UnauthorizedException: User <UserIDXYZ> has no SELECT
>>>> permission on <table system.size_estimates> or any of its parents*"
>>>> error while Spark job is fetching data from Cassandra but could able to
>>>> save data into Cassandra with out any issues
>>>>
>>>> Note: With the same user <UserIDXYZ>,  I could able to access and query
>>>> the table in CQL UI and code used in Spark Job has been tested in Spark
>>>> Shell and it is working fine
>>>>
>>>> Please find the below stack trace
>>>>
>>>> WARN  2015-11-17 07:24:23 org.apache.spark.scheduler.DAGScheduler:
>>>> Creating new stage failed due to exception - job: 0
>>>>
>>>> com.datastax.driver.core.exceptions.UnauthorizedException:* User <*
>>>> *UserIDXYZ**> has no SELECT permission on <table
>>>> system.size_estimates> or any of its parents*
>>>>
>>>>         at
>>>> com.datastax.driver.core.exceptions.UnauthorizedException.copy(UnauthorizedException.java:36)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:269)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:183)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:52)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:44)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> ~[na:1.8.0_51]
>>>>
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> ~[na:1.8.0_51]
>>>>
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> ~[na:1.8.0_51]
>>>>
>>>>         at java.lang.reflect.Method.invoke(Method.java:497)
>>>> ~[na:1.8.0_51]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at com.sun.proxy.$Proxy10.execute(Unknown Source) ~[na:na]
>>>>
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> ~[na:1.8.0_51]
>>>>
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> ~[na:1.8.0_51]
>>>>
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> ~[na:1.8.0_51]
>>>>
>>>>        at java.lang.reflect.Method.invoke(Method.java:497)
>>>> ~[na:1.8.0_51]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at com.sun.proxy.$Proxy10.execute(Unknown Source) ~[na:na]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.partitioner.DataSizeEstimates$$anonfun$tokenRanges$1.apply(DataSizeEstimates.scala:40)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.partitioner.DataSizeEstimates$$anonfun$tokenRanges$1.apply(DataSizeEstimates.scala:38)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.partitioner.DataSizeEstimates.tokenRanges$lzycompute(DataSizeEstimates.scala:38)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.partitioner.DataSizeEstimates.tokenRanges(DataSizeEstimates.scala:37)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.partitioner.DataSizeEstimates.dataSizeInBytes$lzycompute(DataSizeEstimates.scala:81)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.partitioner.DataSizeEstimates.dataSizeInBytes(DataSizeEstimates.scala:80)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.partitioner.CassandraRDDPartitioner.<init>(CassandraRDDPartitioner.scala:39)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.partitioner.CassandraRDDPartitioner$.apply(CassandraRDDPartitioner.scala:176)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:144)
>>>> ~[spark-cassandra-connector_2.10-1.4.0.jar:1.4.0]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at scala.Option.getOrElse(Option.scala:120)
>>>> ~[scala-library-2.10.5.jar:na]
>>>>
>>>>         at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at scala.Option.getOrElse(Option.scala:120)
>>>> ~[scala-library-2.10.5.jar:na]
>>>>
>>>>         at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at scala.Option.getOrElse(Option.scala:120)
>>>> ~[scala-library-2.10.5.jar:na]
>>>>
>>>>         at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at scala.Option.getOrElse(Option.scala:120)
>>>> ~[scala-library-2.10.5.jar:na]
>>>>
>>>>         at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.ShuffleDependency.<init>(Dependency.scala:82)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.ShuffledRDD.getDependencies(ShuffledRDD.scala:78)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:206)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:204)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>        at scala.Option.getOrElse(Option.scala:120)
>>>> ~[scala-library-2.10.5.jar:na]
>>>>
>>>>         at org.apache.spark.rdd.RDD.dependencies(RDD.scala:204)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:321)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.scheduler.DAGScheduler.getParentStages(DAGScheduler.scala:333)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.scheduler.DAGScheduler.getParentStagesAndId(DAGScheduler.scala:234)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.scheduler.DAGScheduler.newResultStage(DAGScheduler.scala:270)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:768)
>>>> ~[spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1426)
>>>> [spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1418)
>>>> [spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>>         at
>>>> org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
>>>> [spark-core_2.10-1.4.1.1.jar:1.4.1.1]
>>>>
>>>> Caused by: *com.datastax.driver.core.exceptions.UnauthorizedException:
>>>> User <**UserIDXYZ**> has no SELECT permission on <table
>>>> system.size_estimates> or any of its parents*
>>>>
>>>>         at
>>>> com.datastax.driver.core.Responses$Error.asException(Responses.java:101)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.DefaultResultSetFuture.onSet(DefaultResultSetFuture.java:118)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.RequestHandler.setFinalResult(RequestHandler.java:183)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.RequestHandler.access$2300(RequestHandler.java:45)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.RequestHandler$SpeculativeExecution.setFinalResult(RequestHandler.java:748)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:573)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:991)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:913)
>>>> ~[cassandra-driver-core-2.1.7.1.jar:na]
>>>>
>>>>         at
>>>> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:163)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.epoll.EpollSocketChannel$EpollSocketUnsafe.epollInReady(EpollSocketChannel.java:722)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:326)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:264)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at
>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>>>> ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>>>>
>>>>         at java.lang.Thread.run(Thread.java:745)
>>>>
>>>>
>>>> Please let me know if any solutions to fix the same
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Satish Chandra
>>>>
>>>
>>>
>>
>

Reply via email to