Hi, I'm using Java API. I see mentioned exception in java log.
I'll provide full stacktrace next time.
2014-11-19 1:01 GMT+03:00 Ted Yu yuzhih...@gmail.com:
The thread you mentioned was more about thrift API rather than
TableNotFoundException.
Can you show us the stack trace of
Hi,
I am trying to login to secure cluster with keytabs using below methods. It
works fine if the token is not expired. My process runs for long time ( web
app from tomcat). Keep getting below exceptions after the token expire time and
connection fails if the user tries to view data from web
Take a look at the patch added to
https://issues.apache.org/jira/browse/HBASE-12366
There will be a new AuthUtil. launchAuthChore() which should help in your
case.
(The doc patch is here https://issues.apache.org/jira/browse/HBASE-12528)
Matteo
On Wed, Nov 19, 2014 at 11:19 AM, Bogala, Chandra
Hello:
I also have encountered the exception? do you have some solution? please
tell me.
tks.
xuge...@longshine.com
Apache HBase 0.98.8 is now available for download. Get it from an Apache
mirror [1] or Maven repository.
The list of changes in this release can be found in the release notes [2]
or following this announcement. This release contains a fix for a security
issue, please see HBASE-12536 [3] for more
Hi Arul,
It's a pure client exception: it means that the client has not even tried
to send the query to the server, it failed before.
Why the client failed is another question.
I see that the pool size is 7, have you changed the default configuration?
Cheers,
Nicolas
On Tue, Nov 18, 2014 at
Hi
i need to find whether particular column qualifier present in column family
so i did code like this
As per document
public boolean containsColumn(byte[] family,
byte[] qualifier)
Checks for existence of a value for the specified column (empty or not).
Parameters:family
bq. org.freinds_rep.java.Insert_friend.search_column(Insert_friend.java:106)
Does line 106 correspond to result.containsColumn() call ?
If so, result was null.
On Wed, Nov 19, 2014 at 9:47 AM, beeshma r beeshm...@gmail.com wrote:
Hi
i need to find whether particular column qualifier present
I am running a single node pseudo hbase cluster on top of a pseudo hadoop.
hadoop is 1.2.1 and replication factor of hdfs is 1. And the hbase
version is 0.98.5
Last night, I found the region server crashed (the process is gone)
I found many logs say
[JvmPauseMonitor] util.JvmPauseMonitor: Detected
also in hdfs ui, I found Number of Under-Replicated Blocks : 497741
it seems there are many bad blocks. is there any method to rescue good data?
On Thu, Nov 20, 2014 at 10:52 AM, Li Li fancye...@gmail.com wrote:
I am running a single node pseudo hbase cluster on top of a pseudo hadoop.
hadoop
Have you tried using fsck ?
Cheers
On Wed, Nov 19, 2014 at 6:56 PM, Li Li fancye...@gmail.com wrote:
also in hdfs ui, I found Number of Under-Replicated Blocks : 497741
it seems there are many bad blocks. is there any method to rescue good
data?
On Thu, Nov 20, 2014 at 10:52 AM, Li Li
I have tried and found many file's replication factor is
3(dfs.replication is 1 in hdfs.xml). So I try to set it to 1 now.
there are so many files that it takes more than 30 minutes now and
still not finished.
I will try fsck later
On Thu, Nov 20, 2014 at 11:25 AM, Ted Yu yuzhih...@gmail.com
hadoop fsck /
Status: HEALTHY
Total size:1382743735840 B
Total dirs:1127
Total files: 476753
Total blocks (validated): 490085 (avg. block size 2821436 B)
Minimally replicated blocks: 490085 (100.0 %)
Over-replicated blocks:0 (0.0 %)
Under-replicated blocks:
13 matches
Mail list logo