Jignesh,

passing --config <path_to_hbase_configs> would help.

Like:
bin/hbase --config <path_to_hbase_configs> shell

-Giri

On 10/12/11 4:50 PM, Matt Foley wrote:
Hi Jignesh,
Not clear what's going on with your ZK, but as a starting point, the
hsync/flush feature in 205 was implemented with an on-off switch.  Make sure
you've turned it on by setting  *dfs.support.append  *to true in the
hdfs-site.xml config file.

Also, are you installing Hadoop with security turned on or off?

I'll gather some other config info that should help.
--Matt


On Wed, Oct 12, 2011 at 1:47 PM, Jignesh Patel<[email protected]>  wrote:

When I tried to run Hbase 0.90.4 with hadoop-.0.20.205.0 I got following
error

Jignesh-MacBookPro:hadoop-hbase hadoop-user$ bin/hbase shell
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 0.90.4, r1150278, Sun Jul 24 15:53:29 PDT 2011

hbase(main):001:0>  status

ERROR: org.apache.hadoop.hbase.ZooKeeperConnectionException: HBase is able
to connect to ZooKeeper but the connection closes immediately. This could be
a sign that the server has too many connections (30 is the default).
Consider inspecting your ZK server logs for that error and then make sure
you are reusing HBaseConfiguration as often as you can. See HTable's javadoc
for more information.


And when I tried to stop Hbase I continuously sees dot being printed and no
sign of stopping it. Not sure why it just simply stop it.

stopping
hbase...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................….


On Oct 12, 2011, at 3:19 PM, Jignesh Patel wrote:

The new plugin works after deleting eclipse and reinstalling it.
On Oct 12, 2011, at 2:39 PM, Jignesh Patel wrote:

I have installed Hadoop-0.20.205.0 but when I replace the hadoop
0.20.204.0 eclipse plugin with the 0.20.205.0, eclipse is not recognizing
it.
-Jignesh
On Oct 12, 2011, at 12:31 PM, Vinod Gupta Tankala wrote:

its free and open source too.. basically, their releases are ahead of
public
releases of hadoop/hbase - from what i understand, major bug fixes and
enhancements are checked in to their branch first and then eventually
make
it to public release branches.

thanks

On Wed, Oct 12, 2011 at 9:26 AM, Jignesh Patel<[email protected]>
wrote:
Sorry to here that.
Is CDH3 is a open source or a paid version?

-jignesh
On Oct 12, 2011, at 11:58 AM, Vinod Gupta Tankala wrote:

for what its worth, i was in a similar situation/dilemma few days ago
and
got frustrated figuring out what version combination of hadoop/hbase
to
use
and how to build hadoop manually to be compatible with hbase. the
build
process didn't work for me either.
eventually, i ended up using cloudera distribution and i think it
saved
me a
lot of headache and time.

thanks

On Tue, Oct 11, 2011 at 8:29 PM, jigneshmpatel<
[email protected]
wrote:

Matt,
Thanks a lot. Just wanted to have some more information. If hadoop
0.2.205.0
voted by the community members then will it become major release?
And
what
if it is not approved by community members.

And as you said I do like to use 0.90.3 if it works. If it is ok,
can
you
share the deails of those configuration changes?

-Jignesh

--
View this message in context:

http://lucene.472066.n3.nabble.com/Hbase-with-Hadoop-tp3413950p3414658.html
Sent from the Hadoop lucene-users mailing list archive at
Nabble.com.




--
-Giri

Reply via email to