On Thu, Oct 11, 2012 at 6:03 PM, Jean-Daniel Cryans <[email protected]> wrote: > On Tue, Oct 9, 2012 at 11:51 AM, Kevin Lyda <[email protected]> wrote: >> In reading the docs I learned that hdck in 0.92.2 has some additional >> -fix* options, and -fixAssignments and -fixMeta seem like they might >> fix this. I also got the impression that one could run the 0.92.2 >> version of hdck on a 0.92.1 HBase - possibly needing to restart the >> master server after it completes (or does the master server need to be >> stopped, hdck run and then the master server brought up after >> completion?). > All you really need is a 0.92.2 client that runs 0.92.2, no need to > restart anything.
Excellent, that worked great. All is better and a bunch of space was reclaimed. >> It also appears that a rolling upgrade to 0.92.2 is possible - but it >> says the hdck should be clean. > Yes... if you do have multiple assignments it could make the rolling > restart harder. They're gone now so I'll work out time for a rolling upgrade. > It's a different build profile that includes code with dependencies on > the security-related classes in Hadoop, which are not present in > versions like 0.20, 0.20-append, 0.21, 0.22, etc > > 0.96.0 will be the first HBase release that requires at least Hadoop > 1.0, which means that we won't need two tarballs. Is it a good idea to upgrade to those? Or is the security layer still in flux? Kevin -- Kevin Lyda Dublin, Ireland US Citizen overseas? We can vote. Register now: http://www.votefromabroad.org/
