[2/5] hbase git commit: HBASE-16701 rely on test category timeout instead of defining one on a specific test.

2016-10-09 Thread busbey
HBASE-16701 rely on test category timeout instead of defining one on a specific 
test.

Signed-off-by: Umesh Agashe 
Signed-off-by: Yu Li 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/acb1392b
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/acb1392b
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/acb1392b

Branch: refs/heads/branch-1
Commit: acb1392b1533b8ebedf2e45b6f133516cdbf99ee
Parents: 364a57a
Author: Sean Busbey 
Authored: Wed Oct 5 17:23:20 2016 -0500
Committer: Sean Busbey 
Committed: Mon Oct 10 00:24:24 2016 -0500

--
 .../java/org/apache/hadoop/hbase/regionserver/TestHRegion.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/acb1392b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
index 1265468..7cf76fc 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
@@ -6639,7 +6639,7 @@ public class TestHRegion {
* HBASE-16429 Make sure no stuck if roll writer when ring buffer is filled 
with appends
* @throws IOException if IO error occurred during test
*/
-  @Test(timeout = 6)
+  @Test
   public void testWritesWhileRollWriter() throws IOException {
 int testCount = 10;
 int numRows = 1024;



[4/5] hbase git commit: HBASE-16701 rely on test category timeout instead of defining one on a specific test.

2016-10-09 Thread busbey
HBASE-16701 rely on test category timeout instead of defining one on a specific 
test.

Signed-off-by: Umesh Agashe 
Signed-off-by: Yu Li 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/bd38f8db
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/bd38f8db
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/bd38f8db

Branch: refs/heads/branch-1.2
Commit: bd38f8dbfde13b862bb79dad53edf4203a28cff1
Parents: 04bd0ec
Author: Sean Busbey 
Authored: Wed Oct 5 17:23:20 2016 -0500
Committer: Sean Busbey 
Committed: Mon Oct 10 00:27:49 2016 -0500

--
 .../java/org/apache/hadoop/hbase/regionserver/TestHRegion.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/bd38f8db/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
index 8d67b98..7dbbdc4 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
@@ -6573,7 +6573,7 @@ public class TestHRegion {
* HBASE-16429 Make sure no stuck if roll writer when ring buffer is filled 
with appends
* @throws IOException if IO error occurred during test
*/
-  @Test(timeout = 6)
+  @Test
   public void testWritesWhileRollWriter() throws IOException {
 int testCount = 10;
 int numRows = 1024;



[1/5] hbase git commit: HBASE-16701 rely on test category timeout instead of defining one on a specific test.

2016-10-09 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/branch-1 364a57a95 -> acb1392b1
  refs/heads/branch-1.1 e7ee6fa20 -> 9620dc4e7
  refs/heads/branch-1.2 04bd0ec8e -> bd38f8dbf
  refs/heads/branch-1.3 0704aed44 -> 6a1598674
  refs/heads/master f5abe17bc -> 6b6a80187


HBASE-16701 rely on test category timeout instead of defining one on a specific 
test.

Signed-off-by: Umesh Agashe 
Signed-off-by: Yu Li 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/6b6a8018
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/6b6a8018
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/6b6a8018

Branch: refs/heads/master
Commit: 6b6a80187693ebcecfb774af51a3e2c875223cda
Parents: f5abe17
Author: Sean Busbey 
Authored: Wed Oct 5 17:23:20 2016 -0500
Committer: Sean Busbey 
Committed: Mon Oct 10 00:14:38 2016 -0500

--
 .../java/org/apache/hadoop/hbase/regionserver/TestHRegion.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/6b6a8018/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
index a69c0ee..612d6cf 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
@@ -6731,7 +6731,7 @@ public class TestHRegion {
* HBASE-16429 Make sure no stuck if roll writer when ring buffer is filled 
with appends
* @throws IOException if IO error occurred during test
*/
-  @Test(timeout = 6)
+  @Test
   public void testWritesWhileRollWriter() throws IOException {
 int testCount = 10;
 int numRows = 1024;



[3/5] hbase git commit: HBASE-16701 rely on test category timeout instead of defining one on a specific test.

2016-10-09 Thread busbey
HBASE-16701 rely on test category timeout instead of defining one on a specific 
test.

Signed-off-by: Umesh Agashe 
Signed-off-by: Yu Li 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/6a159867
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/6a159867
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/6a159867

Branch: refs/heads/branch-1.3
Commit: 6a15986743400a249c85c5d0f0251c5790d99367
Parents: 0704aed
Author: Sean Busbey 
Authored: Wed Oct 5 17:23:20 2016 -0500
Committer: Sean Busbey 
Committed: Mon Oct 10 00:25:08 2016 -0500

--
 .../java/org/apache/hadoop/hbase/regionserver/TestHRegion.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/6a159867/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
index 708af58..2b3a9b5 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
@@ -6628,7 +6628,7 @@ public class TestHRegion {
* HBASE-16429 Make sure no stuck if roll writer when ring buffer is filled 
with appends
* @throws IOException if IO error occurred during test
*/
-  @Test(timeout = 6)
+  @Test
   public void testWritesWhileRollWriter() throws IOException {
 int testCount = 10;
 int numRows = 1024;



[5/5] hbase git commit: HBASE-16701 rely on test category timeout instead of defining one on a specific test.

2016-10-09 Thread busbey
HBASE-16701 rely on test category timeout instead of defining one on a specific 
test.

Signed-off-by: Umesh Agashe 
Signed-off-by: Yu Li 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/9620dc4e
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/9620dc4e
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/9620dc4e

Branch: refs/heads/branch-1.1
Commit: 9620dc4e7a5e4712007be465a64bcc07af9cfe34
Parents: e7ee6fa
Author: Sean Busbey 
Authored: Wed Oct 5 17:23:20 2016 -0500
Committer: Sean Busbey 
Committed: Mon Oct 10 00:29:22 2016 -0500

--
 .../java/org/apache/hadoop/hbase/regionserver/TestHRegion.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/9620dc4e/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
index 8947eaf..ed26a6d 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/TestHRegion.java
@@ -6490,7 +6490,7 @@ public class TestHRegion {
* HBASE-16429 Make sure no stuck if roll writer when ring buffer is filled 
with appends
* @throws IOException if IO error occurred during test
*/
-  @Test(timeout = 6)
+  @Test
   public void testWritesWhileRollWriter() throws IOException {
 int testCount = 10;
 int numRows = 1024;



hbase git commit: HBASE-16666 Add append and remove peer namespaces cmds for replication (Guanghao Zhang)

2016-10-09 Thread tedyu
Repository: hbase
Updated Branches:
  refs/heads/master ccde43939 -> f5abe17bc


HBASE-1 Add append and remove peer namespaces cmds for replication 
(Guanghao Zhang)


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/f5abe17b
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/f5abe17b
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/f5abe17b

Branch: refs/heads/master
Commit: f5abe17bc66ae9b780daec3afb6f08e69a5cf392
Parents: ccde439
Author: tedyu 
Authored: Sun Oct 9 21:22:50 2016 -0700
Committer: tedyu 
Committed: Sun Oct 9 21:22:50 2016 -0700

--
 .../src/main/ruby/hbase/replication_admin.rb| 37 
 hbase-shell/src/main/ruby/shell.rb  |  2 +
 .../shell/commands/append_peer_namespaces.rb| 44 +
 .../shell/commands/remove_peer_namespaces.rb| 41 +
 .../test/ruby/hbase/replication_admin_test.rb   | 93 +++-
 5 files changed, 214 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/f5abe17b/hbase-shell/src/main/ruby/hbase/replication_admin.rb
--
diff --git a/hbase-shell/src/main/ruby/hbase/replication_admin.rb 
b/hbase-shell/src/main/ruby/hbase/replication_admin.rb
index f99ccae..8aa158b 100644
--- a/hbase-shell/src/main/ruby/hbase/replication_admin.rb
+++ b/hbase-shell/src/main/ruby/hbase/replication_admin.rb
@@ -205,10 +205,47 @@ module Hbase
   end
 end
 
+# Add some namespaces for the specified peer
+def add_peer_namespaces(id, namespaces)
+  unless namespaces.nil?
+rpc = get_peer_config(id)
+unless rpc.nil?
+  ns_set = rpc.getNamespaces()
+  if ns_set.nil?
+ns_set = java.util.HashSet.new
+  end
+  namespaces.each do |n|
+ns_set.add(n)
+  end
+  rpc.setNamespaces(ns_set)
+  @replication_admin.updatePeerConfig(id, rpc)
+end
+  end
+end
+
+# Remove some namespaces for the specified peer
+def remove_peer_namespaces(id, namespaces)
+  unless namespaces.nil?
+rpc = get_peer_config(id)
+unless rpc.nil?
+  ns_set = rpc.getNamespaces()
+  unless ns_set.nil?
+namespaces.each do |n|
+  ns_set.remove(n)
+end
+  end
+  rpc.setNamespaces(ns_set)
+  @replication_admin.updatePeerConfig(id, rpc)
+end
+  end
+end
+
 # Show the current namespaces config for the specified peer
 def show_peer_namespaces(peer_config)
   namespaces = peer_config.get_namespaces
   if !namespaces.nil?
+namespaces = java.util.ArrayList.new(namespaces)
+java.util.Collections.sort(namespaces)
 return namespaces.join(';')
   else
 return nil

http://git-wip-us.apache.org/repos/asf/hbase/blob/f5abe17b/hbase-shell/src/main/ruby/shell.rb
--
diff --git a/hbase-shell/src/main/ruby/shell.rb 
b/hbase-shell/src/main/ruby/shell.rb
index ee508e9..02f8191 100644
--- a/hbase-shell/src/main/ruby/shell.rb
+++ b/hbase-shell/src/main/ruby/shell.rb
@@ -371,6 +371,8 @@ Shell.load_command_group(
 enable_peer
 disable_peer
 set_peer_namespaces
+append_peer_namespaces
+remove_peer_namespaces
 show_peer_tableCFs
 set_peer_tableCFs
 list_replicated_tables

http://git-wip-us.apache.org/repos/asf/hbase/blob/f5abe17b/hbase-shell/src/main/ruby/shell/commands/append_peer_namespaces.rb
--
diff --git a/hbase-shell/src/main/ruby/shell/commands/append_peer_namespaces.rb 
b/hbase-shell/src/main/ruby/shell/commands/append_peer_namespaces.rb
new file mode 100644
index 000..2585754
--- /dev/null
+++ b/hbase-shell/src/main/ruby/shell/commands/append_peer_namespaces.rb
@@ -0,0 +1,44 @@
+#
+# Copyright The Apache Software Foundation
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations 

hbase git commit: HBASE-16771 VerifyReplication should increase GOODROWS counter if re-comparison passes

2016-10-09 Thread tedyu
Repository: hbase
Updated Branches:
  refs/heads/branch-1 4b75614a2 -> 364a57a95


HBASE-16771 VerifyReplication should increase GOODROWS counter if re-comparison 
passes


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/364a57a9
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/364a57a9
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/364a57a9

Branch: refs/heads/branch-1
Commit: 364a57a9500b942163932e2c644c05bff3e55090
Parents: 4b75614
Author: tedyu 
Authored: Sun Oct 9 20:51:23 2016 -0700
Committer: tedyu 
Committed: Sun Oct 9 20:51:23 2016 -0700

--
 .../hbase/mapreduce/replication/VerifyReplication.java  | 12 
 1 file changed, 8 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/364a57a9/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
--
diff --git 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
index 8bb1592..bf320ee 100644
--- 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
+++ 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
@@ -102,6 +102,7 @@ public class VerifyReplication extends Configured 
implements Tool {
 private Result currentCompareRowInPeerTable;
 private Table replicatedTable;
 private int sleepMsBeforeReCompare;
+private String delimiter = "";
 private boolean verbose = false;
 
 /**
@@ -119,6 +120,7 @@ public class VerifyReplication extends Configured 
implements Tool {
   if (replicatedScanner == null) {
 Configuration conf = context.getConfiguration();
 sleepMsBeforeReCompare = conf.getInt(NAME +".sleepMsBeforeReCompare", 
0);
+delimiter = conf.get(NAME + ".delimiter", "");
 verbose = conf.getBoolean(NAME +".verbose", false);
 final Scan scan = new Scan();
 scan.setBatch(batch);
@@ -179,7 +181,6 @@ public class VerifyReplication extends Configured 
implements Tool {
 }
   } catch (Exception e) {
 logFailRowAndIncreaseCounter(context, 
Counters.CONTENT_DIFFERENT_ROWS, value);
-LOG.error("Exception while comparing row : " + e);
   }
   currentCompareRowInPeerTable = replicatedScanner.next();
   break;
@@ -203,9 +204,11 @@ public class VerifyReplication extends Configured 
implements Tool {
   Result sourceResult = sourceTable.get(new Get(row.getRow()));
   Result replicatedResult = replicatedTable.get(new Get(row.getRow()));
   Result.compareResults(sourceResult, replicatedResult);
-  context.getCounter(Counters.GOODROWS).increment(1);
-  if (verbose) {
-LOG.info("Good row key: " + delimiter + 
Bytes.toString(row.getRow()) + delimiter);
+  if (!sourceResult.isEmpty()) {
+context.getCounter(Counters.GOODROWS).increment(1);
+if (verbose) {
+  LOG.info("Good row key: " + delimiter + 
Bytes.toString(row.getRow()) + delimiter);
+}
   }
   return;
 } catch (Exception e) {
@@ -309,6 +312,7 @@ public class VerifyReplication extends Configured 
implements Tool {
 conf.setLong(NAME+".startTime", startTime);
 conf.setLong(NAME+".endTime", endTime);
 conf.setInt(NAME +".sleepMsBeforeReCompare", sleepMsBeforeReCompare);
+conf.set(NAME + ".delimiter", delimiter);
 conf.setBoolean(NAME +".verbose", verbose);
 if (families != null) {
   conf.set(NAME+".families", families);



hbase git commit: HBASE-16771 VerifyReplication should increase GOODROWS counter if re-comparison passes

2016-10-09 Thread tedyu
Repository: hbase
Updated Branches:
  refs/heads/master 8a8c60889 -> ccde43939


HBASE-16771 VerifyReplication should increase GOODROWS counter if re-comparison 
passes


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/ccde4393
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/ccde4393
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/ccde4393

Branch: refs/heads/master
Commit: ccde4393925a6fd5f97a11068cdc96fa4e4d4ac0
Parents: 8a8c608
Author: tedyu 
Authored: Sun Oct 9 20:48:28 2016 -0700
Committer: tedyu 
Committed: Sun Oct 9 20:48:28 2016 -0700

--
 .../hbase/mapreduce/replication/VerifyReplication.java  | 12 
 1 file changed, 8 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/ccde4393/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
--
diff --git 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
index 0273b91..88bf815 100644
--- 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
+++ 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/mapreduce/replication/VerifyReplication.java
@@ -107,6 +107,7 @@ public class VerifyReplication extends Configured 
implements Tool {
 private ResultScanner replicatedScanner;
 private Result currentCompareRowInPeerTable;
 private int sleepMsBeforeReCompare;
+private String delimiter = "";
 private boolean verbose = false;
 
 /**
@@ -124,6 +125,7 @@ public class VerifyReplication extends Configured 
implements Tool {
   if (replicatedScanner == null) {
 Configuration conf = context.getConfiguration();
 sleepMsBeforeReCompare = conf.getInt(NAME +".sleepMsBeforeReCompare", 
0);
+delimiter = conf.get(NAME + ".delimiter", "");
 verbose = conf.getBoolean(NAME +".verbose", false);
 final Scan scan = new Scan();
 scan.setBatch(batch);
@@ -180,7 +182,6 @@ public class VerifyReplication extends Configured 
implements Tool {
 }
   } catch (Exception e) {
 logFailRowAndIncreaseCounter(context, 
Counters.CONTENT_DIFFERENT_ROWS, value);
-LOG.error("Exception while comparing row : " + e);
   }
   currentCompareRowInPeerTable = replicatedScanner.next();
   break;
@@ -204,9 +205,11 @@ public class VerifyReplication extends Configured 
implements Tool {
   Result sourceResult = sourceTable.get(new Get(row.getRow()));
   Result replicatedResult = replicatedTable.get(new Get(row.getRow()));
   Result.compareResults(sourceResult, replicatedResult);
-  context.getCounter(Counters.GOODROWS).increment(1);
-  if (verbose) {
-LOG.info("Good row key: " + delimiter + 
Bytes.toString(row.getRow()) + delimiter);
+  if (!sourceResult.isEmpty()) {
+context.getCounter(Counters.GOODROWS).increment(1);
+if (verbose) {
+  LOG.info("Good row key: " + delimiter + 
Bytes.toString(row.getRow()) + delimiter);
+}
   }
   return;
 } catch (Exception e) {
@@ -320,6 +323,7 @@ public class VerifyReplication extends Configured 
implements Tool {
 conf.setLong(NAME+".startTime", startTime);
 conf.setLong(NAME+".endTime", endTime);
 conf.setInt(NAME +".sleepMsBeforeReCompare", sleepMsBeforeReCompare);
+conf.set(NAME + ".delimiter", delimiter);
 conf.setBoolean(NAME +".verbose", verbose);
 if (families != null) {
   conf.set(NAME+".families", families);



hbase git commit: HBASE-16748 update CHANGES.txt for additional changes for RC0.

2016-10-09 Thread busbey
Repository: hbase
Updated Branches:
  refs/heads/branch-1.2 017bc3337 -> 04bd0ec8e
Updated Tags:  refs/tags/1.2.4RC0 [created] ecad21d46


HBASE-16748 update CHANGES.txt for additional changes for RC0.


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/04bd0ec8
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/04bd0ec8
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/04bd0ec8

Branch: refs/heads/branch-1.2
Commit: 04bd0ec8e4cb2f39e5f787126f9b6645dabf27df
Parents: 017bc33
Author: Sean Busbey 
Authored: Sun Oct 9 21:30:40 2016 -0500
Committer: Sean Busbey 
Committed: Sun Oct 9 21:30:40 2016 -0500

--
 CHANGES.txt | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/04bd0ec8/CHANGES.txt
--
diff --git a/CHANGES.txt b/CHANGES.txt
index 9c1dd9d..3132afe 100644
--- a/CHANGES.txt
+++ b/CHANGES.txt
@@ -1,7 +1,7 @@
 HBase Change Log
 
 
-Release Notes - HBase - Version 1.2.4 10/10/2016
+Release Notes - HBase - Version 1.2.4 10/17/2016
 
 ** Sub-task
 * [HBASE-14734] - BindException when setting up MiniKdc
@@ -26,7 +26,9 @@ Release Notes - HBase - Version 1.2.4 10/10/2016
 * [HBASE-16649] - Truncate table with splits preserved can cause both data 
loss and truncated data appeared again
 * [HBASE-16662] - Fix open POODLE vulnerabilities
 * [HBASE-16678] - MapReduce jobs do not update counters from ScanMetrics
+* [HBASE-16682] - Fix Shell tests failure. NoClassDefFoundError for MiniKdc
 * [HBASE-16721] - Concurrency issue in WAL unflushed seqId tracking
+* [HBASE-16723] - RMI registry is not destroyed after stopping JMX 
Connector Server
 * [HBASE-16732] - Avoid possible NPE in MetaTableLocator
 
 ** Improvement



hbase git commit: HBASE-16794 TestDispatchMergingRegionsProcedure#testMergeRegionsConcurrently is flaky

2016-10-09 Thread mbertozzi
Repository: hbase
Updated Branches:
  refs/heads/master e06c3676f -> 8a8c60889


HBASE-16794 TestDispatchMergingRegionsProcedure#testMergeRegionsConcurrently is 
flaky

Signed-off-by: Matteo Bertozzi 


Project: http://git-wip-us.apache.org/repos/asf/hbase/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase/commit/8a8c6088
Tree: http://git-wip-us.apache.org/repos/asf/hbase/tree/8a8c6088
Diff: http://git-wip-us.apache.org/repos/asf/hbase/diff/8a8c6088

Branch: refs/heads/master
Commit: 8a8c60889cf67b581d7adb4245e0bcc02cdfdc93
Parents: e06c367
Author: ChiaPing Tsai 
Authored: Sun Oct 9 16:52:54 2016 -0700
Committer: Matteo Bertozzi 
Committed: Sun Oct 9 16:53:29 2016 -0700

--
 .../hbase/regionserver/CompactSplitThread.java  |  5 +++
 .../TestDispatchMergingRegionsProcedure.java| 45 +---
 2 files changed, 44 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/hbase/blob/8a8c6088/hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/CompactSplitThread.java
--
diff --git 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/CompactSplitThread.java
 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/CompactSplitThread.java
index c1f82b9..a454f0e 100644
--- 
a/hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/CompactSplitThread.java
+++ 
b/hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/CompactSplitThread.java
@@ -724,6 +724,11 @@ public class CompactSplitThread implements 
CompactionRequestor, PropagatingConfi
   }
 
   @VisibleForTesting
+  public long getCompletedMergeTaskCount() {
+return mergePool.getCompletedTaskCount();
+  }
+
+  @VisibleForTesting
   /**
* Shutdown the long compaction thread pool.
* Should only be used in unit test to prevent long compaction thread pool 
from stealing job

http://git-wip-us.apache.org/repos/asf/hbase/blob/8a8c6088/hbase-server/src/test/java/org/apache/hadoop/hbase/master/procedure/TestDispatchMergingRegionsProcedure.java
--
diff --git 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/master/procedure/TestDispatchMergingRegionsProcedure.java
 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/master/procedure/TestDispatchMergingRegionsProcedure.java
index 601f22f..a7dd4a8 100644
--- 
a/hbase-server/src/test/java/org/apache/hadoop/hbase/master/procedure/TestDispatchMergingRegionsProcedure.java
+++ 
b/hbase-server/src/test/java/org/apache/hadoop/hbase/master/procedure/TestDispatchMergingRegionsProcedure.java
@@ -18,7 +18,9 @@
 
 package org.apache.hadoop.hbase.master.procedure;
 
+import java.io.IOException;
 import java.util.List;
+import java.util.concurrent.TimeUnit;
 
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
@@ -36,6 +38,7 @@ import 
org.apache.hadoop.hbase.shaded.protobuf.generated.MasterProcedureProtos.D
 import org.apache.hadoop.hbase.testclassification.MasterTests;
 import org.apache.hadoop.hbase.testclassification.MediumTests;
 import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.hadoop.hbase.util.JVMClusterUtil.RegionServerThread;
 import org.junit.After;
 import org.junit.AfterClass;
 import org.junit.Before;
@@ -125,20 +128,21 @@ public class TestDispatchMergingRegionsProcedure {
 regionsToMerge[0] = tableRegions.get(0);
 regionsToMerge[1] = tableRegions.get(1);
 
+final int initCompletedTaskCount = countOfCompletedMergeTaskCount();
 long procId = procExec.submitProcedure(new DispatchMergingRegionsProcedure(
   procExec.getEnvironment(), tableName, regionsToMerge, true));
 ProcedureTestingUtility.waitProcedure(procExec, procId);
 ProcedureTestingUtility.assertProcNotFailed(procExec, procId);
 
-assertRegionCount(tableName, 2);
+assertRegionCount(tableName, 2, 1, initCompletedTaskCount);
   }
 
   /**
* This tests two concurrent region merges
*/
-  @Test(timeout=9)
+  @Test(timeout=6)
   public void testMergeRegionsConcurrently() throws Exception {
-final TableName tableName = TableName.valueOf("testMergeTwoRegions");
+final TableName tableName = 
TableName.valueOf("testMergeRegionsConcurrently");
 final ProcedureExecutor procExec = 
getMasterProcedureExecutor();
 
 List tableRegions = createTable(tableName, 4);
@@ -150,6 +154,7 @@ public class TestDispatchMergingRegionsProcedure {
 regionsToMerge2[0] = tableRegions.get(2);
 regionsToMerge2[1] = tableRegions.get(3);
 
+final int initCompletedTaskCount = countOfCompletedMergeTaskCount();
 long procId1 = procExec.submitProcedure(new 
DispatchMergingRegionsProcedure(
   procExec.getEnvironment(), 

[17/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/ValueFilter.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/filter/ValueFilter.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/ValueFilter.html
index bd19cfa..5454754 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/filter/ValueFilter.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/filter/ValueFilter.html
@@ -34,103 +34,102 @@
 026import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 027import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 028import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-029import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-030import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-031
-032import 
com.google.protobuf.InvalidProtocolBufferException;
-033
-034/**
-035 * This filter is used to filter based on 
column value. It takes an
-036 * operator (equal, greater, not equal, 
etc) and a byte [] comparator for the
-037 * cell value.
-038 * p
-039 * This filter can be wrapped with {@link 
WhileMatchFilter} and {@link SkipFilter}
-040 * to add more control.
-041 * p
-042 * Multiple filters can be combined using 
{@link FilterList}.
-043 * p
-044 * To test the value of a single 
qualifier when scanning multiple qualifiers,
-045 * use {@link SingleColumnValueFilter}.
-046 */
-047@InterfaceAudience.Public
-048@InterfaceStability.Stable
-049public class ValueFilter extends 
CompareFilter {
-050
-051  /**
-052   * Constructor.
-053   * @param valueCompareOp the compare op 
for value matching
-054   * @param valueComparator the 
comparator for value matching
-055   */
-056  public ValueFilter(final CompareOp 
valueCompareOp,
-057  final ByteArrayComparable 
valueComparator) {
-058super(valueCompareOp, 
valueComparator);
-059  }
-060
-061  @Override
-062  public ReturnCode filterKeyValue(Cell 
v) {
-063if (compareValue(this.compareOp, 
this.comparator, v)) {
-064  return ReturnCode.SKIP;
-065}
-066return ReturnCode.INCLUDE;
-067  }
-068
-069  public static Filter 
createFilterFromArguments(ArrayListbyte [] filterArguments) {
-070@SuppressWarnings("rawtypes")  // for 
arguments
-071ArrayList arguments = 
CompareFilter.extractArguments(filterArguments);
-072CompareOp compareOp = 
(CompareOp)arguments.get(0);
-073ByteArrayComparable comparator = 
(ByteArrayComparable)arguments.get(1);
-074return new ValueFilter(compareOp, 
comparator);
-075  }
-076
-077  /**
-078   * @return The filter serialized using 
pb
-079   */
-080  public byte [] toByteArray() {
-081FilterProtos.ValueFilter.Builder 
builder =
-082  
FilterProtos.ValueFilter.newBuilder();
-083
builder.setCompareFilter(super.convert());
-084return 
builder.build().toByteArray();
-085  }
-086
-087  /**
-088   * @param pbBytes A pb serialized 
{@link ValueFilter} instance
-089   * @return An instance of {@link 
ValueFilter} made from codebytes/code
-090   * @throws DeserializationException
-091   * @see #toByteArray
-092   */
-093  public static ValueFilter 
parseFrom(final byte [] pbBytes)
-094  throws DeserializationException {
-095FilterProtos.ValueFilter proto;
-096try {
-097  proto = 
FilterProtos.ValueFilter.parseFrom(pbBytes);
-098} catch 
(InvalidProtocolBufferException e) {
-099  throw new 
DeserializationException(e);
-100}
-101final CompareOp valueCompareOp =
-102  
CompareOp.valueOf(proto.getCompareFilter().getCompareOp().name());
-103ByteArrayComparable valueComparator = 
null;
-104try {
-105  if 
(proto.getCompareFilter().hasComparator()) {
-106valueComparator = 
ProtobufUtil.toComparator(proto.getCompareFilter().getComparator());
-107  }
-108} catch (IOException ioe) {
-109  throw new 
DeserializationException(ioe);
-110}
-111return new 
ValueFilter(valueCompareOp,valueComparator);
-112  }
-113
-114  /**
-115   * @param other
-116   * @return true if and only if the 
fields of the filter that are serialized
-117   * are equal to the corresponding 
fields in other.  Used for testing.
-118   */
-119  boolean areSerializedFieldsEqual(Filter 
o) {
-120if (o == this) return true;
-121if (!(o instanceof ValueFilter)) 
return false;
-122
-123return 
super.areSerializedFieldsEqual(o);
-124  }
-125}
+029import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+030import 
org.apache.hadoop.hbase.shaded.protobuf.generated.FilterProtos;
+031import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException;
+032
+033/**
+034 * This filter is used to filter based on 
column value. It takes an
+035 * operator (equal, greater, not equal, 
etc) and a byte [] comparator for the
+036 * cell value.
+037 * p
+038 * This filter can be wrapped with {@link 
WhileMatchFilter} and {@link SkipFilter}
+039 * to add 

[50/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/acid-semantics.html
--
diff --git a/acid-semantics.html b/acid-semantics.html
index 4879e2d..e1a3825 100644
--- a/acid-semantics.html
+++ b/acid-semantics.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase   
   Apache HBase (TM) ACID Properties
@@ -600,7 +600,7 @@ under the License. -->
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-09-29
+  Last Published: 
2016-10-09
 
 
 



[42/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/filter/FilterList.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/filter/FilterList.html 
b/apidocs/org/apache/hadoop/hbase/filter/FilterList.html
index 8a2aeb8..8bf8c8b 100644
--- a/apidocs/org/apache/hadoop/hbase/filter/FilterList.html
+++ b/apidocs/org/apache/hadoop/hbase/filter/FilterList.html
@@ -116,7 +116,7 @@ var activeTableTab = "activeTableTab";
 
 @InterfaceAudience.Public
  @InterfaceStability.Stable
-public final class FilterList
+public final class FilterList
 extends Filter
 Implementation of Filter that represents an 
ordered List of Filters
  which will be evaluated with a specified boolean operator FilterList.Operator.MUST_PASS_ALL
@@ -386,7 +386,7 @@ extends 
 
 FilterList
-publicFilterList(http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListFilterrowFilters)
+publicFilterList(http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListFilterrowFilters)
 Constructor that takes a set of Filters. The default 
operator
  MUST_PASS_ALL is assumed.
 
@@ -401,7 +401,7 @@ extends 
 
 FilterList
-publicFilterList(Filter...rowFilters)
+publicFilterList(Filter...rowFilters)
 Constructor that takes a var arg number of Filters. The fefault 
operator
  MUST_PASS_ALL is assumed.
 
@@ -416,7 +416,7 @@ extends 
 
 FilterList
-publicFilterList(FilterList.Operatoroperator)
+publicFilterList(FilterList.Operatoroperator)
 Constructor that takes an operator.
 
 Parameters:
@@ -430,7 +430,7 @@ extends 
 
 FilterList
-publicFilterList(FilterList.Operatoroperator,
+publicFilterList(FilterList.Operatoroperator,
   http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListFilterrowFilters)
 Constructor that takes a set of Filters and an 
operator.
 
@@ -446,7 +446,7 @@ extends 
 
 FilterList
-publicFilterList(FilterList.Operatoroperator,
+publicFilterList(FilterList.Operatoroperator,
   Filter...rowFilters)
 Constructor that takes a var arg number of Filters and an 
operator.
 
@@ -470,7 +470,7 @@ extends 
 
 getOperator
-publicFilterList.OperatorgetOperator()
+publicFilterList.OperatorgetOperator()
 Get the operator.
 
 Returns:
@@ -484,7 +484,7 @@ extends 
 
 getFilters
-publichttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListFiltergetFilters()
+publichttp://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListFiltergetFilters()
 Get the filters.
 
 Returns:
@@ -498,7 +498,7 @@ extends 
 
 addFilter
-publicvoidaddFilter(Filterfilter)
+publicvoidaddFilter(Filterfilter)
 Add a filter.
 
 Parameters:
@@ -512,7 +512,7 @@ extends 
 
 reset
-publicvoidreset()
+publicvoidreset()
throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
 Description copied from 
class:Filter
 Reset the state of the filter between rows.
@@ -533,7 +533,7 @@ extends 
 
 filterRowKey
-publicbooleanfilterRowKey(byte[]rowKey,
+publicbooleanfilterRowKey(byte[]rowKey,
 intoffset,
 intlength)
  throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
@@ -563,7 +563,7 @@ extends 
 
 filterRowKey
-publicbooleanfilterRowKey(CellfirstRowCell)
+publicbooleanfilterRowKey(CellfirstRowCell)
  throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
 Description copied from 
class:Filter
 Filters a row based on the row key. If this returns true, 
the entire row will be excluded. If
@@ -591,7 +591,7 @@ extends 
 
 filterAllRemaining
-publicbooleanfilterAllRemaining()
+publicbooleanfilterAllRemaining()
throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
 Description copied from 
class:Filter
 If this returns true, the scan will terminate.
@@ -614,7 +614,7 @@ extends 
 
 transformCell
-publicCelltransformCell(Cellc)
+publicCelltransformCell(Cellc)
throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
 Description copied from 
class:Filter
 Give the filter a chance to transform the passed KeyValue. 
If the Cell is changed a new
@@ -645,7 +645,7 @@ extends 
 
 filterKeyValue
-publicFilter.ReturnCodefilterKeyValue(Cellc)

[36/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/util/class-use/ByteRange.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/util/class-use/ByteRange.html 
b/apidocs/org/apache/hadoop/hbase/util/class-use/ByteRange.html
index 13bf110..d6afd0c 100644
--- a/apidocs/org/apache/hadoop/hbase/util/class-use/ByteRange.html
+++ b/apidocs/org/apache/hadoop/hbase/util/class-use/ByteRange.html
@@ -250,11 +250,11 @@
 
 
 ByteRange
-SimpleMutableByteRange.deepCopy()
+SimpleByteRange.deepCopy()
 
 
 ByteRange
-SimpleByteRange.deepCopy()
+SimpleMutableByteRange.deepCopy()
 
 
 ByteRange
@@ -281,12 +281,12 @@
 
 
 ByteRange
-SimpleMutableByteRange.put(intindex,
+SimpleByteRange.put(intindex,
byteval)
 
 
 ByteRange
-SimpleByteRange.put(intindex,
+SimpleMutableByteRange.put(intindex,
byteval)
 
 
@@ -298,12 +298,12 @@
 
 
 ByteRange
-SimpleMutableByteRange.put(intindex,
+SimpleByteRange.put(intindex,
byte[]val)
 
 
 ByteRange
-SimpleByteRange.put(intindex,
+SimpleMutableByteRange.put(intindex,
byte[]val)
 
 
@@ -315,14 +315,14 @@
 
 
 ByteRange
-SimpleMutableByteRange.put(intindex,
+SimpleByteRange.put(intindex,
byte[]val,
intoffset,
intlength)
 
 
 ByteRange
-SimpleByteRange.put(intindex,
+SimpleMutableByteRange.put(intindex,
byte[]val,
intoffset,
intlength)
@@ -339,12 +339,12 @@
 
 
 ByteRange
-SimpleMutableByteRange.putInt(intindex,
+SimpleByteRange.putInt(intindex,
   intval)
 
 
 ByteRange
-SimpleByteRange.putInt(intindex,
+SimpleMutableByteRange.putInt(intindex,
   intval)
 
 
@@ -356,12 +356,12 @@
 
 
 ByteRange
-SimpleMutableByteRange.putLong(intindex,
+SimpleByteRange.putLong(intindex,
longval)
 
 
 ByteRange
-SimpleByteRange.putLong(intindex,
+SimpleMutableByteRange.putLong(intindex,
longval)
 
 
@@ -373,12 +373,12 @@
 
 
 ByteRange
-SimpleMutableByteRange.putShort(intindex,
+SimpleByteRange.putShort(intindex,
 shortval)
 
 
 ByteRange
-SimpleByteRange.putShort(intindex,
+SimpleMutableByteRange.putShort(intindex,
 shortval)
 
 
@@ -436,11 +436,11 @@
 
 
 ByteRange
-SimpleMutableByteRange.shallowCopy()
+SimpleByteRange.shallowCopy()
 
 
 ByteRange
-SimpleByteRange.shallowCopy()
+SimpleMutableByteRange.shallowCopy()
 
 
 ByteRange
@@ -450,12 +450,12 @@
 
 
 ByteRange
-SimpleMutableByteRange.shallowCopySubRange(intinnerOffset,
+SimpleByteRange.shallowCopySubRange(intinnerOffset,
intcopyLength)
 
 
 ByteRange
-SimpleByteRange.shallowCopySubRange(intinnerOffset,
+SimpleMutableByteRange.shallowCopySubRange(intinnerOffset,
intcopyLength)
 
 
@@ -467,11 +467,11 @@
 
 
 ByteRange
-SimpleMutableByteRange.unset()
+SimpleByteRange.unset()
 
 
 ByteRange
-SimpleByteRange.unset()
+SimpleMutableByteRange.unset()
 
 
 ByteRange

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/util/class-use/Bytes.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/util/class-use/Bytes.html 
b/apidocs/org/apache/hadoop/hbase/util/class-use/Bytes.html
index 6d12dad..eca9e9e 100644
--- a/apidocs/org/apache/hadoop/hbase/util/class-use/Bytes.html
+++ b/apidocs/org/apache/hadoop/hbase/util/class-use/Bytes.html
@@ -121,14 +121,6 @@
 
 
 http://docs.oracle.com/javase/8/docs/api/java/util/Map.html?is-external=true;
 title="class or interface in java.util">MapBytes,Bytes
-HColumnDescriptor.getValues()
-
-
-http://docs.oracle.com/javase/8/docs/api/java/util/Map.html?is-external=true;
 title="class or interface in java.util">MapBytes,Bytes
-HColumnDescriptor.getValues()
-
-
-http://docs.oracle.com/javase/8/docs/api/java/util/Map.html?is-external=true;
 title="class or interface in java.util">MapBytes,Bytes
 HTableDescriptor.getValues()
 Getter for fetching an unmodifiable HTableDescriptor.values
 map.
 
@@ -139,6 +131,14 @@
 Getter for fetching an unmodifiable HTableDescriptor.values
 map.
 
 
+
+http://docs.oracle.com/javase/8/docs/api/java/util/Map.html?is-external=true;
 title="class or interface in java.util">MapBytes,Bytes
+HColumnDescriptor.getValues()
+
+
+http://docs.oracle.com/javase/8/docs/api/java/util/Map.html?is-external=true;
 title="class or interface in java.util">MapBytes,Bytes
+HColumnDescriptor.getValues()
+
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html 
b/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html
index 0944d8c..2946a87 100644
--- a/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html
+++ b/apidocs/org/apache/hadoop/hbase/util/class-use/Order.html
@@ -116,11 +116,11 @@
 
 
 protected Order
-RawBytes.order
+OrderedBytesBase.order
 
 
 protected Order

[07/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/deprecated-list.html
--
diff --git a/devapidocs/deprecated-list.html b/devapidocs/deprecated-list.html
index afdfb4b..13f153e 100644
--- a/devapidocs/deprecated-list.html
+++ b/devapidocs/deprecated-list.html
@@ -281,51 +281,60 @@
 
 
 
-org.apache.hadoop.hbase.http.HttpServer.Builder.port
+org.apache.hadoop.hbase.shaded.com.google.protobuf.compiler.PluginProtos.CodeGeneratorRequest.PARSER
 
 
-org.apache.hadoop.hbase.regionserver.KeyPrefixRegionSplitPolicy.PREFIX_LENGTH_KEY_DEPRECATED
+org.apache.hadoop.hbase.shaded.com.google.protobuf.compiler.PluginProtos.CodeGeneratorResponse.PARSER
 
 
-org.apache.hadoop.hbase.KeyValue.RAW_COMPARATOR
+org.apache.hadoop.hbase.shaded.com.google.protobuf.compiler.PluginProtos.CodeGeneratorResponse.File.PARSER
 
 
-org.apache.hadoop.hbase.client.Scan.SCAN_ATTRIBUTES_METRICS_DATA
+org.apache.hadoop.hbase.http.HttpServer.Builder.port
 
 
+org.apache.hadoop.hbase.regionserver.KeyPrefixRegionSplitPolicy.PREFIX_LENGTH_KEY_DEPRECATED
+
+
+org.apache.hadoop.hbase.KeyValue.RAW_COMPARATOR
+
+
+org.apache.hadoop.hbase.client.Scan.SCAN_ATTRIBUTES_METRICS_DATA
+
+
 org.apache.hadoop.hbase.client.Scan.SCAN_ATTRIBUTES_METRICS_ENABLE
 since 1.0.0. Use Scan.setScanMetricsEnabled(boolean)
 
 
-
+
 org.apache.hadoop.hbase.regionserver.wal.WALEdit.scopes
 Legacy
 
 
-
+
 org.apache.hadoop.hbase.replication.regionserver.MetricsReplicationGlobalSourceSource.shippedKBsCounter
 
-
+
 org.apache.hadoop.hbase.replication.regionserver.MetricsReplicationSourceSourceImpl.shippedKBsKey
 
-
+
 org.apache.hadoop.hbase.snapshot.SnapshotDescriptionUtils.SNAPSHOT_TIMEOUT_MILLIS_DEFAULT
 Use SnapshotDescriptionUtils.DEFAULT_MAX_WAIT_TIME
 instead.
 
 
-
+
 org.apache.hadoop.hbase.snapshot.SnapshotDescriptionUtils.SNAPSHOT_TIMEOUT_MILLIS_KEY
 Use SnapshotDescriptionUtils.MASTER_SNAPSHOT_TIMEOUT_MILLIS
 instead.
 
 
-
+
 org.apache.hadoop.hbase.replication.regionserver.MetricsReplicationSourceSource.SOURCE_SHIPPED_KBS
 
-
+
 org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner.START
 
-
-org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.tableZNode
+
+org.apache.hadoop.hbase.zookeeper.ZNodePaths.tableZNode
 
 
 
@@ -399,7 +408,7 @@
 
 
 
-org.apache.hadoop.hbase.ipc.RpcServerInterface.call(BlockingService,
 Descriptors.MethodDescriptor, Message, CellScanner, long, 
MonitoredRPCHandler)
+org.apache.hadoop.hbase.ipc.RpcServerInterface.call(BlockingService,
 Descriptors.MethodDescriptor, Message, CellScanner, long, 
MonitoredRPCHandler)
 As of release 1.3, this 
will be removed in HBase 3.0
 
 
@@ -428,13 +437,13 @@
 org.apache.hadoop.hbase.mapreduce.CellCreator.create(byte[],
 int, int, byte[], int, int, byte[], int, int, long, byte[], int, int, 
String)
 
 
-org.apache.hadoop.hbase.regionserver.FifoRpcSchedulerFactory.create(Configuration,
 PriorityFunction)
+org.apache.hadoop.hbase.regionserver.RpcSchedulerFactory.create(Configuration,
 PriorityFunction)
 
 
-org.apache.hadoop.hbase.regionserver.SimpleRpcSchedulerFactory.create(Configuration,
 PriorityFunction)
+org.apache.hadoop.hbase.regionserver.FifoRpcSchedulerFactory.create(Configuration,
 PriorityFunction)
 
 
-org.apache.hadoop.hbase.regionserver.RpcSchedulerFactory.create(Configuration,
 PriorityFunction)
+org.apache.hadoop.hbase.regionserver.SimpleRpcSchedulerFactory.create(Configuration,
 PriorityFunction)
 
 
 org.apache.hadoop.hbase.coprocessor.ObserverContext.createAndPrepare(T,
 ObserverContextT)
@@ -480,15 +489,15 @@
 org.apache.hadoop.hbase.rest.client.RemoteHTable.exists(ListGet)
 
 
-org.apache.hadoop.hbase.filter.FilterBase.filterRowKey(byte[],
 int, int)
+org.apache.hadoop.hbase.filter.Filter.filterRowKey(byte[],
 int, int)
 As of release 2.0.0, this 
will be removed in HBase 3.0.0.
- Instead use FilterBase.filterRowKey(Cell)
+ Instead use Filter.filterRowKey(Cell)
 
 
 
-org.apache.hadoop.hbase.filter.Filter.filterRowKey(byte[],
 int, int)
+org.apache.hadoop.hbase.filter.FilterBase.filterRowKey(byte[],
 int, int)
 As of release 2.0.0, this 
will be removed in HBase 3.0.0.
- Instead use Filter.filterRowKey(Cell)
+ Instead use FilterBase.filterRowKey(Cell)
 
 
 
@@ -737,12 +746,12 @@
 org.apache.hadoop.hbase.master.cleaner.BaseLogCleanerDelegate.isLogDeletable(FileStatus)
 
 
-org.apache.hadoop.hbase.client.ConnectionImplementation.isMasterRunning()
+org.apache.hadoop.hbase.client.ClusterConnection.isMasterRunning()
 this has been deprecated 
without a replacement
 
 
 
-org.apache.hadoop.hbase.client.ClusterConnection.isMasterRunning()
+org.apache.hadoop.hbase.client.ConnectionImplementation.isMasterRunning()
 this has been deprecated 
without a replacement
 
 
@@ -795,15 +804,15 @@
 
 
 
-org.apache.hadoop.hbase.coprocessor.BaseMasterAndRegionObserver.postAddColumn(ObserverContextMasterCoprocessorEnvironment,
 TableName, HColumnDescriptor)

[29/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/client/Query.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/client/Query.html 
b/apidocs/src-html/org/apache/hadoop/hbase/client/Query.html
index ecf5411..417e13a 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/client/Query.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/client/Query.html
@@ -33,185 +33,186 @@
 025import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
 026import 
org.apache.hadoop.hbase.filter.Filter;
 027import 
org.apache.hadoop.hbase.io.TimeRange;
-028import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-029import 
org.apache.hadoop.hbase.security.access.AccessControlConstants;
+028import 
org.apache.hadoop.hbase.security.access.AccessControlConstants;
+029import 
org.apache.hadoop.hbase.security.access.AccessControlUtil;
 030import 
org.apache.hadoop.hbase.security.access.Permission;
 031import 
org.apache.hadoop.hbase.security.visibility.Authorizations;
 032import 
org.apache.hadoop.hbase.security.visibility.VisibilityConstants;
-033
-034import 
com.google.common.collect.ArrayListMultimap;
-035import 
com.google.common.collect.ListMultimap;
-036import 
org.apache.hadoop.hbase.util.Bytes;
-037
-038@InterfaceAudience.Public
-039@InterfaceStability.Evolving
-040public abstract class Query extends 
OperationWithAttributes {
-041  private static final String 
ISOLATION_LEVEL = "_isolationlevel_";
-042  protected Filter filter = null;
-043  protected int targetReplicaId = -1;
-044  protected Consistency consistency = 
Consistency.STRONG;
-045  protected Mapbyte[], TimeRange 
colFamTimeRangeMap = Maps.newTreeMap(Bytes.BYTES_COMPARATOR);
-046
-047  /**
-048   * @return Filter
-049   */
-050  public Filter getFilter() {
-051return filter;
-052  }
-053
-054  /**
-055   * Apply the specified server-side 
filter when performing the Query.
-056   * Only {@link 
Filter#filterKeyValue(Cell)} is called AFTER all tests
-057   * for ttl, column match, deletes and 
max versions have been run.
-058   * @param filter filter to run on the 
server
-059   * @return this for invocation 
chaining
-060   */
-061  public Query setFilter(Filter filter) 
{
-062this.filter = filter;
-063return this;
-064  }
-065
-066  /**
-067   * Sets the authorizations to be used 
by this Query
-068   * @param authorizations
-069   */
-070  public Query 
setAuthorizations(Authorizations authorizations) {
-071
this.setAttribute(VisibilityConstants.VISIBILITY_LABELS_ATTR_KEY, 
ProtobufUtil
-072
.toAuthorizations(authorizations).toByteArray());
-073return this;
-074  }
-075
-076  /**
-077   * @return The authorizations this 
Query is associated with.
-078   * @throws DeserializationException
-079   */
-080  public Authorizations 
getAuthorizations() throws DeserializationException {
-081byte[] authorizationsBytes = 
this.getAttribute(VisibilityConstants.VISIBILITY_LABELS_ATTR_KEY);
-082if (authorizationsBytes == null) 
return null;
-083return 
ProtobufUtil.toAuthorizations(authorizationsBytes);
-084  }
-085
-086  /**
-087   * @return The serialized ACL for this 
operation, or null if none
-088   */
-089  public byte[] getACL() {
-090return 
getAttribute(AccessControlConstants.OP_ATTRIBUTE_ACL);
-091  }
-092
-093  /**
-094   * @param user User short name
-095   * @param perms Permissions for the 
user
-096   */
-097  public Query setACL(String user, 
Permission perms) {
-098
setAttribute(AccessControlConstants.OP_ATTRIBUTE_ACL,
-099  
ProtobufUtil.toUsersAndPermissions(user, perms).toByteArray());
-100return this;
-101  }
-102
-103  /**
-104   * @param perms A map of permissions 
for a user or users
-105   */
-106  public Query setACL(MapString, 
Permission perms) {
-107ListMultimapString, 
Permission permMap = ArrayListMultimap.create();
-108for (Map.EntryString, 
Permission entry : perms.entrySet()) {
-109  permMap.put(entry.getKey(), 
entry.getValue());
-110}
-111
setAttribute(AccessControlConstants.OP_ATTRIBUTE_ACL,
-112  
ProtobufUtil.toUsersAndPermissions(permMap).toByteArray());
-113return this;
-114  }
-115
-116  /**
-117   * Returns the consistency level for 
this operation
-118   * @return the consistency level
-119   */
-120  public Consistency getConsistency() {
-121return consistency;
-122  }
-123
-124  /**
-125   * Sets the consistency level for this 
operation
-126   * @param consistency the consistency 
level
-127   */
-128  public Query setConsistency(Consistency 
consistency) {
-129this.consistency = consistency;
-130return this;
-131  }
-132
-133  /**
-134   * Specify region replica id where 
Query will fetch data from. Use this together with
-135   * {@link #setConsistency(Consistency)} 
passing {@link Consistency#TIMELINE} to read data from
-136   * a specific replicaId.
-137   * brb Expert: 
/bThis is an 

[04/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/org/apache/hadoop/hbase/CellUtil.FirstOnRowDeleteFamilyCell.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/CellUtil.FirstOnRowDeleteFamilyCell.html 
b/devapidocs/org/apache/hadoop/hbase/CellUtil.FirstOnRowDeleteFamilyCell.html
index 01a5fdd..947220c 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/CellUtil.FirstOnRowDeleteFamilyCell.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/CellUtil.FirstOnRowDeleteFamilyCell.html
@@ -123,7 +123,7 @@ var activeTableTab = "activeTableTab";
 
 
 @InterfaceAudience.Private
-private static class CellUtil.FirstOnRowDeleteFamilyCell
+private static class CellUtil.FirstOnRowDeleteFamilyCell
 extends CellUtil.EmptyCell
 
 
@@ -248,7 +248,7 @@ extends 
 
 row
-private finalbyte[] row
+private finalbyte[] row
 
 
 
@@ -257,7 +257,7 @@ extends 
 
 fam
-private finalbyte[] fam
+private finalbyte[] fam
 
 
 
@@ -274,7 +274,7 @@ extends 
 
 FirstOnRowDeleteFamilyCell
-publicFirstOnRowDeleteFamilyCell(byte[]row,
+publicFirstOnRowDeleteFamilyCell(byte[]row,
   byte[]fam)
 
 
@@ -292,7 +292,7 @@ extends 
 
 getRowArray
-publicbyte[]getRowArray()
+publicbyte[]getRowArray()
 Description copied from 
interface:Cell
 Contiguous raw bytes that may start at any index in the 
containing array. Max length is
  Short.MAX_VALUE which is 32,767 bytes.
@@ -312,7 +312,7 @@ extends 
 
 getRowLength
-publicshortgetRowLength()
+publicshortgetRowLength()
 
 Specified by:
 getRowLengthin
 interfaceCell
@@ -329,7 +329,7 @@ extends 
 
 getFamilyArray
-publicbyte[]getFamilyArray()
+publicbyte[]getFamilyArray()
 Description copied from 
interface:Cell
 Contiguous bytes composed of legal HDFS filename characters 
which may start at any index in the
  containing array. Max length is Byte.MAX_VALUE, which is 127 bytes.
@@ -349,7 +349,7 @@ extends 
 
 getFamilyLength
-publicbytegetFamilyLength()
+publicbytegetFamilyLength()
 
 Specified by:
 getFamilyLengthin
 interfaceCell
@@ -366,7 +366,7 @@ extends 
 
 getTimestamp
-publiclonggetTimestamp()
+publiclonggetTimestamp()
 
 Returns:
 Long value representing time at which this cell was "Put" into the row.  
Typically
@@ -380,7 +380,7 @@ extends 
 
 getTypeByte
-publicbytegetTypeByte()
+publicbytegetTypeByte()
 
 Returns:
 The byte representation of the KeyValue.TYPE of this cell: one of Put, 
Delete, etc

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/org/apache/hadoop/hbase/CellUtil.LastOnRowByteBufferedCell.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/CellUtil.LastOnRowByteBufferedCell.html 
b/devapidocs/org/apache/hadoop/hbase/CellUtil.LastOnRowByteBufferedCell.html
index 74dc30a..ba6c43d 100644
--- a/devapidocs/org/apache/hadoop/hbase/CellUtil.LastOnRowByteBufferedCell.html
+++ b/devapidocs/org/apache/hadoop/hbase/CellUtil.LastOnRowByteBufferedCell.html
@@ -132,7 +132,7 @@ var activeTableTab = "activeTableTab";
 
 
 @InterfaceAudience.Private
-private static class CellUtil.LastOnRowByteBufferedCell
+private static class CellUtil.LastOnRowByteBufferedCell
 extends CellUtil.EmptyByteBufferedCell
 
 
@@ -253,7 +253,7 @@ extends 
 
 rowBuff
-private finalhttp://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in java.nio">ByteBuffer rowBuff
+private finalhttp://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in java.nio">ByteBuffer rowBuff
 
 
 
@@ -262,7 +262,7 @@ extends 
 
 roffset
-private finalint roffset
+private finalint roffset
 
 
 
@@ -271,7 +271,7 @@ extends 
 
 rlength
-private finalshort rlength
+private finalshort rlength
 
 
 
@@ -288,7 +288,7 @@ extends 
 
 LastOnRowByteBufferedCell
-publicLastOnRowByteBufferedCell(http://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in java.nio">ByteBufferrow,
+publicLastOnRowByteBufferedCell(http://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in java.nio">ByteBufferrow,
  introffset,
  shortrlength)
 
@@ -307,7 +307,7 @@ extends 
 
 getRowByteBuffer
-publichttp://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in java.nio">ByteBuffergetRowByteBuffer()
+publichttp://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html?is-external=true;
 title="class or interface in java.nio">ByteBuffergetRowByteBuffer()
 
 Overrides:
 getRowByteBufferin
 classCellUtil.EmptyByteBufferedCell
@@ -322,7 +322,7 @@ extends 
 
 getRowPosition
-publicintgetRowPosition()
+publicintgetRowPosition()
 
 Overrides:
 getRowPositionin
 classCellUtil.EmptyByteBufferedCell
@@ -337,7 +337,7 @@ extends 
 
 

[38/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/shaded/com/google/protobuf/package-tree.html
--
diff --git 
a/apidocs/org/apache/hadoop/hbase/shaded/com/google/protobuf/package-tree.html 
b/apidocs/org/apache/hadoop/hbase/shaded/com/google/protobuf/package-tree.html
new file mode 100644
index 000..4242f3e
--- /dev/null
+++ 
b/apidocs/org/apache/hadoop/hbase/shaded/com/google/protobuf/package-tree.html
@@ -0,0 +1,128 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+
+org.apache.hadoop.hbase.shaded.com.google.protobuf Class Hierarchy 
(Apache HBase 2.0.0-SNAPSHOT API)
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Use
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Hierarchy For Package 
org.apache.hadoop.hbase.shaded.com.google.protobuf
+Package Hierarchies:
+
+All Packages
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Use
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+Copyright  20072016 http://www.apache.org/;>The Apache Software Foundation. All rights 
reserved.
+
+

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/shaded/com/google/protobuf/package-use.html
--
diff --git 
a/apidocs/org/apache/hadoop/hbase/shaded/com/google/protobuf/package-use.html 
b/apidocs/org/apache/hadoop/hbase/shaded/com/google/protobuf/package-use.html
new file mode 100644
index 000..0f77da0
--- /dev/null
+++ 
b/apidocs/org/apache/hadoop/hbase/shaded/com/google/protobuf/package-use.html
@@ -0,0 +1,125 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+
+Uses of Package org.apache.hadoop.hbase.shaded.com.google.protobuf 
(Apache HBase 2.0.0-SNAPSHOT API)
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Use
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Uses of 
Packageorg.apache.hadoop.hbase.shaded.com.google.protobuf
+
+No usage of 
org.apache.hadoop.hbase.shaded.com.google.protobuf
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Use
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+Copyright  20072016 http://www.apache.org/;>The Apache Software Foundation. All rights 
reserved.
+
+

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/shaded/protobuf/package-frame.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/shaded/protobuf/package-frame.html 
b/apidocs/org/apache/hadoop/hbase/shaded/protobuf/package-frame.html
new file mode 100644
index 000..532a9b1
--- /dev/null
+++ b/apidocs/org/apache/hadoop/hbase/shaded/protobuf/package-frame.html
@@ -0,0 +1,14 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+
+org.apache.hadoop.hbase.shaded.protobuf (Apache HBase 2.0.0-SNAPSHOT 
API)
+
+
+
+
+org.apache.hadoop.hbase.shaded.protobuf
+
+

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/shaded/protobuf/package-summary.html
--
diff --git 

[16/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html 
b/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
index ef7c1e1..a541dfa 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
@@ -26,51 +26,51 @@
 018 */
 019package 
org.apache.hadoop.hbase.mapreduce;
 020
-021import 
com.google.protobuf.InvalidProtocolBufferException;
-022import 
com.codahale.metrics.MetricRegistry;
-023import org.apache.commons.logging.Log;
-024import 
org.apache.commons.logging.LogFactory;
-025import 
org.apache.hadoop.conf.Configuration;
-026import org.apache.hadoop.fs.FileSystem;
-027import org.apache.hadoop.fs.Path;
-028import 
org.apache.hadoop.hbase.HBaseConfiguration;
-029import 
org.apache.hadoop.hbase.HConstants;
-030import 
org.apache.hadoop.hbase.MetaTableAccessor;
-031import 
org.apache.hadoop.hbase.TableName;
-032import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
-033import 
org.apache.hadoop.hbase.classification.InterfaceStability;
-034import 
org.apache.hadoop.hbase.client.Connection;
-035import 
org.apache.hadoop.hbase.client.ConnectionFactory;
-036import 
org.apache.hadoop.hbase.client.Put;
-037import 
org.apache.hadoop.hbase.client.Scan;
-038import 
org.apache.hadoop.hbase.io.ImmutableBytesWritable;
-039import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-040import 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos;
-041import 
org.apache.hadoop.hbase.security.User;
-042import 
org.apache.hadoop.hbase.security.UserProvider;
-043import 
org.apache.hadoop.hbase.security.token.TokenUtil;
-044import 
org.apache.hadoop.hbase.util.Base64;
-045import 
org.apache.hadoop.hbase.util.Bytes;
-046import 
org.apache.hadoop.hbase.zookeeper.ZKConfig;
-047import org.apache.hadoop.io.Writable;
-048import 
org.apache.hadoop.mapreduce.InputFormat;
-049import org.apache.hadoop.mapreduce.Job;
-050import 
org.apache.hadoop.util.StringUtils;
-051
-052import java.io.File;
-053import java.io.IOException;
-054import java.net.URL;
-055import java.net.URLDecoder;
-056import java.util.ArrayList;
-057import java.util.Collection;
-058import java.util.Enumeration;
-059import java.util.HashMap;
-060import java.util.HashSet;
-061import java.util.List;
-062import java.util.Map;
-063import java.util.Set;
-064import java.util.zip.ZipEntry;
-065import java.util.zip.ZipFile;
+021import java.io.File;
+022import java.io.IOException;
+023import java.net.URL;
+024import java.net.URLDecoder;
+025import java.util.ArrayList;
+026import java.util.Collection;
+027import java.util.Enumeration;
+028import java.util.HashMap;
+029import java.util.HashSet;
+030import java.util.List;
+031import java.util.Map;
+032import java.util.Set;
+033import java.util.zip.ZipEntry;
+034import java.util.zip.ZipFile;
+035
+036import org.apache.commons.logging.Log;
+037import 
org.apache.commons.logging.LogFactory;
+038import 
org.apache.hadoop.conf.Configuration;
+039import org.apache.hadoop.fs.FileSystem;
+040import org.apache.hadoop.fs.Path;
+041import 
org.apache.hadoop.hbase.HBaseConfiguration;
+042import 
org.apache.hadoop.hbase.HConstants;
+043import 
org.apache.hadoop.hbase.MetaTableAccessor;
+044import 
org.apache.hadoop.hbase.TableName;
+045import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
+046import 
org.apache.hadoop.hbase.classification.InterfaceStability;
+047import 
org.apache.hadoop.hbase.client.Connection;
+048import 
org.apache.hadoop.hbase.client.ConnectionFactory;
+049import 
org.apache.hadoop.hbase.client.Put;
+050import 
org.apache.hadoop.hbase.client.Scan;
+051import 
org.apache.hadoop.hbase.io.ImmutableBytesWritable;
+052import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+053import 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos;
+054import 
org.apache.hadoop.hbase.security.User;
+055import 
org.apache.hadoop.hbase.security.UserProvider;
+056import 
org.apache.hadoop.hbase.security.token.TokenUtil;
+057import 
org.apache.hadoop.hbase.util.Base64;
+058import 
org.apache.hadoop.hbase.util.Bytes;
+059import 
org.apache.hadoop.hbase.zookeeper.ZKConfig;
+060import org.apache.hadoop.io.Writable;
+061import 
org.apache.hadoop.mapreduce.InputFormat;
+062import org.apache.hadoop.mapreduce.Job;
+063import 
org.apache.hadoop.util.StringUtils;
+064
+065import 
com.codahale.metrics.MetricRegistry;
 066
 067/**
 068 * Utility for {@link TableMapper} and 
{@link TableReducer}
@@ -583,461 +583,454 @@
 575   */
 576  public static Scan 
convertStringToScan(String base64) throws IOException {
 577byte [] decoded = 
Base64.decode(base64);
-578ClientProtos.Scan scan;
-579try {
-580  scan = 

[02/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/org/apache/hadoop/hbase/HRegionInfo.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/HRegionInfo.html 
b/devapidocs/org/apache/hadoop/hbase/HRegionInfo.html
index 3dfddfb..52feada 100644
--- a/devapidocs/org/apache/hadoop/hbase/HRegionInfo.html
+++ b/devapidocs/org/apache/hadoop/hbase/HRegionInfo.html
@@ -398,19 +398,19 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparabl
 
 
 
-(package private) 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionInfo
+(package private) 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionInfo
 convert()
 Convert a HRegionInfo to the protobuf RegionInfo
 
 
 
 static HRegionInfo
-convert(org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionInfoproto)
+convert(org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionInfoproto)
 Convert a RegionInfo to a HRegionInfo
 
 
 
-static 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionInfo
+static 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionInfo
 convert(HRegionInfoinfo)
 Convert a HRegionInfo to a RegionInfo
 
@@ -1717,7 +1717,7 @@ public
 
 convert
-org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionInfoconvert()
+org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionInfoconvert()
 Convert a HRegionInfo to the protobuf RegionInfo
 
 Returns:
@@ -1731,7 +1731,7 @@ public
 
 convert
-public 
staticorg.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionInfoconvert(HRegionInfoinfo)
+public 
staticorg.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionInfoconvert(HRegionInfoinfo)
 Convert a HRegionInfo to a RegionInfo
 
 Parameters:
@@ -1741,13 +1741,13 @@ public
+
 
 
 
 
 convert
-public staticHRegionInfoconvert(org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionInfoproto)
+public staticHRegionInfoconvert(org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionInfoproto)
 Convert a RegionInfo to a HRegionInfo
 
 Parameters:

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/org/apache/hadoop/hbase/HealthChecker.HealthCheckerExitStatus.html
--
diff --git 
a/devapidocs/org/apache/hadoop/hbase/HealthChecker.HealthCheckerExitStatus.html 
b/devapidocs/org/apache/hadoop/hbase/HealthChecker.HealthCheckerExitStatus.html
index 50d2b7b..0d1cde3 100644
--- 
a/devapidocs/org/apache/hadoop/hbase/HealthChecker.HealthCheckerExitStatus.html
+++ 
b/devapidocs/org/apache/hadoop/hbase/HealthChecker.HealthCheckerExitStatus.html
@@ -272,7 +272,7 @@ the order they are declared.
 
 
 values
-public staticHealthChecker.HealthCheckerExitStatus[]values()
+public staticHealthChecker.HealthCheckerExitStatus[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -292,7 +292,7 @@ for (HealthChecker.HealthCheckerExitStatus c : 
HealthChecker.HealthCheckerExitSt
 
 
 valueOf
-public staticHealthChecker.HealthCheckerExitStatusvalueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticHealthChecker.HealthCheckerExitStatusvalueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html 
b/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html
index 1aeb9d8..cef 100644
--- a/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html
+++ b/devapidocs/org/apache/hadoop/hbase/KeepDeletedCells.html
@@ -263,7 +263,7 @@ the order they are declared.
 
 
 values
-public staticKeepDeletedCells[]values()
+public staticKeepDeletedCells[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -283,7 +283,7 @@ for (KeepDeletedCells c : KeepDeletedCells.values())
 
 
 valueOf
-public staticKeepDeletedCellsvalueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticKeepDeletedCellsvalueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in 

[35/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html
--
diff --git 
a/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html 
b/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html
index e626707..4c0ca38 100644
--- a/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html
+++ b/apidocs/org/apache/hadoop/hbase/util/class-use/PositionedByteRange.html
@@ -124,52 +124,38 @@
 
 
 
-http://docs.oracle.com/javase/8/docs/api/java/lang/Short.html?is-external=true;
 title="class or interface in java.lang">Short
-RawShort.decode(PositionedByteRangesrc)
-
-
-http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object[]
-Struct.decode(PositionedByteRangesrc)
-
-
-T
-TerminatedWrapper.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
+RawString.decode(PositionedByteRangesrc)
 
 
 http://docs.oracle.com/javase/8/docs/api/java/lang/Byte.html?is-external=true;
 title="class or interface in java.lang">Byte
 OrderedInt8.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
-OrderedString.decode(PositionedByteRangesrc)
+T
+DataType.decode(PositionedByteRangesrc)
+Read an instance of T from the buffer 
src.
+
 
 
-http://docs.oracle.com/javase/8/docs/api/java/lang/Double.html?is-external=true;
 title="class or interface in java.lang">Double
-OrderedFloat64.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/8/docs/api/java/lang/Short.html?is-external=true;
 title="class or interface in java.lang">Short
+RawShort.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/8/docs/api/java/lang/Integer.html?is-external=true;
 title="class or interface in java.lang">Integer
-OrderedInt32.decode(PositionedByteRangesrc)
-
-
 http://docs.oracle.com/javase/8/docs/api/java/lang/Number.html?is-external=true;
 title="class or interface in java.lang">Number
 OrderedNumeric.decode(PositionedByteRangesrc)
 
-
-http://docs.oracle.com/javase/8/docs/api/java/lang/Byte.html?is-external=true;
 title="class or interface in java.lang">Byte
-RawByte.decode(PositionedByteRangesrc)
-
 
 http://docs.oracle.com/javase/8/docs/api/java/lang/Float.html?is-external=true;
 title="class or interface in java.lang">Float
 RawFloat.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/8/docs/api/java/lang/Float.html?is-external=true;
 title="class or interface in java.lang">Float
-OrderedFloat32.decode(PositionedByteRangesrc)
+byte[]
+OrderedBlobVar.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/8/docs/api/java/lang/Integer.html?is-external=true;
 title="class or interface in java.lang">Integer
-RawInteger.decode(PositionedByteRangesrc)
+byte[]
+RawBytes.decode(PositionedByteRangesrc)
 
 
 byte[]
@@ -184,59 +170,73 @@
 FixedLengthWrapper.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
-RawString.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object[]
+Struct.decode(PositionedByteRangesrc)
 
 
-byte[]
-RawBytes.decode(PositionedByteRangesrc)
+http://docs.oracle.com/javase/8/docs/api/java/lang/Float.html?is-external=true;
 title="class or interface in java.lang">Float
+OrderedFloat32.decode(PositionedByteRangesrc)
 
 
-byte[]
-OrderedBlobVar.decode(PositionedByteRangesrc)
-
-
 http://docs.oracle.com/javase/8/docs/api/java/lang/Short.html?is-external=true;
 title="class or interface in java.lang">Short
 OrderedInt16.decode(PositionedByteRangesrc)
 
-
-http://docs.oracle.com/javase/8/docs/api/java/lang/Double.html?is-external=true;
 title="class or interface in java.lang">Double
-RawDouble.decode(PositionedByteRangesrc)
-
 
 http://docs.oracle.com/javase/8/docs/api/java/lang/Long.html?is-external=true;
 title="class or interface in java.lang">Long
 RawLong.decode(PositionedByteRangesrc)
 
 
-T
-DataType.decode(PositionedByteRangesrc)
-Read an instance of T from the buffer 
src.
-
+http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
+OrderedString.decode(PositionedByteRangesrc)
 
 
-http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object
-Struct.decode(PositionedByteRangesrc,
-  intindex)
-Read the field at index.
-
+http://docs.oracle.com/javase/8/docs/api/java/lang/Integer.html?is-external=true;
 title="class or interface in java.lang">Integer

[45/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/class-use/Cell.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/class-use/Cell.html 
b/apidocs/org/apache/hadoop/hbase/class-use/Cell.html
index 52ebf28..dcfc443 100644
--- a/apidocs/org/apache/hadoop/hbase/class-use/Cell.html
+++ b/apidocs/org/apache/hadoop/hbase/class-use/Cell.html
@@ -988,23 +988,23 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
+Append
+Append.add(Cellcell)
+Add column and value to this Append operation.
+
+
+
 Put
 Put.add(Cellkv)
 Add the specified KeyValue to this Put operation.
 
 
-
+
 Increment
 Increment.add(Cellcell)
 Add the specified KeyValue to this operation.
 
 
-
-Append
-Append.add(Cellcell)
-Add column and value to this Append operation.
-
-
 
 Delete
 Delete.addDeleteMarker(Cellkv)
@@ -1082,6 +1082,14 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
   booleanpartial)
 
 
+Append
+Append.setFamilyCellMap(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
+
+
+Delete
+Delete.setFamilyCellMap(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
+
+
 Put
 Put.setFamilyCellMap(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 
@@ -1095,14 +1103,6 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 Method for setting the put's familyMap
 
 
-
-Append
-Append.setFamilyCellMap(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
-
-
-Delete
-Delete.setFamilyCellMap(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
-
 
 
 
@@ -1119,66 +1119,66 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 Cell
-ColumnPaginationFilter.getNextCellHint(Cellcell)
+MultipleColumnPrefixFilter.getNextCellHint(Cellcell)
 
 
 Cell
-FilterList.getNextCellHint(CellcurrentCell)
+TimestampsFilter.getNextCellHint(CellcurrentCell)
+Pick the next cell that the scanner should seek to.
+
 
 
-abstract Cell
-Filter.getNextCellHint(CellcurrentCell)
-If the filter returns the match code SEEK_NEXT_USING_HINT, 
then it should also tell which is
- the next key it must seek to.
-
+Cell
+ColumnPaginationFilter.getNextCellHint(Cellcell)
 
 
 Cell
-MultipleColumnPrefixFilter.getNextCellHint(Cellcell)
+FilterList.getNextCellHint(CellcurrentCell)
 
 
 Cell
-TimestampsFilter.getNextCellHint(CellcurrentCell)
-Pick the next cell that the scanner should seek to.
-
+MultiRowRangeFilter.getNextCellHint(CellcurrentKV)
 
 
 Cell
 FuzzyRowFilter.getNextCellHint(CellcurrentCell)
 
 
-Cell
-MultiRowRangeFilter.getNextCellHint(CellcurrentKV)
+abstract Cell
+Filter.getNextCellHint(CellcurrentCell)
+If the filter returns the match code SEEK_NEXT_USING_HINT, 
then it should also tell which is
+ the next key it must seek to.
+
 
 
 Cell
-ColumnPrefixFilter.getNextCellHint(Cellcell)
+ColumnRangeFilter.getNextCellHint(Cellcell)
 
 
 Cell
-ColumnRangeFilter.getNextCellHint(Cellcell)
+ColumnPrefixFilter.getNextCellHint(Cellcell)
 
 
 Cell
-KeyOnlyFilter.transformCell(Cellcell)
+SkipFilter.transformCell(Cellv)
 
 
 Cell
 FilterList.transformCell(Cellc)
 
 
-Cell
-WhileMatchFilter.transformCell(Cellv)
-
-
 abstract Cell
 Filter.transformCell(Cellv)
 Give the filter a chance to transform the passed 
KeyValue.
 
 
+
+Cell
+WhileMatchFilter.transformCell(Cellv)
+
 
 Cell
-SkipFilter.transformCell(Cellv)
+KeyOnlyFilter.transformCell(Cellcell)
 
 
 
@@ -1223,27 +1223,27 @@ Input/OutputFormats, a table indexing MapReduce job, 
and utility methods.
 
 
 Filter.ReturnCode
-InclusiveStopFilter.filterKeyValue(Cellv)
+MultipleColumnPrefixFilter.filterKeyValue(Cellkv)
 
 
 Filter.ReturnCode
-RandomRowFilter.filterKeyValue(Cellv)
+SingleColumnValueFilter.filterKeyValue(Cellc)
 
 
 Filter.ReturnCode

[30/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/client/Mutation.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/client/Mutation.html 
b/apidocs/src-html/org/apache/hadoop/hbase/client/Mutation.html
index d2a7403..b36be41 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/client/Mutation.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/client/Mutation.html
@@ -26,512 +26,558 @@
 018
 019package org.apache.hadoop.hbase.client;
 020
-021import java.nio.ByteBuffer;
-022import java.util.ArrayList;
-023import java.util.Arrays;
-024import java.util.HashMap;
-025import java.util.List;
-026import java.util.Map;
-027import java.util.NavigableMap;
-028import java.util.TreeMap;
-029import java.util.UUID;
-030
-031import org.apache.hadoop.hbase.Cell;
-032import 
org.apache.hadoop.hbase.CellScannable;
-033import 
org.apache.hadoop.hbase.CellScanner;
-034import 
org.apache.hadoop.hbase.CellUtil;
-035import 
org.apache.hadoop.hbase.HConstants;
-036import 
org.apache.hadoop.hbase.KeyValue;
-037import org.apache.hadoop.hbase.Tag;
-038import org.apache.hadoop.hbase.TagUtil;
-039import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
-040import 
org.apache.hadoop.hbase.classification.InterfaceStability;
-041import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-042import 
org.apache.hadoop.hbase.io.HeapSize;
-043import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-044import 
org.apache.hadoop.hbase.security.access.AccessControlConstants;
-045import 
org.apache.hadoop.hbase.security.access.Permission;
-046import 
org.apache.hadoop.hbase.security.visibility.CellVisibility;
-047import 
org.apache.hadoop.hbase.security.visibility.VisibilityConstants;
-048import 
org.apache.hadoop.hbase.util.Bytes;
-049import 
org.apache.hadoop.hbase.util.ClassSize;
-050
-051import 
com.google.common.collect.ArrayListMultimap;
-052import 
com.google.common.collect.ListMultimap;
-053import 
com.google.common.io.ByteArrayDataInput;
-054import 
com.google.common.io.ByteArrayDataOutput;
-055import 
com.google.common.io.ByteStreams;
-056
-057@InterfaceAudience.Public
-058@InterfaceStability.Evolving
-059public abstract class Mutation extends 
OperationWithAttributes implements Row, CellScannable,
-060HeapSize {
-061  public static final long 
MUTATION_OVERHEAD = ClassSize.align(
-062  // This
-063  ClassSize.OBJECT +
-064  // row + 
OperationWithAttributes.attributes
-065  2 * ClassSize.REFERENCE +
-066  // Timestamp
-067  1 * Bytes.SIZEOF_LONG +
-068  // durability
-069  ClassSize.REFERENCE +
-070  // familyMap
-071  ClassSize.REFERENCE +
-072  // familyMap
-073  ClassSize.TREEMAP);
-074
-075  /**
-076   * The attribute for storing the list 
of clusters that have consumed the change.
-077   */
-078  private static final String 
CONSUMED_CLUSTER_IDS = "_cs.id";
-079
-080  /**
-081   * The attribute for storing TTL for 
the result of the mutation.
-082   */
-083  private static final String 
OP_ATTRIBUTE_TTL = "_ttl";
-084
-085  private static final String 
RETURN_RESULTS = "_rr_";
-086
-087  protected byte [] row = null;
-088  protected long ts = 
HConstants.LATEST_TIMESTAMP;
-089  protected Durability durability = 
Durability.USE_DEFAULT;
-090
-091  // A Map sorted by column family.
-092  protected NavigableMapbyte [], 
ListCell familyMap =
-093new TreeMapbyte [], 
ListCell(Bytes.BYTES_COMPARATOR);
-094
-095  @Override
-096  public CellScanner cellScanner() {
-097return 
CellUtil.createCellScanner(getFamilyCellMap());
-098  }
-099
-100  /**
-101   * Creates an empty list if one doesn't 
exist for the given column family
-102   * or else it returns the associated 
list of Cell objects.
-103   *
-104   * @param family column family
-105   * @return a list of Cell objects, 
returns an empty list if one doesn't exist.
-106   */
-107  ListCell getCellList(byte[] 
family) {
-108ListCell list = 
this.familyMap.get(family);
-109if (list == null) {
-110  list = new 
ArrayListCell();
-111}
-112return list;
-113  }
-114
-115  /*
-116   * Create a KeyValue with this objects 
row key and the Put identifier.
-117   *
-118   * @return a KeyValue with this objects 
row key and the Put identifier.
-119   */
-120  KeyValue createPutKeyValue(byte[] 
family, byte[] qualifier, long ts, byte[] value) {
-121return new KeyValue(this.row, family, 
qualifier, ts, KeyValue.Type.Put, value);
-122  }
-123
-124  /**
-125   * Create a KeyValue with this objects 
row key and the Put identifier.
-126   * @param family
-127   * @param qualifier
-128   * @param ts
-129   * @param value
-130   * @param tags - Specify the Tags as an 
Array
-131   * @return a KeyValue with this objects 
row key and the Put identifier.
-132   */
-133  KeyValue createPutKeyValue(byte[] 
family, byte[] qualifier, long ts, byte[] value, Tag[] tags) {
-134  

[22/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.RowRange.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.RowRange.html
 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.RowRange.html
index 070e398..b25492c 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.RowRange.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.RowRange.html
@@ -36,496 +36,495 @@
 028import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 029import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 030import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-031import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-032import 
org.apache.hadoop.hbase.util.ByteStringer;
-033import 
org.apache.hadoop.hbase.util.Bytes;
-034
-035import 
com.google.protobuf.InvalidProtocolBufferException;
-036
-037/**
-038 * Filter to support scan multiple row 
key ranges. It can construct the row key ranges from the
-039 * passed list which can be accessed by 
each region server.
-040 *
-041 * HBase is quite efficient when scanning 
only one small row key range. If user needs to specify
-042 * multiple row key ranges in one scan, 
the typical solutions are: 1. through FilterList which is a
-043 * list of row key Filters, 2. using the 
SQL layer over HBase to join with two table, such as hive,
-044 * phoenix etc. However, both solutions 
are inefficient. Both of them can't utilize the range info
-045 * to perform fast forwarding during scan 
which is quite time consuming. If the number of ranges
-046 * are quite big (e.g. millions), join is 
a proper solution though it is slow. However, there are
-047 * cases that user wants to specify a 
small number of ranges to scan (e.g. lt;1000 ranges). Both
-048 * solutions can't provide satisfactory 
performance in such case. MultiRowRangeFilter is to support
-049 * such usec ase (scan multiple row key 
ranges), which can construct the row key ranges from user
-050 * specified list and perform 
fast-forwarding during scan. Thus, the scan will be quite efficient.
-051 */
-052@InterfaceAudience.Public
-053@InterfaceStability.Evolving
-054public class MultiRowRangeFilter extends 
FilterBase {
-055
-056  private ListRowRange 
rangeList;
-057
-058  private static final int 
ROW_BEFORE_FIRST_RANGE = -1;
-059  private boolean EXCLUSIVE = false;
-060  private boolean done = false;
-061  private boolean initialized = false;
-062  private int index;
-063  private RowRange range;
-064  private ReturnCode currentReturnCode;
-065
-066  /**
-067   * @param list A list of 
codeRowRange/code
-068   * @throws java.io.IOException
-069   *   throw an exception if the 
range list is not in an natural order or any
-070   *   
codeRowRange/code is invalid
-071   */
-072  public 
MultiRowRangeFilter(ListRowRange list) throws IOException {
-073this.rangeList = 
sortAndMerge(list);
-074  }
-075
-076  @Override
-077  public boolean filterAllRemaining() {
-078return done;
-079  }
-080
-081  public ListRowRange 
getRowRanges() {
-082return this.rangeList;
-083  }
-084
-085  @Override
-086  public boolean filterRowKey(Cell 
firstRowCell) {
-087if (filterAllRemaining()) return 
true;
-088// If it is the first time of 
running, calculate the current range index for
-089// the row key. If index is out of 
bound which happens when the start row
-090// user sets is after the largest 
stop row of the ranges, stop the scan.
-091// If row key is after the current 
range, find the next range and update index.
-092byte[] rowArr = 
firstRowCell.getRowArray();
-093int length = 
firstRowCell.getRowLength();
-094int offset = 
firstRowCell.getRowOffset();
-095if (!initialized
-096|| !range.contains(rowArr, 
offset, length)) {
-097  byte[] rowkey = 
CellUtil.cloneRow(firstRowCell);
-098  index = 
getNextRangeIndex(rowkey);
-099  if (index = rangeList.size()) 
{
-100done = true;
-101currentReturnCode = 
ReturnCode.NEXT_ROW;
-102return false;
-103  }
-104  if(index != ROW_BEFORE_FIRST_RANGE) 
{
-105range = rangeList.get(index);
-106  } else {
-107range = rangeList.get(0);
-108  }
-109  if (EXCLUSIVE) {
-110EXCLUSIVE = false;
-111currentReturnCode = 
ReturnCode.NEXT_ROW;
-112return false;
-113  }
-114  if (!initialized) {
-115if(index != 
ROW_BEFORE_FIRST_RANGE) {
-116  currentReturnCode = 
ReturnCode.INCLUDE;
-117} else {
-118  currentReturnCode = 
ReturnCode.SEEK_NEXT_USING_HINT;
-119}
-120initialized = true;
-121  } else {
-122if (range.contains(rowArr, 
offset, length)) {
-123  currentReturnCode = 

[39/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html 
b/apidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
index 3ff66f1..e31b569 100644
--- a/apidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
+++ b/apidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html
@@ -1007,7 +1007,7 @@ public staticvoid
 
 initTableReducerJob
-public staticvoidinitTableReducerJob(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
+public staticvoidinitTableReducerJob(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
http://docs.oracle.com/javase/8/docs/api/java/lang/Class.html?is-external=true;
 title="class or interface in java.lang">Class? extends TableReducerreducer,

org.apache.hadoop.mapreduce.Jobjob)
 throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
@@ -1029,7 +1029,7 @@ public staticvoid
 
 initTableReducerJob
-public staticvoidinitTableReducerJob(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
+public staticvoidinitTableReducerJob(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
http://docs.oracle.com/javase/8/docs/api/java/lang/Class.html?is-external=true;
 title="class or interface in java.lang">Class? extends TableReducerreducer,

org.apache.hadoop.mapreduce.Jobjob,
http://docs.oracle.com/javase/8/docs/api/java/lang/Class.html?is-external=true;
 title="class or interface in java.lang">Classpartitioner)
@@ -1054,7 +1054,7 @@ public staticvoid
 
 initTableReducerJob
-public staticvoidinitTableReducerJob(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
+public staticvoidinitTableReducerJob(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
http://docs.oracle.com/javase/8/docs/api/java/lang/Class.html?is-external=true;
 title="class or interface in java.lang">Class? extends TableReducerreducer,

org.apache.hadoop.mapreduce.Jobjob,
http://docs.oracle.com/javase/8/docs/api/java/lang/Class.html?is-external=true;
 title="class or interface in java.lang">Classpartitioner,
@@ -1095,7 +1095,7 @@ public staticvoid
 
 initTableReducerJob
-public staticvoidinitTableReducerJob(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
+public staticvoidinitTableReducerJob(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
http://docs.oracle.com/javase/8/docs/api/java/lang/Class.html?is-external=true;
 title="class or interface in java.lang">Class? extends TableReducerreducer,

org.apache.hadoop.mapreduce.Jobjob,
http://docs.oracle.com/javase/8/docs/api/java/lang/Class.html?is-external=true;
 title="class or interface in java.lang">Classpartitioner,
@@ -1139,7 +1139,7 @@ public staticvoid
 
 limitNumReduceTasks
-public staticvoidlimitNumReduceTasks(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
+public staticvoidlimitNumReduceTasks(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,

org.apache.hadoop.mapreduce.Jobjob)
 throws http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
 Ensures that the given number of reduce tasks for the given 
job
@@ -1159,7 +1159,7 @@ public staticvoid
 
 setNumReduceTasks
-public staticvoidsetNumReduceTasks(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringtable,
+public 

[08/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/allclasses-frame.html
--
diff --git a/devapidocs/allclasses-frame.html b/devapidocs/allclasses-frame.html
index b673f22..756908b 100644
--- a/devapidocs/allclasses-frame.html
+++ b/devapidocs/allclasses-frame.html
@@ -30,7 +30,6 @@
 AbstractMultiOutputCompactor
 AbstractPositionedByteRange
 AbstractProtobufLogWriter
-AbstractRegionServerCallable
 AbstractResponse
 AbstractResponse.ResponseType
 AbstractRpcClient
@@ -46,6 +45,7 @@
 AccessController
 AccessController.OpType
 AccessControlLists
+AccessControlUtil
 AccessDeniedException
 Action
 ActiveMasterManager
@@ -336,6 +336,7 @@
 CellSearcher
 CellSet
 CellSetModel
+CellSink
 CellTypeEncoder
 CellUtil
 CellUtil.EmptyByteBufferedCell
@@ -380,6 +381,7 @@
 ClientExceptionsUtil
 ClientIdGenerator
 ClientScanner
+ClientServiceCallable
 ClientSideRegionScanner
 ClientSimpleScanner
 ClientSmallReversedScanner
@@ -452,7 +454,6 @@
 CompactionWindow
 CompactionWindowFactory
 Compactor
-Compactor.CellSink
 Compactor.CellSinkFactory
 Compactor.FileDetails
 Compactor.InternalScannerFactory
@@ -519,6 +520,7 @@
 CoprocessorHost.EnvironmentPriorityComparator
 CoprocessorRpcChannel
 CoprocessorRpcUtils
+CoprocessorRpcUtils.BlockingRpcCallback
 CoprocessorService
 CopyKeyDataBlockEncoder
 CopyKeyDataBlockEncoder.CopyKeyEncodingState
@@ -1175,7 +1177,6 @@
 MasterCoprocessorHost.CoprocessorOperation
 MasterCoprocessorHost.CoprocessorOperationWithResult
 MasterCoprocessorHost.MasterEnvironment
-MasterCoprocessorRpcChannel
 MasterDDLOperationHelper
 MasterDumpServlet
 MasterFileSystem
@@ -1535,6 +1536,16 @@
 Permission.Action
 PlainTextMessageBodyProducer
 PleaseHoldException
+PluginProtos
+PluginProtos.CodeGeneratorRequest
+PluginProtos.CodeGeneratorRequest.Builder
+PluginProtos.CodeGeneratorRequestOrBuilder
+PluginProtos.CodeGeneratorResponse
+PluginProtos.CodeGeneratorResponse.Builder
+PluginProtos.CodeGeneratorResponse.File
+PluginProtos.CodeGeneratorResponse.File.Builder
+PluginProtos.CodeGeneratorResponse.FileOrBuilder
+PluginProtos.CodeGeneratorResponseOrBuilder
 PoolMap
 PoolMap.Pool
 PoolMap.PoolType
@@ -1597,8 +1608,7 @@
 ProcedureSuspendedException
 ProcedureSyncWait
 ProcedureSyncWait.Predicate
-ProcedureUtil
-ProcedureUtil.ForeignExceptionMsg
+ProcedureUtil
 ProcedureWALFile
 ProcedureWALFormat
 ProcedureWALFormat.InvalidWALDataException
@@ -1692,7 +1702,7 @@
 RegionCoprocessorHost.RegionOperation
 RegionCoprocessorHost.RegionOperationWithResult
 RegionCoprocessorHost.TableCoprocessorAttribute
-RegionCoprocessorRpcChannel
+RegionCoprocessorRpcChannel
 RegionCoprocessorServiceExec
 RegionException
 RegionGroupingProvider
@@ -1753,7 +1763,6 @@
 RegionServerCoprocessorHost.CoprocessorOperation
 RegionServerCoprocessorHost.EnvironmentPriorityComparator
 RegionServerCoprocessorHost.RegionServerEnvironment
-RegionServerCoprocessorRpcChannel
 RegionServerFlushTableProcedureManager
 RegionServerFlushTableProcedureManager.FlushTableSubprocedurePool
 RegionServerListTmpl
@@ -1881,7 +1890,6 @@
 RetryCounterFactory
 RetryImmediatelyException
 RetryingCallable
-RetryingCallableBase
 RetryingCallerInterceptor
 RetryingCallerInterceptorContext
 RetryingCallerInterceptorFactory
@@ -2069,6 +2077,7 @@
 SettableTimestamp
 ShareableMemory
 Shipper
+ShipperListener
 ShutdownHook
 ShutdownHook.DoNothingStoppable
 ShutdownHook.DoNothingThread
@@ -2267,7 +2276,7 @@
 SweepReducer
 SweepReducer.MobFileStatus
 SweepReducer.PathPrefixFilter
-SyncCoprocessorRpcChannel
+SyncCoprocessorRpcChannel
 SyncFuture
 SyncTable
 SyncTable.SyncMapper
@@ -2559,6 +2568,7 @@
 ZKUtil.ZKUtilOp.SetData
 ZKVisibilityLabelWatcher
 ZNodeClearer
+ZNodePaths
 ZooKeeperConnectionException
 ZooKeeperKeepAliveConnection
 ZooKeeperListener

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/allclasses-noframe.html
--
diff --git a/devapidocs/allclasses-noframe.html 
b/devapidocs/allclasses-noframe.html
index af5296d..ee9bec8 100644
--- a/devapidocs/allclasses-noframe.html
+++ b/devapidocs/allclasses-noframe.html
@@ -30,7 +30,6 @@
 AbstractMultiOutputCompactor
 AbstractPositionedByteRange
 AbstractProtobufLogWriter
-AbstractRegionServerCallable
 AbstractResponse
 AbstractResponse.ResponseType
 AbstractRpcClient
@@ -46,6 +45,7 @@
 AccessController
 AccessController.OpType
 AccessControlLists
+AccessControlUtil
 AccessDeniedException
 Action
 ActiveMasterManager
@@ -336,6 +336,7 @@
 CellSearcher
 CellSet
 CellSetModel
+CellSink
 CellTypeEncoder
 CellUtil
 CellUtil.EmptyByteBufferedCell
@@ -380,6 +381,7 @@
 ClientExceptionsUtil
 ClientIdGenerator
 ClientScanner
+ClientServiceCallable
 ClientSideRegionScanner
 ClientSimpleScanner
 ClientSmallReversedScanner
@@ -452,7 +454,6 @@
 CompactionWindow
 CompactionWindowFactory
 Compactor
-Compactor.CellSink
 Compactor.CellSinkFactory
 

[43/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html 
b/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html
index 55e58d1..00b7f4a 100644
--- a/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html
+++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html
@@ -172,19 +172,19 @@
 
 
 int
-Increment.compareTo(Rowi)
+RowMutations.compareTo(Rowi)
 
 
 int
-Mutation.compareTo(Rowd)
+Increment.compareTo(Rowi)
 
 
 int
-Get.compareTo(Rowother)
+Mutation.compareTo(Rowd)
 
 
 int
-RowMutations.compareTo(Rowi)
+Get.compareTo(Rowother)
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html 
b/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html
index ea1e717..c2ef864 100644
--- a/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html
+++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html
@@ -597,19 +597,19 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 void
-TableInputFormatBase.setScan(Scanscan)
+TableRecordReaderImpl.setScan(Scanscan)
 Sets the scan defining the actual details like columns 
etc.
 
 
 
 void
-TableRecordReaderImpl.setScan(Scanscan)
+TableRecordReader.setScan(Scanscan)
 Sets the scan defining the actual details like columns 
etc.
 
 
 
 void
-TableRecordReader.setScan(Scanscan)
+TableInputFormatBase.setScan(Scanscan)
 Sets the scan defining the actual details like columns 
etc.
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/client/package-tree.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/package-tree.html 
b/apidocs/org/apache/hadoop/hbase/client/package-tree.html
index d68be23..d5c9291 100644
--- a/apidocs/org/apache/hadoop/hbase/client/package-tree.html
+++ b/apidocs/org/apache/hadoop/hbase/client/package-tree.html
@@ -204,13 +204,13 @@
 
 java.lang.http://docs.oracle.com/javase/8/docs/api/java/lang/Enum.html?is-external=true;
 title="class or interface in java.lang">EnumE (implements java.lang.http://docs.oracle.com/javase/8/docs/api/java/lang/Comparable.html?is-external=true;
 title="class or interface in java.lang">ComparableT, java.io.http://docs.oracle.com/javase/8/docs/api/java/io/Serializable.html?is-external=true;
 title="class or interface in java.io">Serializable)
 
-org.apache.hadoop.hbase.client.CompactionState
+org.apache.hadoop.hbase.client.CompactType
+org.apache.hadoop.hbase.client.Consistency
 org.apache.hadoop.hbase.client.Durability
-org.apache.hadoop.hbase.client.MasterSwitchType
+org.apache.hadoop.hbase.client.CompactionState
 org.apache.hadoop.hbase.client.SnapshotType
-org.apache.hadoop.hbase.client.CompactType
 org.apache.hadoop.hbase.client.IsolationLevel
-org.apache.hadoop.hbase.client.Consistency
+org.apache.hadoop.hbase.client.MasterSwitchType
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignException.html
--
diff --git 
a/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignException.html 
b/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignException.html
index 3bb5566..bb38d42 100644
--- a/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignException.html
+++ b/apidocs/org/apache/hadoop/hbase/errorhandling/ForeignException.html
@@ -130,7 +130,7 @@ var activeTableTab = "activeTableTab";
 
 @InterfaceAudience.Public
  @InterfaceStability.Evolving
-public class ForeignException
+public class ForeignException
 extends http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOException
 A ForeignException is an exception from another thread or 
process.
  
@@ -253,7 +253,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/io/IOException.ht
 
 
 ForeignException
-publicForeignException(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringsource,
+publicForeignException(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringsource,
 http://docs.oracle.com/javase/8/docs/api/java/lang/Throwable.html?is-external=true;
 title="class or interface in java.lang">Throwablecause)
 Create a new ForeignException that can be serialized.  It 
is assumed that this came form a
  local source.
@@ -270,7 +270,7 @@ extends 

[28/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/BinaryPrefixComparator.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/BinaryPrefixComparator.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/BinaryPrefixComparator.html
index 2aef512..1a36be8 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/BinaryPrefixComparator.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/BinaryPrefixComparator.html
@@ -32,82 +32,83 @@
 024import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 025import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 026import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-027import 
org.apache.hadoop.hbase.protobuf.generated.ComparatorProtos;
-028import 
org.apache.hadoop.hbase.util.ByteBufferUtils;
-029import 
org.apache.hadoop.hbase.util.Bytes;
-030
-031import 
com.google.protobuf.InvalidProtocolBufferException;
-032
-033/**
-034 * A comparator which compares against a 
specified byte array, but only compares
-035 * up to the length of this byte array. 
For the rest it is similar to
-036 * {@link BinaryComparator}.
-037 */
-038@InterfaceAudience.Public
-039@InterfaceStability.Stable
-040public class BinaryPrefixComparator 
extends ByteArrayComparable {
-041
-042  /**
-043   * Constructor
-044   * @param value value
-045   */
-046  public BinaryPrefixComparator(byte[] 
value) {
-047super(value);
-048  }
-049
-050  @Override
-051  public int compareTo(byte [] value, int 
offset, int length) {
-052return Bytes.compareTo(this.value, 0, 
this.value.length, value, offset,
-053this.value.length = length ? 
this.value.length : length);
-054  }
-055
-056  @Override
-057  public int compareTo(ByteBuffer value, 
int offset, int length) {
-058if (this.value.length = length) 
{
-059  length = this.value.length;
-060}
-061return 
ByteBufferUtils.compareTo(this.value, 0, this.value.length, value, offset, 
length);
-062  }
-063
-064  /**
-065   * @return The comparator serialized 
using pb
-066   */
-067  public byte [] toByteArray() {
-068
ComparatorProtos.BinaryPrefixComparator.Builder builder =
-069  
ComparatorProtos.BinaryPrefixComparator.newBuilder();
-070
builder.setComparable(super.convert());
-071return 
builder.build().toByteArray();
-072  }
-073
-074  /**
-075   * @param pbBytes A pb serialized 
{@link BinaryPrefixComparator} instance
-076   * @return An instance of {@link 
BinaryPrefixComparator} made from codebytes/code
-077   * @throws DeserializationException
-078   * @see #toByteArray
-079   */
-080  public static BinaryPrefixComparator 
parseFrom(final byte [] pbBytes)
-081  throws DeserializationException {
-082
ComparatorProtos.BinaryPrefixComparator proto;
-083try {
-084  proto = 
ComparatorProtos.BinaryPrefixComparator.parseFrom(pbBytes);
-085} catch 
(InvalidProtocolBufferException e) {
-086  throw new 
DeserializationException(e);
-087}
-088return new 
BinaryPrefixComparator(proto.getComparable().getValue().toByteArray());
-089  }
-090
-091  /**
-092   * @param other
-093   * @return true if and only if the 
fields of the comparator that are serialized
-094   * are equal to the corresponding 
fields in other.  Used for testing.
-095   */
-096  boolean 
areSerializedFieldsEqual(ByteArrayComparable other) {
-097if (other == this) return true;
-098if (!(other instanceof 
BinaryPrefixComparator)) return false;
-099
-100return 
super.areSerializedFieldsEqual(other);
-101  }
-102}
+027import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+028import 
org.apache.hadoop.hbase.shaded.protobuf.generated.ComparatorProtos;
+029import 
org.apache.hadoop.hbase.util.ByteBufferUtils;
+030import 
org.apache.hadoop.hbase.util.Bytes;
+031
+032import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException;
+033
+034/**
+035 * A comparator which compares against a 
specified byte array, but only compares
+036 * up to the length of this byte array. 
For the rest it is similar to
+037 * {@link BinaryComparator}.
+038 */
+039@InterfaceAudience.Public
+040@InterfaceStability.Stable
+041public class BinaryPrefixComparator 
extends ByteArrayComparable {
+042
+043  /**
+044   * Constructor
+045   * @param value value
+046   */
+047  public BinaryPrefixComparator(byte[] 
value) {
+048super(value);
+049  }
+050
+051  @Override
+052  public int compareTo(byte [] value, int 
offset, int length) {
+053return Bytes.compareTo(this.value, 0, 
this.value.length, value, offset,
+054this.value.length = length ? 
this.value.length : length);
+055  }
+056
+057  @Override
+058  public int compareTo(ByteBuffer value, 
int offset, int length) {
+059if (this.value.length = length) 
{
+060  length = this.value.length;
+061}
+062return 

[26/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/FamilyFilter.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/filter/FamilyFilter.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/FamilyFilter.html
index 507603e..59c5078 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/filter/FamilyFilter.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/filter/FamilyFilter.html
@@ -34,108 +34,107 @@
 026import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 027import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 028import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-029import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-030import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-031
-032import 
com.google.protobuf.InvalidProtocolBufferException;
-033
-034/**
-035 * p
-036 * This filter is used to filter based on 
the column family. It takes an
-037 * operator (equal, greater, not equal, 
etc) and a byte [] comparator for the
-038 * column family portion of a key.
-039 * /pp
-040 * This filter can be wrapped with {@link 
org.apache.hadoop.hbase.filter.WhileMatchFilter} and {@link 
org.apache.hadoop.hbase.filter.SkipFilter}
-041 * to add more control.
-042 * /pp
-043 * Multiple filters can be combined using 
{@link org.apache.hadoop.hbase.filter.FilterList}.
-044 * /p
-045 * If an already known column family is 
looked for, use {@link org.apache.hadoop.hbase.client.Get#addFamily(byte[])}
-046 * directly rather than a filter.
-047 */
-048@InterfaceAudience.Public
-049@InterfaceStability.Stable
-050public class FamilyFilter extends 
CompareFilter {
-051
-052  /**
-053   * Constructor.
-054   *
-055   * @param familyCompareOp  the compare 
op for column family matching
-056   * @param familyComparator the 
comparator for column family matching
-057   */
-058  public FamilyFilter(final CompareOp 
familyCompareOp,
-059  final 
ByteArrayComparable familyComparator) {
-060  super(familyCompareOp, 
familyComparator);
-061  }
-062
-063  @Override
-064  public ReturnCode filterKeyValue(Cell 
v) {
-065int familyLength = 
v.getFamilyLength();
-066if (familyLength  0) {
-067  if (compareFamily(this.compareOp, 
this.comparator, v)) {
-068return ReturnCode.NEXT_ROW;
-069  }
-070}
-071return ReturnCode.INCLUDE;
-072  }
-073
-074  public static Filter 
createFilterFromArguments(ArrayListbyte [] filterArguments) {
-075ArrayList? arguments = 
CompareFilter.extractArguments(filterArguments);
-076CompareOp compareOp = 
(CompareOp)arguments.get(0);
-077ByteArrayComparable comparator = 
(ByteArrayComparable)arguments.get(1);
-078return new FamilyFilter(compareOp, 
comparator);
-079  }
-080
-081  /**
-082   * @return The filter serialized using 
pb
-083   */
-084  public byte [] toByteArray() {
-085FilterProtos.FamilyFilter.Builder 
builder =
-086  
FilterProtos.FamilyFilter.newBuilder();
-087
builder.setCompareFilter(super.convert());
-088return 
builder.build().toByteArray();
-089  }
-090
-091  /**
-092   * @param pbBytes A pb serialized 
{@link FamilyFilter} instance
-093   * @return An instance of {@link 
FamilyFilter} made from codebytes/code
-094   * @throws DeserializationException
-095   * @see #toByteArray
-096   */
-097  public static FamilyFilter 
parseFrom(final byte [] pbBytes)
-098  throws DeserializationException {
-099FilterProtos.FamilyFilter proto;
-100try {
-101  proto = 
FilterProtos.FamilyFilter.parseFrom(pbBytes);
-102} catch 
(InvalidProtocolBufferException e) {
-103  throw new 
DeserializationException(e);
-104}
-105final CompareOp valueCompareOp =
-106  
CompareOp.valueOf(proto.getCompareFilter().getCompareOp().name());
-107ByteArrayComparable valueComparator = 
null;
-108try {
-109  if 
(proto.getCompareFilter().hasComparator()) {
-110valueComparator = 
ProtobufUtil.toComparator(proto.getCompareFilter().getComparator());
-111  }
-112} catch (IOException ioe) {
-113  throw new 
DeserializationException(ioe);
-114}
-115return new 
FamilyFilter(valueCompareOp,valueComparator);
-116  }
-117
-118  /**
-119   * @param other
-120   * @return true if and only if the 
fields of the filter that are serialized
-121   * are equal to the corresponding 
fields in other.  Used for testing.
-122   */
-123  boolean areSerializedFieldsEqual(Filter 
o) {
-124if (o == this) return true;
-125if (!(o instanceof FamilyFilter)) 
return false;
-126
-127FamilyFilter other = 
(FamilyFilter)o;
-128return 
super.areSerializedFieldsEqual(other);
-129 }
-130}
+029import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+030import 
org.apache.hadoop.hbase.shaded.protobuf.generated.FilterProtos;
+031import 

[09/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/coc.html
--
diff --git a/coc.html b/coc.html
index 678dae0..cd135fe 100644
--- a/coc.html
+++ b/coc.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  
   Code of Conduct Policy
@@ -331,7 +331,7 @@ For flagrant violations requiring a firm response the PMC 
may opt to skip early
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-09-29
+  Last Published: 
2016-10-09
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/cygwin.html
--
diff --git a/cygwin.html b/cygwin.html
index 3e49549..7a9259b 100644
--- a/cygwin.html
+++ b/cygwin.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Installing Apache HBase (TM) on Windows using 
Cygwin
 
@@ -673,7 +673,7 @@ Now your HBase server is running, start 
coding and build that next
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-09-29
+  Last Published: 
2016-10-09
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/dependencies.html
--
diff --git a/dependencies.html b/dependencies.html
index 84fc8e2..c4feba0 100644
--- a/dependencies.html
+++ b/dependencies.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Project Dependencies
 
@@ -518,7 +518,7 @@
 http://www.apache.org/;>The Apache Software 
Foundation.
 All rights reserved.  
 
-  Last Published: 
2016-09-29
+  Last Published: 
2016-10-09
 
 
 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/dependency-convergence.html
--
diff --git a/dependency-convergence.html b/dependency-convergence.html
index 5de4cd0..88f7d2a 100644
--- a/dependency-convergence.html
+++ b/dependency-convergence.html
@@ -7,7 +7,7 @@
   
 
 
-
+
 
 Apache HBase  Reactor Dependency Convergence
 
@@ -284,22 +284,22 @@
 
 
 Number of sub-projects:
-29
+31
 
 Number of dependencies (NOD):
-87
+88
 
 Number of unique artifacts (NOA):
-87
+89
 
 Number of SNAPSHOT artifacts (NOS):
 0
 
 Convergence (NOD/NOA):
-100%
+98%
 
 Ready for Release (100% Convergence and no SNAPSHOTS):
-Success
+ErrorYou do not have 100% convergence.
 
 Dependencies used in 
sub-projects
 
@@ -321,6 +321,7 @@
 http://hbase.apache.org/hbase-archetypes/hbase-client-project;>org.apache.hbase:hbase-client-project
 http://hbase.apache.org/hbase-client;>org.apache.hbase:hbase-client
 http://hbase.apache.org/hbase-common;>org.apache.hbase:hbase-common
+http://hbase.apache.org/hbase-endpoint;>org.apache.hbase:hbase-endpoint
 http://hbase.apache.org/hbase-examples;>org.apache.hbase:hbase-examples
 http://hbase.apache.org/hbase-external-blockcache;>org.apache.hbase:hbase-external-blockcache
 http://hbase.apache.org/hbase-hadoop-compat;>org.apache.hbase:hbase-hadoop-compat
@@ -328,6 +329,7 @@
 http://hbase.apache.org/hbase-it;>org.apache.hbase:hbase-it
 http://hbase.apache.org/hbase-prefix-tree;>org.apache.hbase:hbase-prefix-tree
 http://hbase.apache.org/hbase-procedure;>org.apache.hbase:hbase-procedure
+http://hbase.apache.org/hbase-protocol-shaded;>org.apache.hbase:hbase-protocol-shaded
 http://hbase.apache.org/hbase-protocol;>org.apache.hbase:hbase-protocol
 http://hbase.apache.org/hbase-resource-bundle;>org.apache.hbase:hbase-resource-bundle
 http://hbase.apache.org/hbase-rest;>org.apache.hbase:hbase-rest
@@ -379,7 +381,7 @@
 com.google.protobuf:protobuf-java
 
 
-
+
 
 
 
@@ -394,15 +396,20 @@
 http://hbase.apache.org/hbase-rsgroup;>org.apache.hbase:hbase-rsgroup
 http://hbase.apache.org/hbase-server;>org.apache.hbase:hbase-server
 http://hbase.apache.org/hbase-spark;>org.apache.hbase:hbase-spark
-http://hbase.apache.org/hbase-thrift;>org.apache.hbase:hbase-thrift
+http://hbase.apache.org/hbase-thrift;>org.apache.hbase:hbase-thrift
+
+3.1.0
+
+
+http://hbase.apache.org/hbase-protocol-shaded;>org.apache.hbase:hbase-protocol-shaded
 
 com.lmax:disruptor
 
-
+
 
 
 
-
+
 3.3.0
 
 
@@ -410,11 +417,11 @@
 
 com.sun.jersey:jersey-client
 
-
+
 
 
 
-
+
 1.9
 
 
@@ -422,11 +429,11 @@
 
 com.sun.jersey:jersey-core
 
-
+
 
 
 
-
+
 1.9
 
 
@@ -435,11 +442,11 @@
 
 com.sun.jersey:jersey-json
 
-
+
 
 
 
-
+
 1.9
 
 
@@ -447,11 +454,11 @@
 
 com.sun.jersey:jersey-server
 
-
+
 
 
 
-
+
 1.9
 
 
@@ -460,11 +467,11 @@
 
 

[44/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/class-use/ServerName.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/class-use/ServerName.html 
b/apidocs/org/apache/hadoop/hbase/class-use/ServerName.html
index fe99dcf..bbf9a96 100644
--- a/apidocs/org/apache/hadoop/hbase/class-use/ServerName.html
+++ b/apidocs/org/apache/hadoop/hbase/class-use/ServerName.html
@@ -141,28 +141,22 @@
 
 
 static ServerName
-ServerName.parseFrom(byte[]data)
-Get a ServerName from the passed in data bytes.
-
-
-
-static ServerName
 ServerName.parseServerName(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringstr)
 
-
+
 static ServerName
 ServerName.parseVersionedServerName(byte[]versionedBytes)
 Use this method instantiating a ServerName from bytes
  gotten from a call to getVersionedBytes().
 
 
-
+
 static ServerName
 ServerName.valueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringserverName)
 Retrieve an instance of ServerName.
 
 
-
+
 static ServerName
 ServerName.valueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringhostname,
intport,
@@ -170,7 +164,7 @@
 Retrieve an instance of ServerName.
 
 
-
+
 static ServerName
 ServerName.valueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringhostAndPort,
longstartCode)

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/client/CompactionState.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/CompactionState.html 
b/apidocs/org/apache/hadoop/hbase/client/CompactionState.html
index ffea9b5..0dae31a 100644
--- a/apidocs/org/apache/hadoop/hbase/client/CompactionState.html
+++ b/apidocs/org/apache/hadoop/hbase/client/CompactionState.html
@@ -259,7 +259,7 @@ the order they are declared.
 
 
 values
-public staticCompactionState[]values()
+public staticCompactionState[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -279,7 +279,7 @@ for (CompactionState c : CompactionState.values())
 
 
 valueOf
-public staticCompactionStatevalueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticCompactionStatevalueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/client/Consistency.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/Consistency.html 
b/apidocs/org/apache/hadoop/hbase/client/Consistency.html
index e713810..957225e 100644
--- a/apidocs/org/apache/hadoop/hbase/client/Consistency.html
+++ b/apidocs/org/apache/hadoop/hbase/client/Consistency.html
@@ -254,7 +254,7 @@ the order they are declared.
 
 
 values
-public staticConsistency[]values()
+public staticConsistency[]values()
 Returns an array containing the constants of this enum 
type, in
 the order they are declared.  This method may be used to iterate
 over the constants as follows:
@@ -274,7 +274,7 @@ for (Consistency c : Consistency.values())
 
 
 valueOf
-public staticConsistencyvalueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
+public staticConsistencyvalueOf(http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">Stringname)
 Returns the enum constant of this type with the specified 
name.
 The string must match exactly an identifier used to declare an
 enum constant in this type.  (Extraneous whitespace characters are 

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/client/Durability.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/client/Durability.html 
b/apidocs/org/apache/hadoop/hbase/client/Durability.html
index 3d6bd6f..8854646 100644
--- a/apidocs/org/apache/hadoop/hbase/client/Durability.html
+++ b/apidocs/org/apache/hadoop/hbase/client/Durability.html
@@ -294,7 +294,7 @@ the order they are declared.
 
 

[18/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueFilter.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueFilter.html
index c7a9ba3..3cd803c 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueFilter.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueFilter.html
@@ -37,15 +37,15 @@
 029import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 030import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
 031import 
org.apache.hadoop.hbase.filter.CompareFilter.CompareOp;
-032import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-033import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-034import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos;
-035import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.CompareType;
-036import 
org.apache.hadoop.hbase.util.ByteStringer;
-037import 
org.apache.hadoop.hbase.util.Bytes;
-038
-039import 
com.google.common.base.Preconditions;
-040import 
com.google.protobuf.InvalidProtocolBufferException;
+032import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException;
+033import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.UnsafeByteOperations;
+034import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+035import 
org.apache.hadoop.hbase.shaded.protobuf.generated.FilterProtos;
+036import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos;
+037import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.CompareType;
+038import 
org.apache.hadoop.hbase.util.Bytes;
+039
+040import 
com.google.common.base.Preconditions;
 041
 042/**
 043 * This filter is used to filter cells 
based on value. It takes a {@link CompareFilter.CompareOp}
@@ -82,7 +82,7 @@
 074  protected byte [] columnFamily;
 075  protected byte [] columnQualifier;
 076  protected CompareOp compareOp;
-077  protected ByteArrayComparable 
comparator;
+077  protected 
org.apache.hadoop.hbase.filter.ByteArrayComparable comparator;
 078  protected boolean foundColumn = 
false;
 079  protected boolean matchedColumn = 
false;
 080  protected boolean filterIfMissing = 
false;
@@ -104,7 +104,7 @@
 096   */
 097  public SingleColumnValueFilter(final 
byte [] family, final byte [] qualifier,
 098  final CompareOp compareOp, final 
byte[] value) {
-099this(family, qualifier, compareOp, 
new BinaryComparator(value));
+099this(family, qualifier, compareOp, 
new org.apache.hadoop.hbase.filter.BinaryComparator(value));
 100  }
 101
 102  /**
@@ -122,283 +122,285 @@
 114   * @param comparator Comparator to 
use.
 115   */
 116  public SingleColumnValueFilter(final 
byte [] family, final byte [] qualifier,
-117  final CompareOp compareOp, final 
ByteArrayComparable comparator) {
-118this.columnFamily = family;
-119this.columnQualifier = qualifier;
-120this.compareOp = compareOp;
-121this.comparator = comparator;
-122  }
-123
-124  /**
-125   * Constructor for protobuf 
deserialization only.
-126   * @param family
-127   * @param qualifier
-128   * @param compareOp
-129   * @param comparator
-130   * @param filterIfMissing
-131   * @param latestVersionOnly
-132   */
-133  protected SingleColumnValueFilter(final 
byte[] family, final byte[] qualifier,
-134  final CompareOp compareOp, 
ByteArrayComparable comparator, final boolean filterIfMissing,
-135  final boolean latestVersionOnly) 
{
-136this(family, qualifier, compareOp, 
comparator);
-137this.filterIfMissing = 
filterIfMissing;
-138this.latestVersionOnly = 
latestVersionOnly;
-139  }
-140
-141  /**
-142   * @return operator
-143   */
-144  public CompareOp getOperator() {
-145return compareOp;
-146  }
-147
-148  /**
-149   * @return the comparator
-150   */
-151  public ByteArrayComparable 
getComparator() {
-152return comparator;
-153  }
-154
-155  /**
-156   * @return the family
-157   */
-158  public byte[] getFamily() {
-159return columnFamily;
-160  }
-161
-162  /**
-163   * @return the qualifier
-164   */
-165  public byte[] getQualifier() {
-166return columnQualifier;
-167  }
-168
-169  @Override
-170  public boolean filterRowKey(Cell cell) 
throws IOException {
-171// Impl in FilterBase might do 
unnecessary copy for Off heap backed Cells.
-172return false;
-173  }
-174
-175  @Override
-176  public ReturnCode filterKeyValue(Cell 
c) {
-177// System.out.println("REMOVE KEY=" + 
keyValue.toString() + ", value=" + Bytes.toString(keyValue.getValue()));
-178if (this.matchedColumn) {
-179  // We already found and matched the 
single column, all keys now pass
-180  return ReturnCode.INCLUDE;
-181} else if (this.latestVersionOnly 
 this.foundColumn) {
-182  // 

[11/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/book.html
--
diff --git a/book.html b/book.html
index d244150..876f41c 100644
--- a/book.html
+++ b/book.html
@@ -4,11 +4,11 @@
 
 
 
-
+
 
 Apache HBase  Reference Guide
 
-https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.2.0/css/font-awesome.min.css;>
+https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.4.0/css/font-awesome.min.css;>
 
 
 
@@ -261,17 +261,22 @@
 154. 
Integration Testing with an HBase Mini-Cluster
 
 
+Protobuf in HBase
+
+155. Protobuf
+
+
 ZooKeeper
 
-155. Using existing 
ZooKeeper ensemble
-156. SASL Authentication with ZooKeeper
+156. Using existing 
ZooKeeper ensemble
+157. SASL Authentication with ZooKeeper
 
 
 Community
 
-157. Decisions
-158. Community Roles
-159. Commit Message format
+158. Decisions
+159. Community Roles
+160. Commit Message format
 
 
 Appendix
@@ -281,7 +286,7 @@
 Appendix C: hbck In Depth
 Appendix D: Access Control Matrix
 Appendix E: Compression and Data Block Encoding In 
HBase
-160. Enable Data Block 
Encoding
+161. Enable Data Block 
Encoding
 Appendix F: SQL over HBase
 Appendix G: YCSB
 Appendix H: HFile format
@@ -290,8 +295,8 @@
 Appendix K: HBase and the Apache Software 
Foundation
 Appendix L: Apache HBase Orca
 Appendix M: Enabling Dapper-like Tracing in 
HBase
-161. Client Modifications
-162. Tracing from HBase Shell
+162. Client Modifications
+163. Tracing from HBase Shell
 Appendix N: 0.95 RPC Specification
 
 
@@ -5243,7 +5248,7 @@ example9
 
 
 # The java implementation to use.
-export JAVA_HOME=/usr/java/jdk1.7.0/
+export JAVA_HOME=/usr/java/jdk1.8.0/
 
 # The maximum amount of heap to use. Default is left to JVM default.
 export HBASE_HEAPSIZE=4G
@@ -5816,7 +5821,7 @@ It may be possible to skip across 
versionsfor example go fr
 APIs available in a patch version will be available in all later patch 
versions. However, new APIs may be added which will not be available in earlier 
patch versions.
 
 
-New APIs introduced in a patch version will only be added in a source 
compatible way [1]: i.e. code that 
implements public APIs will continue to compile.
+New APIs introduced in a patch version will only be added in a source 
compatible way [1]: i.e. code that 
implements public APIs will continue to compile.
 
 
 Example: A user using a newly deprecated API does not need to modify 
application code with HBase API calls until the next major version.
@@ -5880,7 +5885,7 @@ It may be possible to skip across 
versionsfor example go fr
 Summary
 
 
-A patch upgrade is a drop-in replacement. Any change that is not Java 
binary and source compatible would not be allowed.[2] Downgrading versions within patch releases may not be 
compatible.
+A patch upgrade is a drop-in replacement. Any change that is not Java 
binary and source compatible would not be allowed.[2] Downgrading versions within patch releases may not be 
compatible.
 
 
 A minor upgrade requires no application/client code modification. Ideally 
it would be a drop-in replacement but client code, coprocessors, filters, etc 
might have to be recompiled if new jars are used.
@@ -5891,7 +5896,7 @@ It may be possible to skip across 
versionsfor example go fr
 
 
 
-Table 3. Compatibility Matrix [3]
+Table 3. Compatibility Matrix [3]
 
 
 
@@ -5919,7 +5924,7 @@ It may be possible to skip across 
versionsfor example go fr
 
 
 File 
Format Compatibility
-N [4]
+N [4]
 Y
 Y
 
@@ -15757,31 +15762,51 @@ If you use HBase shell, the general command pattern 
is as follows:
 
 
 hbase.hstore.compaction.date.tiered.max.storefile.age.millis
-
+
+Contents
+
+
+
 Files with max-timestamp smaller than this will no longer be 
compacted.Default at Long.MAX_VALUE.
 
 
 
 hbase.hstore.compaction.date.tiered.base.window.millis
-
+
+Contents
+
+
+
 Base window size in milliseconds. Default at 6 hours.
 
 
 
 hbase.hstore.compaction.date.tiered.windows.per.tier
-
+
+Contents
+
+
+
 Number of windows per tier. Default at 4.
 
 
 
 hbase.hstore.compaction.date.tiered.incoming.window.min
-
+
+Contents
+
+
+
 Minimal number of files to compact in the incoming window. Set it to 
expected number of files in the window to avoid wasteful compaction. Default at 
6.
 
 
 
 hbase.hstore.compaction.date.tiered.window.policy.class
-
+
+Contents
+
+
+
 The policy to select store files within the same time window. It doesn’t 
apply to the incoming window. Default at exploring compaction. This is to avoid 
wasteful compaction.
 
 
@@ -15937,7 +15962,11 @@ For example, if your regions are 30 GB, 12 x 2.5 GB 
stripes might be a good star
 
 
 hbase.store.stripe.initialStripeCount
-
+
+Contents
+
+
+
 The number of stripes to create when stripe compaction is enabled. You can 
use it as follows:
 
 
@@ -15962,7 +15991,11 @@ one hash prefix per region, pre-splitting may make 
sense.
 
 
 hbase.store.stripe.sizeToSplit
-
+
+Contents
+
+
+
 The maximum size a stripe grows before splitting. Use 

[48/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apache_hbase_reference_guide.pdfmarks
--
diff --git a/apache_hbase_reference_guide.pdfmarks 
b/apache_hbase_reference_guide.pdfmarks
index 5ef666b..43c257d 100644
--- a/apache_hbase_reference_guide.pdfmarks
+++ b/apache_hbase_reference_guide.pdfmarks
@@ -1,9 +1,9 @@
-[ /Title (Apache HBase  Reference Guide)
+[ /Title 

   /Author (Apache HBase Team)
-  /Subject ()
-  /Keywords ()
-  /ModDate (D:20160929151030)
-  /CreationDate (D:20160929151030)
-  /Creator (Asciidoctor PDF 1.5.0.alpha.6, based on Prawn 1.2.1)
-  /Producer ()
+  /Subject null
+  /Keywords null
+  /ModDate (D:20161009074600)
+  /CreationDate (D:20161009074600)
+  /Creator (Asciidoctor PDF 1.5.0.alpha.11, based on Prawn 1.3.0)
+  /Producer null
   /DOCINFO pdfmark

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/allclasses-frame.html
--
diff --git a/apidocs/allclasses-frame.html b/apidocs/allclasses-frame.html
index 8c9dd06..492f79c 100644
--- a/apidocs/allclasses-frame.html
+++ b/apidocs/allclasses-frame.html
@@ -307,7 +307,6 @@
 StructIterator
 SubstringComparator
 Sweeper
-SyncCoprocessorRpcChannel
 Table
 TableExistsException
 TableInfoMissingException

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/allclasses-noframe.html
--
diff --git a/apidocs/allclasses-noframe.html b/apidocs/allclasses-noframe.html
index b40ba1f..0efd518 100644
--- a/apidocs/allclasses-noframe.html
+++ b/apidocs/allclasses-noframe.html
@@ -307,7 +307,6 @@
 StructIterator
 SubstringComparator
 Sweeper
-SyncCoprocessorRpcChannel
 Table
 TableExistsException
 TableInfoMissingException

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/deprecated-list.html
--
diff --git a/apidocs/deprecated-list.html b/apidocs/deprecated-list.html
index 5d7de15..d1e5315 100644
--- a/apidocs/deprecated-list.html
+++ b/apidocs/deprecated-list.html
@@ -323,13 +323,13 @@
 org.apache.hadoop.hbase.rest.client.RemoteHTable.getRpcTimeout()
 
 
-org.apache.hadoop.hbase.util.Bytes.getSize()
-use Bytes.getLength()
 instead
+org.apache.hadoop.hbase.io.ImmutableBytesWritable.getSize()
+use ImmutableBytesWritable.getLength()
 instead
 
 
 
-org.apache.hadoop.hbase.io.ImmutableBytesWritable.getSize()
-use ImmutableBytesWritable.getLength()
 instead
+org.apache.hadoop.hbase.util.Bytes.getSize()
+use Bytes.getLength()
 instead
 
 
 
@@ -454,16 +454,21 @@
 
 
 
-org.apache.hadoop.hbase.util.Bytes.toIntUnsafe(byte[],
 int)
+org.apache.hadoop.hbase.util.Bytes.toByteString()
 As of release 2.0.0, this 
will be removed in HBase 3.0.0.
 
 
 
-org.apache.hadoop.hbase.util.Bytes.toLongUnsafe(byte[],
 int)
+org.apache.hadoop.hbase.util.Bytes.toIntUnsafe(byte[],
 int)
 As of release 2.0.0, this 
will be removed in HBase 3.0.0.
 
 
 
+org.apache.hadoop.hbase.util.Bytes.toLongUnsafe(byte[],
 int)
+As of release 2.0.0, this 
will be removed in HBase 3.0.0.
+
+
+
 org.apache.hadoop.hbase.util.Bytes.toShortUnsafe(byte[],
 int)
 As of release 2.0.0, this 
will be removed in HBase 3.0.0.
 
@@ -484,54 +489,59 @@
 
 
 
+org.apache.hadoop.hbase.util.Bytes(ByteString)
+As of release 2.0.0, this 
will be removed in HBase 3.0.0.
+
+
+
 org.apache.hadoop.hbase.HBaseConfiguration()
 Please use create() 
instead.
 
 
-
+
 org.apache.hadoop.hbase.HBaseConfiguration(Configuration)
 Please user create(conf) 
instead.
 
 
-
+
 org.apache.hadoop.hbase.HTableDescriptor()
 As of release 0.96 (https://issues.apache.org/jira/browse/HBASE-5453;>HBASE-5453).
  This was made protected in 2.0.0 and will be removed in HBase 
3.0.0.
  Used by Writables and Writables are going away.
 
 
-
+
 org.apache.hadoop.hbase.HTableDescriptor(byte[])
 
-
+
 org.apache.hadoop.hbase.HTableDescriptor(String)
 
-
+
 org.apache.hadoop.hbase.io.TimeRange()
 This is made 
@InterfaceAudience.Private in the 2.0 line and above
 
 
-
+
 org.apache.hadoop.hbase.io.TimeRange(byte[])
 This is made 
@InterfaceAudience.Private in the 2.0 line and above
 
 
-
+
 org.apache.hadoop.hbase.io.TimeRange(byte[],
 byte[])
 This is made 
@InterfaceAudience.Private in the 2.0 line and above
 
 
-
+
 org.apache.hadoop.hbase.io.TimeRange(long)
 This is made 
@InterfaceAudience.Private in the 2.0 line and above
 
 
-
+
 org.apache.hadoop.hbase.io.TimeRange(long,
 long)
 This is made 
@InterfaceAudience.Private in the 2.0 line and above
 
 
-
+
 org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor()
 As of release 2.0.0. This 
will be removed in HBase 3.0.0.
   Use UnmodifyableHTableDescriptor.UnmodifyableHTableDescriptor(HTableDescriptor).

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/index-all.html

[40/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html 
b/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
index a41faf9..644a013 100644
--- a/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
+++ b/apidocs/org/apache/hadoop/hbase/filter/class-use/Filter.html
@@ -390,23 +390,23 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 static Filter
-InclusiveStopFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+MultipleColumnPrefixFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-DependentColumnFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+SingleColumnValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-FirstKeyOnlyFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+RowFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-KeyOnlyFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+ColumnCountGetFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-SingleColumnValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+TimestampsFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
@@ -414,11 +414,11 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 static Filter
-PageFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+DependentColumnFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-QualifierFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+InclusiveStopFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
@@ -426,47 +426,47 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 static Filter
-ValueFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+FirstKeyOnlyFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-MultipleColumnPrefixFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
+PageFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)
 
 
 static Filter
-TimestampsFilter.createFilterFromArguments(http://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html?is-external=true;
 title="class or interface in 
java.util">ArrayListbyte[]filterArguments)

[27/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPaginationFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPaginationFilter.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPaginationFilter.html
index f8fb24d..bd0d99f 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPaginationFilter.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPaginationFilter.html
@@ -35,12 +35,12 @@
 027import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 028import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 029import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-030import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-031import 
org.apache.hadoop.hbase.util.ByteStringer;
-032import 
org.apache.hadoop.hbase.util.Bytes;
-033
-034import 
com.google.common.base.Preconditions;
-035import 
com.google.protobuf.InvalidProtocolBufferException;
+030import 
org.apache.hadoop.hbase.shaded.protobuf.generated.FilterProtos;
+031import 
org.apache.hadoop.hbase.util.Bytes;
+032
+033import 
com.google.common.base.Preconditions;
+034import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException;
+035import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.UnsafeByteOperations;
 036
 037/**
 038 * A filter, based on the 
ColumnCountGetFilter, takes two arguments: limit and offset.
@@ -180,7 +180,7 @@
 172  builder.setOffset(this.offset);
 173}
 174if (this.columnOffset != null) {
-175  
builder.setColumnOffset(ByteStringer.wrap(this.columnOffset));
+175  
builder.setColumnOffset(UnsafeByteOperations.unsafeWrap(this.columnOffset));
 176}
 177return 
builder.build().toByteArray();
 178  }

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPrefixFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPrefixFilter.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPrefixFilter.html
index 826e5c7..0c3cb13 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPrefixFilter.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnPrefixFilter.html
@@ -36,13 +36,13 @@
 028import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 029import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 030import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-031import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
+031import 
org.apache.hadoop.hbase.shaded.protobuf.generated.FilterProtos;
 032import 
org.apache.hadoop.hbase.util.ByteBufferUtils;
-033import 
org.apache.hadoop.hbase.util.ByteStringer;
-034import 
org.apache.hadoop.hbase.util.Bytes;
-035
-036import 
com.google.common.base.Preconditions;
-037import 
com.google.protobuf.InvalidProtocolBufferException;
+033import 
org.apache.hadoop.hbase.util.Bytes;
+034
+035import 
com.google.common.base.Preconditions;
+036import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException;
+037import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.UnsafeByteOperations;
 038
 039/**
 040 * This filter is used for selecting only 
those keys with columns that matches
@@ -120,7 +120,7 @@
 112  public byte [] toByteArray() {
 113
FilterProtos.ColumnPrefixFilter.Builder builder =
 114  
FilterProtos.ColumnPrefixFilter.newBuilder();
-115if (this.prefix != null) 
builder.setPrefix(ByteStringer.wrap(this.prefix));
+115if (this.prefix != null) 
builder.setPrefix(UnsafeByteOperations.unsafeWrap(this.prefix));
 116return 
builder.build().toByteArray();
 117  }
 118

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnRangeFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnRangeFilter.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnRangeFilter.html
index 0d25b14..e3cae69 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnRangeFilter.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/filter/ColumnRangeFilter.html
@@ -38,12 +38,12 @@
 030import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 031import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 032import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-033import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-034import 
org.apache.hadoop.hbase.util.ByteStringer;
-035import 
org.apache.hadoop.hbase.util.Bytes;
-036
-037import 
com.google.common.base.Preconditions;
-038import 
com.google.protobuf.InvalidProtocolBufferException;

[25/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.Operator.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.Operator.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.Operator.html
index 98a111b..8b52a04 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.Operator.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.Operator.html
@@ -37,479 +37,477 @@
 029import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 030import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 031import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-032import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-033import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-034
-035import 
com.google.protobuf.InvalidProtocolBufferException;
-036
-037/**
-038 * Implementation of {@link Filter} that 
represents an ordered List of Filters
-039 * which will be evaluated with a 
specified boolean operator {@link Operator#MUST_PASS_ALL}
-040 * (codeAND/code) or 
{@link Operator#MUST_PASS_ONE} (codeOR/code).
-041 * Since you can use Filter Lists as 
children of Filter Lists, you can create a
-042 * hierarchy of filters to be 
evaluated.
-043 *
-044 * br
-045 * {@link Operator#MUST_PASS_ALL} 
evaluates lazily: evaluation stops as soon as one filter does
-046 * not include the KeyValue.
-047 *
-048 * br
-049 * {@link Operator#MUST_PASS_ONE} 
evaluates non-lazily: all filters are always evaluated.
-050 *
-051 * br
-052 * Defaults to {@link 
Operator#MUST_PASS_ALL}.
-053 */
-054@InterfaceAudience.Public
-055@InterfaceStability.Stable
-056final public class FilterList extends 
Filter {
-057  /** set operator */
-058  @InterfaceAudience.Public
-059  @InterfaceStability.Stable
-060  public static enum Operator {
-061/** !AND */
-062MUST_PASS_ALL,
-063/** !OR */
-064MUST_PASS_ONE
-065  }
-066
-067  private static final int 
MAX_LOG_FILTERS = 5;
-068  private Operator operator = 
Operator.MUST_PASS_ALL;
-069  private ListFilter filters = 
new ArrayListFilter();
-070  private Filter seekHintFilter = null;
-071
-072  /** Reference Cell used by {@link 
#transformCell(Cell)} for validation purpose. */
-073  private Cell referenceCell = null;
-074
-075  /**
-076   * When filtering a given Cell in 
{@link #filterKeyValue(Cell)},
-077   * this stores the transformed Cell to 
be returned by {@link #transformCell(Cell)}.
-078   *
-079   * Individual filters transformation 
are applied only when the filter includes the Cell.
-080   * Transformations are composed in the 
order specified by {@link #filters}.
-081   */
-082  private Cell transformedCell = null;
-083
-084  /**
-085   * Constructor that takes a set of 
{@link Filter}s. The default operator
-086   * MUST_PASS_ALL is assumed.
-087   *
-088   * @param rowFilters list of filters
-089   */
-090  public FilterList(final 
ListFilter rowFilters) {
-091if (rowFilters instanceof ArrayList) 
{
-092  this.filters = rowFilters;
-093} else {
-094  this.filters = new 
ArrayListFilter(rowFilters);
-095}
-096  }
-097
-098  /**
-099   * Constructor that takes a var arg 
number of {@link Filter}s. The fefault operator
-100   * MUST_PASS_ALL is assumed.
-101   * @param rowFilters
-102   */
-103  public FilterList(final Filter... 
rowFilters) {
-104this.filters = new 
ArrayListFilter(Arrays.asList(rowFilters));
-105  }
-106
-107  /**
-108   * Constructor that takes an 
operator.
-109   *
-110   * @param operator Operator to process 
filter set with.
-111   */
-112  public FilterList(final Operator 
operator) {
-113this.operator = operator;
-114  }
-115
-116  /**
-117   * Constructor that takes a set of 
{@link Filter}s and an operator.
-118   *
-119   * @param operator Operator to process 
filter set with.
-120   * @param rowFilters Set of row 
filters.
-121   */
-122  public FilterList(final Operator 
operator, final ListFilter rowFilters) {
-123this.filters = new 
ArrayListFilter(rowFilters);
-124this.operator = operator;
-125  }
-126
-127  /**
-128   * Constructor that takes a var arg 
number of {@link Filter}s and an operator.
-129   *
-130   * @param operator Operator to process 
filter set with.
-131   * @param rowFilters Filters to use
-132   */
-133  public FilterList(final Operator 
operator, final Filter... rowFilters) {
-134this.filters = new 
ArrayListFilter(Arrays.asList(rowFilters));
-135this.operator = operator;
-136  }
-137
-138  /**
-139   * Get the operator.
-140   *
-141   * @return operator
-142   */
-143  public Operator getOperator() {
-144return operator;
-145  }
-146
-147  /**
-148   * Get the filters.
-149   *
-150   * @return filters
-151   */
-152  public ListFilter getFilters() 
{
-153return filters;
-154  }
-155
-156  /**
-157   * Add a filter.
-158   *
-159 

[24/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.html
index 98a111b..8b52a04 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/filter/FilterList.html
@@ -37,479 +37,477 @@
 029import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 030import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 031import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-032import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-033import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-034
-035import 
com.google.protobuf.InvalidProtocolBufferException;
-036
-037/**
-038 * Implementation of {@link Filter} that 
represents an ordered List of Filters
-039 * which will be evaluated with a 
specified boolean operator {@link Operator#MUST_PASS_ALL}
-040 * (codeAND/code) or 
{@link Operator#MUST_PASS_ONE} (codeOR/code).
-041 * Since you can use Filter Lists as 
children of Filter Lists, you can create a
-042 * hierarchy of filters to be 
evaluated.
-043 *
-044 * br
-045 * {@link Operator#MUST_PASS_ALL} 
evaluates lazily: evaluation stops as soon as one filter does
-046 * not include the KeyValue.
-047 *
-048 * br
-049 * {@link Operator#MUST_PASS_ONE} 
evaluates non-lazily: all filters are always evaluated.
-050 *
-051 * br
-052 * Defaults to {@link 
Operator#MUST_PASS_ALL}.
-053 */
-054@InterfaceAudience.Public
-055@InterfaceStability.Stable
-056final public class FilterList extends 
Filter {
-057  /** set operator */
-058  @InterfaceAudience.Public
-059  @InterfaceStability.Stable
-060  public static enum Operator {
-061/** !AND */
-062MUST_PASS_ALL,
-063/** !OR */
-064MUST_PASS_ONE
-065  }
-066
-067  private static final int 
MAX_LOG_FILTERS = 5;
-068  private Operator operator = 
Operator.MUST_PASS_ALL;
-069  private ListFilter filters = 
new ArrayListFilter();
-070  private Filter seekHintFilter = null;
-071
-072  /** Reference Cell used by {@link 
#transformCell(Cell)} for validation purpose. */
-073  private Cell referenceCell = null;
-074
-075  /**
-076   * When filtering a given Cell in 
{@link #filterKeyValue(Cell)},
-077   * this stores the transformed Cell to 
be returned by {@link #transformCell(Cell)}.
-078   *
-079   * Individual filters transformation 
are applied only when the filter includes the Cell.
-080   * Transformations are composed in the 
order specified by {@link #filters}.
-081   */
-082  private Cell transformedCell = null;
-083
-084  /**
-085   * Constructor that takes a set of 
{@link Filter}s. The default operator
-086   * MUST_PASS_ALL is assumed.
-087   *
-088   * @param rowFilters list of filters
-089   */
-090  public FilterList(final 
ListFilter rowFilters) {
-091if (rowFilters instanceof ArrayList) 
{
-092  this.filters = rowFilters;
-093} else {
-094  this.filters = new 
ArrayListFilter(rowFilters);
-095}
-096  }
-097
-098  /**
-099   * Constructor that takes a var arg 
number of {@link Filter}s. The fefault operator
-100   * MUST_PASS_ALL is assumed.
-101   * @param rowFilters
-102   */
-103  public FilterList(final Filter... 
rowFilters) {
-104this.filters = new 
ArrayListFilter(Arrays.asList(rowFilters));
-105  }
-106
-107  /**
-108   * Constructor that takes an 
operator.
-109   *
-110   * @param operator Operator to process 
filter set with.
-111   */
-112  public FilterList(final Operator 
operator) {
-113this.operator = operator;
-114  }
-115
-116  /**
-117   * Constructor that takes a set of 
{@link Filter}s and an operator.
-118   *
-119   * @param operator Operator to process 
filter set with.
-120   * @param rowFilters Set of row 
filters.
-121   */
-122  public FilterList(final Operator 
operator, final ListFilter rowFilters) {
-123this.filters = new 
ArrayListFilter(rowFilters);
-124this.operator = operator;
-125  }
-126
-127  /**
-128   * Constructor that takes a var arg 
number of {@link Filter}s and an operator.
-129   *
-130   * @param operator Operator to process 
filter set with.
-131   * @param rowFilters Filters to use
-132   */
-133  public FilterList(final Operator 
operator, final Filter... rowFilters) {
-134this.filters = new 
ArrayListFilter(Arrays.asList(rowFilters));
-135this.operator = operator;
-136  }
-137
-138  /**
-139   * Get the operator.
-140   *
-141   * @return operator
-142   */
-143  public Operator getOperator() {
-144return operator;
-145  }
-146
-147  /**
-148   * Get the filters.
-149   *
-150   * @return filters
-151   */
-152  public ListFilter getFilters() 
{
-153return filters;
-154  }
-155
-156  /**
-157   * Add a filter.
-158   *
-159   * @param filter another filter
-160   */

[47/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/CellUtil.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/CellUtil.html 
b/apidocs/org/apache/hadoop/hbase/CellUtil.html
index e2a2306..b17f669 100644
--- a/apidocs/org/apache/hadoop/hbase/CellUtil.html
+++ b/apidocs/org/apache/hadoop/hbase/CellUtil.html
@@ -1157,7 +1157,7 @@ public statichttp://docs.oracle.com/javase/8/docs/api/java/nio/By
 
 
 createCellScanner
-public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">List? extends 
org.apache.hadoop.hbase.CellScannablecellScannerables)
+public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">List? extends 
org.apache.hadoop.hbase.CellScannablecellScannerables)
 
 Parameters:
 cellScannerables - 
@@ -1172,7 +1172,7 @@ public statichttp://docs.oracle.com/javase/8/docs/api/java/nio/By
 
 
 createCellScanner
-public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/lang/Iterable.html?is-external=true;
 title="class or interface in java.lang">IterableCellcellIterable)
+public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/lang/Iterable.html?is-external=true;
 title="class or interface in java.lang">IterableCellcellIterable)
 
 Parameters:
 cellIterable - 
@@ -1187,7 +1187,7 @@ public statichttp://docs.oracle.com/javase/8/docs/api/java/nio/By
 
 
 createCellScanner
-public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/Iterator.html?is-external=true;
 title="class or interface in java.util">IteratorCellcells)
+public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/Iterator.html?is-external=true;
 title="class or interface in java.util">IteratorCellcells)
 
 Parameters:
 cells - 
@@ -1203,7 +1203,7 @@ public statichttp://docs.oracle.com/javase/8/docs/api/java/nio/By
 
 
 createCellScanner
-public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(Cell[]cellArray)
+public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(Cell[]cellArray)
 
 Parameters:
 cellArray - 
@@ -1218,7 +1218,7 @@ public statichttp://docs.oracle.com/javase/8/docs/api/java/nio/By
 
 
 createCellScanner
-public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
+public staticorg.apache.hadoop.hbase.CellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 Flatten the map of cells out under the CellScanner
 
 Parameters:
@@ -1236,7 +1236,7 @@ public statichttp://docs.oracle.com/javase/8/docs/api/java/nio/By
 
 matchingRow
 http://docs.oracle.com/javase/8/docs/api/java/lang/Deprecated.html?is-external=true;
 title="class or interface in java.lang">@Deprecated
-public staticbooleanmatchingRow(Cellleft,
+public staticbooleanmatchingRow(Cellleft,
   Cellright)
 Deprecated.As of release 2.0.0, this will be removed in HBase 
3.0.0.
  Instead use matchingRows(Cell,
 Cell)
@@ -1255,7 +1255,7 @@ public staticboolean
 
 matchingRow
-public staticbooleanmatchingRow(Cellleft,
+public staticbooleanmatchingRow(Cellleft,
   byte[]buf)
 
 
@@ -1265,7 +1265,7 @@ public staticboolean
 
 matchingRow
-public staticbooleanmatchingRow(Cellleft,
+public staticbooleanmatchingRow(Cellleft,
   byte[]buf,
   intoffset,
   intlength)
@@ -1277,7 +1277,7 @@ public staticboolean
 
 matchingFamily
-public staticbooleanmatchingFamily(Cellleft,
+public staticbooleanmatchingFamily(Cellleft,
  Cellright)
 
 
@@ -1287,7 +1287,7 @@ public staticboolean
 
 matchingFamily
-public staticbooleanmatchingFamily(Cellleft,
+public staticbooleanmatchingFamily(Cellleft,
  byte[]buf)
 
 
@@ -1297,7 +1297,7 @@ public staticboolean
 
 matchingFamily
-public staticbooleanmatchingFamily(Cellleft,
+public staticbooleanmatchingFamily(Cellleft,
  byte[]buf,
   

[21/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.html
index 070e398..b25492c 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/filter/MultiRowRangeFilter.html
@@ -36,496 +36,495 @@
 028import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 029import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 030import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-031import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-032import 
org.apache.hadoop.hbase.util.ByteStringer;
-033import 
org.apache.hadoop.hbase.util.Bytes;
-034
-035import 
com.google.protobuf.InvalidProtocolBufferException;
-036
-037/**
-038 * Filter to support scan multiple row 
key ranges. It can construct the row key ranges from the
-039 * passed list which can be accessed by 
each region server.
-040 *
-041 * HBase is quite efficient when scanning 
only one small row key range. If user needs to specify
-042 * multiple row key ranges in one scan, 
the typical solutions are: 1. through FilterList which is a
-043 * list of row key Filters, 2. using the 
SQL layer over HBase to join with two table, such as hive,
-044 * phoenix etc. However, both solutions 
are inefficient. Both of them can't utilize the range info
-045 * to perform fast forwarding during scan 
which is quite time consuming. If the number of ranges
-046 * are quite big (e.g. millions), join is 
a proper solution though it is slow. However, there are
-047 * cases that user wants to specify a 
small number of ranges to scan (e.g. lt;1000 ranges). Both
-048 * solutions can't provide satisfactory 
performance in such case. MultiRowRangeFilter is to support
-049 * such usec ase (scan multiple row key 
ranges), which can construct the row key ranges from user
-050 * specified list and perform 
fast-forwarding during scan. Thus, the scan will be quite efficient.
-051 */
-052@InterfaceAudience.Public
-053@InterfaceStability.Evolving
-054public class MultiRowRangeFilter extends 
FilterBase {
-055
-056  private ListRowRange 
rangeList;
-057
-058  private static final int 
ROW_BEFORE_FIRST_RANGE = -1;
-059  private boolean EXCLUSIVE = false;
-060  private boolean done = false;
-061  private boolean initialized = false;
-062  private int index;
-063  private RowRange range;
-064  private ReturnCode currentReturnCode;
-065
-066  /**
-067   * @param list A list of 
codeRowRange/code
-068   * @throws java.io.IOException
-069   *   throw an exception if the 
range list is not in an natural order or any
-070   *   
codeRowRange/code is invalid
-071   */
-072  public 
MultiRowRangeFilter(ListRowRange list) throws IOException {
-073this.rangeList = 
sortAndMerge(list);
-074  }
-075
-076  @Override
-077  public boolean filterAllRemaining() {
-078return done;
-079  }
-080
-081  public ListRowRange 
getRowRanges() {
-082return this.rangeList;
-083  }
-084
-085  @Override
-086  public boolean filterRowKey(Cell 
firstRowCell) {
-087if (filterAllRemaining()) return 
true;
-088// If it is the first time of 
running, calculate the current range index for
-089// the row key. If index is out of 
bound which happens when the start row
-090// user sets is after the largest 
stop row of the ranges, stop the scan.
-091// If row key is after the current 
range, find the next range and update index.
-092byte[] rowArr = 
firstRowCell.getRowArray();
-093int length = 
firstRowCell.getRowLength();
-094int offset = 
firstRowCell.getRowOffset();
-095if (!initialized
-096|| !range.contains(rowArr, 
offset, length)) {
-097  byte[] rowkey = 
CellUtil.cloneRow(firstRowCell);
-098  index = 
getNextRangeIndex(rowkey);
-099  if (index = rangeList.size()) 
{
-100done = true;
-101currentReturnCode = 
ReturnCode.NEXT_ROW;
-102return false;
-103  }
-104  if(index != ROW_BEFORE_FIRST_RANGE) 
{
-105range = rangeList.get(index);
-106  } else {
-107range = rangeList.get(0);
-108  }
-109  if (EXCLUSIVE) {
-110EXCLUSIVE = false;
-111currentReturnCode = 
ReturnCode.NEXT_ROW;
-112return false;
-113  }
-114  if (!initialized) {
-115if(index != 
ROW_BEFORE_FIRST_RANGE) {
-116  currentReturnCode = 
ReturnCode.INCLUDE;
-117} else {
-118  currentReturnCode = 
ReturnCode.SEEK_NEXT_USING_HINT;
-119}
-120initialized = true;
-121  } else {
-122if (range.contains(rowArr, 
offset, length)) {
-123  currentReturnCode = 
ReturnCode.INCLUDE;
-124} else currentReturnCode 

[49/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apache_hbase_reference_guide.pdf
--
diff --git a/apache_hbase_reference_guide.pdf b/apache_hbase_reference_guide.pdf
index 068b048..41cd630 100644
--- a/apache_hbase_reference_guide.pdf
+++ b/apache_hbase_reference_guide.pdf
@@ -1,105950 +1,74 @@
 %PDF-1.4
 %����
 1 0 obj
-<< /Title (Apache HBase  Reference Guide)
+<< /Title 

 /Author (Apache HBase Team)
-/Creator (Asciidoctor PDF 1.5.0.alpha.6, based on Prawn 1.2.1)
+/Creator (Asciidoctor PDF 1.5.0.alpha.11, based on Prawn 1.3.0)
 /Producer (Apache HBase Team)
-/CreationDate (D:20160929150855+00'00')
-/ModDate (D:20160929150855+00'00')
+/CreationDate (D:20161009074353+00'00')
+/ModDate (D:20161009074353+00'00')
 >>
 endobj
 2 0 obj
 << /Type /Catalog
 /Pages 3 0 R
-/Names 25 0 R
-/Outlines 4009 0 R
-/PageLabels 4213 0 R
+/Names 24 0 R
+/Outlines 4237 0 R
+/PageLabels 4443 0 R
 /PageMode /UseOutlines
-/ViewerPreferences [/FitWindow]
+/OpenAction [7 0 R /FitH 842.89]
+/ViewerPreferences << /DisplayDocTitle true
 >>
-endobj
-3 0 obj
-<< /Type /Pages
-/Count 667
-/Kids [7 0 R 13 0 R 15 0 R 17 0 R 19 0 R 21 0 R 23 0 R 39 0 R 43 0 R 47 0 R 58 
0 R 62 0 R 64 0 R 66 0 R 68 0 R 75 0 R 78 0 R 80 0 R 85 0 R 88 0 R 90 0 R 92 0 
R 101 0 R 107 0 R 112 0 R 114 0 R 135 0 R 141 0 R 148 0 R 150 0 R 154 0 R 157 0 
R 167 0 R 175 0 R 191 0 R 195 0 R 199 0 R 201 0 R 205 0 R 211 0 R 213 0 R 215 0 
R 217 0 R 219 0 R 222 0 R 228 0 R 231 0 R 233 0 R 235 0 R 237 0 R 239 0 R 241 0 
R 244 0 R 247 0 R 251 0 R 253 0 R 255 0 R 257 0 R 259 0 R 261 0 R 263 0 R 265 0 
R 272 0 R 274 0 R 276 0 R 278 0 R 280 0 R 285 0 R 290 0 R 295 0 R 298 0 R 302 0 
R 317 0 R 327 0 R 333 0 R 344 0 R 354 0 R 359 0 R 361 0 R 363 0 R 374 0 R 379 0 
R 383 0 R 388 0 R 392 0 R 403 0 R 415 0 R 430 0 R 436 0 R 438 0 R 440 0 R 447 0 
R 458 0 R 469 0 R 480 0 R 483 0 R 486 0 R 490 0 R 494 0 R 497 0 R 500 0 R 502 0 
R 505 0 R 509 0 R 511 0 R 515 0 R 519 0 R 525 0 R 529 0 R 531 0 R 537 0 R 539 0 
R 543 0 R 551 0 R 553 0 R 556 0 R 559 0 R 562 0 R 565 0 R 580 0 R 587 0 R 594 0 
R 605 0 R 612 0 R 621 0 R 629 0 R 632 0
  R 636 0 R 639 0 R 651 0 R 659 0 R 665 0 R 670 0 R 674 0 R 676 0 R 690 0 R 702 
0 R 708 0 R 714 0 R 717 0 R 725 0 R 733 0 R 738 0 R 743 0 R 749 0 R 751 0 R 753 
0 R 755 0 R 763 0 R 772 0 R 776 0 R 783 0 R 791 0 R 797 0 R 801 0 R 808 0 R 812 
0 R 817 0 R 825 0 R 827 0 R 831 0 R 842 0 R 847 0 R 849 0 R 852 0 R 856 0 R 862 
0 R 865 0 R 877 0 R 881 0 R 886 0 R 894 0 R 899 0 R 903 0 R 907 0 R 909 0 R 912 
0 R 914 0 R 918 0 R 920 0 R 923 0 R 928 0 R 932 0 R 937 0 R 941 0 R 944 0 R 946 
0 R 953 0 R 957 0 R 962 0 R 975 0 R 979 0 R 983 0 R 988 0 R 990 0 R 999 0 R 
1002 0 R 1007 0 R 1010 0 R 1019 0 R 1022 0 R 1028 0 R 1035 0 R 1038 0 R 1040 0 
R 1049 0 R 1051 0 R 1053 0 R 1056 0 R 1058 0 R 1060 0 R 1062 0 R 1064 0 R 1066 
0 R 1069 0 R 1072 0 R 1077 0 R 1080 0 R 1082 0 R 1084 0 R 1086 0 R 1091 0 R 
1100 0 R 1103 0 R 1105 0 R 1107 0 R 1112 0 R 1114 0 R 1117 0 R 1119 0 R 1121 0 
R 1123 0 R 1126 0 R 1131 0 R 1137 0 R 1144 0 R 1149 0 R 1163 0 R 1174 0 R 1178 
0 R 1191 0 R 1200 0 R 1216 0 R 1220 0 R 1230 0 R 1
 243 0 R 1246 0 R 1258 0 R 1267 0 R 1275 0 R 1279 0 R 1288 0 R 1293 0 R 1297 0 
R 1303 0 R 1309 0 R 1316 0 R 1324 0 R 1326 0 R 1338 0 R 1340 0 R 1345 0 R 1349 
0 R 1354 0 R 1364 0 R 1370 0 R 1376 0 R 1378 0 R 1380 0 R 1392 0 R 1399 0 R 
1408 0 R 1414 0 R 1428 0 R 1436 0 R 1440 0 R 1449 0 R 1457 0 R 1465 0 R 1471 0 
R 1475 0 R 1478 0 R 1480 0 R 1489 0 R 1492 0 R 1499 0 R 1503 0 R 1506 0 R 1514 
0 R 1518 0 R 1521 0 R 1523 0 R 1532 0 R 1539 0 R 1545 0 R 1550 0 R 1554 0 R 
1557 0 R 1563 0 R 1568 0 R 1573 0 R 1575 0 R 1577 0 R 1580 0 R 1582 0 R 1591 0 
R 1594 0 R 1600 0 R 1607 0 R 1611 0 R 1617 0 R 1620 0 R 1622 0 R 1627 0 R 1630 
0 R 1632 0 R 1634 0 R 1636 0 R 1643 0 R 1653 0 R 1655 0 R 1657 0 R 1659 0 R 
1661 0 R 1665 0 R 1667 0 R 1669 0 R 1671 0 R 1674 0 R 1676 0 R 1678 0 R 1680 0 
R 1684 0 R 1688 0 R 1697 0 R 1699 0 R 1701 0 R 1703 0 R 1705 0 R 1711 0 R 1713 
0 R 1718 0 R 1720 0 R 1722 0 R 1729 0 R 1734 0 R 1738 0 R 1742 0 R 1745 0 R 
1748 0 R 1753 0 R 1755 0 R 1758 0 R 1760 0 R 1762 0 R 1764 0 R
  1768 0 R 1770 0 R 1774 0 R 1776 0 R 1778 0 R 1780 0 R 1782 0 R 1789 0 R 1792 
0 R 1797 0 R 1799 0 R 1801 0 R 1803 0 R 1805 0 R 1813 0 R 1823 0 R 1826 0 R 
1842 0 R 1857 0 R 1861 0 R 1866 0 R 1870 0 R 1873 0 R 1878 0 R 1880 0 R 1887 0 
R 1889 0 R 1892 0 R 1894 0 R 1896 0 R 1898 0 R 1900 0 R 1904 0 R 1906 0 R 1914 
0 R 1922 0 R 1928 0 R 1939 0 R 1953 0 R 1965 0 R 1984 0 R 1986 0 R 1988 0 R 
1992 0 R 2009 0 R 2017 0 R 2024 0 R 2033 0 R 2037 0 R 2047 0 R 2058 0 R 2064 0 
R 2073 0 R 2086 0 R 2103 0 R 2113 0 R 2116 0 R 2125 0 R 2140 0 R 2147 0 R 2150 
0 R 2155 0 R 2160 0 R 2170 0 R 2178 0 R 2181 0 R 2183 0 R 2187 0 R 2200 0 R 
2208 0 R 2214 0 R 2218 0 R 2221 0 R 2223 0 R 2225 0 R 2227 0 R 2229 0 R 2234 0 
R 2236 0 R 2246 0 R 2256 0 R 2263 0 R 2275 0 R 2280 0 R 2284 0 R 2296 0 R 2303 
0 R 2309 

[32/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/ClusterStatus.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/ClusterStatus.html 
b/apidocs/src-html/org/apache/hadoop/hbase/ClusterStatus.html
index b17f59f..a527e60 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/ClusterStatus.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/ClusterStatus.html
@@ -38,14 +38,14 @@
 030import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 031import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 032import 
org.apache.hadoop.hbase.master.RegionState;
-033import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-034import 
org.apache.hadoop.hbase.protobuf.generated.ClusterStatusProtos;
-035import 
org.apache.hadoop.hbase.protobuf.generated.ClusterStatusProtos.LiveServerInfo;
-036import 
org.apache.hadoop.hbase.protobuf.generated.ClusterStatusProtos.RegionInTransition;
-037import 
org.apache.hadoop.hbase.protobuf.generated.FSProtos.HBaseVersionFileContent;
-038import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos;
-039import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionSpecifier;
-040import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionSpecifier.RegionSpecifierType;
+033import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+034import 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClusterStatusProtos;
+035import 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClusterStatusProtos.LiveServerInfo;
+036import 
org.apache.hadoop.hbase.shaded.protobuf.generated.ClusterStatusProtos.RegionInTransition;
+037import 
org.apache.hadoop.hbase.shaded.protobuf.generated.FSProtos.HBaseVersionFileContent;
+038import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos;
+039import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionSpecifier;
+040import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionSpecifier.RegionSpecifierType;
 041import 
org.apache.hadoop.hbase.util.ByteStringer;
 042import 
org.apache.hadoop.hbase.util.Bytes;
 043import 
org.apache.hadoop.io.VersionedWritable;

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html 
b/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html
index 874a0ff..0af2352 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/HColumnDescriptor.html
@@ -42,8 +42,8 @@
 034import 
org.apache.hadoop.hbase.exceptions.HBaseException;
 035import 
org.apache.hadoop.hbase.io.compress.Compression;
 036import 
org.apache.hadoop.hbase.io.encoding.DataBlockEncoding;
-037import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-038import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.ColumnFamilySchema;
+037import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+038import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.ColumnFamilySchema;
 039import 
org.apache.hadoop.hbase.regionserver.BloomType;
 040import 
org.apache.hadoop.hbase.util.Bytes;
 041import 
org.apache.hadoop.hbase.util.PrettyPrinter;

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/HRegionInfo.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/HRegionInfo.html 
b/apidocs/src-html/org/apache/hadoop/hbase/HRegionInfo.html
index f8c7ae6..6d65caf 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/HRegionInfo.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/HRegionInfo.html
@@ -41,11 +41,11 @@
 033import 
org.apache.hadoop.hbase.KeyValue.KVComparator;
 034import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
 035import 
org.apache.hadoop.hbase.master.RegionState;
-036import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-037import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos;
-038import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.RegionInfo;
-039import 
org.apache.hadoop.hbase.util.ByteArrayHashKey;
-040import 
org.apache.hadoop.hbase.util.ByteStringer;
+036import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.UnsafeByteOperations;
+037import 
org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil;
+038import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos;
+039import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionInfo;
+040import 
org.apache.hadoop.hbase.util.ByteArrayHashKey;
 041import 
org.apache.hadoop.hbase.util.Bytes;
 042import 
org.apache.hadoop.hbase.util.HashKey;
 043import 
org.apache.hadoop.hbase.util.JenkinsHash;
@@ -881,10 

[06/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/index-all.html
--
diff --git a/devapidocs/index-all.html b/devapidocs/index-all.html
index bba004e..5b130a5 100644
--- a/devapidocs/index-all.html
+++ b/devapidocs/index-all.html
@@ -236,7 +236,7 @@
 
 abortProcedure(long,
 boolean) - Method in class org.apache.hadoop.hbase.master.HMaster
 
-abortProcedure(RpcController,
 MasterProtos.AbortProcedureRequest) - Method in class 
org.apache.hadoop.hbase.master.MasterRpcServices
+abortProcedure(RpcController,
 MasterProtos.AbortProcedureRequest) - Method in class 
org.apache.hadoop.hbase.master.MasterRpcServices
 
 abortProcedure(long,
 boolean) - Method in interface org.apache.hadoop.hbase.master.MasterServices
 
@@ -254,7 +254,7 @@
 
 AbortProcedureFuture(HBaseAdmin,
 Long, Boolean) - Constructor for class 
org.apache.hadoop.hbase.client.HBaseAdmin.AbortProcedureFuture
 
-abortProcedureResult(MasterProtos.AbortProcedureRequest)
 - Method in class org.apache.hadoop.hbase.client.HBaseAdmin.ProcedureFuture
+abortProcedureResult(MasterProtos.AbortProcedureRequest)
 - Method in class org.apache.hadoop.hbase.client.HBaseAdmin.ProcedureFuture
 
 abortProcess()
 - Method in class org.apache.hadoop.hbase.master.procedure.MasterProcedureEnv.MasterProcedureStoreListener
 
@@ -386,12 +386,6 @@
 
 AbstractProtobufLogWriter()
 - Constructor for class org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter
 
-AbstractRegionServerCallableT - Class in org.apache.hadoop.hbase.client
-
-Added by HBASE-15745 Refactor of RPC classes to better 
accept async changes.
-
-AbstractRegionServerCallable(Connection,
 TableName, byte[]) - Constructor for class 
org.apache.hadoop.hbase.client.AbstractRegionServerCallable
-
 AbstractResponse - Class in org.apache.hadoop.hbase.client
 
 This class is used to extend AP to process single action 
request, like delete, get etc.
@@ -541,6 +535,10 @@
 
 AccessControlLists()
 - Constructor for class org.apache.hadoop.hbase.security.access.AccessControlLists
 
+AccessControlUtil - Class in org.apache.hadoop.hbase.security.access
+
+AccessControlUtil()
 - Constructor for class org.apache.hadoop.hbase.security.access.AccessControlUtil
+
 accessCount
 - Variable in class org.apache.hadoop.hbase.io.hfile.bucket.BucketCache
 
 Cache access count (sequential ID)
@@ -870,8 +868,6 @@
 
 Attempt to add the specified entry to this queue.
 
-add(Cell)
 - Method in class org.apache.hadoop.hbase.io.hfile.CompoundBloomFilterWriter
-
 add(byte[],
 long, int, long) - Method in class 
org.apache.hadoop.hbase.io.hfile.HFileBlockIndex.BlockIndexChunk
 
 Adds a new entry to this block index chunk.
@@ -920,10 +916,12 @@
 Append the given message to this buffer, automatically 
evicting
  older messages until the desired memory limit is achieved.
 
-add(ProcedureProtos.Procedure)
 - Method in class org.apache.hadoop.hbase.procedure2.store.wal.ProcedureWALFormatReader.WalProcedureMap
+add(ProcedureProtos.Procedure)
 - Method in class org.apache.hadoop.hbase.procedure2.store.wal.ProcedureWALFormatReader.WalProcedureMap
 
 add(E)
 - Method in class org.apache.hadoop.hbase.procedure2.util.TimeoutBlockingQueue
 
+add(IterableCell)
 - Method in class org.apache.hadoop.hbase.regionserver.AbstractMemStore
+
 add(Cell)
 - Method in class org.apache.hadoop.hbase.regionserver.AbstractMemStore
 
 Write an update
@@ -934,10 +932,16 @@
 
 add(Cell)
 - Method in class org.apache.hadoop.hbase.regionserver.HStore
 
+add(IterableCell)
 - Method in class org.apache.hadoop.hbase.regionserver.HStore
+
 add(Cell)
 - Method in interface org.apache.hadoop.hbase.regionserver.MemStore
 
 Write an update
 
+add(IterableCell)
 - Method in interface org.apache.hadoop.hbase.regionserver.MemStore
+
+Write the updates
+
 add(Cell,
 boolean) - Method in class org.apache.hadoop.hbase.regionserver.MutableSegment
 
 Adds the given cell into the segment
@@ -954,6 +958,10 @@
 
 Adds a value to the memstore
 
+add(IterableCell)
 - Method in interface org.apache.hadoop.hbase.regionserver.Store
+
+Adds the specified value to the memstore
+
 add(Cell)
 - Method in class org.apache.hadoop.hbase.regionserver.wal.WALEdit
 
 add(String)
 - Method in class org.apache.hadoop.hbase.rest.client.Cluster
@@ -994,10 +1002,6 @@
 
 add(Cell)
 - Method in class org.apache.hadoop.hbase.util.BloomFilterChunk
 
-add(Cell)
 - Method in interface org.apache.hadoop.hbase.util.BloomFilterWriter
-
-Add the specified binary to the bloom filter.
-
 add(E)
 - Method in class org.apache.hadoop.hbase.util.BoundedPriorityBlockingQueue.PriorityQueue
 
 add(ByteRange)
 - Method in class org.apache.hadoop.hbase.util.byterange.ByteRangeSet
@@ -1093,8 +1097,23 @@
 
 addAll(int,
 Collection? extends E) - Method in class 
org.apache.hadoop.hbase.util.SortedList
 
+addAllFile(Iterable?
 extends PluginProtos.CodeGeneratorResponse.File) - Method in 
class 

[13/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/util/Bytes.RowEndKeyComparator.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/util/Bytes.RowEndKeyComparator.html 
b/apidocs/src-html/org/apache/hadoop/hbase/util/Bytes.RowEndKeyComparator.html
index 3d30c3b..2ab24d4 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/util/Bytes.RowEndKeyComparator.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/util/Bytes.RowEndKeyComparator.html
@@ -36,2642 +36,2640 @@
 028import java.math.BigDecimal;
 029import java.math.BigInteger;
 030import java.nio.ByteBuffer;
-031import java.nio.charset.Charset;
-032import 
java.nio.charset.StandardCharsets;
-033import java.security.SecureRandom;
-034import java.util.Arrays;
-035import java.util.Collection;
-036import java.util.Comparator;
-037import java.util.Iterator;
-038import java.util.List;
-039
-040import org.apache.commons.logging.Log;
-041import 
org.apache.commons.logging.LogFactory;
-042import org.apache.hadoop.hbase.Cell;
-043import 
org.apache.hadoop.hbase.CellComparator;
-044import 
org.apache.hadoop.hbase.KeyValue;
-045import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
-046import 
org.apache.hadoop.hbase.classification.InterfaceStability;
-047import 
org.apache.hadoop.io.RawComparator;
-048import 
org.apache.hadoop.io.WritableComparator;
-049import 
org.apache.hadoop.io.WritableUtils;
-050
-051import sun.misc.Unsafe;
-052
-053import 
com.google.common.annotations.VisibleForTesting;
-054import com.google.common.collect.Lists;
-055import com.google.protobuf.ByteString;
-056
-057/**
-058 * Utility class that handles byte 
arrays, conversions to/from other types,
-059 * comparisons, hash code generation, 
manufacturing keys for HashMaps or
-060 * HashSets, and can be used as key in 
maps or trees.
-061 */
-062@SuppressWarnings("restriction")
-063@InterfaceAudience.Public
-064@InterfaceStability.Stable
-065@edu.umd.cs.findbugs.annotations.SuppressWarnings(
-066
value="EQ_CHECK_FOR_OPERAND_NOT_COMPATIBLE_WITH_THIS",
-067justification="It has been like this 
forever")
-068public class Bytes implements 
ComparableBytes {
-069  //HConstants.UTF8_ENCODING should be 
updated if this changed
-070  /** When we encode strings, we always 
specify UTF8 encoding */
-071  private static final String 
UTF8_ENCODING = "UTF-8";
+031import 
java.nio.charset.StandardCharsets;
+032import java.security.SecureRandom;
+033import java.util.Arrays;
+034import java.util.Collection;
+035import java.util.Comparator;
+036import java.util.Iterator;
+037import java.util.List;
+038
+039import org.apache.commons.logging.Log;
+040import 
org.apache.commons.logging.LogFactory;
+041import org.apache.hadoop.hbase.Cell;
+042import 
org.apache.hadoop.hbase.CellComparator;
+043import 
org.apache.hadoop.hbase.KeyValue;
+044import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
+045import 
org.apache.hadoop.hbase.classification.InterfaceStability;
+046import 
org.apache.hadoop.io.RawComparator;
+047import 
org.apache.hadoop.io.WritableComparator;
+048import 
org.apache.hadoop.io.WritableUtils;
+049
+050import sun.misc.Unsafe;
+051
+052import 
com.google.common.annotations.VisibleForTesting;
+053import com.google.common.collect.Lists;
+054import com.google.protobuf.ByteString;
+055
+056/**
+057 * Utility class that handles byte 
arrays, conversions to/from other types,
+058 * comparisons, hash code generation, 
manufacturing keys for HashMaps or
+059 * HashSets, and can be used as key in 
maps or trees.
+060 */
+061@SuppressWarnings("restriction")
+062@InterfaceAudience.Public
+063@InterfaceStability.Stable
+064@edu.umd.cs.findbugs.annotations.SuppressWarnings(
+065
value="EQ_CHECK_FOR_OPERAND_NOT_COMPATIBLE_WITH_THIS",
+066justification="It has been like this 
forever")
+067public class Bytes implements 
ComparableBytes {
+068
+069  // Using the charset canonical name for 
String/byte[] conversions is much
+070  // more efficient due to use of cached 
encoders/decoders.
+071  private static final String UTF8_CSN = 
StandardCharsets.UTF_8.name();
 072
-073  //HConstants.UTF8_CHARSET should be 
updated if this changed
-074  /** When we encode strings, we always 
specify UTF8 encoding */
-075  private static final Charset 
UTF8_CHARSET = Charset.forName(UTF8_ENCODING);
-076
-077  // Using the charset canonical name for 
String/byte[] conversions is much
-078  // more efficient due to use of cached 
encoders/decoders.
-079  private static final String UTF8_CSN = 
StandardCharsets.UTF_8.name();
-080
-081  //HConstants.EMPTY_BYTE_ARRAY should be 
updated if this changed
-082  private static final byte [] 
EMPTY_BYTE_ARRAY = new byte [0];
-083
-084  private static final Log LOG = 
LogFactory.getLog(Bytes.class);
-085
-086  /**
-087   * Size of boolean in bytes
-088   */
-089  public static final int SIZEOF_BOOLEAN 
= Byte.SIZE / 

[15/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableRecordReaderImpl.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableRecordReaderImpl.html 
b/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableRecordReaderImpl.html
index abf6b1f..d649da0 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableRecordReaderImpl.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/mapreduce/TableRecordReaderImpl.html
@@ -48,277 +48,279 @@
 040import 
org.apache.hadoop.mapreduce.TaskAttemptContext;
 041import 
org.apache.hadoop.util.StringUtils;
 042
-043/**
-044 * Iterate over an HBase table data, 
return (ImmutableBytesWritable, Result)
-045 * pairs.
-046 */
-047@InterfaceAudience.Public
-048@InterfaceStability.Stable
-049public class TableRecordReaderImpl {
-050  public static final String 
LOG_PER_ROW_COUNT
-051= 
"hbase.mapreduce.log.scanner.rowcount";
-052
-053  private static final Log LOG = 
LogFactory.getLog(TableRecordReaderImpl.class);
+043import 
com.google.common.annotations.VisibleForTesting;
+044
+045/**
+046 * Iterate over an HBase table data, 
return (ImmutableBytesWritable, Result)
+047 * pairs.
+048 */
+049@InterfaceAudience.Public
+050@InterfaceStability.Stable
+051public class TableRecordReaderImpl {
+052  public static final String 
LOG_PER_ROW_COUNT
+053= 
"hbase.mapreduce.log.scanner.rowcount";
 054
-055  // HBASE_COUNTER_GROUP_NAME is the name 
of mapreduce counter group for HBase
-056  private static final String 
HBASE_COUNTER_GROUP_NAME =
-057"HBase Counters";
-058  private ResultScanner scanner = null;
-059  private Scan scan = null;
-060  private Scan currentScan = null;
-061  private Table htable = null;
-062  private byte[] lastSuccessfulRow = 
null;
-063  private ImmutableBytesWritable key = 
null;
-064  private Result value = null;
-065  private TaskAttemptContext context = 
null;
-066  private Method getCounter = null;
-067  private long numRestarts = 0;
-068  private long numStale = 0;
-069  private long timestamp;
-070  private int rowcount;
-071  private boolean logScannerActivity = 
false;
-072  private int logPerRowCount = 100;
-073
-074  /**
-075   * Restart from survivable exceptions 
by creating a new scanner.
-076   *
-077   * @param firstRow  The first row to 
start at.
-078   * @throws IOException When restarting 
fails.
-079   */
-080  public void restart(byte[] firstRow) 
throws IOException {
-081currentScan = new Scan(scan);
-082currentScan.setStartRow(firstRow);
-083
currentScan.setScanMetricsEnabled(true);
-084if (this.scanner != null) {
-085  if (logScannerActivity) {
-086LOG.info("Closing the previously 
opened scanner object.");
-087  }
-088  this.scanner.close();
-089}
-090this.scanner = 
this.htable.getScanner(currentScan);
-091if (logScannerActivity) {
-092  LOG.info("Current scan=" + 
currentScan.toString());
-093  timestamp = 
System.currentTimeMillis();
-094  rowcount = 0;
-095}
-096  }
-097
-098  /**
-099   * In new mapreduce APIs, 
TaskAttemptContext has two getCounter methods
-100   * Check if getCounter(String, String) 
method is available.
-101   * @return The getCounter method or 
null if not available.
-102   * @throws IOException
-103   */
-104  protected static Method 
retrieveGetCounterWithStringsParams(TaskAttemptContext context)
-105  throws IOException {
-106Method m = null;
-107try {
-108  m = 
context.getClass().getMethod("getCounter",
-109new Class [] {String.class, 
String.class});
-110} catch (SecurityException e) {
-111  throw new IOException("Failed test 
for getCounter", e);
-112} catch (NoSuchMethodException e) {
-113  // Ignore
-114}
-115return m;
-116  }
-117
-118  /**
-119   * Sets the HBase table.
-120   *
-121   * @param htable  The {@link 
org.apache.hadoop.hbase.HTableDescriptor} to scan.
-122   */
-123  public void setHTable(Table htable) {
-124Configuration conf = 
htable.getConfiguration();
-125logScannerActivity = 
conf.getBoolean(
-126  
ScannerCallable.LOG_SCANNER_ACTIVITY, false);
-127logPerRowCount = 
conf.getInt(LOG_PER_ROW_COUNT, 100);
-128this.htable = htable;
-129  }
-130
-131  /**
-132   * Sets the scan defining the actual 
details like columns etc.
-133   *
-134   * @param scan  The scan to set.
-135   */
-136  public void setScan(Scan scan) {
-137this.scan = scan;
-138  }
-139
-140  /**
-141   * Build the scanner. Not done in 
constructor to allow for extension.
-142   *
-143   * @throws IOException
-144   * @throws InterruptedException
-145   */
-146  public void initialize(InputSplit 
inputsplit,
-147  TaskAttemptContext context) throws 
IOException,
-148  InterruptedException {
-149if (context != null) {
-150  this.context = context;
-151  getCounter = 

[37/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/util/Bytes.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/util/Bytes.html 
b/apidocs/org/apache/hadoop/hbase/util/Bytes.html
index 8276a12..a2152d9 100644
--- a/apidocs/org/apache/hadoop/hbase/util/Bytes.html
+++ b/apidocs/org/apache/hadoop/hbase/util/Bytes.html
@@ -18,7 +18,7 @@
 catch(err) {
 }
 //-->
-var methods = 
{"i0":9,"i1":9,"i2":9,"i3":9,"i4":41,"i5":41,"i6":9,"i7":9,"i8":10,"i9":9,"i10":9,"i11":10,"i12":9,"i13":9,"i14":9,"i15":9,"i16":10,"i17":9,"i18":9,"i19":9,"i20":9,"i21":9,"i22":10,"i23":9,"i24":9,"i25":10,"i26":9,"i27":10,"i28":10,"i29":42,"i30":9,"i31":10,"i32":9,"i33":9,"i34":9,"i35":9,"i36":9,"i37":9,"i38":9,"i39":9,"i40":9,"i41":9,"i42":9,"i43":9,"i44":9,"i45":9,"i46":9,"i47":9,"i48":9,"i49":9,"i50":9,"i51":9,"i52":9,"i53":9,"i54":9,"i55":9,"i56":41,"i57":9,"i58":41,"i59":9,"i60":41,"i61":9,"i62":9,"i63":9,"i64":9,"i65":9,"i66":9,"i67":9,"i68":41,"i69":9,"i70":9,"i71":10,"i72":10,"i73":9,"i74":9,"i75":9,"i76":9,"i77":9,"i78":9,"i79":9,"i80":9,"i81":9,"i82":9,"i83":9,"i84":9,"i85":9,"i86":9,"i87":9,"i88":9,"i89":9,"i90":9,"i91":9,"i92":9,"i93":9,"i94":9,"i95":9,"i96":10,"i97":9,"i98":9,"i99":9,"i100":9,"i101":9,"i102":9,"i103":9,"i104":9,"i105":9,"i106":41,"i107":9,"i108":9,"i109":9,"i110":41,"i111":9,"i112":9,"i113":9,"i114":41,"i115":10,"i116":9,"i117":9,"i118":9
 
,"i119":9,"i120":9,"i121":9,"i122":9,"i123":9,"i124":9,"i125":9,"i126":9,"i127":9,"i128":9,"i129":9,"i130":9,"i131":9};
+var methods = 
{"i0":9,"i1":9,"i2":9,"i3":9,"i4":41,"i5":41,"i6":9,"i7":9,"i8":10,"i9":9,"i10":9,"i11":10,"i12":9,"i13":9,"i14":9,"i15":9,"i16":10,"i17":9,"i18":9,"i19":9,"i20":9,"i21":9,"i22":10,"i23":9,"i24":9,"i25":10,"i26":9,"i27":10,"i28":10,"i29":42,"i30":9,"i31":10,"i32":9,"i33":9,"i34":9,"i35":9,"i36":9,"i37":9,"i38":9,"i39":9,"i40":9,"i41":9,"i42":9,"i43":9,"i44":9,"i45":9,"i46":9,"i47":9,"i48":9,"i49":9,"i50":9,"i51":9,"i52":9,"i53":9,"i54":9,"i55":9,"i56":41,"i57":9,"i58":41,"i59":9,"i60":41,"i61":9,"i62":9,"i63":9,"i64":9,"i65":9,"i66":9,"i67":9,"i68":41,"i69":9,"i70":9,"i71":10,"i72":10,"i73":9,"i74":9,"i75":9,"i76":9,"i77":9,"i78":9,"i79":9,"i80":9,"i81":9,"i82":9,"i83":9,"i84":9,"i85":9,"i86":9,"i87":9,"i88":9,"i89":9,"i90":9,"i91":9,"i92":9,"i93":9,"i94":9,"i95":9,"i96":42,"i97":9,"i98":9,"i99":9,"i100":9,"i101":9,"i102":9,"i103":9,"i104":9,"i105":9,"i106":41,"i107":9,"i108":9,"i109":9,"i110":41,"i111":9,"i112":9,"i113":9,"i114":41,"i115":10,"i116":9,"i117":9,"i118":9
 
,"i119":9,"i120":9,"i121":9,"i122":9,"i123":9,"i124":9,"i125":9,"i126":9,"i127":9,"i128":9,"i129":9,"i130":9,"i131":9};
 var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete 
Methods"],32:["t6","Deprecated Methods"]};
 var altColor = "altColor";
 var rowColor = "rowColor";
@@ -115,7 +115,7 @@ var activeTableTab = "activeTableTab";
 
 @InterfaceAudience.Public
  @InterfaceStability.Stable
-public class Bytes
+public class Bytes
 extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object
 implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparable.html?is-external=true;
 title="class or interface in java.lang">ComparableBytes
 Utility class that handles byte arrays, conversions to/from 
other types,
@@ -277,7 +277,9 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparabl
 
 
 Bytes(com.google.protobuf.ByteStringbyteString)
-Copy bytes from ByteString instance.
+Deprecated.
+As of release 2.0.0, this 
will be removed in HBase 3.0.0.
+
 
 
 
@@ -923,7 +925,11 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparabl
 
 
 com.google.protobuf.ByteString
-toByteString()
+toByteString()
+Deprecated.
+As of release 2.0.0, this 
will be removed in HBase 3.0.0.
+
+
 
 
 static double
@@ -1199,7 +1205,7 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparabl
 
 
 SIZEOF_BOOLEAN
-public static finalint SIZEOF_BOOLEAN
+public static finalint SIZEOF_BOOLEAN
 Size of boolean in bytes
 
 See Also:
@@ -1213,7 +1219,7 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparabl
 
 
 SIZEOF_BYTE
-public static finalint SIZEOF_BYTE
+public static finalint SIZEOF_BYTE
 Size of byte in bytes
 
 See Also:
@@ -1227,7 +1233,7 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparabl
 
 
 SIZEOF_CHAR
-public static finalint SIZEOF_CHAR
+public static finalint SIZEOF_CHAR
 Size of char in bytes
 
 See Also:
@@ -1241,7 +1247,7 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparabl
 
 
 SIZEOF_DOUBLE
-public static finalint SIZEOF_DOUBLE
+public static finalint SIZEOF_DOUBLE
 Size of double in bytes
 
 See Also:
@@ -1255,7 +1261,7 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Comparabl
 
 
 SIZEOF_FLOAT

[03/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/devapidocs/org/apache/hadoop/hbase/CellUtil.html
--
diff --git a/devapidocs/org/apache/hadoop/hbase/CellUtil.html 
b/devapidocs/org/apache/hadoop/hbase/CellUtil.html
index f2a8a69..1174513 100644
--- a/devapidocs/org/apache/hadoop/hbase/CellUtil.html
+++ b/devapidocs/org/apache/hadoop/hbase/CellUtil.html
@@ -1011,7 +1011,7 @@ extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?
 
 
 EMPTY_TAGS_ITR
-private static finalhttp://docs.oracle.com/javase/8/docs/api/java/util/Iterator.html?is-external=true;
 title="class or interface in java.util">IteratorTag EMPTY_TAGS_ITR
+private static finalhttp://docs.oracle.com/javase/8/docs/api/java/util/Iterator.html?is-external=true;
 title="class or interface in java.util">IteratorTag EMPTY_TAGS_ITR
 
 
 
@@ -1462,7 +1462,7 @@ public static
 
 createCellScanner
-public staticCellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">List? extends CellScannablecellScannerables)
+public staticCellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">List? extends CellScannablecellScannerables)
 
 Parameters:
 cellScannerables - 
@@ -1477,7 +1477,7 @@ public static
 
 createCellScanner
-public staticCellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/lang/Iterable.html?is-external=true;
 title="class or interface in java.lang">IterableCellcellIterable)
+public staticCellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/lang/Iterable.html?is-external=true;
 title="class or interface in java.lang">IterableCellcellIterable)
 
 Parameters:
 cellIterable - 
@@ -1492,7 +1492,7 @@ public static
 
 createCellScanner
-public staticCellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/Iterator.html?is-external=true;
 title="class or interface in java.util">IteratorCellcells)
+public staticCellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/Iterator.html?is-external=true;
 title="class or interface in java.util">IteratorCellcells)
 
 Parameters:
 cells - 
@@ -1508,7 +1508,7 @@ public static
 
 createCellScanner
-public staticCellScannercreateCellScanner(Cell[]cellArray)
+public staticCellScannercreateCellScanner(Cell[]cellArray)
 
 Parameters:
 cellArray - 
@@ -1523,7 +1523,7 @@ public static
 
 createCellScanner
-public staticCellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
+public staticCellScannercreateCellScanner(http://docs.oracle.com/javase/8/docs/api/java/util/NavigableMap.html?is-external=true;
 title="class or interface in java.util">NavigableMapbyte[],http://docs.oracle.com/javase/8/docs/api/java/util/List.html?is-external=true;
 title="class or interface in java.util">ListCellmap)
 Flatten the map of cells out under the CellScanner
 
 Parameters:
@@ -1541,7 +1541,7 @@ public static
 matchingRow
 http://docs.oracle.com/javase/8/docs/api/java/lang/Deprecated.html?is-external=true;
 title="class or interface in java.lang">@Deprecated
-public staticbooleanmatchingRow(Cellleft,
+public staticbooleanmatchingRow(Cellleft,
   Cellright)
 Deprecated.As of release 2.0.0, this will be removed in HBase 
3.0.0.
  Instead use matchingRows(Cell,
 Cell)
@@ -1560,7 +1560,7 @@ public staticboolean
 
 matchingRow
-public staticbooleanmatchingRow(Cellleft,
+public staticbooleanmatchingRow(Cellleft,
   byte[]buf)
 
 
@@ -1570,7 +1570,7 @@ public staticboolean
 
 matchingRow
-public staticbooleanmatchingRow(Cellleft,
+public staticbooleanmatchingRow(Cellleft,
   byte[]buf,
   intoffset,
   intlength)
@@ -1582,7 +1582,7 @@ public staticboolean
 
 matchingFamily
-public staticbooleanmatchingFamily(Cellleft,
+public staticbooleanmatchingFamily(Cellleft,
  Cellright)
 
 
@@ -1592,7 +1592,7 @@ public staticboolean
 
 matchingFamily
-public staticbooleanmatchingFamily(Cellleft,
+public staticbooleanmatchingFamily(Cellleft,
  byte[]buf)
 
 
@@ -1602,7 +1602,7 @@ public staticboolean
 
 matchingFamily
-public staticbooleanmatchingFamily(Cellleft,
+public staticbooleanmatchingFamily(Cellleft,
  byte[]buf,
  intoffset,
  intlength)
@@ -1614,7 +1614,7 @@ public 

[52/52] hbase-site git commit: Empty commit

2016-10-09 Thread tedyu
Empty commit


Project: http://git-wip-us.apache.org/repos/asf/hbase-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/hbase-site/commit/344fa326
Tree: http://git-wip-us.apache.org/repos/asf/hbase-site/tree/344fa326
Diff: http://git-wip-us.apache.org/repos/asf/hbase-site/diff/344fa326

Branch: refs/heads/asf-site
Commit: 344fa3264f079a988e6bcfbff544bae01940972e
Parents: c7e8462
Author: tedyu 
Authored: Sun Oct 9 08:11:31 2016 -0700
Committer: tedyu 
Committed: Sun Oct 9 08:11:31 2016 -0700

--

--




[33/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/CellUtil.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/CellUtil.html 
b/apidocs/src-html/org/apache/hadoop/hbase/CellUtil.html
index a78e2b5..e0212b1 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/CellUtil.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/CellUtil.html
@@ -551,341 +551,341 @@
 543  }
 544  return len;
 545}
-546  }
-547
-548  /**
-549   * Version of TagRewriteCell where the 
original Cell is ShareableMemory type.
-550   */
-551  private static class 
ShareableMemoryTagRewriteCell extends TagRewriteCell implements
-552  ShareableMemory {
-553
-554public 
ShareableMemoryTagRewriteCell(Cell cell, byte[] tags) {
-555  super(cell, tags);
-556  assert cell instanceof 
ShareableMemory;
-557}
-558
-559@Override
-560public Cell cloneToCell() {
-561  Cell clonedBaseCell = 
((ShareableMemory) this.cell).cloneToCell();
-562  return new 
TagRewriteCell(clonedBaseCell, this.tags);
-563}
-564  }
-565
-566  /**
-567   * @param cellScannerables
-568   * @return CellScanner interface over 
codecellIterables/code
-569   */
-570  public static CellScanner 
createCellScanner(
-571  final List? extends 
CellScannable cellScannerables) {
-572return new CellScanner() {
-573  private final Iterator? extends 
CellScannable iterator = cellScannerables.iterator();
-574  private CellScanner cellScanner = 
null;
-575
-576  @Override
-577  public Cell current() {
-578return this.cellScanner != null? 
this.cellScanner.current(): null;
-579  }
-580
-581  @Override
-582  public boolean advance() throws 
IOException {
-583while (true) {
-584  if (this.cellScanner == null) 
{
-585if (!this.iterator.hasNext()) 
return false;
-586this.cellScanner = 
this.iterator.next().cellScanner();
-587  }
-588  if (this.cellScanner.advance()) 
return true;
-589  this.cellScanner = null;
-590}
-591  }
-592};
-593  }
-594
-595  /**
-596   * @param cellIterable
-597   * @return CellScanner interface over 
codecellIterable/code
-598   */
-599  public static CellScanner 
createCellScanner(final IterableCell cellIterable) {
-600if (cellIterable == null) return 
null;
-601return 
createCellScanner(cellIterable.iterator());
+546
+547@Override
+548public void write(byte[] buf, int 
offset) {
+549  offset = 
KeyValueUtil.appendToByteArray(this.cell, buf, offset, false);
+550  int tagsLen = this.tags.length;
+551  assert tagsLen  0;
+552  offset = Bytes.putAsShort(buf, 
offset, tagsLen);
+553  System.arraycopy(this.tags, 0, buf, 
offset, tagsLen);
+554}
+555  }
+556
+557  /**
+558   * Version of TagRewriteCell where the 
original Cell is ShareableMemory type.
+559   */
+560  private static class 
ShareableMemoryTagRewriteCell extends TagRewriteCell implements
+561  ShareableMemory {
+562
+563public 
ShareableMemoryTagRewriteCell(Cell cell, byte[] tags) {
+564  super(cell, tags);
+565  assert cell instanceof 
ShareableMemory;
+566}
+567
+568@Override
+569public Cell cloneToCell() {
+570  Cell clonedBaseCell = 
((ShareableMemory) this.cell).cloneToCell();
+571  return new 
TagRewriteCell(clonedBaseCell, this.tags);
+572}
+573  }
+574
+575  /**
+576   * @param cellScannerables
+577   * @return CellScanner interface over 
codecellIterables/code
+578   */
+579  public static CellScanner 
createCellScanner(
+580  final List? extends 
CellScannable cellScannerables) {
+581return new CellScanner() {
+582  private final Iterator? extends 
CellScannable iterator = cellScannerables.iterator();
+583  private CellScanner cellScanner = 
null;
+584
+585  @Override
+586  public Cell current() {
+587return this.cellScanner != null? 
this.cellScanner.current(): null;
+588  }
+589
+590  @Override
+591  public boolean advance() throws 
IOException {
+592while (true) {
+593  if (this.cellScanner == null) 
{
+594if (!this.iterator.hasNext()) 
return false;
+595this.cellScanner = 
this.iterator.next().cellScanner();
+596  }
+597  if (this.cellScanner.advance()) 
return true;
+598  this.cellScanner = null;
+599}
+600  }
+601};
 602  }
 603
 604  /**
-605   * @param cells
-606   * @return CellScanner interface over 
codecellIterable/code or null if codecells/code 
is
-607   * null
-608   */
-609  public static CellScanner 
createCellScanner(final IteratorCell cells) {
-610if (cells == null) return null;
-611return new CellScanner() {
-612  private final IteratorCell 
iterator = cells;
-613  private Cell current = null;
-614
-615  @Override
-616  public Cell current() {
-617

[19/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueExcludeFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueExcludeFilter.html
 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueExcludeFilter.html
index 7b2aec8..1ecaccd 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueExcludeFilter.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/SingleColumnValueExcludeFilter.html
@@ -38,157 +38,156 @@
 030import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 031import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
 032import 
org.apache.hadoop.hbase.filter.CompareFilter.CompareOp;
-033import 
org.apache.hadoop.hbase.protobuf.ProtobufUtil;
-034import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-035
-036import 
com.google.protobuf.InvalidProtocolBufferException;
-037
-038/**
-039 * A {@link Filter} that checks a single 
column value, but does not emit the
-040 * tested column. This will enable a 
performance boost over
-041 * {@link SingleColumnValueFilter}, if 
the tested column value is not actually
-042 * needed as input (besides for the 
filtering itself).
-043 */
-044@InterfaceAudience.Public
-045@InterfaceStability.Stable
-046public class 
SingleColumnValueExcludeFilter extends SingleColumnValueFilter {
-047
-048  /**
-049   * Constructor for binary compare of 
the value of a single column. If the
-050   * column is found and the condition 
passes, all columns of the row will be
-051   * emitted; except for the tested 
column value. If the column is not found or
-052   * the condition fails, the row will 
not be emitted.
-053   *
-054   * @param family name of column 
family
-055   * @param qualifier name of column 
qualifier
-056   * @param compareOp operator
-057   * @param value value to compare column 
values against
-058   */
-059  public 
SingleColumnValueExcludeFilter(byte[] family, byte[] qualifier,
-060  CompareOp compareOp, byte[] value) 
{
-061super(family, qualifier, compareOp, 
value);
-062  }
-063
-064  /**
-065   * Constructor for binary compare of 
the value of a single column. If the
-066   * column is found and the condition 
passes, all columns of the row will be
-067   * emitted; except for the tested 
column value. If the condition fails, the
-068   * row will not be emitted.
-069   * p
-070   * Use the filterIfColumnMissing flag 
to set whether the rest of the columns
-071   * in a row will be emitted if the 
specified column to check is not found in
-072   * the row.
-073   *
-074   * @param family name of column 
family
-075   * @param qualifier name of column 
qualifier
-076   * @param compareOp operator
-077   * @param comparator Comparator to 
use.
-078   */
-079  public 
SingleColumnValueExcludeFilter(byte[] family, byte[] qualifier,
-080  CompareOp compareOp, 
ByteArrayComparable comparator) {
-081super(family, qualifier, compareOp, 
comparator);
-082  }
-083
-084  /**
-085   * Constructor for protobuf 
deserialization only.
-086   * @param family
-087   * @param qualifier
-088   * @param compareOp
-089   * @param comparator
-090   * @param filterIfMissing
-091   * @param latestVersionOnly
-092   */
-093  protected 
SingleColumnValueExcludeFilter(final byte[] family, final byte[] qualifier,
-094  final CompareOp compareOp, 
ByteArrayComparable comparator, final boolean filterIfMissing,
-095  final boolean latestVersionOnly) 
{
-096super(family, qualifier, compareOp, 
comparator, filterIfMissing, latestVersionOnly);
-097  }
-098
-099  // We cleaned result row in FilterRow 
to be consistent with scanning process.
-100  public boolean hasFilterRow() {
-101   return true;
-102  }
-103
-104  // Here we remove from row all key 
values from testing column
-105  @Override
-106  public void 
filterRowCells(ListCell kvs) {
-107Iterator? extends Cell it = 
kvs.iterator();
-108while (it.hasNext()) {
-109  // If the current column is 
actually the tested column,
-110  // we will skip it instead.
-111  if 
(CellUtil.matchingColumn(it.next(), this.columnFamily, this.columnQualifier)) 
{
-112it.remove();
-113  }
-114}
-115  }
-116
-117  public static Filter 
createFilterFromArguments(ArrayListbyte [] filterArguments) {
-118SingleColumnValueFilter tempFilter = 
(SingleColumnValueFilter)
-119  
SingleColumnValueFilter.createFilterFromArguments(filterArguments);
-120SingleColumnValueExcludeFilter filter 
= new SingleColumnValueExcludeFilter (
-121  tempFilter.getFamily(), 
tempFilter.getQualifier(),
-122  tempFilter.getOperator(), 
tempFilter.getComparator());
-123
-124if (filterArguments.size() == 6) {
-125  
filter.setFilterIfMissing(tempFilter.getFilterIfMissing());
-126  

[34/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/overview-frame.html
--
diff --git a/apidocs/overview-frame.html b/apidocs/overview-frame.html
index 8f2ef06..cf4f3d4 100644
--- a/apidocs/overview-frame.html
+++ b/apidocs/overview-frame.html
@@ -60,6 +60,9 @@
 org.apache.hadoop.hbase.rest.client
 org.apache.hadoop.hbase.rsgroup
 org.apache.hadoop.hbase.security
+org.apache.hadoop.hbase.shaded.com.google.protobuf
+org.apache.hadoop.hbase.shaded.com.google.protobuf.compiler
+org.apache.hadoop.hbase.shaded.protobuf
 org.apache.hadoop.hbase.snapshot
 org.apache.hadoop.hbase.spark
 org.apache.hadoop.hbase.spark.example.hbasecontext

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/overview-summary.html
--
diff --git a/apidocs/overview-summary.html b/apidocs/overview-summary.html
index 30656ad..65cd8e2 100644
--- a/apidocs/overview-summary.html
+++ b/apidocs/overview-summary.html
@@ -298,18 +298,30 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
 
 
 
-org.apache.hadoop.hbase.snapshot
+org.apache.hadoop.hbase.shaded.com.google.protobuf
 
 
 
-org.apache.hadoop.hbase.spark
+org.apache.hadoop.hbase.shaded.com.google.protobuf.compiler
 
 
 
-org.apache.hadoop.hbase.spark.example.hbasecontext
+org.apache.hadoop.hbase.shaded.protobuf
+
+
+
+org.apache.hadoop.hbase.snapshot
+
+
+
+org.apache.hadoop.hbase.spark
 
 
 
+org.apache.hadoop.hbase.spark.example.hbasecontext
+
+
+
 org.apache.hadoop.hbase.types
 
 
@@ -317,23 +329,23 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
  extensible data type API.
 
 
-
+
 org.apache.hadoop.hbase.util
 
 
-
+
 org.apache.hadoop.hbase.util.hbck
 
 
-
+
 org.apache.hadoop.hbase.wal
 
 
-
+
 org.apache.hadoop.hbase.zookeeper
 
 
-
+
 org.apache.hbase.archetypes.exemplars.client
 
 This package provides fully-functional exemplar Java code 
demonstrating
@@ -341,7 +353,7 @@ Input/OutputFormats, a table indexing MapReduce job, and 
utility methods.
  archetype with hbase-client dependency.
 
 
-
+
 org.apache.hbase.archetypes.exemplars.shaded_client
 
 This package provides fully-functional exemplar Java code 
demonstrating

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/overview-tree.html
--
diff --git a/apidocs/overview-tree.html b/apidocs/overview-tree.html
index b4b03d2..b5dea80 100644
--- a/apidocs/overview-tree.html
+++ b/apidocs/overview-tree.html
@@ -120,6 +120,9 @@
 org.apache.hadoop.hbase.rest.client,
 
 org.apache.hadoop.hbase.rsgroup,
 
 org.apache.hadoop.hbase.security,
 
+org.apache.hadoop.hbase.shaded.com.google.protobuf,
 
+org.apache.hadoop.hbase.shaded.com.google.protobuf.compiler,
 
+org.apache.hadoop.hbase.shaded.protobuf,
 
 org.apache.hadoop.hbase.snapshot,
 
 org.apache.hadoop.hbase.spark,
 
 org.apache.hadoop.hbase.spark.example.hbasecontext,
 
@@ -481,7 +484,6 @@
 org.apache.hadoop.hbase.types.Struct 
(implements org.apache.hadoop.hbase.types.DataTypeT)
 org.apache.hadoop.hbase.types.StructBuilder
 org.apache.hadoop.hbase.types.StructIterator (implements java.util.http://docs.oracle.com/javase/8/docs/api/java/util/Iterator.html?is-external=true;
 title="class or interface in java.util">IteratorE)
-org.apache.hadoop.hbase.ipc.SyncCoprocessorRpcChannel (implements 
org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel)
 org.apache.hadoop.hbase.mapred.TableInputFormatBase (implements 
org.apache.hadoop.mapred.InputFormatK,V)
 
 org.apache.hadoop.hbase.mapred.TableInputFormat (implements 
org.apache.hadoop.mapred.JobConfigurable)
@@ -846,28 +848,28 @@
 
 java.lang.http://docs.oracle.com/javase/8/docs/api/java/lang/Enum.html?is-external=true;
 title="class or interface in java.lang">EnumE (implements java.lang.http://docs.oracle.com/javase/8/docs/api/java/lang/Comparable.html?is-external=true;
 title="class or interface in java.lang">ComparableT, java.io.http://docs.oracle.com/javase/8/docs/api/java/io/Serializable.html?is-external=true;
 title="class or interface in java.io">Serializable)
 
-org.apache.hadoop.hbase.util.Order
 org.apache.hadoop.hbase.KeepDeletedCells
 org.apache.hadoop.hbase.ProcedureState
 org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
+org.apache.hadoop.hbase.util.Order
+org.apache.hadoop.hbase.filter.BitComparator.BitwiseOp
 org.apache.hadoop.hbase.filter.FilterList.Operator
+org.apache.hadoop.hbase.filter.RegexStringComparator.EngineType
 org.apache.hadoop.hbase.filter.CompareFilter.CompareOp
 org.apache.hadoop.hbase.filter.Filter.ReturnCode
-org.apache.hadoop.hbase.filter.BitComparator.BitwiseOp
-org.apache.hadoop.hbase.filter.RegexStringComparator.EngineType
 org.apache.hadoop.hbase.regionserver.BloomType
-org.apache.hadoop.hbase.client.CompactionState
+org.apache.hadoop.hbase.client.CompactType

[31/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/ServerName.html
--
diff --git a/apidocs/src-html/org/apache/hadoop/hbase/ServerName.html 
b/apidocs/src-html/org/apache/hadoop/hbase/ServerName.html
index 12f49be..97ffe8c 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/ServerName.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/ServerName.html
@@ -26,401 +26,354 @@
 018 */
 019package org.apache.hadoop.hbase;
 020
-021import 
com.google.common.net.HostAndPort;
-022import 
com.google.common.net.InetAddresses;
-023import 
com.google.protobuf.InvalidProtocolBufferException;
-024
-025import java.io.Serializable;
-026import java.util.ArrayList;
-027import java.util.List;
-028import java.util.Locale;
-029import java.util.regex.Pattern;
-030
-031import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
-032import 
org.apache.hadoop.hbase.classification.InterfaceStability;
-033import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-034import 
org.apache.hadoop.hbase.protobuf.ProtobufMagic;
-035import 
org.apache.hadoop.hbase.protobuf.generated.ZooKeeperProtos;
-036import 
org.apache.hadoop.hbase.util.Addressing;
-037import 
org.apache.hadoop.hbase.util.Bytes;
-038
-039/**
-040 * Instance of an HBase ServerName.
-041 * A server name is used uniquely 
identifying a server instance in a cluster and is made
-042 * of the combination of hostname, port, 
and startcode.  The startcode distingushes restarted
-043 * servers on same hostname and port 
(startcode is usually timestamp of server startup). The
-044 * {@link #toString()} format of 
ServerName is safe to use in the  filesystem and as znode name
-045 * up in ZooKeeper.  Its format is:
-046 * codelt;hostnamegt; 
'{@link #SERVERNAME_SEPARATOR}' lt;portgt;
-047 * '{@link #SERVERNAME_SEPARATOR}' 
lt;startcodegt;/code.
-048 * For example, if hostname is 
codewww.example.org/code, port is 
code1234/code,
-049 * and the startcode for the regionserver 
is code1212121212/code, then
-050 * the {@link #toString()} would be 
codewww.example.org,1234,1212121212/code.
-051 * 
-052 * pYou can obtain a versioned 
serialized form of this class by calling
-053 * {@link #getVersionedBytes()}.  To 
deserialize, call {@link #parseVersionedServerName(byte[])}
-054 * 
-055 * pImmutable.
-056 */
-057@InterfaceAudience.Public
-058@InterfaceStability.Evolving
-059  public class ServerName implements 
ComparableServerName, Serializable {
-060  private static final long 
serialVersionUID = 1367463982557264981L;
-061
-062  /**
-063   * Version for this class.
-064   * Its a short rather than a byte so I 
can for sure distinguish between this
-065   * version of this class and the 
version previous to this which did not have
-066   * a version.
-067   */
-068  private static final short VERSION = 
0;
-069  static final byte [] VERSION_BYTES = 
Bytes.toBytes(VERSION);
-070
-071  /**
-072   * What to use if no startcode 
supplied.
-073   */
-074  public static final int NON_STARTCODE = 
-1;
-075
-076  /**
-077   * This character is used as separator 
between server hostname, port and
-078   * startcode.
-079   */
-080  public static final String 
SERVERNAME_SEPARATOR = ",";
-081
-082  public static final Pattern 
SERVERNAME_PATTERN =
-083Pattern.compile("[^" + 
SERVERNAME_SEPARATOR + "]+" +
-084  SERVERNAME_SEPARATOR + 
Addressing.VALID_PORT_REGEX +
-085  SERVERNAME_SEPARATOR + 
Addressing.VALID_PORT_REGEX + "$");
-086
-087  /**
-088   * What to use if server name is 
unknown.
-089   */
-090  public static final String 
UNKNOWN_SERVERNAME = "#unknown#";
-091
-092  private final String servername;
-093  private final String hostnameOnly;
-094  private final int port;
-095  private final long startcode;
-096  private transient HostAndPort 
hostAndPort;
-097
-098  /**
-099   * Cached versioned bytes of this 
ServerName instance.
-100   * @see #getVersionedBytes()
-101   */
-102  private byte [] bytes;
-103  public static final 
ListServerName EMPTY_SERVER_LIST = new 
ArrayListServerName(0);
-104
-105  private ServerName(final String 
hostname, final int port, final long startcode) {
-106// Drop the domain is there is one; 
no need of it in a local cluster.  With it, we get long
-107// unwieldy names.
-108this.hostnameOnly = hostname;
-109this.port = port;
-110this.startcode = startcode;
-111this.servername = 
getServerName(hostname, port, startcode);
-112  }
-113
-114  /**
-115   * @param hostname
-116   * @return hostname minus the domain, 
if there is one (will do pass-through on ip addresses)
-117   */
-118  static String 
getHostNameMinusDomain(final String hostname) {
-119if 
(InetAddresses.isInetAddress(hostname)) return hostname;
-120String [] parts = 
hostname.split("\\.");
-121if (parts == null || parts.length == 
0) return hostname;
-122return parts[0];
-123  }
-124
-125  private 

[23/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/FirstKeyValueMatchingQualifiersFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/FirstKeyValueMatchingQualifiersFilter.html
 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/FirstKeyValueMatchingQualifiersFilter.html
index 93471d1..6548544 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/FirstKeyValueMatchingQualifiersFilter.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/FirstKeyValueMatchingQualifiersFilter.html
@@ -34,12 +34,12 @@
 026import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 027import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 028import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-029import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-030import 
org.apache.hadoop.hbase.util.ByteStringer;
-031import 
org.apache.hadoop.hbase.util.Bytes;
-032
-033import com.google.protobuf.ByteString;
-034import 
com.google.protobuf.InvalidProtocolBufferException;
+029import 
org.apache.hadoop.hbase.shaded.protobuf.generated.FilterProtos;
+030import 
org.apache.hadoop.hbase.util.Bytes;
+031
+032import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.ByteString;
+033import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException;
+034import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.UnsafeByteOperations;
 035
 036/**
 037 * The filter looks for the given columns 
in KeyValue. Once there is a match for
@@ -96,7 +96,7 @@
 088
FilterProtos.FirstKeyValueMatchingQualifiersFilter.Builder builder =
 089  
FilterProtos.FirstKeyValueMatchingQualifiersFilter.newBuilder();
 090for (byte[] qualifier : qualifiers) 
{
-091  if (qualifier != null) 
builder.addQualifiers(ByteStringer.wrap(qualifier));
+091  if (qualifier != null) 
builder.addQualifiers(UnsafeByteOperations.unsafeWrap(qualifier));
 092}
 093return 
builder.build().toByteArray();
 094  }

http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/FuzzyRowFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/FuzzyRowFilter.html 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/FuzzyRowFilter.html
index d416cfa..3e67195 100644
--- a/apidocs/src-html/org/apache/hadoop/hbase/filter/FuzzyRowFilter.html
+++ b/apidocs/src-html/org/apache/hadoop/hbase/filter/FuzzyRowFilter.html
@@ -37,16 +37,16 @@
 029import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 030import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 031import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-032import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-033import 
org.apache.hadoop.hbase.protobuf.generated.HBaseProtos.BytesBytesPair;
-034import 
org.apache.hadoop.hbase.util.ByteStringer;
-035import 
org.apache.hadoop.hbase.util.Bytes;
-036import 
org.apache.hadoop.hbase.util.Pair;
-037import 
org.apache.hadoop.hbase.util.UnsafeAccess;
-038import 
org.apache.hadoop.hbase.util.UnsafeAvailChecker;
-039
-040import 
com.google.common.annotations.VisibleForTesting;
-041import 
com.google.protobuf.InvalidProtocolBufferException;
+032import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException;
+033import 
org.apache.hadoop.hbase.shaded.com.google.protobuf.UnsafeByteOperations;
+034import 
org.apache.hadoop.hbase.shaded.protobuf.generated.FilterProtos;
+035import 
org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.BytesBytesPair;
+036import 
org.apache.hadoop.hbase.util.Bytes;
+037import 
org.apache.hadoop.hbase.util.Pair;
+038import 
org.apache.hadoop.hbase.util.UnsafeAccess;
+039import 
org.apache.hadoop.hbase.util.UnsafeAvailChecker;
+040
+041import 
com.google.common.annotations.VisibleForTesting;
 042
 043/**
 044 * This is optimized version of a 
standard FuzzyRowFilter Filters data based on fuzzy row key.
@@ -265,8 +265,8 @@
 257FilterProtos.FuzzyRowFilter.Builder 
builder = FilterProtos.FuzzyRowFilter.newBuilder();
 258for (Pairbyte[], byte[] 
fuzzyData : fuzzyKeysData) {
 259  BytesBytesPair.Builder bbpBuilder = 
BytesBytesPair.newBuilder();
-260  
bbpBuilder.setFirst(ByteStringer.wrap(fuzzyData.getFirst()));
-261  
bbpBuilder.setSecond(ByteStringer.wrap(fuzzyData.getSecond()));
+260  
bbpBuilder.setFirst(UnsafeByteOperations.unsafeWrap(fuzzyData.getFirst()));
+261  
bbpBuilder.setSecond(UnsafeByteOperations.unsafeWrap(fuzzyData.getSecond()));
 262  
builder.addFuzzyKeysData(bbpBuilder);
 263}
 264return 
builder.build().toByteArray();


[46/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/org/apache/hadoop/hbase/ProcedureInfo.html
--
diff --git a/apidocs/org/apache/hadoop/hbase/ProcedureInfo.html 
b/apidocs/org/apache/hadoop/hbase/ProcedureInfo.html
index 31a58b2..76ea893 100644
--- a/apidocs/org/apache/hadoop/hbase/ProcedureInfo.html
+++ b/apidocs/org/apache/hadoop/hbase/ProcedureInfo.html
@@ -18,7 +18,7 @@
 catch(err) {
 }
 //-->
-var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10,"i7":10,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":10,"i14":10,"i15":10,"i16":10,"i17":10,"i18":10};
+var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10,"i7":10,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":10,"i14":10,"i15":10,"i16":10};
 var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
 var altColor = "altColor";
 var rowColor = "rowColor";
@@ -115,7 +115,7 @@ var activeTableTab = "activeTableTab";
 
 @InterfaceAudience.Public
  @InterfaceStability.Evolving
-public class ProcedureInfo
+public class ProcedureInfo
 extends http://docs.oracle.com/javase/8/docs/api/java/lang/Object.html?is-external=true;
 title="class or interface in java.lang">Object
 implements http://docs.oracle.com/javase/8/docs/api/java/lang/Cloneable.html?is-external=true;
 title="class or interface in java.lang">Cloneable
 Procedure information
@@ -137,13 +137,13 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Cloneable
 Constructor and Description
 
 
-ProcedureInfo(longprocId,
+ProcedureInfo(longprocId,
  http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringprocName,
  http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringprocOwner,
  ProcedureStateprocState,
  longparentId,
  org.apache.hadoop.hbase.util.NonceKeynonceKey,
- 
org.apache.hadoop.hbase.ProcedureUtil.ForeignExceptionMsgexception,
+ http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOExceptionexception,
  longlastUpdate,
  longstartTime,
  byte[]result)
@@ -177,65 +177,57 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Cloneable
 
 
 http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
-getExceptionCause()
-
-
-http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 getExceptionFullMessage()
 
-
-http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
-getExceptionMessage()
-
-
+
 long
 getLastUpdate()
 
-
+
 org.apache.hadoop.hbase.util.NonceKey
 getNonceKey()
 
-
+
 long
 getParentId()
 
-
+
 long
 getProcId()
 
-
+
 http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 getProcName()
 
-
+
 http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 getProcOwner()
 
-
+
 ProcedureState
 getProcState()
 
-
+
 byte[]
 getResult()
 
-
+
 long
 getStartTime()
 
-
+
 boolean
 hasParentId()
 
-
+
 boolean
 hasResultData()
 
-
+
 boolean
 isFailed()
 
-
+
 http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">String
 toString()
 
@@ -261,20 +253,20 @@ implements http://docs.oracle.com/javase/8/docs/api/java/lang/Cloneable
 
 
 Constructor Detail
-
+
 
 
 
 
 ProcedureInfo
 @InterfaceAudience.Private
-publicProcedureInfo(longprocId,
+publicProcedureInfo(longprocId,
 http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringprocName,
 http://docs.oracle.com/javase/8/docs/api/java/lang/String.html?is-external=true;
 title="class or interface in java.lang">StringprocOwner,
 ProcedureStateprocState,
 longparentId,
 
org.apache.hadoop.hbase.util.NonceKeynonceKey,
-
org.apache.hadoop.hbase.ProcedureUtil.ForeignExceptionMsgexception,
+http://docs.oracle.com/javase/8/docs/api/java/io/IOException.html?is-external=true;
 title="class or interface in java.io">IOExceptionexception,
 longlastUpdate,
  

[20/52] [partial] hbase-site git commit: Published site at e06c3676f1273f033e3e185ee9c1ec52c1c7cb31.

2016-10-09 Thread tedyu
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/c7e84622/apidocs/src-html/org/apache/hadoop/hbase/filter/MultipleColumnPrefixFilter.html
--
diff --git 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/MultipleColumnPrefixFilter.html
 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/MultipleColumnPrefixFilter.html
index b1295ab..bc1d9db 100644
--- 
a/apidocs/src-html/org/apache/hadoop/hbase/filter/MultipleColumnPrefixFilter.html
+++ 
b/apidocs/src-html/org/apache/hadoop/hbase/filter/MultipleColumnPrefixFilter.html
@@ -35,181 +35,180 @@
 027import 
org.apache.hadoop.hbase.classification.InterfaceAudience;
 028import 
org.apache.hadoop.hbase.classification.InterfaceStability;
 029import 
org.apache.hadoop.hbase.exceptions.DeserializationException;
-030import 
org.apache.hadoop.hbase.protobuf.generated.FilterProtos;
-031import 
org.apache.hadoop.hbase.util.ByteStringer;
-032import 
org.apache.hadoop.hbase.util.Bytes;
-033
-034import 
com.google.protobuf.InvalidProtocolBufferException;
-035
-036/**
-037 * This filter is used for selecting only 
those keys with columns that matches
-038 * a particular prefix. For example, if 
prefix is 'an', it will pass keys will
-039 * columns like 'and', 'anti' but not 
keys with columns like 'ball', 'act'.
-040 */
-041@InterfaceAudience.Public
-042@InterfaceStability.Stable
-043public class MultipleColumnPrefixFilter 
extends FilterBase {
-044  protected byte [] hint = null;
-045  protected TreeSetbyte [] 
sortedPrefixes = createTreeSet();
-046  private final static int 
MAX_LOG_PREFIXES = 5;
-047
-048  public MultipleColumnPrefixFilter(final 
byte [][] prefixes) {
-049if (prefixes != null) {
-050  for (int i = 0; i  
prefixes.length; i++) {
-051if 
(!sortedPrefixes.add(prefixes[i]))
-052  throw new 
IllegalArgumentException ("prefixes must be distinct");
-053  }
-054}
-055  }
-056
-057  public byte [][] getPrefix() {
-058int count = 0;
-059byte [][] temp = new byte 
[sortedPrefixes.size()][];
-060for (byte [] prefixes : 
sortedPrefixes) {
-061  temp [count++] = prefixes;
-062}
-063return temp;
-064  }
-065
-066  @Override
-067  public boolean filterRowKey(Cell cell) 
throws IOException {
-068// Impl in FilterBase might do 
unnecessary copy for Off heap backed Cells.
-069return false;
-070  }
-071
-072  @Override
-073  public ReturnCode filterKeyValue(Cell 
kv) {
-074if (sortedPrefixes.size() == 0) {
-075  return ReturnCode.INCLUDE;
-076} else {
-077  return filterColumn(kv);
-078}
-079  }
-080
-081  public ReturnCode filterColumn(Cell 
cell) {
-082byte [] qualifier = 
CellUtil.cloneQualifier(cell);
-083TreeSetbyte [] 
lesserOrEqualPrefixes =
-084  (TreeSetbyte []) 
sortedPrefixes.headSet(qualifier, true);
-085
-086if (lesserOrEqualPrefixes.size() != 
0) {
-087  byte [] 
largestPrefixSmallerThanQualifier = lesserOrEqualPrefixes.last();
-088  
-089  if (Bytes.startsWith(qualifier, 
largestPrefixSmallerThanQualifier)) {
-090return ReturnCode.INCLUDE;
-091  }
-092  
-093  if (lesserOrEqualPrefixes.size() == 
sortedPrefixes.size()) {
-094return ReturnCode.NEXT_ROW;
-095  } else {
-096hint = 
sortedPrefixes.higher(largestPrefixSmallerThanQualifier);
-097return 
ReturnCode.SEEK_NEXT_USING_HINT;
-098  }
-099} else {
-100  hint = sortedPrefixes.first();
-101  return 
ReturnCode.SEEK_NEXT_USING_HINT;
-102}
-103  }
-104
-105  public static Filter 
createFilterFromArguments(ArrayListbyte [] filterArguments) {
-106byte [][] prefixes = new byte 
[filterArguments.size()][];
-107for (int i = 0 ; i  
filterArguments.size(); i++) {
-108  byte [] columnPrefix = 
ParseFilter.removeQuotesFromByteArray(filterArguments.get(i));
-109  prefixes[i] = columnPrefix;
-110}
-111return new 
MultipleColumnPrefixFilter(prefixes);
-112  }
-113
-114  /**
-115   * @return The filter serialized using 
pb
-116   */
-117  public byte [] toByteArray() {
-118
FilterProtos.MultipleColumnPrefixFilter.Builder builder =
-119  
FilterProtos.MultipleColumnPrefixFilter.newBuilder();
-120for (byte [] element : 
sortedPrefixes) {
-121  if (element != null) 
builder.addSortedPrefixes(ByteStringer.wrap(element));
-122}
-123return 
builder.build().toByteArray();
-124  }
-125
-126  /**
-127   * @param pbBytes A pb serialized 
{@link MultipleColumnPrefixFilter} instance
-128   * @return An instance of {@link 
MultipleColumnPrefixFilter} made from codebytes/code
-129   * @throws DeserializationException
-130   * @see #toByteArray
-131   */
-132  public static 
MultipleColumnPrefixFilter parseFrom(final byte [] pbBytes)
-133  throws DeserializationException {
-134
FilterProtos.MultipleColumnPrefixFilter proto;
-135try {
-136  proto = 
FilterProtos.MultipleColumnPrefixFilter.parseFrom(pbBytes);
-137} catch