shahrs87 commented on a change in pull request #3244:
URL: https://github.com/apache/hbase/pull/3244#discussion_r630367777
##########
File path:
hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/wal/WALCellCodec.java
##########
@@ -256,6 +267,43 @@ public void write(Cell cell) throws IOException {
}
}
}
+
+ private byte[] compressValue(Cell cell) throws IOException {
+ Deflater deflater = compression.getValueCompressor().getDeflater();
+ if (cell instanceof ByteBufferExtendedCell) {
+
deflater.setInput(((ByteBufferExtendedCell)cell).getValueByteBuffer().array(),
+ ((ByteBufferExtendedCell)cell).getValueByteBuffer().arrayOffset() +
+ ((ByteBufferExtendedCell)cell).getValuePosition(),
+ cell.getValueLength());
+ } else {
+ deflater.setInput(cell.getValueArray(), cell.getValueOffset(),
cell.getValueLength());
+ }
+ ByteArrayOutputStream baos = new ByteArrayOutputStream();
Review comment:
@apurtell I increased the buffer to 100, changed the input string to
500 and everything works fine. The decompressed string was the same as original
raw string.
On a side note, `Deflater#deflateBytes(long addr, byte[] b, int off, int
len, int flush)` which is a native method doesn't have much documentation which
help newbies to understand whats going on but not related to this PR.
Sorry for the noise if I have created. :(
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]