[ 
https://issues.apache.org/jira/browse/BEAM-10464?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17272360#comment-17272360
 ] 

Marc Catrisse commented on BEAM-10464:
--------------------------------------

[~iemejia] Sure, finally I did a shade class of 
{{org.apache.hadoop.hbase.shaded.com.google.protobuf.}}{{CodedInputStream;
}}

Modifying the size limit parameter to Integer.{{}}{{MAX_VALUE. I'm sure it's 
not the best solution, but at the moment it fitted my needs}}{{...}}

Here you have some relevant code
{code:java}
// //
// Source code recreated from a .class file by IntelliJ IDEA
// (powered by Fernflower decompiler)
//

package org.apache.hadoop.hbase.shaded.com.google.protobuf;

import java.io.IOException;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.Iterator;
import org.apache.hadoop.hbase.shaded.com.google.protobuf.MessageLite.Builder;

public final class CodedInputStream {
    private final byte[] buffer;
    private int bufferSize;
    private int bufferSizeAfterLimit;
    private int bufferPos;
    private final InputStream input;
    private int lastTag;
    private int totalBytesRetired;
    private int currentLimit = 2147483647;
    private int recursionDepth;
    private int recursionLimit = 64;
    private int sizeLimit = Integer.MAX_VALUE; //SIZE LIMIT MOD
    private static final int DEFAULT_RECURSION_LIMIT = 64;
    private static final int DEFAULT_SIZE_LIMIT = Integer.MAX_VALUE;  //SIZE 
LIMIT MOD
    private static final int BUFFER_SIZE = 4096;
{code}
{{}}

> [HBaseIO] - Protocol message was too large.  May be malicious.
> --------------------------------------------------------------
>
>                 Key: BEAM-10464
>                 URL: https://issues.apache.org/jira/browse/BEAM-10464
>             Project: Beam
>          Issue Type: Bug
>          Components: io-java-hbase
>            Reporter: Marc Catrisse
>            Priority: P1
>
> Hi! I just got the following error perfoming a HBaseIO.read() from scan. 
> {code:java}
> Caused by: 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException:
>  Protocol message was too large.  May be malicious.  Use 
> CodedInputStream.setSizeLimit() to increase the size limit. at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.InvalidProtocolBufferException.sizeLimitExceeded(InvalidProtocolBufferException.java:110)
>  at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:755)
>  at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
>  at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
>  at 
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result.<init>(ClientProtos.java:4694)
>  at 
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result.<init>(ClientProtos.java:4658)
>  at 
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result$1.parsePartialFrom(ClientProtos.java:4767)
>  at 
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result$1.parsePartialFrom(ClientProtos.java:4762)
>  at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
>  at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>  at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>  at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>  at 
> org.apache.hadoop.hbase.shaded.com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>  at 
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result.parseDelimitedFrom(ClientProtos.java:5131)
>  at 
> org.apache.beam.sdk.io.hbase.HBaseResultCoder.decode(HBaseResultCoder.java:50)
>  at 
> org.apache.beam.sdk.io.hbase.HBaseResultCoder.decode(HBaseResultCoder.java:34)
>  at org.apache.beam.sdk.coders.Coder.decode(Coder.java:159) at 
> org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:602)
>  at 
> org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:593)
>  at 
> org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:539)
>  at 
> org.apache.beam.runners.spark.translation.ValueAndCoderLazySerializable.getOrDecode(ValueAndCoderLazySerializable.java:73)
>  ... 61 more
> {code}
> It seems I'm scanning a family column bigger than 64MB, but HBaseIO doesn't 
> provide any workaround to change the current sizeLimit of the protobuf 
> decoder. How should we manage Big Data Datasets stored in HBase?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to