Re: Odd cell result

2018-06-11 Thread Ted Yu
bq. is it available in version 1.2.6?

If you were talking
about 
hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala
, it is not in 1.2.6 since hbase-spark module was not part of 1.2.6 release.

FYI

On Mon, Jun 11, 2018 at 2:08 AM, Kang Minwoo 
wrote:

> Thank you for giving me a good way.
> But HBaseContext does not seem to exist in version 1.2.6
> is it available in version 1.2.6?
>
> And the problem that I can not use HBaseContext is that I am using
> CustomTableInputFormat which extends TableInputFormat.
>
> Best regards,
> Minwoo Kang
>
> 
> 보낸 사람: Juan Jose Escobar 
> 보낸 날짜: 2018년 6월 9일 토요일 18:36
> 받는 사람: user@hbase.apache.org
> 제목: Re: Odd cell result
>
> Hello,
>
> Are you trying to read exported files or similar? Otherwise I think you
> need to indicate the format of the data you are reading. I think what you
> want to do is easier like this:
>
> val sparkConf = new SparkConf()
> 
> val sc = new SparkContext(sparkConf)
>
> val conf = HBaseConfiguration.create()
> val hbaseContext = new HBaseContext(sc, conf)
> val scan = new Scan()
> // ... scan config
> val rdd = hbaseContext.hbaseRDD(TableName.valueOf(tableName), scan)
> rdd.count()
>
> or use a Spark-HBase connector which encapsulates the details
>
> Regards
>
>
> On Sat, Jun 9, 2018 at 8:48 AM, Kang Minwoo 
> wrote:
>
> > 1) I am using just InputFormat. (I do not know it is the right answer to
> > the question.)
> >
> > 2) code snippet
> >
> > ```
> > val rdd = sc.newAPIHadoopFile(...)
> > rdd.count()
> > ```
> >
> > 3) hbase version 1.2.6
> >
> > Best regards,
> > Minwoo Kang
> >
> > 
> > 보낸 사람: Ted Yu 
> > 보낸 날짜: 2018년 6월 8일 금요일 20:01
> > 받는 사람: hbase-user
> > 제목: Re: Odd cell result
> >
> > Which connector do you use for Spark 2.1.2 ?
> >
> > Is there any code snippet which may reproduce what you experienced ?
> >
> > Which hbase release are you using ?
> >
> > Thanks
> >
> > On Fri, Jun 8, 2018 at 1:50 AM, Kang Minwoo 
> > wrote:
> >
> > > Hello, Users
> > >
> > > I recently met an unusual situation.
> > > That is the cell result does not contain column family.
> > >
> > > I thought the cell is the smallest unit where data could be transferred
> > in
> > > HBase.
> > > But cell does not contain column family means the cell is not the
> > smallest
> > > unit.
> > > I'm wrong?
> > >
> > > It occurred in Spark 2.1.2 and did not occur in MR.
> > > And now it is not reappearance.
> > >
> > > Best regards,
> > > Minwoo Kang
> > >
> >
>


Re: Odd cell result

2018-06-11 Thread Kang Minwoo
Thank you for giving me a good way.
But HBaseContext does not seem to exist in version 1.2.6
is it available in version 1.2.6?

And the problem that I can not use HBaseContext is that I am using 
CustomTableInputFormat which extends TableInputFormat.

Best regards,
Minwoo Kang


보낸 사람: Juan Jose Escobar 
보낸 날짜: 2018년 6월 9일 토요일 18:36
받는 사람: user@hbase.apache.org
제목: Re: Odd cell result

Hello,

Are you trying to read exported files or similar? Otherwise I think you
need to indicate the format of the data you are reading. I think what you
want to do is easier like this:

val sparkConf = new SparkConf()

val sc = new SparkContext(sparkConf)

val conf = HBaseConfiguration.create()
val hbaseContext = new HBaseContext(sc, conf)
val scan = new Scan()
// ... scan config
val rdd = hbaseContext.hbaseRDD(TableName.valueOf(tableName), scan)
rdd.count()

or use a Spark-HBase connector which encapsulates the details

Regards


On Sat, Jun 9, 2018 at 8:48 AM, Kang Minwoo  wrote:

> 1) I am using just InputFormat. (I do not know it is the right answer to
> the question.)
>
> 2) code snippet
>
> ```
> val rdd = sc.newAPIHadoopFile(...)
> rdd.count()
> ```
>
> 3) hbase version 1.2.6
>
> Best regards,
> Minwoo Kang
>
> 
> 보낸 사람: Ted Yu 
> 보낸 날짜: 2018년 6월 8일 금요일 20:01
> 받는 사람: hbase-user
> 제목: Re: Odd cell result
>
> Which connector do you use for Spark 2.1.2 ?
>
> Is there any code snippet which may reproduce what you experienced ?
>
> Which hbase release are you using ?
>
> Thanks
>
> On Fri, Jun 8, 2018 at 1:50 AM, Kang Minwoo 
> wrote:
>
> > Hello, Users
> >
> > I recently met an unusual situation.
> > That is the cell result does not contain column family.
> >
> > I thought the cell is the smallest unit where data could be transferred
> in
> > HBase.
> > But cell does not contain column family means the cell is not the
> smallest
> > unit.
> > I'm wrong?
> >
> > It occurred in Spark 2.1.2 and did not occur in MR.
> > And now it is not reappearance.
> >
> > Best regards,
> > Minwoo Kang
> >
>


Re: Odd cell result

2018-06-09 Thread Juan Jose Escobar
Hello,

Are you trying to read exported files or similar? Otherwise I think you
need to indicate the format of the data you are reading. I think what you
want to do is easier like this:

val sparkConf = new SparkConf()

val sc = new SparkContext(sparkConf)

val conf = HBaseConfiguration.create()
val hbaseContext = new HBaseContext(sc, conf)
val scan = new Scan()
// ... scan config
val rdd = hbaseContext.hbaseRDD(TableName.valueOf(tableName), scan)
rdd.count()

or use a Spark-HBase connector which encapsulates the details

Regards


On Sat, Jun 9, 2018 at 8:48 AM, Kang Minwoo  wrote:

> 1) I am using just InputFormat. (I do not know it is the right answer to
> the question.)
>
> 2) code snippet
>
> ```
> val rdd = sc.newAPIHadoopFile(...)
> rdd.count()
> ```
>
> 3) hbase version 1.2.6
>
> Best regards,
> Minwoo Kang
>
> 
> 보낸 사람: Ted Yu 
> 보낸 날짜: 2018년 6월 8일 금요일 20:01
> 받는 사람: hbase-user
> 제목: Re: Odd cell result
>
> Which connector do you use for Spark 2.1.2 ?
>
> Is there any code snippet which may reproduce what you experienced ?
>
> Which hbase release are you using ?
>
> Thanks
>
> On Fri, Jun 8, 2018 at 1:50 AM, Kang Minwoo 
> wrote:
>
> > Hello, Users
> >
> > I recently met an unusual situation.
> > That is the cell result does not contain column family.
> >
> > I thought the cell is the smallest unit where data could be transferred
> in
> > HBase.
> > But cell does not contain column family means the cell is not the
> smallest
> > unit.
> > I'm wrong?
> >
> > It occurred in Spark 2.1.2 and did not occur in MR.
> > And now it is not reappearance.
> >
> > Best regards,
> > Minwoo Kang
> >
>


Re: Odd cell result

2018-06-08 Thread Kang Minwoo
1) I am using just InputFormat. (I do not know it is the right answer to the 
question.)

2) code snippet

```
val rdd = sc.newAPIHadoopFile(...)
rdd.count()
```

3) hbase version 1.2.6

Best regards,
Minwoo Kang


보낸 사람: Ted Yu 
보낸 날짜: 2018년 6월 8일 금요일 20:01
받는 사람: hbase-user
제목: Re: Odd cell result

Which connector do you use for Spark 2.1.2 ?

Is there any code snippet which may reproduce what you experienced ?

Which hbase release are you using ?

Thanks

On Fri, Jun 8, 2018 at 1:50 AM, Kang Minwoo  wrote:

> Hello, Users
>
> I recently met an unusual situation.
> That is the cell result does not contain column family.
>
> I thought the cell is the smallest unit where data could be transferred in
> HBase.
> But cell does not contain column family means the cell is not the smallest
> unit.
> I'm wrong?
>
> It occurred in Spark 2.1.2 and did not occur in MR.
> And now it is not reappearance.
>
> Best regards,
> Minwoo Kang
>


Re: Odd cell result

2018-06-08 Thread Ted Yu
Which connector do you use for Spark 2.1.2 ?

Is there any code snippet which may reproduce what you experienced ?

Which hbase release are you using ?

Thanks

On Fri, Jun 8, 2018 at 1:50 AM, Kang Minwoo  wrote:

> Hello, Users
>
> I recently met an unusual situation.
> That is the cell result does not contain column family.
>
> I thought the cell is the smallest unit where data could be transferred in
> HBase.
> But cell does not contain column family means the cell is not the smallest
> unit.
> I'm wrong?
>
> It occurred in Spark 2.1.2 and did not occur in MR.
> And now it is not reappearance.
>
> Best regards,
> Minwoo Kang
>


Odd cell result

2018-06-08 Thread Kang Minwoo
Hello, Users

I recently met an unusual situation.
That is the cell result does not contain column family.

I thought the cell is the smallest unit where data could be transferred in 
HBase.
But cell does not contain column family means the cell is not the smallest unit.
I'm wrong?

It occurred in Spark 2.1.2 and did not occur in MR.
And now it is not reappearance.

Best regards,
Minwoo Kang