Hi Team,
I need some help on writing a scala to bulk load some data into hbase.
*Env:*
hbase 0.94
spark-1.0.2
I am trying below code to just bulk load some data into hbase table “t1”.
import org.apache.spark._
import org.apache.spark.rdd.NewHadoopRDD
import
Here is the method signature used by HFileOutputFormat :
public void write(ImmutableBytesWritable row, KeyValue kv)
Meaning, KeyValue is expected, not Put.
On Tue, Jan 27, 2015 at 10:54 AM, Jim Green openkbi...@gmail.com wrote:
Hi Team,
I need some help on writing a scala to bulk load
Thanks Ted. Could you give me a simple example to load one row data in
hbase? How should I generate the KeyValue?
I tried multiple times, and still can not figure it out.
On Tue, Jan 27, 2015 at 12:10 PM, Ted Yu yuzhih...@gmail.com wrote:
Here is the method signature used by HFileOutputFormat :
I used below code, and it still failed with the same error.
Anyone has experience on bulk loading using scala?
Thanks.
import org.apache.spark._
import org.apache.spark.rdd.NewHadoopRDD
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}
import
)
List(kv)
}
Thanks,
Sun
fightf...@163.com
From: Jim Green
Date: 2015-01-28 04:44
To: Ted Yu
CC: user
Subject: Re: Bulk loading into hbase using saveAsNewAPIHadoopFile
I used below code, and it still failed with the same error.
Anyone has experience on bulk loading