Hi Wei ,

You can do something like this ,

foreachPartition( (part) => {    val conn =
ConnectionFactory.createConnection(HBaseConfiguration.create());
val table = conn.getTable(TableName.valueOf(tablename));
//part.foreach((inp)=>{println(inp);table.put(inp)}) //This is line by
line put        table.put(part.toList.asJava)    table.close();
conn.close();


\

Now if you want to wrap it inside a DAO,its upto you. Making DAO will
abstract thing , but ultimately going to use the same code .

Note: Always use Hbase ConnectionFactory to get connection ,and dump data
per partition basis.

Regards,
Rabin Banerjee


On Wed, Jul 20, 2016 at 12:06 PM, Yu Wei <yu20...@hotmail.com> wrote:

> I need to write all data received from MQTT data into hbase for further
> processing.
>
> They're not final result.  I also need to read the data from hbase for
> analysis.
>
>
> Is it good choice to use DAO in such situation?
>
>
> Thx,
>
> Jared
>
>
> ------------------------------
> *From:* Deepak Sharma <deepakmc...@gmail.com>
> *Sent:* Wednesday, July 20, 2016 12:34:07 PM
> *To:* Yu Wei
> *Cc:* spark users
> *Subject:* Re: Is it good choice to use DAO to store results generated by
> spark application?
>
>
> I am using DAO in spark application to write the final computation to
> Cassandra  and it performs well.
> What kinds of issues you foresee using DAO for hbase ?
>
> Thanks
> Deepak
>
> On 19 Jul 2016 10:04 pm, "Yu Wei" <yu20...@hotmail.com> wrote:
>
>> Hi guys,
>>
>>
>> I write spark application and want to store results generated by spark
>> application to hbase.
>>
>> Do I need to access hbase via java api directly?
>>
>> Or is it better choice to use DAO similar as traditional RDBMS?  I
>> suspect that there is major performance downgrade and other negative
>> impacts using DAO. However, I have little knowledge in this field.
>>
>>
>> Any advice?
>>
>>
>> Thanks,
>>
>> Jared
>>
>>
>>
>>

Reply via email to