Re: Writing to Hbase table from Spark

2016-08-30 Thread Todd Nist
Have  you looked at spark-packges.org?  There are several different HBase
connectors there, not sure if any meet  you need or not.

https://spark-packages.org/?q=hbase

HTH,

-Todd

On Tue, Aug 30, 2016 at 5:23 AM, ayan guha  wrote:

> You can use rdd level new hadoop format api and pass on appropriate
> classes.
> On 30 Aug 2016 19:13, "Mich Talebzadeh"  wrote:
>
>> Hi,
>>
>> Is there an existing interface to read from and write to Hbase table in
>> Spark.
>>
>> Similar to below for Parquet
>>
>> val s = spark.read.parquet("oraclehadoop.sales2")
>> s.write.mode("overwrite").parquet("oraclehadoop.sales4")
>>
>> Or need too write Hive table which is already defined over Hbase?
>>
>>
>> Thanks
>>
>>
>>
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> *
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>


Re: Writing to Hbase table from Spark

2016-08-30 Thread ayan guha
You can use rdd level new hadoop format api and pass on appropriate
classes.
On 30 Aug 2016 19:13, "Mich Talebzadeh"  wrote:

> Hi,
>
> Is there an existing interface to read from and write to Hbase table in
> Spark.
>
> Similar to below for Parquet
>
> val s = spark.read.parquet("oraclehadoop.sales2")
> s.write.mode("overwrite").parquet("oraclehadoop.sales4")
>
> Or need too write Hive table which is already defined over Hbase?
>
>
> Thanks
>
>
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> *
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>


Writing to Hbase table from Spark

2016-08-30 Thread Mich Talebzadeh
Hi,

Is there an existing interface to read from and write to Hbase table in
Spark.

Similar to below for Parquet

val s = spark.read.parquet("oraclehadoop.sales2")
s.write.mode("overwrite").parquet("oraclehadoop.sales4")

Or need too write Hive table which is already defined over Hbase?


Thanks





Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.