ok thanks let me check it.

So your primary storage layer is Hbase with Phoenix as a tool.

Sounds interesting. I will get back to you on this

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 5 May 2016 at 13:26, Divya Gehlot <divya.htco...@gmail.com> wrote:

>
> http://blog.cloudera.com/blog/2015/07/how-to-do-data-quality-checks-using-apache-spark-dataframes/
> I am looking for something similar to above solution .
> ---------- Forwarded message ----------
> From: "Divya Gehlot" <divya.htco...@gmail.com>
> Date: May 5, 2016 6:51 PM
> Subject: package for data quality in Spark 1.5.2
> To: "user @spark" <user@spark.apache.org>
> Cc:
>
> Hi,
>
> Is there any package or project in Spark/scala which supports Data Quality
> check?
> For instance checking null values , foreign key constraint
>
> Would really appreciate ,if somebody has already done it and happy to
> share or has any open source package .
>
>
> Thanks,
> Divya
>

Reply via email to