Re: Spark hang on load phoenix table

2016-11-16 Thread Josh Mahonin
Hi, Are there any logs in the Spark driver and executors which would help provide some context? In diagnosing, increasing the log level to DEBUG might be useful as well. Also, the snippet you posted is a 'lazy' operation. In theory it should return quickly, and only evaluate when some sort of

Re: Inserting into Temporary Phoenix table using Spark plugin

2016-11-16 Thread Josh Mahonin
Hi Hussain, I'm not familiar with the Spark temporary table syntax. Perhaps you can work around it by using other options, such as the DataFrame.save() functionality which is documented [1] and unit tested [2]. I suspect what you're encountering is a valid use case. If you could also file a JIRA

Re: Why is there a spark plugin?

2016-11-16 Thread Christopher Tarnas
It is not an either or, you can use both - hence the plugin. Phoenix is great at OLTP type workloads and Spark is better at OLAP and machine learning. -chris > On Nov 16, 2016, at 6:56 PM, Cheyenne Forbes > wrote: > > so why would I choose Phoenix over

Re: Why is there a spark plugin?

2016-11-16 Thread Cheyenne Forbes
so why would I choose Phoenix over Spark?

Re: Why is there a spark plugin?

2016-11-16 Thread Christopher Tarnas
Spark is much, much more than just a way to perform SQL. -chris > On Nov 16, 2016, at 7:13 AM, Cheyenne Forbes > wrote: > > Why would/should I care about spark/spark plugin when I already have phoenix?

Why is there a spark plugin?

2016-11-16 Thread Cheyenne Forbes
Why would/should I care about spark/spark plugin when I already have phoenix?

Inserting into Temporary Phoenix table using Spark plugin

2016-11-16 Thread Hussain Pirosha
I am trying to insert into temporary table created on a Spark (v 1.6) DataFrame loaded using Phoenix-Spark (v 4.4) plugin. Below is the code: val sc = new SparkContext("local", "phoenix-test") val configuration = new Configuration() configuration.set("zookeeper.znode.parent", "/hbase-unsecure")