Hi Krishna,

I've not tried it in Java at all, but I as of Spark 1.4+ the DataFrame API
should be unified between Scala and Java, so the following may work for you:

DataFrame df = sqlContext.read()
    .format("org.apache.phoenix.spark")
    .option("table", "TABLE1")
    .option("zkUrl", "<phoenix-server:2181>")
    .load();

Note that 'zkUrl' must be set to your Phoenix URL, and passing a 'conf'
parameter isn't supported. Please let us know back here if this works out
for you, I'd love to update the documentation and unit tests if it works.

Josh

On Tue, Dec 1, 2015 at 6:30 PM, Krishna <research...@gmail.com> wrote:

> Hi,
>
> Is there a working example for using spark plugin in Java? Specifically,
> what's the java equivalent for creating a dataframe as shown here in scala:
>
> val df = sqlContext.phoenixTableAsDataFrame("TABLE1", Array("ID", "COL1"), 
> conf = configuration)
>
>

Reply via email to