Hi Ben,

I'm not sure about the format of those command line options you're passing.
I've had success with spark-shell just by setting the
'spark.executor.extraClassPath' and 'spark.driver.extraClassPath' options
on the spark config, as per the docs [1].

I'm not sure if there's anything special needed for CDH or not though. I
also have a docker image I've been toying with which has a working
Spark/Phoenix setup using the Phoenix 4.7.0 RC and Spark 1.6.0. It might be
a useful reference for you as well [2].

Good luck,

Josh

[1] https://phoenix.apache.org/phoenix_spark.html
[2] https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark

On Mon, Feb 8, 2016 at 4:29 PM, Benjamin Kim <bbuil...@gmail.com> wrote:

> Hi Pierre,
>
> I tried to run in spark-shell using spark 1.6.0 by running this:
>
> spark-shell --master yarn-client --driver-class-path
> /opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jar --driver-java-options
> "-Dspark.executor.extraClassPath=/opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jar”
>
> The version of HBase is the one in CDH5.4.8, which is 1.0.0-cdh5.4.8.
>
> When I get to the line:
>
> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" ->
> “TEST.MY_TEST", "zkUrl" -> “zk1,zk2,zk3:2181”))
>
> I get this error:
>
> java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.spark.rdd.RDDOperationScope$
>
> Any ideas?
>
> Thanks,
> Ben
>
>
> On Feb 5, 2016, at 1:36 PM, pierre lacave <pie...@lacave.me> wrote:
>
> I don't know when the full release will be, RC1 just got pulled out, and
> expecting RC2 soon
>
> you can find them here
>
> https://dist.apache.org/repos/dist/dev/phoenix/
>
>
> there is a new phoenix-4.7.0-HBase-1.1-client-spark.jar that is all you
> need to have in spark classpath
>
>
> *Pierre Lacave*
> 171 Skellig House, Custom House, Lower Mayor street, Dublin 1, Ireland
> Phone :       +353879128708
>
> On Fri, Feb 5, 2016 at 9:28 PM, Benjamin Kim <bbuil...@gmail.com> wrote:
>
>> Hi Pierre,
>>
>> When will I be able to download this version?
>>
>> Thanks,
>> Ben
>>
>>
>> On Friday, February 5, 2016, pierre lacave <pie...@lacave.me> wrote:
>>
>>> This was addressed in Phoenix 4.7 (currently in RC)
>>> https://issues.apache.org/jira/browse/PHOENIX-2503
>>>
>>>
>>>
>>>
>>> *Pierre Lacave*
>>> 171 Skellig House, Custom House, Lower Mayor street, Dublin 1, Ireland
>>> Phone :       +353879128708
>>>
>>> On Fri, Feb 5, 2016 at 6:17 PM, Benjamin Kim <bbuil...@gmail.com> wrote:
>>>
>>>> I cannot get this plugin to work in CDH 5.4.8 using Phoenix 4.5.2 and
>>>> Spark 1.6. When I try to launch spark-shell, I get:
>>>>
>>>>         java.lang.RuntimeException: java.lang.RuntimeException: Unable
>>>> to instantiate 
>>>> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>>>
>>>> I continue on and run the example code. When I get tot the line below:
>>>>
>>>>         val df = sqlContext.load("org.apache.phoenix.spark",
>>>> Map("table" -> "TEST.MY_TEST", "zkUrl" ->
>>>> "zookeeper1,zookeeper2,zookeeper3:2181")
>>>>
>>>> I get this error:
>>>>
>>>>         java.lang.NoSuchMethodError:
>>>> com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;
>>>>
>>>> Can someone help?
>>>>
>>>> Thanks,
>>>> Ben
>>>
>>>
>>>
>
>

Reply via email to