Pierre,

I got it to work using phoenix-4.7.0-HBase-1.0-client-spark.jar. But, now, I 
get this error:

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in 
stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 
3, prod-dc1-datanode151.pdc1i.gradientx.com): java.lang.IllegalStateException: 
unread block data

It happens when I do:

df.show()

Getting closer…

Thanks,
Ben



> On Feb 8, 2016, at 2:57 PM, pierre lacave <pie...@lacave.me> wrote:
> 
> This is the wrong client jar try with the one named 
> phoenix-4.7.0-HBase-1.1-client-spark.jar 
> 
> 
> On Mon, 8 Feb 2016, 22:29 Benjamin Kim <bbuil...@gmail.com 
> <mailto:bbuil...@gmail.com>> wrote:
> Hi Josh,
> 
> I tried again by putting the settings within the spark-default.conf.
> 
> spark.driver.extraClassPath=/opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jar
> spark.executor.extraClassPath=/opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jar
> 
> I still get the same error using the code below.
> 
> import org.apache.phoenix.spark._
> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> 
> "TEST.MY_TEST", "zkUrl" -> “zk1,zk2,zk3:2181"))
> 
> Can you tell me what else you’re doing?
> 
> Thanks,
> Ben
> 
> 
>> On Feb 8, 2016, at 1:44 PM, Josh Mahonin <jmaho...@gmail.com 
>> <mailto:jmaho...@gmail.com>> wrote:
>> 
>> Hi Ben,
>> 
>> I'm not sure about the format of those command line options you're passing. 
>> I've had success with spark-shell just by setting the 
>> 'spark.executor.extraClassPath' and 'spark.driver.extraClassPath' options on 
>> the spark config, as per the docs [1].
>> 
>> I'm not sure if there's anything special needed for CDH or not though. I 
>> also have a docker image I've been toying with which has a working 
>> Spark/Phoenix setup using the Phoenix 4.7.0 RC and Spark 1.6.0. It might be 
>> a useful reference for you as well [2].
>> 
>> Good luck,
>> 
>> Josh
>> 
>> [1] https://phoenix.apache.org/phoenix_spark.html 
>> <https://phoenix.apache.org/phoenix_spark.html>
>> [2] https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark 
>> <https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark>
>> 
>> On Mon, Feb 8, 2016 at 4:29 PM, Benjamin Kim <bbuil...@gmail.com 
>> <mailto:bbuil...@gmail.com>> wrote:
>> Hi Pierre,
>> 
>> I tried to run in spark-shell using spark 1.6.0 by running this:
>> 
>> spark-shell --master yarn-client --driver-class-path 
>> /opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jar --driver-java-options 
>> "-Dspark.executor.extraClassPath=/opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jar”
>> 
>> The version of HBase is the one in CDH5.4.8, which is 1.0.0-cdh5.4.8.
>> 
>> When I get to the line:
>> 
>> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> 
>> “TEST.MY_TEST", "zkUrl" -> “zk1,zk2,zk3:2181”))
>> 
>> I get this error:
>> 
>> java.lang.NoClassDefFoundError: Could not initialize class 
>> org.apache.spark.rdd.RDDOperationScope$
>> 
>> Any ideas?
>> 
>> Thanks,
>> Ben
>> 
>> 
>>> On Feb 5, 2016, at 1:36 PM, pierre lacave <pie...@lacave.me 
>>> <mailto:pie...@lacave.me>> wrote:
>>> 
>>> I don't know when the full release will be, RC1 just got pulled out, and 
>>> expecting RC2 soon
>>> 
>>> you can find them here 
>>> 
>>> https://dist.apache.org/repos/dist/dev/phoenix/ 
>>> <https://dist.apache.org/repos/dist/dev/phoenix/>
>>> 
>>> 
>>> there is a new phoenix-4.7.0-HBase-1.1-client-spark.jar that is all you 
>>> need to have in spark classpath
>>> 
>>> 
>>> Pierre Lacave
>>> 171 Skellig House, Custom House, Lower Mayor street, Dublin 1, Ireland
>>> Phone :       +353879128708 <tel:%2B353879128708>
>>> 
>>> On Fri, Feb 5, 2016 at 9:28 PM, Benjamin Kim <bbuil...@gmail.com 
>>> <mailto:bbuil...@gmail.com>> wrote:
>>> Hi Pierre,
>>> 
>>> When will I be able to download this version?
>>> 
>>> Thanks,
>>> Ben
>>> 
>>> 
>>> On Friday, February 5, 2016, pierre lacave <pie...@lacave.me 
>>> <mailto:pie...@lacave.me>> wrote:
>>> This was addressed in Phoenix 4.7 (currently in RC) 
>>> https://issues.apache.org/jira/browse/PHOENIX-2503 
>>> <https://issues.apache.org/jira/browse/PHOENIX-2503>
>>> 
>>> 
>>> 
>>> 
>>> Pierre Lacave
>>> 171 Skellig House, Custom House, Lower Mayor street, Dublin 1, Ireland
>>> Phone :       +353879128708 <tel:%2B353879128708>
>>> 
>>> On Fri, Feb 5, 2016 at 6:17 PM, Benjamin Kim <bbuil...@gmail.com <>> wrote:
>>> I cannot get this plugin to work in CDH 5.4.8 using Phoenix 4.5.2 and Spark 
>>> 1.6. When I try to launch spark-shell, I get:
>>> 
>>>         java.lang.RuntimeException: java.lang.RuntimeException: Unable to 
>>> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>> 
>>> I continue on and run the example code. When I get tot the line below:
>>> 
>>>         val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> 
>>> "TEST.MY_TEST", "zkUrl" -> "zookeeper1,zookeeper2,zookeeper3:2181")
>>> 
>>> I get this error:
>>> 
>>>         java.lang.NoSuchMethodError: 
>>> com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;
>>> 
>>> Can someone help?
>>> 
>>> Thanks,
>>> Ben
>>> 
>>> 
>> 
>> 
> 

Reply via email to