Hello, I used Phoenix–4.9-HBase–1.2, the pom file is like

<dependency>
    <groupId>org.apache.phoenix</groupId>
    <artifactId>phoenix-client</artifactId>
    <version>${phoenix.version}</version>

</dependency>

and the save() method is unaccessible in Phoenix–4.9, so use the following
code instead, while “BIGJOY.TARJS” is my destination table ,
imoDF.sparkSession.sparkContext.hadoopConfiguration is for getting config
needed, traces is a dataset of class *Trajectory* which correspond to
schema of table “BIGJOY.TRAJS”.

traces.toDF().saveToPhoenix("BIGJOY.TRAJS",
imoDF.sparkSession.sparkContext.hadoopConfiguration,
Some("hadoop-master:2181"))

Hope this can help you.
On 15 February 2017 at 23:59:57, Josh Mahonin (jmaho...@gmail.com) wrote:

Hi,

Spark is unable to load the Phoenix classes it needs. If you're using a
recent version of Phoenix, please ensure the "fat" *client* JAR (or for
older versions of Phoenix, the Phoenix *client*-spark JAR) is on your Spark
driver and executor classpath [1]. The 'phoenix-spark' JAR is insufficient
to provide Spark all of the classes necessary.

[1] https://phoenix.apache.org/phoenix_spark.html

On Wed, Feb 15, 2017 at 10:29 AM, Nimrod Oren <
nimrod.o...@veracity-group.com> wrote:

> Hi,
>
>
>
> I'm trying to write a simple dataframe to Phoenix:
>
>      df.save("org.apache.phoenix.spark", SaveMode.Overwrite,
>
>       Map("table" -> "TEST_SAVE", "zkUrl" -> "zk.internal:2181"))
>
>
>
> I have the following in my pom.xml:
>
>         <dependency>
>
>             <groupId>org.apache.phoenix</groupId>
>
>             <artifactId>phoenix-spark</artifactId>
>
>             <version>${phoenix-version}</version>
>
>             <scope>provided</scope>
>
>         </dependency>
>
>
>
> and phoenix-spark is in spark-defaults.conf on all servers. However I'm
> getting the following error:
>
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/phoenix/util/SchemaUtil
>
>         at org.apache.phoenix.spark.DataFrameFunctions$$anonfun$1.
> apply(DataFrameFunctions.scala:33)
>
>         at org.apache.phoenix.spark.DataFrameFunctions$$anonfun$1.
> apply(DataFrameFunctions.scala:33)
>
>         at scala.collection.TraversableLike$$anonfun$map$
> 1.apply(TraversableLike.scala:244)
>
>         at scala.collection.TraversableLike$$anonfun$map$
> 1.apply(TraversableLike.scala:244)
>
>         at scala.collection.IndexedSeqOptimized$class.
> foreach(IndexedSeqOptimized.scala:33)
>
>         at scala.collection.mutable.ArrayOps$ofRef.foreach(
> ArrayOps.scala:108)
>
>         at scala.collection.TraversableLike$class.map(
> TraversableLike.scala:244)
>
>         at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
>
>         at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(
> DataFrameFunctions.scala:33)
>
>         at org.apache.phoenix.spark.DefaultSource.createRelation(
> DefaultSource.scala:47)
>
>         at org.apache.spark.sql.execution.datasources.
> ResolvedDataSource$.apply(ResolvedDataSource.scala:222)
>
>         at org.apache.spark.sql.DataFrameWriter.save(
> DataFrameWriter.scala:148)
>
>         at org.apache.spark.sql.DataFrame.save(DataFrame.scala:2045)
>
>         at com.pelephone.TrueCallLoader$.main(TrueCallLoader.scala:184)
>
>         at com.pelephone.TrueCallLoader.main(TrueCallLoader.scala)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:606)
>
>         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>
>         at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(
> SparkSubmit.scala:181)
>
>         at org.apache.spark.deploy.SparkSubmit$.submit(
> SparkSubmit.scala:206)
>
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.
> scala:121)
>
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Caused by: java.lang.ClassNotFoundException: org.apache.phoenix.util.
> SchemaUtil
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
>
>
> Am I missing something?
>
>
>
> Nimrod
>
>
>
>
>
>
>

Reply via email to