Anyone tried to save a DataFrame to a HBase table using Phoenix? I am able to
load and read, but I can’t save.
>> spark-shell —jars
>> /opt/cloudera/parcels/CLABS_PHOENIX/lib/phoenix/lib/phoenix-spark-4.7.0-clabs-phoenix1.3.0.jar,/opt/cloudera/parcels/CLABS_PHOENIX/lib/phoenix/phoenix-4.7.0-clabs-phoenix1.3.0-client.jar
import org.apache.spark.sql._
import org.apache.phoenix.spark._
val hbaseConnectionString = “<zookeeper-quorum>”
// Save to OUTPUT_TABLE
df.save("org.apache.phoenix.spark", SaveMode.Overwrite, Map("table" ->
"OUTPUT_TABLE",
"zkUrl" -> hbaseConnectionString))
java.lang.ClassNotFoundException: Class
org.apache.phoenix.mapreduce.PhoenixOutputFormat not found
at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
at
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
Thanks,
Ben