Hi Robert,
I'm glad you've found a solution that works for you.
An attempt to answer your questions:
*1. What is the difference between "--jars" and*
*"spark.driver.extraClassPath"? What does each one do?*
As I understand it, the 'extraClassPath' setting makes the JARs available
to Spark's
For HDP 2.4.2 this is what we ended up with to get it to work:
/usr/hdp/2.4.2.0-258/phoenix/lib/phoenix-core-4.4.0.2.4.2.0-258.jar
/usr/hdp/2.4.2.0-258/phoenix/lib/phoenix-spark-4.4.0.2.4.2.0-258.jar
/usr/hdp/2.4.2.0-258/phoenix/lib/hbase-client.jar
Robert,
you should use the phoenix-4*-spark.jar that is located in root phoenix
directory.
Thanks,
Sergey
On Tue, Jul 5, 2016 at 8:06 AM, Josh Elser wrote:
> Looking into this on the HDP side. Please feel free to reach out via HDP
> channels instead of Apache channels.
>
Looking into this on the HDP side. Please feel free to reach out via HDP
channels instead of Apache channels.
Thanks for letting us know as well.
Josh Mahonin wrote:
Hi Robert,
I recommend following up with HDP on this issue.
The underlying problem is that the
Hi Robert,
I recommend following up with HDP on this issue.
The underlying problem is that the 'phoenix-spark-4.4.0.2.4.0.0-169.jar'
they've provided isn't actually a fat client JAR, it's missing many of the
required dependencies. They might be able to provide the correct JAR for
you, but you'd
I'm trying to use Phoenix on Spark, and can't get around this error:
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at
org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:82)
DETAILS:
1. I'm running HDP 2.4.0.0-169
2. Using