Hi,

Just out of sheer curiosity - why are you using SPARK 1.6? Since then SPARK
has made significant advancement and improvement, why not take advantage
of  that?


Regards,
Gourav

On Wed, Aug 9, 2017 at 10:41 AM, toletum <tole...@toletum.org> wrote:

> Thanks Matteo
>
> I fixed it
>
> Regards,
> JCS
>
> On MiƩ., Ago. 9, 2017 at 11:22, Matteo Cossu <elco...@gmail.com> wrote:
>
> Hello,
> try to use these options when starting Spark:
>
> *--conf "spark.driver.userClassPathFirst=true" --conf
> "spark.executor.userClassPathFirst=true"  *
> In this way you will be sure that the executor and the driver of Spark
> will use the classpath you define.
>
> Best Regards,
> Matteo Cossu
>
>
> On 5 August 2017 at 23:04, toletum <tole...@toletum.org> wrote:
>
> Hi everybody
>
> I'm trying to connect Spark to Hive.
>
> Hive uses Derby Server for metastore_db.
>
> $SPARK_HOME/conf/hive-site.xml
>
> <configuration>
> <property>
>   <name>javax.jdo.option.ConnectionURL</name>
>   <value>jdbc:derby://derby:1527/metastore_db;create=true</value>
>   <description>JDBC connect string for a JDBC metastore</description>
> </property>
>
> <property>
>   <name>javax.jdo.option.ConnectionDriverName</name>
>   <value>org.apache.derby.jdbc.ClientDriver</value>
>   <description>Driver class name for a JDBC metastore</description>
> </property>
> </configuration>
>
> I have copied to $SPARK_HOME/lib derby.jar, derbyclient.jar, derbytools.jar
>
> Added to CLASSPATH the 3 jars too
>
> $SPARK_HOMElib/derby.jar:$SPARK_HOME/lib/derbytools.jar:$
> SPARK_HOME/lib/derbyclient.jar
>
> But spark-sql saids:
>
> org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException:
> The specified datastore driver ("org.apache.derby.jdbc.ClientDriver") was
> not found in the CLASSPATH. Please check your CLASSPATH specification, and
> the name of the driver.
>
> java finds the class
>
> java org.apache.derby.jdbc.ClientDriver
> Error: Main method not found in class org.apache.derby.jdbc.ClientDriver,
> please define the main method as:
>    public static void main(String[] args)
> or a JavaFX application class must extend javafx.application.Application
>
> It seems Spark can't find the driver
>
>
>
>
>

Reply via email to