Deenar, I have not resolved this issue. Why do you think it's from
different versions of Derby? I was playing with this as a fun experiment
and my setup was on a clean machine -- no other versions of
hive/hadoop/etc...

On Sun, Dec 20, 2015 at 12:17 AM, Deenar Toraskar <deenar.toras...@gmail.com
> wrote:

> apparently it is down to different versions of derby in the classpath, but
> i am unsure where the other version is coming from. The setup worked
> perfectly with spark 1.3.1.
>
> Deenar
>
> On 20 December 2015 at 04:41, Deenar Toraskar <deenar.toras...@gmail.com>
> wrote:
>
>> Hi Yana/All
>>
>> I am getting the same exception. Did you make any progress?
>>
>> Deenar
>>
>> On 5 November 2015 at 17:32, Yana Kadiyska <yana.kadiy...@gmail.com>
>> wrote:
>>
>>> Hi folks, trying experiment with a minimal external metastore.
>>>
>>> I am following the instructions here:
>>> https://cwiki.apache.org/confluence/display/Hive/HiveDerbyServerMode
>>>
>>> I grabbed Derby 10.12.1.1 and started an instance, verified I can
>>> connect via ij tool and that process is listening on 1527
>>>
>>> put the following hive-site.xml under conf
>>> ```
>>> <?xml version="1.0"?>
>>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>> <configuration>
>>> <property>
>>>   <name>javax.jdo.option.ConnectionURL</name>
>>>   <value>jdbc:derby://localhost:1527/metastore_db;create=true</value>
>>>   <description>JDBC connect string for a JDBC metastore</description>
>>> </property>
>>> <property>
>>>   <name>javax.jdo.option.ConnectionDriverName</name>
>>>   <value>org.apache.derby.jdbc.ClientDriver</value>
>>>   <description>Driver class name for a JDBC metastore</description>
>>> </property>
>>> </configuration>
>>> ```
>>>
>>> I then try to run spark-shell thusly:
>>> bin/spark-shell --driver-class-path
>>> /home/yana/db-derby-10.12.1.1-bin/lib/derbyclient.jar
>>>
>>> and I get an ugly stack trace like so...
>>>
>>> Caused by: java.lang.NoClassDefFoundError: Could not initialize class
>>> org.apache.derby.jdbc.EmbeddedDriver
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> at java.lang.Class.newInstance(Class.java:379)
>>> at
>>> org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:47)
>>> at
>>> org.datanucleus.store.rdbms.connectionpool.DBCPConnectionPoolFactory.createConnectionPool(DBCPConnectionPoolFactory.java:50)
>>> at
>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)
>>> at
>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)
>>> at
>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)
>>> ... 114 more
>>>
>>> <console>:10: error: not found: value sqlContext
>>>        import sqlContext.implicits._
>>>
>>>
>>> What am I doing wrong -- not sure why it's looking for Embedded
>>> anything, I'm specifically trying to not use the embedded server...but I
>>> know my hive-site is being read as starting witout --driver-class-path does
>>> say it can't load org.apache.derby.jdbc.ClientDriver
>>>
>>
>>
>

Reply via email to