[ 
https://issues.apache.org/jira/browse/SPARK-17126?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15426198#comment-15426198
 ] 

Ozioma Ihekwoaba commented on SPARK-17126:
------------------------------------------

Yes the entries make it to the classpath, I checked up on the web UI.
However, the jars are not loaded during startup and the failure persists.
These are the current entries I have:

spark.driver.extraC‌​lassPath         
C:\\hadoop\\spark\\v161\\lib\\mysql-connector-java-5.1.25-bin.jar;C:\\hadoop\\spark\\v161\\lib\\commons-csv-1.4.jar;C:\\hadoop\\spark\\v161\\lib\\spark-csv_2.11-1.4.0.jar
spark.executor.extraClassPath       
C:\\hadoop\\spark\\v161\\lib\\mysql-connector-java-5.1.25-bin.jar;C:\\hadoop\\spark\\v161\\lib\\commons-csv-1.4.jar;C:\\hadoop\\spark\\v161\\lib\\spark-csv_2.11-1.4.0.jar

spark.driver.extraC‌​lassPath   C:\\hadoop\\spark\\v161\\lib\\*
spark.executor.extraClassPath   C:\\hadoop\\spark\\v161\\lib\\*

I think the issue is a straightforward one.
If I have the above jars in the lib folder, how do I add them to the driver 
classpath ON WINDOWS?
Like so..

spark.driver.extraC‌​lassPath       jar1 jar2 ...etc
spark.executor.extraClassPath     jar1 jar2 ...etc

I know there should be a way to do this, cept if Apache Spark is strictly 
restricted to Linux environments.
Meanwhile, is there any reason why Spark could detect the DataNucleus jars in 
the lib folder, but could not
detect the other jars in the same folder?

Kindly revert.

Thanks,
Ozzy


> Errors setting driver classpath in spark-defaults.conf on Windows 7
> -------------------------------------------------------------------
>
>                 Key: SPARK-17126
>                 URL: https://issues.apache.org/jira/browse/SPARK-17126
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Shell, SQL
>    Affects Versions: 1.6.1
>         Environment: Windows 7
>            Reporter: Ozioma Ihekwoaba
>
> I am having issues starting up Spark shell with a local hive-site.xml on 
> Windows 7.
> I have a local Hive 2.1.0 instance on Windows using a MySQL metastore.
> The Hive instance is working fine.
> I copied over the hive-site.xml to my local instance of Spark 1.6.1 conf 
> folder and also copied over mysql-connector-java-5.1.25-bin.jar to the lib 
> folder.
> I was expecting Spark to pick up jar files in the lib folder automatically, 
> but found out Spark expects a spark.driver.extraC‌​lassPath and 
> spark.executor.extraClassPath settings to resolve jars.
> Thing is this has failed on Windows for me with a 
> DataStoreDriverNotFoundException saying com.mysql.jdbc.Driver could not be 
> found.
> Here are some of the different file paths I've tried:
> C:/hadoop/spark/v161/lib/mysql-connector-java-5.1.25-bin.jar;C:/hadoop/spark/v161/lib/commons-csv-1.4.jar;C:/hadoop/spark/v161/lib/spark-csv_2.11-1.4.0.jar
> ".;C:\hadoop\spark\v161\lib\*"
> ....NONE has worked so far.
> Please, what is the correct way to set driver classpaths on Windows?
> Also, what is the correct file path format on Windows?
> I have it working fine on Linux but my current engagement requires me to run 
> Spark on a Windows box.
> Is there a way for Spark to automatically resolve jars from the lib folder in 
> all modes?
> Thanks.
> Ozzy



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to