[ 
https://issues.apache.org/jira/browse/SPARK-17126?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15426391#comment-15426391
 ] 

Ozioma Ihekwoaba commented on SPARK-17126:
------------------------------------------

Hi Sean,

Sorry, but could you be more direct in your responses, would really appreciate 
it.
Apparently I have done my share of googling for answers on this topic and if 
you noticed in my reply,
I noted that ALL file paths I set are NOT working.
So I wouldn't know which of the file paths you were referring to as correct.
Is it this?
spark.driver.extraC‌​lassPath C:\\hadoop\\spark\\v161\\lib\\*

Or this?
spark.driver.extraC‌​lassPath 
C:\\hadoop\\spark\\v161\\lib\\mysql-connector-java-5.1.25-bin.jar;C:\\hadoop\\spark\\v161\\lib\\commons-csv-1.4.jar;C:\\hadoop\\spark\\v161\\lib\\spark-csv_2.11-1.4.0.jar

Also I find this confusing "It also looks like you're referring to a local file 
on remote executors, so I imagine that could be the problem". I'm thinking a 
straight up example would do the trick.

I understand you are attending to so many people, but plsssss could you help 
with this,
If my SPARK_HOME = C:\\hadoop\\spark\\v161
and I need to add the following jars from the %SPARK_HOME%\lib folder to the 
driver classpath:
mysql-connector-java-5.1.25-bin.jar
spark-csv_2.11-1.4.0.jar

Could you give AN EXAMPLE of the correct file path format on WINDOWS?
spark.driver.extraC‌​lassPath        {correct format}
spark.executor.extraClassPath      {correct format}


Thanks again,
Ozzy

> Errors setting driver classpath in spark-defaults.conf on Windows 7
> -------------------------------------------------------------------
>
>                 Key: SPARK-17126
>                 URL: https://issues.apache.org/jira/browse/SPARK-17126
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Shell, SQL
>    Affects Versions: 1.6.1
>         Environment: Windows 7
>            Reporter: Ozioma Ihekwoaba
>
> I am having issues starting up Spark shell with a local hive-site.xml on 
> Windows 7.
> I have a local Hive 2.1.0 instance on Windows using a MySQL metastore.
> The Hive instance is working fine.
> I copied over the hive-site.xml to my local instance of Spark 1.6.1 conf 
> folder and also copied over mysql-connector-java-5.1.25-bin.jar to the lib 
> folder.
> I was expecting Spark to pick up jar files in the lib folder automatically, 
> but found out Spark expects a spark.driver.extraC‌​lassPath and 
> spark.executor.extraClassPath settings to resolve jars.
> Thing is this has failed on Windows for me with a 
> DataStoreDriverNotFoundException saying com.mysql.jdbc.Driver could not be 
> found.
> Here are some of the different file paths I've tried:
> C:/hadoop/spark/v161/lib/mysql-connector-java-5.1.25-bin.jar;C:/hadoop/spark/v161/lib/commons-csv-1.4.jar;C:/hadoop/spark/v161/lib/spark-csv_2.11-1.4.0.jar
> ".;C:\hadoop\spark\v161\lib\*"
> ....NONE has worked so far.
> Please, what is the correct way to set driver classpaths on Windows?
> Also, what is the correct file path format on Windows?
> I have it working fine on Linux but my current engagement requires me to run 
> Spark on a Windows box.
> Is there a way for Spark to automatically resolve jars from the lib folder in 
> all modes?
> Thanks.
> Ozzy



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to