[ 
https://issues.apache.org/jira/browse/SPARK-17126?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15432767#comment-15432767
 ] 

Ozioma Ihekwoaba commented on SPARK-17126:
------------------------------------------

It works, all my jars get listed in the Spark Web UI.
I think from Java 6 upwards you can use the wildcard option to specify all jars 
in a classpath directory.
I get your point, my scenario was a spark-shell tutorial session for Spark-SQL 
using a custom Hive instance.
I needed a way to add the MySQL connector jar to the classpath for the Hive 
metastore, and also for other jars
like the Spark CSV jar.
Works like a charm on Linux, but failed repeatedly on Windows.
Just curious do you know of any company running production Spark clusters on 
Windows?
Cos it appears Spark is not built for Windows and all the examples point to a 
Linux setting.
Thing is lots of up and coming young devs are totally flummoxed by the Linux 
command-line, and since they use
Windows by default, Windows should be supported at a minimum...as a dev 
platform.
You know, like the sbin folder scripts are all bash scripts.

Ok, that was a subtle rant, maybe I should adapt the scripts myself to run on 
Windows.
Thanks for the awesome work!

> Errors setting driver classpath in spark-defaults.conf on Windows 7
> -------------------------------------------------------------------
>
>                 Key: SPARK-17126
>                 URL: https://issues.apache.org/jira/browse/SPARK-17126
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Shell, SQL
>    Affects Versions: 1.6.1
>         Environment: Windows 7
>            Reporter: Ozioma Ihekwoaba
>
> I am having issues starting up Spark shell with a local hive-site.xml on 
> Windows 7.
> I have a local Hive 2.1.0 instance on Windows using a MySQL metastore.
> The Hive instance is working fine.
> I copied over the hive-site.xml to my local instance of Spark 1.6.1 conf 
> folder and also copied over mysql-connector-java-5.1.25-bin.jar to the lib 
> folder.
> I was expecting Spark to pick up jar files in the lib folder automatically, 
> but found out Spark expects a spark.driver.extraC‌​lassPath and 
> spark.executor.extraClassPath settings to resolve jars.
> Thing is this has failed on Windows for me with a 
> DataStoreDriverNotFoundException saying com.mysql.jdbc.Driver could not be 
> found.
> Here are some of the different file paths I've tried:
> C:/hadoop/spark/v161/lib/mysql-connector-java-5.1.25-bin.jar;C:/hadoop/spark/v161/lib/commons-csv-1.4.jar;C:/hadoop/spark/v161/lib/spark-csv_2.11-1.4.0.jar
> ".;C:\hadoop\spark\v161\lib\*"
> ....NONE has worked so far.
> Please, what is the correct way to set driver classpaths on Windows?
> Also, what is the correct file path format on Windows?
> I have it working fine on Linux but my current engagement requires me to run 
> Spark on a Windows box.
> Is there a way for Spark to automatically resolve jars from the lib folder in 
> all modes?
> Thanks.
> Ozzy



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to