[
https://issues.apache.org/jira/browse/SPARK-17126?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15432715#comment-15432715
]
Sean Owen commented on SPARK-17126:
-----------------------------------
Hm, I am not sure that "*" works on any JVM. Maybe I'm missing a reason it
works for the env variable. But you would in general not specify it this way,
which could be the problem. You would also not in general set app jar
dependencies this way, but rather build them into your app.
> Errors setting driver classpath in spark-defaults.conf on Windows 7
> -------------------------------------------------------------------
>
> Key: SPARK-17126
> URL: https://issues.apache.org/jira/browse/SPARK-17126
> Project: Spark
> Issue Type: Question
> Components: Spark Shell, SQL
> Affects Versions: 1.6.1
> Environment: Windows 7
> Reporter: Ozioma Ihekwoaba
>
> I am having issues starting up Spark shell with a local hive-site.xml on
> Windows 7.
> I have a local Hive 2.1.0 instance on Windows using a MySQL metastore.
> The Hive instance is working fine.
> I copied over the hive-site.xml to my local instance of Spark 1.6.1 conf
> folder and also copied over mysql-connector-java-5.1.25-bin.jar to the lib
> folder.
> I was expecting Spark to pick up jar files in the lib folder automatically,
> but found out Spark expects a spark.driver.extraClassPath and
> spark.executor.extraClassPath settings to resolve jars.
> Thing is this has failed on Windows for me with a
> DataStoreDriverNotFoundException saying com.mysql.jdbc.Driver could not be
> found.
> Here are some of the different file paths I've tried:
> C:/hadoop/spark/v161/lib/mysql-connector-java-5.1.25-bin.jar;C:/hadoop/spark/v161/lib/commons-csv-1.4.jar;C:/hadoop/spark/v161/lib/spark-csv_2.11-1.4.0.jar
> ".;C:\hadoop\spark\v161\lib\*"
> ....NONE has worked so far.
> Please, what is the correct way to set driver classpaths on Windows?
> Also, what is the correct file path format on Windows?
> I have it working fine on Linux but my current engagement requires me to run
> Spark on a Windows box.
> Is there a way for Spark to automatically resolve jars from the lib folder in
> all modes?
> Thanks.
> Ozzy
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]