No, those have to be local paths.

On Thu, Apr 23, 2015 at 6:53 PM, Night Wolf <nightwolf...@gmail.com> wrote:
> Thanks Marcelo, can this be a path on HDFS?
>
> On Fri, Apr 24, 2015 at 11:52 AM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
>>
>> You'd have to use spark.{driver,executor}.extraClassPath to modify the
>> system class loader. But that also means you have to manually
>> distribute the jar to the nodes in your cluster, into a common
>> location.
>>
>> On Thu, Apr 23, 2015 at 6:38 PM, Night Wolf <nightwolf...@gmail.com>
>> wrote:
>> > Hi guys,
>> >
>> > Having a problem build a DataFrame in Spark SQL from a JDBC data source
>> > when
>> > running with --master yarn-client and adding the JDBC driver JAR with
>> > --jars. If I run with a local[*] master all works fine.
>> >
>> > ./bin/spark-shell --jars /tmp/libs/mysql-jdbc.jar --master yarn-client
>> >
>> > sqlContext.load("jdbc", Map("url" ->
>> > "jdbc:mysql://mysqlsvr:3306/MyDB;user=usr;password=pwd", "driver" ->
>> > "com.mysql.jdbc.Driver", "dbtable" -> "MY_TBL”))
>> >
>> >
>> > This throws a class not found exception when running with Spark SQL. But
>> > when trying to get the driver class on the workers or driver the class
>> > is
>> > found no problems. So I'm guessing this is some problem with the
>> > primordial
>> > class loader/Java security in the DriverManager and the class loader
>> > used in
>> > Spark SQL when running on YARN.
>> >
>> > Any ideas? The only thing I have found that works is merging my mysql
>> > adbc
>> > driver into the Spark assembly JAR thats shipped to YARN. Because adding
>> > with --jars doesn't work.
>> >
>> > Cheers!
>>
>>
>>
>> --
>> Marcelo
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to