I fought with a ClassNotFoundException for quite some time, but it was for
kafka.

The final configuration that got everything working was running spark-submit
with the following options:

--jars "/path/to/.ivy2/jars/package.jar" \
--driver-class-path "/path/to/.ivy2/jars/package.jar" \
--conf "spark.executor.extraClassPath=/path/to/.ivy2/package.jar" \
--packages org.some.package:package_name:version

While this was needed for me to run in cluster mode, it works equally well
for client mode as well.

One other note when needing to supplied multiple items to these args -
--jars and --packages should be comma separated, --driver-class-path and
extraClassPath should be : separated

HTH
​

On Fri, Apr 13, 2018 at 4:28 AM, jb44 <jbo...@gmail.com> wrote:

> Haoyuan -
>
> As I mentioned below, I've been through the documentation already.  It has
> not helped me to resolve the issue.
>
> Here is what I have tried so far:
>
> - setting extraClassPath as explained below
> - adding fs.alluxio.impl through sparkconf
> - adding spark.sql.hive.metastore.sharedPrefixes (though I don't believe
> this matters in my case)
> - compiling the client from source
>
> Do you have any other suggestions on how to get this working?
>
> Thanks
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to