Here is a script I use to submit a directory of jar files. It assumes jar files
are in target/dependency or lib/
DRIVER_PATH=
DEPEND_PATH=
if [ -d "lib" ]; then
DRIVER_PATH="lib"
DEPEND_PATH="lib"
else
DRIVER_PATH="target"
DEPEND_PATH="target/dependency"
fi
DEPEND_JARS=log4j.properties
f
One more question.
How would I submit additional jars to the spark-submit job. I used --jars
option, it seems it is not working as explained earlier.
Thanks for the help,
-D
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-sql-SQLException-No-suitable
Hi All,
I tried to make combined.jar in shell script . it is working when I am using
spark-shell. But for the spark-submit it is same issue.
Help is highly appreciated.
Thanks
-D
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-sql-SQLException-No-sui
With JDBC you often need to load the class so it can register the driver at
the beginning of your program. Usually this is something like:
Class.forName("com.mysql.jdbc.Driver");
On Fri, Dec 19, 2014 at 3:47 PM, durga wrote:
> Hi I am facing an issue with mysql jars with spark-submit.
>
> I a