Here is a script I use to submit a directory of jar files. It assumes jar files
are in target/dependency or lib/
DRIVER_PATH=
DEPEND_PATH=
if [ -d "lib" ]; then
DRIVER_PATH="lib"
DEPEND_PATH="lib"
else
DRIVER_PATH="target"
DEPEND_PATH="target/dependency"
fi
DEPEND_JARS=log4j.properties
f
One more question.
How would I submit additional jars to the spark-submit job. I used --jars
option, it seems it is not working as explained earlier.
Thanks for the help,
-D
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-sql-SQLException-No-suitable
Hi All,
I tried to make combined.jar in shell script . it is working when I am using
spark-shell. But for the spark-submit it is same issue.
Help is highly appreciated.
Thanks
-D
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-sql-SQLException-No-sui
.1.34-bin.jar with
> timestamp 1419032530456
> 14/12/19 23:42:10 INFO SparkContext: Added JAR file:/root/abc/myjar.jar at
> http://10.61.187.176:57956/jars/filesplitter_2.10-1.0.jar with timestamp
> 1419032530459
> 14/12/19 23:42:10 INFO AppClient$ClientActor: Connecting to master
>
ws.com:7077...
Exception in thread "main" java.sql.SQLException: No suitable driver found
for "jdbc:mysql://192.168.20.45:3306/abcdb"?user="root"&password="admin"
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-sql-SQL