Thanks for the reply.
Actually, I don't think excluding spark-hive from spark-submit --packages
is a good idea.
I don't want to recompile spark by assembly for my cluster, every time a
new spark release is out.
I prefer using binary version of spark and then adding some jars for job
execution.
I want to add spark-hive as a dependence to submit my job, but it seems that
spark-submit can not resolve it.
$ ./bin/spark-submit \
→ --packages
org.apache.spark:spark-hive_2.10:1.4.0,org.postgresql:postgresql:9.3-1103-jdbc3,joda-time:joda-time:2.8.1
\
→ --class
spark-hive is excluded when using --packages, because it can be included in
the spark-assembly by adding -Phive during mvn package or sbt assembly.
Best,
Burak
On Tue, Jul 7, 2015 at 8:06 AM, Hao Ren inv...@gmail.com wrote:
I want to add spark-hive as a dependence to submit my job, but it