The issue is solved. There was a problem in my hive codebase. Once that was
fixed, -Phive-provided spark is working fine against my hive jars.
On 27 April 2015 at 08:00, Manku Timma manku.tim...@gmail.com wrote:
Made some progress on this. Adding hive jars to the system classpath is
needed.
Made some progress on this. Adding hive jars to the system classpath is
needed. But looks like it needs to be towards the end of the system
classes. Manually adding the hive classpath into
Client.populateHadoopClasspath solved the issue. But a new issue has come
up. It looks like some hive
Setting SPARK_CLASSPATH is triggering other errors. Not working.
On 25 April 2015 at 09:16, Manku Timma manku.tim...@gmail.com wrote:
Actually found the culprit. The JavaSerializerInstance.deserialize is
called with a classloader (of type MutableURLClassLoader) which has access
to all the
Actually found the culprit. The JavaSerializerInstance.deserialize is
called with a classloader (of type MutableURLClassLoader) which has access
to all the hive classes. But internally it triggers a call to loadClass but
with the default classloader. Below is the stacktrace (line numbers in the
I see, now try a bit tricky approach, Add the hive jar to the
SPARK_CLASSPATH (in conf/spark-env.sh file on all machines) and make sure
that jar is available on all the machines in the cluster in the same path.
Thanks
Best Regards
On Wed, Apr 22, 2015 at 11:24 AM, Manku Timma
Akhil, Thanks for the suggestions.
I tried out sc.addJar, --jars, --conf spark.executor.extraClassPath and
none of them helped. I added stuff into compute-classpath.sh. That did not
change anything. I checked the classpath of the running executor and made
sure that the hive jars are in that dir.
Can you try sc.addJar(/path/to/your/hive/jar), i think it will resolve it.
Thanks
Best Regards
On Mon, Apr 20, 2015 at 12:26 PM, Manku Timma manku.tim...@gmail.com
wrote:
Akhil,
But the first case of creating HiveConf on the executor works fine (map
case). Only the second case fails. I was
I am using spark-1.3 with hadoop-provided and hive-provided and hive-0.13.1
profiles. I am running a simple spark job on a yarn cluster by adding all
hadoop2 and hive13 jars to the spark classpaths.
If I remove the hive-provided while building spark, I dont face any issue.
But with hive-provided
Looks like a missing jar, try to print the classpath and make sure the hive
jar is present.
Thanks
Best Regards
On Mon, Apr 20, 2015 at 11:52 AM, Manku Timma manku.tim...@gmail.com
wrote:
I am using spark-1.3 with hadoop-provided and hive-provided and
hive-0.13.1 profiles. I am running a
Akhil,
But the first case of creating HiveConf on the executor works fine (map
case). Only the second case fails. I was suspecting some foul play with
classloaders.
On 20 April 2015 at 12:20, Akhil Das ak...@sigmoidanalytics.com wrote:
Looks like a missing jar, try to print the classpath and
10 matches
Mail list logo