There is an existing Jira to be able to specify the necessary JARs in the 
CREATE TABLE command, but this is not in yet - 
HIVE-9252<https://issues.apache.org/jira/browse/HIVE-9252>


________________________________
From: Jörn Franke <jornfra...@gmail.com>
Sent: Wednesday, April 12, 2017 8:04 AM
To: user@hive.apache.org
Subject: Re: Using JAR files located on HDFS for SerDe

I do not think it is supported. The jar for Hive must be on a local filesystem 
of the Hive server (not necessarily on all nodes).

On 12. Apr 2017, at 16:57, Mahdi Mohammadinasab 
<mah...@gmail.com<mailto:mah...@gmail.com>> wrote:

Hello,

I am trying to add a JAR file which is located on HDFS to be later used as a 
SerDe. This is completely possible using "ADD JAR" command but I prefer to use 
hive.aux.jars.path setting in "hive-site.xml" or "HIVE_AUX_JARS_PATH" 
environment variable (Because then I don't need to update all of my scripts 
with ADD JAR command).

Unfortunately, none of these methods seem to work. I tried different addressing 
methods (hdfs://, hdfs:///, hdfs://cluster_name/path/file.jar, 
hdfs://localhost:90000/path/file.jar, ...) but they don't work.

So how can I do this?

Thanks,
Mahdi

Reply via email to