[ 
https://issues.apache.org/jira/browse/HIVE-18177?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aegeaner updated HIVE-18177:
----------------------------
    Description: 
When creating a table in hive using statement:

{code:java}
create database if not exists ${DB};
use ${DB};

drop table if exists date_dim;

create table date_dim
stored as ${FILE}
as select * from ${SOURCE}.date_dim;
{code}

The statement execution failed as:

{code:java}
FAILED: SemanticException 0:0 Error creating temporary folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
FAILED: SemanticException 0:0 Error creating temporary folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
{code}


We got this exception stack:

{code:java}
2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: Completed phase 1 of Semantic Analysis
2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: Get metadata for source tables
2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
metastore.HiveMetaStore: 0: get_table : db=tpcds_text_2 tbl=date_dim
2017-11-29T17:32:47,647  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_table : 
db=tpcds_text_2 tbl=date_dim
2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: Get metadata for subqueries
2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: Get metadata for destination tables
2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
metastore.HiveMetaStore: 0: get_database: tpcds_bin_partitioned_orc_2
2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_database: 
tpcds_bin_partitioned_orc_2
2017-11-29T17:32:48,308  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
common.FileUtils: Creating directory if it doesn't exist: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1
2017-11-29T17:32:48,330 ERROR [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: org.apache.hadoop.hive.ql.parse.SemanticException: 0:0 
Error creating temporary folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2211)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1934)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:11080)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11133)
        at 
org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
        at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
        at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
        at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
        at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.RuntimeException: Cannot create staging directory 
'hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1':
 Invalid host name: local host is: (unknown); destination host is: 
"ns-offline":8020; java.net.UnknownHostException; For more details see:  
http://wiki.apache.org/hadoop/UnknownHost
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:374)
        at 
org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:632)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2208)
        ... 25 more
Caused by: java.net.UnknownHostException: Invalid host name: local host is: 
(unknown); destination host is: "ns-offline":8020; 
java.net.UnknownHostException; For more details see:  
http://wiki.apache.org/hadoop/UnknownHost
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744)
        at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:445)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)
        at org.apache.hadoop.ipc.Client.call(Client.java:1373)
        at org.apache.hadoop.ipc.Client.call(Client.java:1337)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
        at com.sun.proxy.$Proxy36.getFileInfo(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:787)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
        at com.sun.proxy.$Proxy37.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1700)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1436)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1433)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1433)
        at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:528)
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:366)
        ... 27 more
Caused by: java.net.UnknownHostException
        ... 52 more

2017-11-29T17:32:48,331 ERROR [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
ql.Driver: FAILED: SemanticException 0:0 Error creating temporary folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
org.apache.hadoop.hive.ql.parse.SemanticException: 0:0 Error creating temporary 
folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2211)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1934)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:11080)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11133)
        at 
org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
        at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
        at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
        at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
        at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.RuntimeException: Cannot create staging directory 
'hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1':
 Invalid host name: local host is: (unknown); destination host is: 
"ns-offline":8020; java.net.UnknownHostException; For more details see:  
http://wiki.apache.org/hadoop/UnknownHost
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:374)
        at 
org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:632)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2208)
        ... 25 more
Caused by: java.net.UnknownHostException: Invalid host name: local host is: 
(unknown); destination host is: "ns-offline":8020; 
java.net.UnknownHostException; For more details see:  
http://wiki.apache.org/hadoop/UnknownHost
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744)
        at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:445)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)
        at org.apache.hadoop.ipc.Client.call(Client.java:1373)
        at org.apache.hadoop.ipc.Client.call(Client.java:1337)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
        at com.sun.proxy.$Proxy36.getFileInfo(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:787)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
        at com.sun.proxy.$Proxy37.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1700)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1436)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1433)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1433)
        at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:528)
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:366)
        ... 27 more
Caused by: java.net.UnknownHostException
        ... 52 more
{code}

In HADOOP core-site.xml:

{code:java}
<property>
  <name>fs.default.name</name>
  <value>hdfs://ns-offline</value>
</property>
{code}

The Exception is thrown at Hive QL:
[https://github.com/apache/hive/blob/07fe7e210cb444aec43cb5adda37f8f7cd26f243/ql/src/java/org/apache/hadoop/hive/ql/Context.java#L396]
And called HDFS mkdir operation at:
[https://github.com/apache/hive/blob/07fe7e210cb444aec43cb5adda37f8f7cd26f243/common/src/java/org/apache/hadoop/hive/common/FileUtils.java#L579]
The call above took a _+conf+_ parameter but not used.

And also, in PathInfo class where we got the FileSystem instance:
[https://github.com/apache/hive/blob/32e854ef1c25f21d53f7932723cfc76bf75a71cd/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/util/PathInfo.java#L69]
{code:java}
FileSystem fileSystem = inputPath.getFileSystem(hiveConf);
{code}
We passed  a hive configuration instead of HDFS config.


  was:
When creating a table in hive using statement:

{code:java}
create database if not exists ${DB};
use ${DB};

drop table if exists date_dim;

create table date_dim
stored as ${FILE}
as select * from ${SOURCE}.date_dim;
{code}

The statement execution failed as:

{code:java}
FAILED: SemanticException 0:0 Error creating temporary folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
FAILED: SemanticException 0:0 Error creating temporary folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
{code}


We got this exception stack:

{code:java}
2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: Completed phase 1 of Semantic Analysis
2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: Get metadata for source tables
2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
metastore.HiveMetaStore: 0: get_table : db=tpcds_text_2 tbl=date_dim
2017-11-29T17:32:47,647  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_table : 
db=tpcds_text_2 tbl=date_dim
2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: Get metadata for subqueries
2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: Get metadata for destination tables
2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
metastore.HiveMetaStore: 0: get_database: tpcds_bin_partitioned_orc_2
2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
HiveMetaStore.audit: ugi=hadoop       ip=unknown-ip-addr      cmd=get_database: 
tpcds_bin_partitioned_orc_2
2017-11-29T17:32:48,308  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
common.FileUtils: Creating directory if it doesn't exist: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1
2017-11-29T17:32:48,330 ERROR [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
parse.CalcitePlanner: org.apache.hadoop.hive.ql.parse.SemanticException: 0:0 
Error creating temporary folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2211)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1934)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:11080)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11133)
        at 
org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
        at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
        at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
        at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
        at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.RuntimeException: Cannot create staging directory 
'hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1':
 Invalid host name: local host is: (unknown); destination host is: 
"ns-offline":8020; java.net.UnknownHostException; For more details see:  
http://wiki.apache.org/hadoop/UnknownHost
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:374)
        at 
org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:632)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2208)
        ... 25 more
Caused by: java.net.UnknownHostException: Invalid host name: local host is: 
(unknown); destination host is: "ns-offline":8020; 
java.net.UnknownHostException; For more details see:  
http://wiki.apache.org/hadoop/UnknownHost
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744)
        at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:445)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)
        at org.apache.hadoop.ipc.Client.call(Client.java:1373)
        at org.apache.hadoop.ipc.Client.call(Client.java:1337)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
        at com.sun.proxy.$Proxy36.getFileInfo(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:787)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
        at com.sun.proxy.$Proxy37.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1700)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1436)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1433)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1433)
        at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:528)
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:366)
        ... 27 more
Caused by: java.net.UnknownHostException
        ... 52 more

2017-11-29T17:32:48,331 ERROR [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
ql.Driver: FAILED: SemanticException 0:0 Error creating temporary folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
org.apache.hadoop.hive.ql.parse.SemanticException: 0:0 Error creating temporary 
folder on: 
hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
encountered near token 'TOK_TMP_FILE'
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2211)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1934)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:11080)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11133)
        at 
org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
        at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
        at 
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
        at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
        at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.RuntimeException: Cannot create staging directory 
'hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1':
 Invalid host name: local host is: (unknown); destination host is: 
"ns-offline":8020; java.net.UnknownHostException; For more details see:  
http://wiki.apache.org/hadoop/UnknownHost
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:374)
        at 
org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:632)
        at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2208)
        ... 25 more
Caused by: java.net.UnknownHostException: Invalid host name: local host is: 
(unknown); destination host is: "ns-offline":8020; 
java.net.UnknownHostException; For more details see:  
http://wiki.apache.org/hadoop/UnknownHost
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744)
        at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:445)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)
        at org.apache.hadoop.ipc.Client.call(Client.java:1373)
        at org.apache.hadoop.ipc.Client.call(Client.java:1337)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
        at com.sun.proxy.$Proxy36.getFileInfo(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:787)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
        at com.sun.proxy.$Proxy37.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1700)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1436)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1433)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1433)
        at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:528)
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:366)
        ... 27 more
Caused by: java.net.UnknownHostException
        ... 52 more
{code}

In HADOOP core-site.xml:

{code:java}
<property>
  <name>fs.default.name</name>
  <value>hdfs://ns-offline</value>
</property>
{code}

The Exception is thrown at Hive QL:
[https://github.com/apache/hive/blob/07fe7e210cb444aec43cb5adda37f8f7cd26f243/ql/src/java/org/apache/hadoop/hive/ql/Context.java#L396]
And called HDFS mkdir operation at:
[https://github.com/apache/hive/blob/07fe7e210cb444aec43cb5adda37f8f7cd26f243/common/src/java/org/apache/hadoop/hive/common/FileUtils.java#L579]
The call above took a _+conf+_ parameter but not used.



> Error creating temporary staging folder on HDFS when creating Hive table
> ------------------------------------------------------------------------
>
>                 Key: HIVE-18177
>                 URL: https://issues.apache.org/jira/browse/HIVE-18177
>             Project: Hive
>          Issue Type: Bug
>          Components: Parser
>    Affects Versions: 2.3.1
>            Reporter: Aegeaner
>
> When creating a table in hive using statement:
> {code:java}
> create database if not exists ${DB};
> use ${DB};
> drop table if exists date_dim;
> create table date_dim
> stored as ${FILE}
> as select * from ${SOURCE}.date_dim;
> {code}
> The statement execution failed as:
> {code:java}
> FAILED: SemanticException 0:0 Error creating temporary folder on: 
> hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
> encountered near token 'TOK_TMP_FILE'
> FAILED: SemanticException 0:0 Error creating temporary folder on: 
> hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
> encountered near token 'TOK_TMP_FILE'
> {code}
> We got this exception stack:
> {code:java}
> 2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> parse.CalcitePlanner: Completed phase 1 of Semantic Analysis
> 2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> parse.CalcitePlanner: Get metadata for source tables
> 2017-11-29T17:32:47,646  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> metastore.HiveMetaStore: 0: get_table : db=tpcds_text_2 tbl=date_dim
> 2017-11-29T17:32:47,647  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> HiveMetaStore.audit: ugi=hadoop     ip=unknown-ip-addr      cmd=get_table : 
> db=tpcds_text_2 tbl=date_dim
> 2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> parse.CalcitePlanner: Get metadata for subqueries
> 2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> parse.CalcitePlanner: Get metadata for destination tables
> 2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> metastore.HiveMetaStore: 0: get_database: tpcds_bin_partitioned_orc_2
> 2017-11-29T17:32:47,748  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> HiveMetaStore.audit: ugi=hadoop     ip=unknown-ip-addr      cmd=get_database: 
> tpcds_bin_partitioned_orc_2
> 2017-11-29T17:32:48,308  INFO [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> common.FileUtils: Creating directory if it doesn't exist: 
> hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1
> 2017-11-29T17:32:48,330 ERROR [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> parse.CalcitePlanner: org.apache.hadoop.hive.ql.parse.SemanticException: 0:0 
> Error creating temporary folder on: 
> hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
> encountered near token 'TOK_TMP_FILE'
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2211)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1934)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:11080)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11133)
>       at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
>       at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
>       at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
>       at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
>       at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
>       at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
>       at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>       at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
> Caused by: java.lang.RuntimeException: Cannot create staging directory 
> 'hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1':
>  Invalid host name: local host is: (unknown); destination host is: 
> "ns-offline":8020; java.net.UnknownHostException; For more details see:  
> http://wiki.apache.org/hadoop/UnknownHost
>       at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:374)
>       at 
> org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:632)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2208)
>       ... 25 more
> Caused by: java.net.UnknownHostException: Invalid host name: local host is: 
> (unknown); destination host is: "ns-offline":8020; 
> java.net.UnknownHostException; For more details see:  
> http://wiki.apache.org/hadoop/UnknownHost
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>       at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)
>       at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744)
>       at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:445)
>       at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1373)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1337)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
>       at com.sun.proxy.$Proxy36.getFileInfo(Unknown Source)
>       at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:787)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
>       at com.sun.proxy.$Proxy37.getFileInfo(Unknown Source)
>       at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1700)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1436)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1433)
>       at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1433)
>       at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:528)
>       at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:366)
>       ... 27 more
> Caused by: java.net.UnknownHostException
>       ... 52 more
> 2017-11-29T17:32:48,331 ERROR [4d9462cf-43b0-4fea-b0c2-c1a9969d9763 main] 
> ql.Driver: FAILED: SemanticException 0:0 Error creating temporary folder on: 
> hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
> encountered near token 'TOK_TMP_FILE'
> org.apache.hadoop.hive.ql.parse.SemanticException: 0:0 Error creating 
> temporary folder on: 
> hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db. Error 
> encountered near token 'TOK_TMP_FILE'
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2211)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1934)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:11080)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11133)
>       at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
>       at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
>       at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
>       at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
>       at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
>       at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
>       at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>       at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
> Caused by: java.lang.RuntimeException: Cannot create staging directory 
> 'hdfs://ns-offline/user/hive2/warehouse/tpcds_bin_partitioned_orc_2.db/.hive-staging_hive_2017-11-29_17-32-47_541_2322222506518783479-1':
>  Invalid host name: local host is: (unknown); destination host is: 
> "ns-offline":8020; java.net.UnknownHostException; For more details see:  
> http://wiki.apache.org/hadoop/UnknownHost
>       at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:374)
>       at 
> org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:632)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2208)
>       ... 25 more
> Caused by: java.net.UnknownHostException: Invalid host name: local host is: 
> (unknown); destination host is: "ns-offline":8020; 
> java.net.UnknownHostException; For more details see:  
> http://wiki.apache.org/hadoop/UnknownHost
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>       at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)
>       at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744)
>       at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:445)
>       at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1373)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1337)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
>       at com.sun.proxy.$Proxy36.getFileInfo(Unknown Source)
>       at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:787)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
>       at com.sun.proxy.$Proxy37.getFileInfo(Unknown Source)
>       at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1700)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1436)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1433)
>       at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1433)
>       at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:528)
>       at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:366)
>       ... 27 more
> Caused by: java.net.UnknownHostException
>       ... 52 more
> {code}
> In HADOOP core-site.xml:
> {code:java}
> <property>
>   <name>fs.default.name</name>
>   <value>hdfs://ns-offline</value>
> </property>
> {code}
> The Exception is thrown at Hive QL:
> [https://github.com/apache/hive/blob/07fe7e210cb444aec43cb5adda37f8f7cd26f243/ql/src/java/org/apache/hadoop/hive/ql/Context.java#L396]
> And called HDFS mkdir operation at:
> [https://github.com/apache/hive/blob/07fe7e210cb444aec43cb5adda37f8f7cd26f243/common/src/java/org/apache/hadoop/hive/common/FileUtils.java#L579]
> The call above took a _+conf+_ parameter but not used.
> And also, in PathInfo class where we got the FileSystem instance:
> [https://github.com/apache/hive/blob/32e854ef1c25f21d53f7932723cfc76bf75a71cd/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/util/PathInfo.java#L69]
> {code:java}
> FileSystem fileSystem = inputPath.getFileSystem(hiveConf);
> {code}
> We passed  a hive configuration instead of HDFS config.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to