[ 
https://issues.apache.org/jira/browse/HDDS-671?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16652475#comment-16652475
 ] 

Namit Maheshwari commented on HDDS-671:
---------------------------------------

{code:java}
-bash-4.2$ beeline
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/usr/hdp/3.0.3.0-63/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/3.0.3.0-63/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to 
jdbc:hive2://ctr-e138-1518143905142-510793-01-000011.hwx.site:2181,ctr-e138-1518143905142-510793-01-000006.hwx.site:2181,ctr-e138-1518143905142-510793-01-000008.hwx.site:2181,ctr-e138-1518143905142-510793-01-000010.hwx.site:2181,ctr-e138-1518143905142-510793-01-000007.hwx.site:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
Enter username for 
jdbc:hive2://ctr-e138-1518143905142-510793-01-000011.hwx.site:2181,ctr-e138-1518143905142-510793-01-000006.hwx.site:2181,ctr-e138-1518143905142-510793-01-000008.hwx.site:2181,ctr-e138-1518143905142-510793-01-000010.hwx.site:2181,ctr-e138-1518143905142-510793-01-000007.hwx.site:2181/default:
Enter password for 
jdbc:hive2://ctr-e138-1518143905142-510793-01-000011.hwx.site:2181,ctr-e138-1518143905142-510793-01-000006.hwx.site:2181,ctr-e138-1518143905142-510793-01-000008.hwx.site:2181,ctr-e138-1518143905142-510793-01-000010.hwx.site:2181,ctr-e138-1518143905142-510793-01-000007.hwx.site:2181/default:
18/10/16 21:09:32 [main]: INFO jdbc.HiveConnection: Connected to 
ctr-e138-1518143905142-510793-01-000004.hwx.site:10000
Connected to: Apache Hive (version 3.1.0.3.0.3.0-63)
Driver: Hive JDBC (version 3.1.0.3.0.3.0-63)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 3.1.0.3.0.3.0-63 by Apache Hive
0: jdbc:hive2://ctr-e138-1518143905142-510793> describe formatted testo3;
INFO : Compiling 
command(queryId=hive_20181016210256_3400c0cb-a1d3-4384-8af4-7b95678030e4): 
describe formatted testo3
INFO : Semantic Analysis Completed (retrial = false)
INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:col_name, 
type:string, comment:from deserializer), FieldSchema(name:data_type, 
type:string, comment:from deserializer), FieldSchema(name:comment, type:string, 
comment:from deserializer)], properties:null)
INFO : Completed compiling 
command(queryId=hive_20181016210256_3400c0cb-a1d3-4384-8af4-7b95678030e4); Time 
taken: 1.616 seconds
INFO : Executing 
command(queryId=hive_20181016210256_3400c0cb-a1d3-4384-8af4-7b95678030e4): 
describe formatted testo3
INFO : Starting task [Stage-0:DDL] in serial mode
INFO : Completed executing 
command(queryId=hive_20181016210256_3400c0cb-a1d3-4384-8af4-7b95678030e4); Time 
taken: 0.294 seconds
INFO : OK
+-------------------------------+----------------------------------------------------+-----------------------+
| col_name | data_type | comment |
+-------------------------------+----------------------------------------------------+-----------------------+
| # col_name | data_type | comment |
| i | int | |
| s | string | |
| d | float | |
| | NULL | NULL |
| # Detailed Table Information | NULL | NULL |
| Database: | default | NULL |
| OwnerType: | USER | NULL |
| Owner: | anonymous | NULL |
| CreateTime: | Mon Oct 15 22:25:33 UTC 2018 | NULL |
| LastAccessTime: | UNKNOWN | NULL |
| Retention: | 0 | NULL |
| Location: | o3://bucket2.volume2/testo3 | NULL |
| Table Type: | EXTERNAL_TABLE | NULL |
| Table Parameters: | NULL | NULL |
| | EXTERNAL | TRUE |
| | bucketing_version | 2 |
| | transient_lastDdlTime | 1539642333 |
| | NULL | NULL |
| # Storage Information | NULL | NULL |
| SerDe Library: | org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe | NULL |
| InputFormat: | org.apache.hadoop.mapred.TextInputFormat | NULL |
| OutputFormat: | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat | 
NULL |
| Compressed: | No | NULL |
| Num Buckets: | -1 | NULL |
| Bucket Columns: | [] | NULL |
| Sort Columns: | [] | NULL |
| Storage Desc Params: | NULL | NULL |
| | serialization.format | 1 |
+-------------------------------+----------------------------------------------------+-----------------------+
29 rows selected (2.65 seconds)
0: jdbc:hive2://ctr-e138-1518143905142-510793> insert into testo3 values(1, 
"aa", 3.0);
INFO : Compiling 
command(queryId=hive_20181016212028_c9c9dd7d-0f14-4d72-80cf-2177cc468167): 
insert into testo3 values(1, "aa", 3.0)
INFO : Semantic Analysis Completed (retrial = false)
INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_col0, 
type:int, comment:null), FieldSchema(name:_col1, type:string, comment:null), 
FieldSchema(name:_col2, type:float, comment:null)], properties:null)
INFO : Completed compiling 
command(queryId=hive_20181016212028_c9c9dd7d-0f14-4d72-80cf-2177cc468167); Time 
taken: 6.272 seconds
INFO : Executing 
command(queryId=hive_20181016212028_c9c9dd7d-0f14-4d72-80cf-2177cc468167): 
insert into testo3 values(1, "aa", 3.0)
INFO : Query ID = hive_20181016212028_c9c9dd7d-0f14-4d72-80cf-2177cc468167
INFO : Total jobs = 1
INFO : Launching Job 1 out of 1
INFO : Starting task [Stage-1:MAPRED] in serial mode
INFO : Subscribed to counters: [] for queryId: 
hive_20181016212028_c9c9dd7d-0f14-4d72-80cf-2177cc468167
INFO : Tez session hasn't been created yet. Opening session
ERROR : Failed to execute tez graph.
org.apache.hadoop.security.AccessControlException: Permission denied: 
user=anonymous, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1850)
at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1834)
at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1793)
at 
org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707)
at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
~[?:1.8.0_181]
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
 ~[?:1.8.0_181]
at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 ~[?:1.8.0_181]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
~[?:1.8.0_181]
at 
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2411) 
~[hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:?]
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2385) 
~[hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1325)
 ~[hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1322)
 ~[hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1339)
 ~[hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1314)
 ~[hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:?]
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2326) 
~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.hive.ql.exec.tez.DagUtils.getDefaultDestDir(DagUtils.java:1000)
 ~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hadoop.hive.ql.exec.tez.DagUtils.getHiveJarDirectory(DagUtils.java:1152)
 ~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hadoop.hive.ql.exec.tez.TezSessionState.createJarLocalResource(TezSessionState.java:896)
 ~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hadoop.hive.ql.exec.tez.TezSessionState.makeCombinedJarMap(TezSessionState.java:349)
 ~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:418)
 ~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolSession.openInternal(TezSessionPoolSession.java:124)
 ~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:373)
 ~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hadoop.hive.ql.exec.tez.TezTask.ensureSessionHasResources(TezTask.java:372)
 ~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:199) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:210) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2707) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2378) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2054) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1752) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1746) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) 
~[hive-exec-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226)
 ~[hive-service-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87)
 ~[hive-service-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at 
org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:318)
 ~[hive-service-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:338)
 ~[hive-service-3.1.0.3.0.3.0-63.jar:3.1.0.3.0.3.0-63]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
~[?:1.8.0_181]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_181]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
~[?:1.8.0_181]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_181]
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
~[?:1.8.0_181]
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
~[?:1.8.0_181]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
Caused by: org.apache.hadoop.ipc.RemoteException: Permission denied: 
user=anonymous, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1850)
at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1834)
at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1793)
at 
org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707)
at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)

at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497) 
~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1443) 
~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1353) 
~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at com.sun.proxy.$Proxy32.mkdirs(Unknown Source) ~[?:?]
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:653)
 ~[hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:?]
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source) ~[?:?]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_181]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
 ~[hadoop-common-3.1.1.3.0.3.0-63.jar:?]
at com.sun.proxy.$Proxy33.mkdirs(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2409) 
~[hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:?]
... 38 more
ERROR : FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.tez.TezTask
INFO : Completed executing 
command(queryId=hive_20181016212028_c9c9dd7d-0f14-4d72-80cf-2177cc468167); Time 
taken: 0.827 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1 
from org.apache.hadoop.hive.ql.exec.tez.TezTask (state=08S01,code=1)
0: jdbc:hive2://ctr-e138-1518143905142-510793>

{code}
 

> Hive HSI insert tries to create data in Hdfs for Ozone external table
> ---------------------------------------------------------------------
>
>                 Key: HDDS-671
>                 URL: https://issues.apache.org/jira/browse/HDDS-671
>             Project: Hadoop Distributed Data Store
>          Issue Type: Task
>            Reporter: Namit Maheshwari
>            Priority: Major
>              Labels: app-compat
>
> Hive HSI insert tries to create data in Hdfs for Ozone external table, when 
> "hive.server2.enable.doAs" is set to true 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to