[jira] [Commented] (SPARK-18584) multiple Spark Thrift Servers running in the same machine throws org.apache.hadoop.security.AccessControlException

2016-11-25 Thread tanxinz (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-18584?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15697237#comment-15697237
 ] 

tanxinz commented on SPARK-18584:
-

Two STS ran diffent Queue on yarn
etl Spark Thrift Server ran root.etl queue

dev Spark Thrift Server ran root.dev queue

I found spark Executor like this ,can it identify which users perform ?
3004 CoarseGrainedExecutorBackend --driver-url 
spark://CoarseGrainedScheduler@machine_ip:33035 --executor-id 154 --hostname 
slave198 --cores 3 --app-id application_1479797390730_2433 --user-class-path 
file:/data5/yn_loc/usercache/etl/appcache/application_1479797390730_2433/container_1479797390730_2433_01_000608/__app__.jar

> multiple Spark Thrift Servers running in the same machine throws 
> org.apache.hadoop.security.AccessControlException
> --
>
> Key: SPARK-18584
> URL: https://issues.apache.org/jira/browse/SPARK-18584
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.0.2
> Environment: hadoop-2.5.0-cdh5.2.1-och4.0.0
> spark2.0.2
>Reporter: tanxinz
>
> In spark2.0.2 , I have two users(etl , dev ) start Spark Thrift Server in the 
> same machine . I connected by beeline etl STS to execute a command,and 
> throwed org.apache.hadoop.security.AccessControlException.I don't know why is 
> dev user perform,not etl.
> ```
> Caused by: 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
>  Permission denied: user=dev, access=EXECUTE, 
> inode="/user/hive/warehouse/tb_spark_sts/etl_cycle_id=20161122":etl:supergroup:drwxr-x---,group:etl:rwx,group:oth_dev:rwx,default:user:data_mining:r-x,default:group::rwx,default:group:etl:rwx,default:group:oth_dev:rwx,default:mask::rwx,default:other::---
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkAccessAcl(DefaultAuthorizationProvider.java:335)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:231)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:178)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6250)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3942)
> at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:811)
> at 
> org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:502)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:815)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:587)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
> ```



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-18584) multiple Spark Thrift Servers running in the same machine throws org.apache.hadoop.security.AccessControlException

2016-11-25 Thread tanxinz (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-18584?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15697216#comment-15697216
 ] 

tanxinz commented on SPARK-18584:
-

Different users have different authorizations to access different hdfs 
sources.Right now I have two users (etl , dev ),and running two Spark Thrift 
Server :

etl Spark Thrift Server:
/home/etl/app/spark-2.0.2-bin-spark_hadoop250/sbin/start-thriftserver.sh \
--hiveconf hive.server2.thrift.port=10111 \
--properties-file 
/home/etl/app/spark-2.0.2-bin-spark_hadoop250/conf/spark-etl.conf \
--conf spark.executor.instances=130 --name spark_etl

dev Spark Thrift Server:
/home/dev/app/spark-2.0.1-bin-spark_hadoop250/sbin/start-thriftserver.sh \
--hiveconf hive.server2.thrift.port=10001 \
--properties-file 
/home/dev/app/spark-2.0.1-bin-spark_hadoop250/conf/spark-dev.conf \
--driver-memory 10G \
--conf spark.shuffle.service.enabled=true \
--conf spark.dynamicAllocation.enabled=true \
--conf spark.shuffle.service.port=7337 \
--conf spark.dynamicAllocation.maxExecutors=100 \
--conf spark.dynamicAllocation.sustainedSchedulerBacklogTimeout=5s \
--conf spark.dynamicAllocation.executorIdleTimeout=30s \
--name sparkedw_dynamic

When I connected by beeline etl STS to execute a command:
beeline  -u jdbc:hive2://machine_ip:10111  -n etl -p passwd  --verbose=true   
-e "${sql_text}"

Throwed org.apache.hadoop.security.AccessControlException.I don't know why is 
dev user perform,not etl.



> multiple Spark Thrift Servers running in the same machine throws 
> org.apache.hadoop.security.AccessControlException
> --
>
> Key: SPARK-18584
> URL: https://issues.apache.org/jira/browse/SPARK-18584
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.0.2
> Environment: hadoop-2.5.0-cdh5.2.1-och4.0.0
> spark2.0.2
>Reporter: tanxinz
>
> In spark2.0.2 , I have two users(etl , dev ) start Spark Thrift Server in the 
> same machine . I connected by beeline etl STS to execute a command,and 
> throwed org.apache.hadoop.security.AccessControlException.I don't know why is 
> dev user perform,not etl.
> ```
> Caused by: 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
>  Permission denied: user=dev, access=EXECUTE, 
> inode="/user/hive/warehouse/tb_spark_sts/etl_cycle_id=20161122":etl:supergroup:drwxr-x---,group:etl:rwx,group:oth_dev:rwx,default:user:data_mining:r-x,default:group::rwx,default:group:etl:rwx,default:group:oth_dev:rwx,default:mask::rwx,default:other::---
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkAccessAcl(DefaultAuthorizationProvider.java:335)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:231)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:178)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6250)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3942)
> at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:811)
> at 
> org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:502)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:815)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:587)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
> ```



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (SPARK-18584) multiple Spark Thrift Servers running in the same machine throws org.apache.hadoop.security.AccessControlException

2016-11-24 Thread tanxinz (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-18584?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15695080#comment-15695080
 ] 

tanxinz commented on SPARK-18584:
-


etl Spark Thrift Server:
/home/etl/app/spark-2.0.2-bin-spark_hadoop250/sbin/start-thriftserver.sh \
 --hiveconf hive.server2.thrift.port=10111  \
 --properties-file 
/home/etl/app/spark-2.0.2-bin-spark_hadoop250/conf/spark-etl.conf \
 --conf spark.executor.instances=130 --name spark_etl

dev Spark Thrift Server:
/home/dev/app/spark-2.0.1-bin-spark_hadoop250/sbin/start-thriftserver.sh \
--hiveconf hive.server2.thrift.port=10001  \
--properties-file 
/home/dev/app/spark-2.0.1-bin-spark_hadoop250/conf/spark-dev.conf  \
--driver-memory  10G   \
--conf spark.shuffle.service.enabled=true \
--conf spark.dynamicAllocation.enabled=true \
--conf spark.shuffle.service.port=7337 \
--conf spark.dynamicAllocation.maxExecutors=100 \
--conf spark.dynamicAllocation.sustainedSchedulerBacklogTimeout=5s \
--conf spark.dynamicAllocation.executorIdleTimeout=30s \
--name sparkedw_dynamic

> multiple Spark Thrift Servers running in the same machine throws 
> org.apache.hadoop.security.AccessControlException
> --
>
> Key: SPARK-18584
> URL: https://issues.apache.org/jira/browse/SPARK-18584
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.0.2
> Environment: hadoop-2.5.0-cdh5.2.1-och4.0.0
> spark2.0.2
>Reporter: tanxinz
> Fix For: 2.0.2
>
>
> In spark2.0.2 , I have two users(etl , dev ) start Spark Thrift Server in the 
> same machine . I connected by beeline etl STS to execute a command,and 
> throwed org.apache.hadoop.security.AccessControlException.I don't know why is 
> dev user perform,not etl.
> ```
> Caused by: 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
>  Permission denied: user=dev, access=EXECUTE, 
> inode="/user/hive/warehouse/tb_spark_sts/etl_cycle_id=20161122":etl:supergroup:drwxr-x---,group:etl:rwx,group:oth_dev:rwx,default:user:data_mining:r-x,default:group::rwx,default:group:etl:rwx,default:group:oth_dev:rwx,default:mask::rwx,default:other::---
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkAccessAcl(DefaultAuthorizationProvider.java:335)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:231)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:178)
> at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6250)
> at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3942)
> at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:811)
> at 
> org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:502)
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:815)
> at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:587)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
> ```



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-18584) multiple Spark Thrift Servers running in the same machine throws org.apache.hadoop.security.AccessControlException

2016-11-24 Thread tanxinz (JIRA)
tanxinz created SPARK-18584:
---

 Summary: multiple Spark Thrift Servers running in the same machine 
throws org.apache.hadoop.security.AccessControlException
 Key: SPARK-18584
 URL: https://issues.apache.org/jira/browse/SPARK-18584
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 2.0.2
 Environment: hadoop-2.5.0-cdh5.2.1-och4.0.0
spark2.0.2
Reporter: tanxinz
 Fix For: 2.0.2


In spark2.0.2 , I have two users(etl , dev ) start Spark Thrift Server in the 
same machine . I connected by beeline etl STS to execute a command,and throwed 
org.apache.hadoop.security.AccessControlException.I don't know why is dev user 
perform,not etl.

```
Caused by: 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
 Permission denied: user=dev, access=EXECUTE, 
inode="/user/hive/warehouse/tb_spark_sts/etl_cycle_id=20161122":etl:supergroup:drwxr-x---,group:etl:rwx,group:oth_dev:rwx,default:user:data_mining:r-x,default:group::rwx,default:group:etl:rwx,default:group:oth_dev:rwx,default:mask::rwx,default:other::---
at 
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkAccessAcl(DefaultAuthorizationProvider.java:335)
at 
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:231)
at 
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:178)
at 
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6250)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3942)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:811)
at 
org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:502)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:815)
at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:587)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
```



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org