Hi Guys,

I've developed a PXF plugin and able to make it work to read from our data 
source.
However I implemented WriteResolver and WriteAccessor, however when I tried to 
insert into the table I got the following exception:



postgres=# CREATE EXTERNAL TABLE t3 (id int, total int, comments varchar) 

LOCATION ('pxf://localhost:51200/foo.bar?PROFILE=XXXX')

FORMAT 'custom' (formatter='pxfwritable_import') ;

CREATE EXTERNAL TABLE

postgres=# select * from t3;

 id  | total | comments 

-----+-------+----------

 100 |   500 | 

 100 |  5000 | abcdfe

     |  5000 | 100

(3 rows)

postgres=# drop external table t3;

DROP EXTERNAL TABLE

postgres=# CREATE WRITABLE EXTERNAL TABLE t3 (id int, total int, comments 
varchar) 

LOCATION ('pxf://localhost:51200/foo.bar?PROFILE=XXXX')

FORMAT 'custom' (formatter='pxfwritable_export') ;

CREATE EXTERNAL TABLE

postgres=# insert into t3 values ( 1, 2, 'hello');

ERROR:  remote component error (500) from '127.0.0.1:51200':  type  Exception 
report   message   
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
 Access denied for user pxf. Superuser privilege is required    description   
The server encountered an internal error that prevented it from fulfilling this 
request.    exception   javax.servlet.ServletException: 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
 Access denied for user pxf. Superuser privilege is required (libchurl.c:852)  
(seg6 localhost.localdomain:40000 pid=19701) (dispatcher.c:1681)

Nov 07, 2015 11:40:08 AM com.sun.jersey.spi.container.ContainerResponse 
mapMappableContainerException


The log shows:

SEVERE: The exception contained within MappableContainerException could not be 
mapped to a response, re-throwing to the HTTP container

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
 Access denied for user pxf. Superuser privilege is required

at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkSuperuserPrivilege(FSPermissionChecker.java:122)

at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkSuperuserPrivilege(FSNamesystem.java:5906)

at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.datanodeReport(FSNamesystem.java:4941)

at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDatanodeReport(NameNodeRpcServer.java:1033)

at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDatanodeReport(ClientNamenodeProtocolServerSideTranslatorPB.java:698)

at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)

at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)

at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)

at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)




at org.apache.hadoop.ipc.Client.call(Client.java:1476)

at org.apache.hadoop.ipc.Client.call(Client.java:1407)

at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)

at com.sun.proxy.$Proxy63.getDatanodeReport(Unknown Source)

at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDatanodeReport(ClientNamenodeProtocolTranslatorPB.java:626)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)

at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)

at com.sun.proxy.$Proxy64.getDatanodeReport(Unknown Source)

at org.apache.hadoop.hdfs.DFSClient.datanodeReport(DFSClient.java:2562)

at 
org.apache.hadoop.hdfs.DistributedFileSystem.getDataNodeStats(DistributedFileSystem.java:1196)

at 
com.pivotal.pxf.service.rest.ClusterNodesResource.read(ClusterNodesResource.java:62)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at 
com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)

at 
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)

at 
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)

at 
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)

at 
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)

at 
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)

at 
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)

at 
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)

at 
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)

at 
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)

at 
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)

at 
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)

at 
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)

at 
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)

at 
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:699)

at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)

at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)

at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)

at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)

at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)

at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)

at 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)

at 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)

at 
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505)

at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170)

at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)

at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:957)

at 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)

at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:423)

at 
org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1079)

at 
org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:620)

at 
org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)

at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

at 
org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)

at java.lang.Thread.run(Thread.java:745)

Since our datasource is totally indepedent from HDFS, I'm not sure why it's 
still trying to access HDFS and get superuser access.
Please let me know if there anything missing here.
Cheers


Reply via email to