[ 
https://issues.apache.org/jira/browse/DRILL-5733?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16998323#comment-16998323
 ] 

Stefan Hammer commented on DRILL-5733:
--------------------------------------

Interestingly 
{noformat}
SELECT * FROM `hdfs.default`.`./user/foo/*.parquet`{noformat}
works perfectly, while 
{noformat}
SELECT * FROM `hdfs.default`.`./user/foo/test.parquet`{noformat}
fails.

Running
{noformat}
hdfs dfs -test -e /user/foo/test.parquet && echo $?{noformat}
returns 0, which shows that this path does exist.

Tested versions:
 * Drill 1.16.0 with hadoop 2.7.4
 * Drill 1.16.0 with hadoop 2.7.3

The query *fails for both tested hadoop* versions!

Server version: CDH 6.3.1

I guess a full stack trace will help here:
{code:java}
{"queryId":"22071257-fa26-11a8-4896-eca9f008be75","schema":"","queryText":"SELECT
 * FROM 
`hdfs.default`.`./user/foo/*.parquet`","start":1576594855195,"finish":1576594863220,"outcome":"COMPLETED","username":"bar","remoteAddress":"10.255.0.3:60765"}
{"queryId":"22071245-97ae-18b4-be56-98a556666aa7","schema":"","queryText":"SELECT
 * FROM 
`hdfs.default`.`./user/foo/test.parquet`","start":1576594873427,"finish":1576594874094,"outcome":"FAILED","username":"bar","remoteAddress":"10.255.0.3:60765"}2019-12-17
 15:00:55,327 [22071257-fa26-11a8-4896-eca9f008be75:foreman] INFO  
o.a.drill.exec.work.foreman.Foreman - Query text for query with id 
22071257-fa26-11a8-4896-eca9f008be75 issued by bar: SELECT * FROM 
`hdfs.default`.`./user/foo/*.parquet`
2019-12-17 15:01:00,295 [22071257-fa26-11a8-4896-eca9f008be75:frag:0:0] INFO  
o.a.d.e.w.fragment.FragmentExecutor - 22071257-fa26-11a8-4896-eca9f008be75:0:0: 
State change requested AWAITING_ALLOCATION --> RUNNING
2019-12-17 15:01:00,310 [22071257-fa26-11a8-4896-eca9f008be75:frag:0:0] INFO  
o.a.d.e.w.f.FragmentStatusReporter - 22071257-fa26-11a8-4896-eca9f008be75:0:0: 
State to report: RUNNING
2019-12-17 15:01:03,028 [22071257-fa26-11a8-4896-eca9f008be75:frag:0:0] INFO  
o.a.d.e.c.ClassCompilerSelector - Java compiler policy: DEFAULT, Debug option: 
true
2019-12-17 15:01:03,204 [22071257-fa26-11a8-4896-eca9f008be75:frag:0:0] INFO  
o.a.d.e.w.fragment.FragmentExecutor - 22071257-fa26-11a8-4896-eca9f008be75:0:0: 
State change requested RUNNING --> FINISHED
2019-12-17 15:01:03,208 [22071257-fa26-11a8-4896-eca9f008be75:frag:0:0] INFO  
o.a.d.e.w.f.FragmentStatusReporter - 22071257-fa26-11a8-4896-eca9f008be75:0:0: 
State to report: FINISHED
2019-12-17 15:01:13,439 [22071245-97ae-18b4-be56-98a556666aa7:foreman] INFO  
o.a.drill.exec.work.foreman.Foreman - Query text for query with id 
22071245-97ae-18b4-be56-98a556666aa7 issued by bar: SELECT * FROM 
`hdfs.default`.`./user/foo/test.parquet`
2019-12-17 15:01:14,099 [22071245-97ae-18b4-be56-98a556666aa7:foreman] ERROR 
o.a.drill.exec.work.foreman.Foreman - SYSTEM ERROR: RemoteException: 
/user/foo/test.parquet (is not a directory)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:677)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3060)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1151)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:940)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)Please, 
refer to logs for more information.[Error Id: 
d14b1507-5490-49a4-b853-7525783ac2c5 on 60a9e24ebec7:31010]
org.apache.drill.common.exceptions.UserException: SYSTEM ERROR: 
RemoteException: /user/foo/test.parquet (is not a directory)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:677)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3060)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1151)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:940)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)Please, 
refer to logs for more information.[Error Id: 
d14b1507-5490-49a4-b853-7525783ac2c5 on 60a9e24ebec7:31010]
        at 
org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:630)
 ~[drill-common-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.work.foreman.Foreman$ForemanResult.close(Foreman.java:789)
 [drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.work.foreman.QueryStateProcessor.checkCommonStates(QueryStateProcessor.java:325)
 [drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.work.foreman.QueryStateProcessor.planning(QueryStateProcessor.java:221)
 [drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.work.foreman.QueryStateProcessor.moveToState(QueryStateProcessor.java:83)
 [drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:304) 
[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[na:1.8.0_222]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[na:1.8.0_222]
        at java.lang.Thread.run(Thread.java:748) [na:1.8.0_222]
Caused by: org.apache.drill.exec.work.foreman.ForemanException: Unexpected 
exception during fragment initiabarion: Error while applying rule 
DrillTableRule, args 
[rel#108:EnumerableTableScan.ENUMERABLE.ANY([]).[](table=[hdfs.default, 
./user/foo/test.parquet])]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:305) 
[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        ... 3 common frames omitted
Caused by: java.lang.RuntimeException: Error while applying rule 
DrillTableRule, args 
[rel#108:EnumerableTableScan.ENUMERABLE.ANY([]).[](table=[hdfs.default, 
./user/foo/test.parquet])]
        at 
org.apache.calcite.plan.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:236)
 ~[calcite-core-1.18.0-drill-r0.jar:1.18.0-drill-r0]
        at 
org.apache.calcite.plan.volcano.VolcanoPlanner.findBestExp(VolcanoPlanner.java:643)
 ~[calcite-core-1.18.0-drill-r0.jar:1.18.0-drill-r0]
        at 
org.apache.calcite.tools.Programs$RuleSetProgram.run(Programs.java:339) 
~[calcite-core-1.18.0-drill-r0.jar:1.18.0-drill-r0]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:430)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:370)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRawDrel(DefaultSqlHandler.java:250)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToDrel(DefaultSqlHandler.java:319)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:177)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan(DrillSqlWorker.java:216)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan(DrillSqlWorker.java:130)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:87)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:593) 
[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:276) 
[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        ... 3 common frames omitted
Caused by: org.apache.drill.common.exceptions.DrillRuntimeException: Failure 
creating scan.
        at 
org.apache.drill.exec.planner.common.DrillScanRelBase.<init>(DrillScanRelBase.java:53)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:79) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:66) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:59) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillScanRule.onMatch(DrillScanRule.java:38)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.calcite.plan.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:212)
 ~[calcite-core-1.18.0-drill-r0.jar:1.18.0-drill-r0]
        ... 15 common frames omitted
Caused by: org.apache.hadoop.security.AccessControlException: 
/user/foo/test.parquet (is not a directory)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:677)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3060)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1151)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:940)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
~[na:1.8.0_222]
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
 ~[na:1.8.0_222]
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 ~[na:1.8.0_222]
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
~[na:1.8.0_222]
        at 
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
 ~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
 ~[hadoop-common-2.7.4.jar:na]
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2110) 
~[hadoop-hdfs-2.7.4.jar:na]
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
 ~[hadoop-hdfs-2.7.4.jar:na]
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
 ~[hadoop-hdfs-2.7.4.jar:na]
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 ~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
 ~[hadoop-hdfs-2.7.4.jar:na]
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426) 
~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.drill.exec.store.dfs.DrillFileSystem.exists(DrillFileSystem.java:639)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.fileExists(ParquetTableMetadataProviderImpl.java:149)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.populateMetaPaths(ParquetTableMetadataProviderImpl.java:135)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.expandIfNecessary(ParquetTableMetadataProviderImpl.java:228)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.<init>(ParquetTableMetadataProviderImpl.java:98)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.<init>(ParquetTableMetadataProviderImpl.java:47)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl$Builder.build(ParquetTableMetadataProviderImpl.java:473)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl$Builder.build(ParquetTableMetadataProviderImpl.java:386)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetGroupScan.<init>(ParquetGroupScan.java:145)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetGroupScan.<init>(ParquetGroupScan.java:115)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetFormatPlugin.getGroupScan(ParquetFormatPlugin.java:200)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetFormatPlugin.getGroupScan(ParquetFormatPlugin.java:77)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.dfs.FileSystemPlugin.getPhysicalScan(FileSystemPlugin.java:184)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.dfs.FileSystemPlugin.getPhysicalScan(FileSystemPlugin.java:172)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillTable.getGroupScan(DrillTable.java:117)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.common.DrillScanRelBase.<init>(DrillScanRelBase.java:51)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        ... 20 common frames omitted
Caused by: org.apache.hadoop.ipc.RemoteException: /user/foo/test.parquet (is 
not a directory)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:677)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3060)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1151)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:940)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)        at 
org.apache.hadoop.ipc.Client.call(Client.java:1476) 
~[hadoop-common-2.7.4.jar:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1413) 
~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
 ~[hadoop-common-2.7.4.jar:na]
        at com.sun.proxy.$Proxy87.getFileInfo(Unknown Source) ~[na:na]
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:776)
 ~[hadoop-hdfs-2.7.4.jar:na]
        at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source) ~[na:na]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_222]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_222]
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
 ~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
 ~[hadoop-common-2.7.4.jar:na]
        at com.sun.proxy.$Proxy88.getFileInfo(Unknown Source) ~[na:na]
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108) 
~[hadoop-hdfs-2.7.4.jar:na]
        ... 41 common frames omitted
2019-12-17 15:01:14,124 [qtp369156800-57] ERROR 
o.a.d.e.server.rest.QueryResources - Query from Web UI Failed: {}
org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR: 
RemoteException: /user/foo/test.parquet (is not a directory)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:677)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3060)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1151)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:940)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)Please, 
refer to logs for more information.[Error Id: 
d14b1507-5490-49a4-b853-7525783ac2c5 on 60a9e24ebec7:31010]
        at 
org.apache.drill.exec.server.rest.QueryWrapper.run(QueryWrapper.java:126) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.server.rest.QueryResources.submitQueryJSON(QueryResources.java:74)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.server.rest.QueryResources.submitQuery(QueryResources.java:90)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_222]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_222]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_222]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_222]
        at 
org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:205)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:326) 
[jersey-server-2.25.1.jar:na]
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) 
[jersey-common-2.25.1.jar:na]
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) 
[jersey-common-2.25.1.jar:na]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:315) 
[jersey-common-2.25.1.jar:na]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:297) 
[jersey-common-2.25.1.jar:na]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:267) 
[jersey-common-2.25.1.jar:na]
        at 
org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317)
 [jersey-common-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305) 
[jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154)
 [jersey-server-2.25.1.jar:na]
        at 
org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:473) 
[jersey-container-servlet-core-2.25.1.jar:na]
        at 
org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:427) 
[jersey-container-servlet-core-2.25.1.jar:na]
        at 
org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388)
 [jersey-container-servlet-core-2.25.1.jar:na]
        at 
org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341)
 [jersey-container-servlet-core-2.25.1.jar:na]
        at 
org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228)
 [jersey-container-servlet-core-2.25.1.jar:na]
        at 
org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) 
[jetty-servlet-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585) 
[jetty-servlet-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) 
[jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:524) 
[jetty-security-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.apache.drill.exec.server.rest.auth.DrillHttpSecurityHandlerProvider.handle(DrillHttpSecurityHandlerProvider.java:151)
 [drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
 [jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
 [jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:513) 
[jetty-servlet-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
 [jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
 [jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) 
[jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) 
[jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at org.eclipse.jetty.server.Server.handle(Server.java:539) 
[jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) 
[jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) 
[jetty-server-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
 [jetty-io-9.3.25.v20180904.jar:9.3.25.v20180904]
        at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) 
[jetty-io-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) 
[jetty-io-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
 [jetty-util-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
 [jetty-util-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
 [jetty-util-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
 [jetty-util-9.3.25.v20180904.jar:9.3.25.v20180904]
        at 
org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) 
[jetty-util-9.3.25.v20180904.jar:9.3.25.v20180904]
        at java.lang.Thread.run(Thread.java:748) [na:1.8.0_222]
org.apache.drill.exec.work.foreman.ForemanException: Unexpected exception 
during fragment initiabarion: Error while applying rule DrillTableRule, args 
[rel#108:EnumerableTableScan.ENUMERABLE.ANY([]).[](table=[hdfs.default, 
./user/foo/test.parquet])]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:305) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at .......(:0) ~[na:na]
Caused by: java.lang.RuntimeException: Error while applying rule 
DrillTableRule, args 
[rel#108:EnumerableTableScan.ENUMERABLE.ANY([]).[](table=[hdfs.default, 
./user/foo/test.parquet])]
        at 
org.apache.calcite.plan.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:236)
 ~[calcite-core-1.18.0-drill-r0.jar:1.18.0-drill-r0]
        at 
org.apache.calcite.plan.volcano.VolcanoPlanner.findBestExp(VolcanoPlanner.java:643)
 ~[calcite-core-1.18.0-drill-r0.jar:1.18.0-drill-r0]
        at 
org.apache.calcite.tools.Programs$RuleSetProgram.run(Programs.java:339) 
~[calcite-core-1.18.0-drill-r0.jar:1.18.0-drill-r0]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:430)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:370)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRawDrel(DefaultSqlHandler.java:250)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToDrel(DefaultSqlHandler.java:319)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:177)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan(DrillSqlWorker.java:216)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.convertPlan(DrillSqlWorker.java:130)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:87)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:593) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:276) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        ... 1 common frames omitted
Caused by: org.apache.drill.common.exceptions.DrillRuntimeException: Failure 
creating scan.
        at 
org.apache.drill.exec.planner.common.DrillScanRelBase.<init>(DrillScanRelBase.java:53)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:79) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:66) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:59) 
~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillScanRule.onMatch(DrillScanRule.java:38)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.calcite.plan.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:212)
 ~[calcite-core-1.18.0-drill-r0.jar:1.18.0-drill-r0]
        ... 13 common frames omitted
Caused by: java.lang.Exception: /user/foo/test.parquet (is not a directory)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:677)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3060)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1151)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:940)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)        at 
.......(:0) ~[na:na]
        at 
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
 ~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
 ~[hadoop-common-2.7.4.jar:na]
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2110) 
~[hadoop-hdfs-2.7.4.jar:na]
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
 ~[hadoop-hdfs-2.7.4.jar:na]
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
 ~[hadoop-hdfs-2.7.4.jar:na]
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 ~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
 ~[hadoop-hdfs-2.7.4.jar:na]
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426) 
~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.drill.exec.store.dfs.DrillFileSystem.exists(DrillFileSystem.java:639)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.fileExists(ParquetTableMetadataProviderImpl.java:149)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.populateMetaPaths(ParquetTableMetadataProviderImpl.java:135)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.expandIfNecessary(ParquetTableMetadataProviderImpl.java:228)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.<init>(ParquetTableMetadataProviderImpl.java:98)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl.<init>(ParquetTableMetadataProviderImpl.java:47)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl$Builder.build(ParquetTableMetadataProviderImpl.java:473)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetTableMetadataProviderImpl$Builder.build(ParquetTableMetadataProviderImpl.java:386)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetGroupScan.<init>(ParquetGroupScan.java:145)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetGroupScan.<init>(ParquetGroupScan.java:115)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetFormatPlugin.getGroupScan(ParquetFormatPlugin.java:200)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.parquet.ParquetFormatPlugin.getGroupScan(ParquetFormatPlugin.java:77)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.dfs.FileSystemPlugin.getPhysicalScan(FileSystemPlugin.java:184)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.store.dfs.FileSystemPlugin.getPhysicalScan(FileSystemPlugin.java:172)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.logical.DrillTable.getGroupScan(DrillTable.java:117)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        at 
org.apache.drill.exec.planner.common.DrillScanRelBase.<init>(DrillScanRelBase.java:51)
 ~[drill-java-exec-1.16.0-1.jar:1.16.0-1]
        ... 18 common frames omitted
Caused by: java.lang.Exception: /user/foo/test.parquet (is not a directory)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:677)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3060)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1151)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:940)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
        at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)        at 
org.apache.hadoop.ipc.Client.call(Client.java:1476) 
~[hadoop-common-2.7.4.jar:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1413) 
~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
 ~[hadoop-common-2.7.4.jar:na]
        at .......(:0) ~[na:na]
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:776)
 ~[hadoop-hdfs-2.7.4.jar:na]
        at .......(:0) ~[na:na]
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
 ~[hadoop-common-2.7.4.jar:na]
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
 ~[hadoop-common-2.7.4.jar:na]
        at .......(:0) ~[na:na]
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108) 
~[hadoop-hdfs-2.7.4.jar:na]
        ... 39 common frames omitted
{code}

> Unable to SELECT from parquet file with Hadoop 2.7.4
> ----------------------------------------------------
>
>                 Key: DRILL-5733
>                 URL: https://issues.apache.org/jira/browse/DRILL-5733
>             Project: Apache Drill
>          Issue Type: Bug
>    Affects Versions: 1.11.0
>            Reporter: Michele Lamarca
>            Priority: Major
>
> {{SELECT * FROM hdfs.`/user/drill/nation.parquet`;}} fails with Hadoop 2.7.4 
> with {noformat}
> 1/2          SELECT * FROM hdfs.`/user/drill/nation.parquet`;
> Error: SYSTEM ERROR: RemoteException: /user/drill/nation.parquet (is
> not a directory)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:272)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:215)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:199)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:100)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3820)
>         at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1012)
>         at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:855)
>         at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>         at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2213)
> {noformat}
> Query correctly executes with Hadoop 2.7.3, while it fails with:
> - Hadoop 2.7.4 with Drill 1.11 (default pom.xml)
> - Hadoop 2.7.4 with Drill 1.11 (with -Dhadoop.version=2.7.4)
> - Hadoop 2.8.0 with Drill 1.11 (default pom.xml)
> - Hadoop 3.0.0-alpha4 with Drill 1.11 (default pom.xml)
> thus looking related to https://issues.apache.org/jira/browse/HDFS-10673
> Temporary workaround consists in querying on an enclosing directory, as 
> suggested by [~kkhatua] on drill-user mailinglist.
> Relevant stacktrace from drillbit log
> {noformat}
> 2017-08-19 09:00:45,570 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.drill.exec.work.foreman.Foreman - Query text for query id 
> 26681de9-2b48-2c3a-cc7c-2c7ceeb1beae: SELECT * FROM 
> hdfs.`/user/drill/nation.parquet`
> 2017-08-19 09:00:45,571 [UserServer-1] WARN  
> o.a.drill.exec.rpc.user.UserServer - Message of mode REQUEST of rpc type 3 
> took longer than 500ms.  Actual duration was 7137ms.
> 2017-08-19 09:00:45,617 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.c.s.persistence.ScanResult - loading 7 classes for 
> org.apache.drill.exec.store.dfs.FormatPlugin took 0ms
> 2017-08-19 09:00:45,618 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.c.s.persistence.ScanResult - loading 8 classes for 
> org.apache.drill.common.logical.FormatPluginConfig took 0ms
> 2017-08-19 09:00:45,619 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.c.s.persistence.ScanResult - loading 8 classes for 
> org.apache.drill.common.logical.FormatPluginConfig took 0ms
> 2017-08-19 09:00:45,619 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.c.s.persistence.ScanResult - loading 8 classes for 
> org.apache.drill.common.logical.FormatPluginConfig took 0ms
> 2017-08-19 09:00:45,648 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.c.s.persistence.ScanResult - loading 7 classes for 
> org.apache.drill.exec.store.dfs.FormatPlugin took 0ms
> 2017-08-19 09:00:45,649 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.c.s.persistence.ScanResult - loading 8 classes for 
> org.apache.drill.common.logical.FormatPluginConfig took 0ms
> 2017-08-19 09:00:45,649 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.c.s.persistence.ScanResult - loading 8 classes for 
> org.apache.drill.common.logical.FormatPluginConfig took 0ms
> 2017-08-19 09:00:45,650 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.c.s.persistence.ScanResult - loading 8 classes for 
> org.apache.drill.common.logical.FormatPluginConfig took 0ms
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,726 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] INFO  
> o.a.d.exec.store.dfs.FileSelection - FileSelection.getStatuses() took 0 ms, 
> numFiles: 1
> 2017-08-19 09:00:45,775 [26681de9-2b48-2c3a-cc7c-2c7ceeb1beae:foreman] ERROR 
> o.a.drill.exec.work.foreman.Foreman - SYSTEM ERROR: RemoteException: 
> /user/drill/nation.parquet (is not a directory)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:272)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:215)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:199)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:100)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3820)
>     at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1012)
>     at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:855)
>     at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2213)
> [Error Id: 8f351c63-d3f7-4b61-a5e6-1a09c6c2ba8d on node001.cm.cluster:31010]
> org.apache.drill.common.exceptions.UserException: SYSTEM ERROR: 
> RemoteException: /user/drill/nation.parquet (is not a directory)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:272)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:215)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:199)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:100)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3820)
>     at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1012)
>     at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:855)
>     at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2213)
> [Error Id: 8f351c63-d3f7-4b61-a5e6-1a09c6c2ba8d on node001.cm.cluster:31010]
>     at 
> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:550)
>  ~[drill-common-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.work.foreman.Foreman$ForemanResult.close(Foreman.java:847)
>  [drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.work.foreman.Foreman.moveToState(Foreman.java:977) 
> [drill-java-exec-1.11.0.jar:1.11.0]
>     at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:297) 
> [drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  [na:1.8.0_141]
>     at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  [na:1.8.0_141]
>     at java.lang.Thread.run(Thread.java:748) [na:1.8.0_141]
> Caused by: org.apache.drill.exec.work.foreman.ForemanException: Unexpected 
> exception during fragment initialization: Internal error: Error while 
> applying rule DrillTableRule, args 
> [rel#152:EnumerableTableScan.ENUMERABLE.ANY([]).[](table=[hdfs, 
> /user/drill/nation.parquet])]
>     ... 4 common frames omitted
> Caused by: java.lang.AssertionError: Internal error: Error while applying 
> rule DrillTableRule, args 
> [rel#152:EnumerableTableScan.ENUMERABLE.ANY([]).[](table=[hdfs, 
> /user/drill/nation.parquet])]
>     at org.apache.calcite.util.Util.newInternal(Util.java:792) 
> ~[calcite-core-1.4.0-drill-r21.jar:1.4.0-drill-r21]
>     at 
> org.apache.calcite.plan.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:251)
>  ~[calcite-core-1.4.0-drill-r21.jar:1.4.0-drill-r21]
>     at 
> org.apache.calcite.plan.volcano.VolcanoPlanner.findBestExp(VolcanoPlanner.java:811)
>  ~[calcite-core-1.4.0-drill-r21.jar:1.4.0-drill-r21]
>     at 
> org.apache.calcite.tools.Programs$RuleSetProgram.run(Programs.java:310) 
> ~[calcite-core-1.4.0-drill-r21.jar:1.4.0-drill-r21]
>     at 
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:401)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.transform(DefaultSqlHandler.java:343)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToRawDrel(DefaultSqlHandler.java:242)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToDrel(DefaultSqlHandler.java:292)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:169)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan(DrillSqlWorker.java:131)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:79)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1050) 
> [drill-java-exec-1.11.0.jar:1.11.0]
>     at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:280) 
> [drill-java-exec-1.11.0.jar:1.11.0]
>     ... 3 common frames omitted
> Caused by: org.apache.drill.common.exceptions.DrillRuntimeException: Failure 
> creating scan.
>     at 
> org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:92)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:70)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:63)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.logical.DrillScanRule.onMatch(DrillScanRule.java:37)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.calcite.plan.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:228)
>  ~[calcite-core-1.4.0-drill-r21.jar:1.4.0-drill-r21]
>     ... 14 common frames omitted
> Caused by: org.apache.hadoop.security.AccessControlException: 
> /user/drill/nation.parquet (is not a directory)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:272)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:215)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:199)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:100)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3820)
>     at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1012)
>     at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:855)
>     at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2213)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
> ~[na:1.8.0_141]
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  ~[na:1.8.0_141]
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  ~[na:1.8.0_141]
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
> ~[na:1.8.0_141]
>     at 
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>  ~[hadoop-common-2.7.1.jar:na]
>     at 
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
>  ~[hadoop-common-2.7.1.jar:na]
>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2118) 
> ~[hadoop-hdfs-2.7.1.jar:na]
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
>  ~[hadoop-hdfs-2.7.1.jar:na]
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
>  ~[hadoop-hdfs-2.7.1.jar:na]
>     at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>  ~[hadoop-common-2.7.1.jar:na]
>     at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
>  ~[hadoop-hdfs-2.7.1.jar:na]
>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424) 
> ~[hadoop-common-2.7.1.jar:na]
>     at 
> org.apache.drill.exec.store.dfs.DrillFileSystem.exists(DrillFileSystem.java:603)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.store.parquet.ParquetGroupScan.expandIfNecessary(ParquetGroupScan.java:270)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.store.parquet.ParquetGroupScan.<init>(ParquetGroupScan.java:207)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.store.parquet.ParquetGroupScan.<init>(ParquetGroupScan.java:186)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.store.parquet.ParquetFormatPlugin.getGroupScan(ParquetFormatPlugin.java:170)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.store.parquet.ParquetFormatPlugin.getGroupScan(ParquetFormatPlugin.java:66)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.store.dfs.FileSystemPlugin.getPhysicalScan(FileSystemPlugin.java:144)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.store.AbstractStoragePlugin.getPhysicalScan(AbstractStoragePlugin.java:100)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.logical.DrillTable.getGroupScan(DrillTable.java:85)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     at 
> org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:90)
>  ~[drill-java-exec-1.11.0.jar:1.11.0]
>     ... 18 common frames omitted
> Caused by: org.apache.hadoop.ipc.RemoteException: /user/drill/nation.parquet 
> (is not a directory)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:272)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:215)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:199)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:100)
>     at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3820)
>     at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1012)
>     at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:855)
>     at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2213)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1476) 
> ~[hadoop-common-2.7.1.jar:na]
>     at org.apache.hadoop.ipc.Client.call(Client.java:1407) 
> ~[hadoop-common-2.7.1.jar:na]
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
>  ~[hadoop-common-2.7.1.jar:na]
>     at com.sun.proxy.$Proxy65.getFileInfo(Unknown Source) ~[na:na]
>     at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
>  ~[hadoop-hdfs-2.7.1.jar:na]
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
> ~[na:1.8.0_141]
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> ~[na:1.8.0_141]
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  ~[na:1.8.0_141]
>     at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_141]
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>  ~[hadoop-common-2.7.1.jar:na]
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>  ~[hadoop-common-2.7.1.jar:na]
>     at com.sun.proxy.$Proxy66.getFileInfo(Unknown Source) ~[na:na]
>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116) 
> ~[hadoop-hdfs-2.7.1.jar:na]
>     ... 33 common frames omitted
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to