Hello,

Could this please be documented somewhere? I think it is important enough to 
have it visible for the users.

Greetings,

Uwe



-----Original Message-----
From: Abhishek Girish [mailto:[email protected]] 
Sent: Dienstag, 15. September 2015 18:22
To: [email protected]
Subject: Re: PERMISSION ERROR: Not authorized to list or query tables in schema 
[dfs.default]

Hello Narayan,

The dfs.default workspace is a hidden workspace which points to the root of the 
file system. You could override the same (by adding an entry in the dfs
plugin) to have it point to a different location where the drill user has 
permissions.

    "default": {
      "location": "/user/abc/tmp",
      "writable": true,
      "defaultInputFormat": null
    }

Let me know if this works.

Regards,
Abhishek

On Thu, Sep 10, 2015 at 3:06 PM, Narayanan K <[email protected]> wrote:

> Hi all,
>
> We are trying to use Drill to access HDFS which is secured by Kerberos.
>
> We are executing a query to hit a HDFS file through a user account 
> "abc" which has kerberos tickets already fetched i.e we are able to 
> use hadoop command to read/write files on hdfs.
>
> But we are getting the following exception while trying to run the 
> following command
>
> bin/sqlline -f test.sql --verbose=true --force=true -u
> jdbc:drill:schema=dfs;zk=localhost:2181
>
> test.sql :
>
> select * from dfs.`/user/abc/tmp/test.csv` limit 10
>
> [Error Id: 304156bf-0217-4f94-a7ea-cfc5e266e9fe on xyz.def.com:31010]
> (state=,code=0)
> java.sql.SQLException: PERMISSION ERROR: Not authorized to list or 
> query tables in schema [dfs.default]
>
>
> [Error Id: 304156bf-0217-4f94-a7ea-cfc5e266e9fe on xyz.def.com:31010] 
> at
> org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.j
> ava:214)
> at
> org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.j
> ava:257)
> at
> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetIm
> pl.java:1362)
> at
> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetIm
> pl.java:72)
> at
> net.hydromatic.avatica.AvaticaConnection.executeQueryInternal(AvaticaC
> onnection.java:404)
> at
> net.hydromatic.avatica.AvaticaStatement.executeQueryInternal(AvaticaSt
> atement.java:351)
> at
> net.hydromatic.avatica.AvaticaStatement.executeInternal(AvaticaStateme
> nt.java:338)
> at
> net.hydromatic.avatica.AvaticaStatement.execute(AvaticaStatement.java:
> 69)
> at
> org.apache.drill.jdbc.impl.DrillStatementImpl.execute(DrillStatementIm
> pl.java:85) at sqlline.Commands.execute(Commands.java:841)
> at sqlline.Commands.sql(Commands.java:751)
> at sqlline.SqlLine.dispatch(SqlLine.java:738)
> at sqlline.SqlLine.runCommands(SqlLine.java:1641)
> at sqlline.Commands.run(Commands.java:1304)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> orImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483)
> at
> sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java
> :36) at sqlline.SqlLine.dispatch(SqlLine.java:734)
> at sqlline.SqlLine.initArgs(SqlLine.java:544)
> at sqlline.SqlLine.begin(SqlLine.java:587)
> at sqlline.SqlLine.start(SqlLine.java:366)
> at sqlline.SqlLine.main(SqlLine.java:259)
> Caused by: org.apache.drill.common.exceptions.UserRemoteException:
> PERMISSION ERROR: Not authorized to list or query tables in schema 
> [dfs.default]
>
>
> Our dfs storage  plugin config looks like :
>
> {
>   "type": "file",
>   "enabled": true,
>   "connection": "hdfs://<NAMENODE-HOST>:<PORT>",
>   "workspaces": {
>     "root": {
>       "location": "/user/abc",
>       "writable": false,
>       "defaultInputFormat": null
>     },
>     "tmp": {
>       "location": "/user/abc/tmp",
>       "writable": true,
>       "defaultInputFormat": null
>     }
>   },
>   "formats": {
>     "psv": {
>       "type": "text",
>       "extensions": [
>         "tbl"
>       ],
>       "delimiter": "|"
>     },
>     "csv": {
>       "type": "text",
>       "extensions": [
>         "csv"
>       ],
>       "delimiter": ","
>     },
>     "tsv": {
>       "type": "text",
>       "delimiter": "\u0001"
>     },
>     "parquet": {
>       "type": "parquet"
>     },
>     "json": {
>       "type": "json"
>     },
>     "avro": {
>       "type": "avro"
>     }
>   }
> }
>
> Thanks
> Narayan
>

Reply via email to