Because you already defined dfs.root as '/localdata'. 

You might just need to say dfs.root.`testdata.csv` in your where clause 

Sent from my iPhone

> On May 24, 2015, at 1:56 PM, Matt <[email protected]> wrote:
> 
> I have used a single node install (unzip and run) to query local text / csv 
> files, but on a 3 node cluster (installed via MapR CE), a query with local 
> files results in:
> 
> ~~~
> sqlline version 1.1.6
> 0: jdbc:drill:> select * from dfs.
> Query failed: PARSE ERROR: From line 1, column 15 to line 1, column 17: Table 
> 'dfs./localdata/testdata.csv' not found
> 
> 0: jdbc:drill:> select * from dfs.`/localdata/testdata.csv`;
> Query failed: PARSE ERROR: From line 1, column 15 to line 1, column 17: Table 
> 'dfs./localdata/testdata.csv' not found
> ~~~
> 
> Is there a special config for local file querying? An initial doc search did 
> not point me to a solution, but I may simply not have found the relevant 
> sections.
> 
> I have tried modifying the default dfs config to no avail:
> 
> ~~~
>  "type": "file",
>  "enabled": true,
>  "connection": "file:///",
>  "workspaces": {
>    "root": {
>      "location": "/localdata",
>      "writable": false,
>      "defaultInputFormat": null
>    }
> ~~~

Reply via email to