Scott,

Try the following.

In the SP config

"connection": "s3n://sccapachedrill”,

or you can also try 

"connection": "s3://sccapachedrill”,


For the core-site.xml in the CONF directory

<property>
<name>fs.s3.awsAccessKeyId</name>
<value>ID</value>
</property>

<property>
<name>fs.s3.awsSecretAccessKey</name>
<value>SECRET</value>
</property>

<property>
<name>fs.s3n.awsAccessKeyId</name>
<value>ID</value>
</property>

<property>
<name>fs.s3n.awsSecretAccessKey</name>
<value>SECRET</value>
</property>

Where ID and SECRET is for your AWS account (just verify the files/directories 
are readable with the account) Or test with a set of files you provide public 
access.

Check the permissions on all the conf files - I have my user own all the files 
in the Drill install directory, no need for sudo


RESTART drill using command below

$DRILL-HOME/bin/drill-embedded
https://drill.apache.org/docs/starting-drill-on-linux-and-mac-os-x/ 
<https://drill.apache.org/docs/starting-drill-on-linux-and-mac-os-x/>


in sqlline

use s3sccapachedrill.root;

show files;


See if this works.

—Andries




> On Sep 22, 2015, at 4:31 PM, scott cote <[email protected]> wrote:
> 
> Andries and others,
> 
> I launch drill on my mac with the command:
> 
> sudo bin/sqlline -u jdbc:drill:zk=local
> 
> My core-site.xml is located in the conf folder and has not been modified or 
> moved since my last post.
> 
> the configuration of the s3sccapachedrill is now (based on your suggestion):
> 
> 
> {
>  "type": "file",
>  "enabled": true,
>  "connection": "s3n://sccapachedrill/",
>  "workspaces": {
>    "root": {
>      "location": "/",
>      "writable": false,
>      "defaultInputFormat": null
>    },
>    "tmp": {
>      "location": "/tmp",
>      "writable": true,
>      "defaultInputFormat": null
>    }
>  },
>  "formats": {
>    "psv": {
>      "type": "text",
>      "extensions": [
>        "tbl"
>      ],
>      "delimiter": "|"
>    },
>    "csv": {
>      "type": "text",
>      "extensions": [
>        "csv"
>      ],
>      "delimiter": ","
>    },
>    "tsv": {
>      "type": "text",
>      "extensions": [
>        "tsv"
>      ],
>      "delimiter": "\t"
>    },
>    "parquet": {
>      "type": "parquet"
>    },
>    "json": {
>      "type": "json"
>    },
>    "avro": {
>      "type": "avro"
>    }
>  }
> }
> 
> 
> Issuing the command:
> 
> use s3sccapachedrill
> 
> yields:
> 
> 
> 0: jdbc:drill:zk=local> use s3sccapachedrill
> . . . . . . . . . . . > ;
> +-------+-----------------------------------------------+
> |  ok   |                    summary                    |
> +-------+-----------------------------------------------+
> | true  | Default schema changed to [s3sccapachedrill]  |
> +-------+-----------------------------------------------+
> 1 row selected (1.38 seconds)
> 0: jdbc:drill:zk=local> 
> 
> 
> 
> The command:
> 
> show tables;
> 
> yields:
> 
> +--+
> |  |
> +--+
> +--+
> No rows selected (1.251 seconds)
> 
> The command:
> 
> show files;
> 
> yields:
> 
> Error: SYSTEM ERROR: NullPointerException
> 
> 
> [Error Id: d96f65d6-16dd-4e4d-97c4-7bcb49f12295 on 
> macbookair-7f99.home:31010] (state=,code=0)
> 0: jdbc:drill:zk=local> 
> 
> 
> 
> 
> What am I missing here?
> 
> Thanks in advance.
> 
> Regards,
> 
> SCott
> 
> 
> 
> 
> 
>> On Sep 22, 2015, at 1:23 PM, Steven Phillips <[email protected]> wrote:
>> 
>> You need to change s3 to s3n in the URI:
>> 
>> See the discussion in the comments of this blog post:
>> 
>> http://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/
>> 
>> Hopefully that helps. Let me know if you are still having problems.
>> 
>> On Tue, Sep 22, 2015 at 8:47 AM, Andries Engelbrecht <
>> [email protected]> wrote:
>> 
>>> Scott,
>>> 
>>> In your SP configuration change  "connection": "s3://sccapachedrill/
>>> <s3://sccapachedrill/> <s3://sccapachedrill/ <s3://sccapachedrill/>>”, to
>>> "connection": "s3://sccapachedrill/ <s3://sccapachedrill/> “,
>>> I think you may have misunderstood the instructions on the page.
>>> 
>>> 
>>> Also when querying the metadata in S3 you are better of to use show files
>>> instead of show tables.
>>> 
>>> In some cases I used core-site.xml with the credentials and placed it in
>>> $DRILL_HOME/conf
>>> 
>>> See if this works for you.
>>> 
>>> 
>>> —Andries
>>> 
>>> 
>>>> On Sep 21, 2015, at 7:29 PM, scott cote <[email protected]> wrote:
>>>> 
>>>> Drillers,
>>>> 
>>>> I run a User Group for MongoDB and am attempting to demonstrate the
>>> ability to join data in S3 with data located in a MongoDB Collection.
>>> Unfortunately, I am unable to properly query a csv file that I placed in S3.
>>>> 
>>>> As reference, I am using the MapR S3 page. Followed directions with one
>>> modification: one may not place a hyphen within the name of the bucket or
>>> the name of the storage plugin.  Doing so blows up the sql parser….  So
>>> with that deviation, I did everything else listed on that page.  They
>>> system is configured per
>>> https://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/
>>> <
>>> https://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/
>>>> 
>>>> 
>>>> Problem:
>>>> 
>>>> The s3sccapachedrill storage plugin shows enabled and I can switch to is
>>> for use, but viewing a list of files/tables returns an empty list, and
>>> attempting to query a specific file by name cases stack traces.
>>>> 
>>>> 
>>>> Inside this email are the following key pieces of information:
>>>> 
>>>> 1. Command/Response of using Apache Drill to access S3 csv log files.
>>>> 2. Configuration of s3sccapachedrill storage plugin
>>>> 3. Path to the s3 file in the sccapachedrill bucket
>>>> 4. Contents of my hadoop_excludes.txt
>>>> 5. View of the end of the sqlline.log
>>>> 
>>>> 
>>>> Please let me know what I should change?
>>>> 
>>>> Thanks,
>>>> 
>>>> SCott
>>>> Scott C. Cote
>>>> [email protected] <mailto:[email protected]>
>>>> 972.672.6484
>>>> 
>>>> 
>>>> =====>>>>> Part 1 <<<<=========
>>>> What I see when I request for a list of databases:
>>>> 
>>>> 
>>>> 0: jdbc:drill:zk=local> show databases;
>>>> +---------------------------+
>>>> |        SCHEMA_NAME        |
>>>> +---------------------------+
>>>> | INFORMATION_SCHEMA        |
>>>> | cp.default                |
>>>> | dfs.default               |
>>>> | dfs.root                  |
>>>> | dfs.tmp                   |
>>>> | s3sccapachedrill.default  |
>>>> | s3sccapachedrill.root     |
>>>> | s3sccapachedrill.tmp      |
>>>> | sys                       |
>>>> +---------------------------+
>>>> 9 rows selected (0.083 seconds)
>>>> 
>>>> 
>>>> Here is what I see when I request a list of tables (first sign of
>>> trouble):
>>>> 
>>>> 
>>>> 0: jdbc:drill:zk=local> use s3sccapachedrill;
>>>> +-------+-----------------------------------------------+
>>>> |  ok   |                    summary                    |
>>>> +-------+-----------------------------------------------+
>>>> | true  | Default schema changed to [s3sccapachedrill]  |
>>>> +-------+-----------------------------------------------+
>>>> 1 row selected (0.069 seconds)
>>>> 0: jdbc:drill:zk=local> show tables;
>>>> +--+
>>>> |  |
>>>> +--+
>>>> +--+
>>>> No rows selected (1.504 seconds)
>>>> 
>>>> 
>>>> Here is what I see when I try to query a specific file in s3 -
>>> oletv_server_event.log.2014-12-16-10-55.csv
>>>> 
>>>> 0: jdbc:drill:zk=local> select * from
>>> s3sccapachedrill.root.`oletv_server_event.log.2014-12-16-10-55.csv`;
>>>> Sep 16, 2015 12:07:22 PM
>>> org.apache.calcite.sql.validate.SqlValidatorException <init>
>>>> SEVERE: org.apache.calcite.sql.validate.SqlValidatorException: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>> Sep 16, 2015 12:07:22 PM org.apache.calcite.runtime.CalciteException
>>> <init>
>>>> SEVERE: org.apache.calcite.runtime.CalciteContextException: From line 1,
>>> column 15 to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>> Error: PARSE ERROR: From line 1, column 15 to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>> 
>>>> 
>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d on
>>> macbookair-7f99.home:31010] (state=,code=0)
>>>> 
>>>> 
>>>> See “Part 3” for the full http url for the file.
>>>> 
>>>> =====>>>>> Part 2 <<<<=========
>>>> 
>>>> Here is the configuration that I’m using for the s3sccapachedrill
>>> storage plugin
>>>> 
>>>> 
>>>> {
>>>> "type": "file",
>>>> "enabled": true,
>>>> "connection": "s3://sccapachedrill/ <s3://sccapachedrill/>",
>>>> "workspaces": {
>>>>  "root": {
>>>>    "location": "/",
>>>>    "writable": false,
>>>>    "defaultInputFormat": null
>>>>  },
>>>>  "tmp": {
>>>>    "location": "/tmp",
>>>>    "writable": true,
>>>>    "defaultInputFormat": null
>>>>  }
>>>> },
>>>> "formats": {
>>>>  "psv": {
>>>>    "type": "text",
>>>>    "extensions": [
>>>>      "tbl"
>>>>    ],
>>>>    "delimiter": "|"
>>>>  },
>>>>  "csv": {
>>>>    "type": "text",
>>>>    "extensions": [
>>>>      "csv"
>>>>    ],
>>>>    "delimiter": ","
>>>>  },
>>>>  "tsv": {
>>>>    "type": "text",
>>>>    "extensions": [
>>>>      "tsv"
>>>>    ],
>>>>    "delimiter": "\t"
>>>>  },
>>>>  "parquet": {
>>>>    "type": "parquet"
>>>>  },
>>>>  "json": {
>>>>    "type": "json"
>>>>  },
>>>>  "avro": {
>>>>    "type": "avro"
>>>>  }
>>>> }
>>>> }
>>>> 
>>>> =====>>>>> Part 3 <<<<=========
>>>> Here is the path to the s3 file
>>>> 
>>>> 
>>> https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv
>>> <
>>> https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv
>>>> 
>>>> 
>>>> 
>>>> =====>>>>> Part 4 <<<<=========
>>>> Here is the contents of my hadoop_excludes.txt
>>>> 
>>>> asm
>>>> jackson
>>>> mockito
>>>> log4j
>>>> logging
>>>> slf4j
>>>> jetty
>>>> jasper
>>>> jersey
>>>> eclipse
>>>> common
>>>> guava
>>>> servlet
>>>> parquet
>>>> 
>>>> =====>>>>> Part 5 <<<<=========
>>>> A view of the end of the sqlline.log where the command to view the file
>>> as a table blew up looks like:
>>>> 
>>>> 
>>>> 2015-09-16 12:07:22,093 [2a065e36-7ead-e8a3-70f3-5a9bf12c4abe:foreman]
>>> INFO  o.a.d.e.planner.sql.DrillSqlWorker - User Error Occurred
>>>> org.apache.drill.common.exceptions.UserException: PARSE ERROR: From line
>>> 1, column 15 to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>> 
>>>> 
>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d ]
>>>>     at
>>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:523)
>>> ~[drill-common-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:181)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:903)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:242)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> [na:1.8.0_45]
>>>>     at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> [na:1.8.0_45]
>>>>     at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
>>>> Caused by: org.apache.calcite.tools.ValidationException:
>>> org.apache.calcite.runtime.CalciteContextException: From line 1, column 15
>>> to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>>     at
>>> org.apache.calcite.prepare.PlannerImpl.validate(PlannerImpl.java:176)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.prepare.PlannerImpl.validateAndGetType(PlannerImpl.java:185)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:428)
>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert(DefaultSqlHandler.java:188)
>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:157)
>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:178)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     ... 5 common frames omitted
>>>> Caused by: org.apache.calcite.runtime.CalciteContextException: From line
>>> 1, column 15 to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method) ~[na:1.8.0_45]
>>>>     at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> ~[na:1.8.0_45]
>>>>     at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> ~[na:1.8.0_45]
>>>>     at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>> ~[na:1.8.0_45]
>>>>     at
>>> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:348)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:689)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:674)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.newValidationError(SqlValidatorImpl.java:3750)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:106)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:86)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:874)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:863)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2745)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2730)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2953)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:86)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:874)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:863)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at org.apache.calcite.sql.SqlSelect.validate(SqlSelect.java:210)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:837)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:552)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.prepare.PlannerImpl.validate(PlannerImpl.java:174)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     ... 10 common frames omitted
>>>> Caused by: org.apache.calcite.sql.validate.SqlValidatorException: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method) ~[na:1.8.0_45]
>>>>     at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> ~[na:1.8.0_45]
>>>>     at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> ~[na:1.8.0_45]
>>>>     at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>> ~[na:1.8.0_45]
>>>>     at
>>> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:348)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:457)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     ... 28 common frames omitted
>>>> 2015-09-16 12:07:22,095 [2a065e36-7ead-e8a3-70f3-5a9bf12c4abe:foreman]
>>> INFO  o.a.drill.exec.work.foreman.Foreman - State change requested.
>>> PENDING --> FAILED
>>>> org.apache.drill.exec.work.foreman.ForemanException: Unexpected
>>> exception during fragment initialization: PARSE ERROR: From line 1, column
>>> 15 to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>> 
>>>> 
>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d ]
>>>>     at
>>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:253)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> [na:1.8.0_45]
>>>>     at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> [na:1.8.0_45]
>>>>     at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
>>>> Caused by: org.apache.drill.common.exceptions.UserException: PARSE
>>> ERROR: From line 1, column 15 to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>> 
>>>> 
>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d ]
>>>>     at
>>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:523)
>>> ~[drill-common-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:181)
>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:903)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:242)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     ... 3 common frames omitted
>>>> Caused by: org.apache.calcite.tools.ValidationException:
>>> org.apache.calcite.runtime.CalciteContextException: From line 1, column 15
>>> to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>>     at
>>> org.apache.calcite.prepare.PlannerImpl.validate(PlannerImpl.java:176)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.prepare.PlannerImpl.validateAndGetType(PlannerImpl.java:185)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:428)
>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert(DefaultSqlHandler.java:188)
>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:157)
>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:178)
>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>>>     ... 5 common frames omitted
>>>> Caused by: org.apache.calcite.runtime.CalciteContextException: From line
>>> 1, column 15 to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method) ~[na:1.8.0_45]
>>>>     at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> ~[na:1.8.0_45]
>>>>     at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> ~[na:1.8.0_45]
>>>>     at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>> ~[na:1.8.0_45]
>>>>     at
>>> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:348)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:689)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:674)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.newValidationError(SqlValidatorImpl.java:3750)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:106)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:86)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:874)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:863)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2745)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2730)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2953)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:86)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:874)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:863)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at org.apache.calcite.sql.SqlSelect.validate(SqlSelect.java:210)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:837)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:552)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.prepare.PlannerImpl.validate(PlannerImpl.java:174)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     ... 10 common frames omitted
>>>> Caused by: org.apache.calcite.sql.validate.SqlValidatorException: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method) ~[na:1.8.0_45]
>>>>     at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> ~[na:1.8.0_45]
>>>>     at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> ~[na:1.8.0_45]
>>>>     at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>> ~[na:1.8.0_45]
>>>>     at
>>> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:348)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     at
>>> org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:457)
>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>>>>     ... 28 common frames omitted
>>>> 2015-09-16 12:07:22,096 [2a065e36-7ead-e8a3-70f3-5a9bf12c4abe:foreman]
>>> INFO  o.a.drill.exec.work.foreman.Foreman - foreman cleaning up.
>>>> 2015-09-16 12:07:22,106 [Client-1] INFO
>>> o.a.d.j.i.DrillResultSetImpl$ResultsListener - [#18] Query failed:
>>>> org.apache.drill.common.exceptions.UserRemoteException: PARSE ERROR:
>>> From line 1, column 15 to line 1, column 30: Table
>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>>> found
>>>> 
>>>> 
>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d on
>>> macbookair-7f99.home:31010]
>>>>     at
>>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:118)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.rpc.user.UserClient.handleReponse(UserClient.java:111)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:47)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:32)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at org.apache.drill.exec.rpc.RpcBus.handle(RpcBus.java:61)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:233)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:205)
>>> [drill-java-exec-1.1.0.jar:1.1.0]
>>>>     at
>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
>>> [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
>>> [netty-handler-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>>> [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242)
>>> [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>>>     at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>> [netty-common-4.0.27.Final.jar:4.0.27.Final]
>>>>     at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
>>> 
>>> 
> 

Reply via email to