Scott,

FYI: You may not want to publish your AWS key on a public list ;-)


Anyway I grabbed you core-site.xml file and put it on my Mac.

Started Drill

$ ./drill-embedded

Then created a SP (s3scott) that looked like this:
{
  "type": "file",
  "enabled": true,
  "connection": "s3n://sccapachedrill",
  "workspaces": {
    "root": {
      "location": "/",
      "writable": false,
      "defaultInputFormat": null
    }
  },
  "formats": {
    "psv": {
      "type": "text",
      "extensions": [
        "tbl"
      ],
      "delimiter": "|"
    },
    "csv": {
      "type": "text",
      "extensions": [
        "csv"
      ],
      "delimiter": ","
    },
    "tsv": {
      "type": "text",
      "extensions": [
        "tsv"
      ],
      "delimiter": "\t"
    },
    "parquet": {
      "type": "parquet"
    },
    "json": {
      "type": "json"
    }
  }
}


Did a quick show files:

0: jdbc:drill:zk=local> show databases;
+---------------------+
|     SCHEMA_NAME     |
+---------------------+
| INFORMATION_SCHEMA  |
| cp.default          |
| dfs.default         |
| dfs.root            |
| dfs.tmp             |
| s3scott.default     |
| s3scott.root        |
| s3test.default      |
| s3test.reviews      |
| sys                 |
+---------------------+
10 rows selected (1.742 seconds)
0: jdbc:drill:zk=local> use s3scott;
+-------+--------------------------------------+
|  ok   |               summary                |
+-------+--------------------------------------+
| true  | Default schema changed to [s3scott]  |
+-------+--------------------------------------+
1 row selected (0.115 seconds)
0: jdbc:drill:zk=local> show files;
+----------------------------------------------+--------------+---------+---------+--------+--------+--------------+------------------------+------------------------+
|                     name                     | isDirectory  | isFile  | 
length  | owner  | group  | permissions  |       accessTime       |    
modificationTime    |
+----------------------------------------------+--------------+---------+---------+--------+--------+--------------+------------------------+------------------------+
| oletv_server_event.log.2014-12-16-10-55.csv  | false        | true    | 17353 
  |        |        | rw-rw-rw-    | 1969-12-31 16:00:00.0  | 2015-09-09 
20:21:17.0  |
| oletv_server_event.log.2014-12-16-10-59.csv  | false        | true    | 19325 
  |        |        | rw-rw-rw-    | 1969-12-31 16:00:00.0  | 2015-09-09 
20:21:03.0  |
+----------------------------------------------+--------------+---------+---------+--------+--------+--------------+------------------------+------------------------+
2 rows selected (1.704 seconds)
0: jdbc:drill:zk=local>

So it seems to be working fine on my system.

Make sure to remove jets3t in the bin/hadoop-excludes.txt file

Also I used the latest jets3t jar as well — jets3t-0.9.4.jar

What is in your hadoop-excludes.txt and what jar are you using?

BTW: I’m using Drill 1.1


—Andries



> On Sep 22, 2015, at 11:19 PM, scott cote <[email protected]> wrote:
> 
> Thanks.
> 
> I don’t have the aws cli installed or hadoop installed.  Will install the AWS 
> cli onto my mac ….  
> 
>   No attachment was present in your email.  I’ll take a couple of screen 
> shots of my console so you can see what I see.  If you don’t get the 
> attachements, would you kindly send your direct email address to me so I can 
> send you the screen shots?  I’ll refrain from sending subsequent email 
> directly to you.
> 
> One attachment shows the properties of the bucket.
> The second attachment shows the properties of on of the files in the bucket - 
> there other file has the same properties.
> 
> <Screen Shot 2015-09-23 at 1.14.51 AM.png>
> 
> 
> <Screen Shot 2015-09-23 at 1.15.35 AM.png>
> 
> 
> SCott
> 
>> On Sep 23, 2015, at 12:50 AM, Amit Hadke <[email protected] 
>> <mailto:[email protected]>> wrote:
>> 
>> Hi Scott,
>> 
>> It seems like there is some problem with aws key/secret or permission on 
>> bucket.
>> 
>> I tried to access sccapachedrill using hadoop command (hadoop fs -ls 
>> s3n://sccapachedrill/ <s3n://sccapachedrill/>) and
>> aws cli (aws s3 ls s3://sccapachedrill <s3://sccapachedrill>) both didn't 
>> work.
>> 
>> Are you able to list sscapachedrill using aws cli?
>> (Usually it may be that ListBucket permission is not set on the s3 bucket, 
>> see attached screenshot)
>> 
>> Let me know.
>> 
>> Thanks,
>> ~ Amit.
>> 
>> 
>> On Tue, Sep 22, 2015 at 9:48 PM, scott cote <[email protected] 
>> <mailto:[email protected]>> wrote:
>> Andries,
>> 
>> I’m tying to go for “maximum” reproducibility.  Obviously S3 storage works, 
>> otherwise it wouldn’t be so promitnently displayed on the page and demos ….  
>> So I am assuming that I’m hurting myself somehow …..
>> 
>> In this email revision are two separate attempts to follow your 
>> instructions.  Am trying to provide everything you need for maximum 
>> reproducibility.  I made an attempt with the “s3” prefix and the “s3n” 
>> prefix in the connection value.
>> 
>> Both attempts used the embedded drill bit.  Both attempts used a drill 
>> installation where the owner/group was changed to my user with my group.
>> 
>> Summary of parts of this message  (since it is long):
>> 1. First Attempt:  Log shows null pointer errors after s3n attempt
>> 2. Second Attempt: Log shows 403 errors have s3 attempt
>> 3. S3 File Urls and notes
>> 4. content of s3sccapachedrill.sys.drill
>> 5. content of conf dir and core-site.xml
>> 6. content of sqlline_queries.json
>> 
>> 
>> You can navigate to each section by using text find for each enumerated 
>> bullet point.
>> 
>> I can take this one step further and tar up the “/tmp/drill” folder and 
>> stick it in a drop box.  Do the same with the conf folder and the log 
>> folder. Let me know if you want those artifacts or if you want me to turn on 
>> a debug/verbose flag.
>> 
>> Thanks,
>> 
>> SCott
>> 
>> ——>>>>>>1.  First Attempt (use s3n)  <<<<<<<<<<<<<<--------------------
>> with the connections string doing the “s3n” prefix - null pointer errors
>> 
>> Here is the command response sequence that generated the errors in the log :
>> 
>> ———>>>>> begin sqlline.log <<<<------
>> 
>> MACBOOKAIR-7F99:log scottccote$ more sqlline.log
>> 2015-09-22 22:54:07,723 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: zookeeper.version=3.4.5-1392090, built on 09/30/2012 
>> 17:52 GMT
>> 2015-09-22 22:54:07,725 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: host.name <http://host.name/>=macbookair-7f99.home
>> 2015-09-22 22:54:07,725 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: java.version=1.7.0_75
>> 2015-09-22 22:54:07,725 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: java.vendor=Oracle Corporation
>> 2015-09-22 22:54:07,725 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: 
>> java.home=/Library/Java/JavaVirtualMachines/jdk1.7.0_75.jdk/Contents/Home/jre
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: 
>> java.class.path=/opt/drill/conf:/opt/drill/jars/drill-common-1.1.0.jar:/opt/drill/jars/drill-hive-exec-shaded-1.1.0.jar:/opt/drill/jars/drill-java-exec-1.1.0.jar:/opt/drill/jars/drill-jdbc-1.1.0.jar:/opt/drill/jars/drill-mongo-storage-1.1.0.jar:/opt/drill/jars/drill-protocol-1.1.0.jar:/opt/drill/jars/drill-storage-hbase-1.1.0.jar:/opt/drill/jars/drill-storage-hive-core-1.1.0.jar:/opt/drill/jars/tpch-sample-data-1.1.0.jar:/opt/drill/jars/ext/zookeeper-3.4.5.jar:/opt/drill/jars/3rdparty/antlr-2.7.7.jar:/opt/drill/jars/3rdparty/antlr-runtime-3.4.jar:/opt/drill/jars/3rdparty/asm-debug-all-5.0.3.jar:/opt/drill/jars/3rdparty/avro-1.7.7.jar:/opt/drill/jars/3rdparty/avro-ipc-1.7.7-tests.jar:/opt/drill/jars/3rdparty/avro-ipc-1.7.7.jar:/opt/drill/jars/3rdparty/avro-mapred-1.7.7.jar:/opt/drill/jars/3rdparty/bonecp-0.8.0.RELEASE.jar:/opt/drill/jars/3rdparty/calcite-avatica-1.1.0-drill-r14.jar:/opt/drill/jars/3rdparty/calcite-core-1.1.0-drill-r14.jar:/opt/drill/jars/3rdparty/calcite-linq4j-1.1.0-drill-r14.jar:/opt/drill/jars/3rdparty/commons-beanutils-1.7.0.jar:/opt/drill/jars/3rdparty/commons-beanutils-core-1.8.0.jar:/opt/drill/jars/3rdparty/commons-cli-1.2.jar:/opt/drill/jars/3rdparty/commons-codec-1.4.jar:/opt/drill/jars/3rdparty/commons-collections-3.2.1.jar:/opt/drill/jars/3rdparty/commons-compiler-2.7.6.jar:/opt/drill/jars/3rdparty/commons-compiler-jdk-2.7.4.jar:/opt/drill/jars/3rdparty/commons-compress-1.4.1.jar:/opt/drill/jars/3rdparty/commons-configuration-1.6.jar:/opt/drill/jars/3rdparty/commons-dbcp-1.4.jar:/opt/drill/jars/3rdparty/commons-digester-1.8.jar:/opt/drill/jars/3rdparty/commons-el-1.0.jar:/opt/drill/jars/3rdparty/commons-httpclient-3.1.jar:/opt/drill/jars/3rdparty/commons-io-2.4.jar:/opt/drill/jars/3rdparty/commons-lang-2.6.jar:/opt/drill/jars/3rdparty/commons-lang3-3.1.jar:/opt/drill/jars/3rdparty/commons-math-2.2.jar:/opt/drill/jars/3rdparty/commons-math3-3.1.1.jar:/opt/drill/jars/3rdparty/commons-net-3.1.jar:/opt/drill/jars/3rdparty/commons-pool-1.5.4.jar:/opt/drill/jars/3rdparty/commons-pool2-2.1.jar:/opt/drill/jars/3rdparty/config-1.0.0.jar:/opt/drill/jars/3rdparty/curator-client-2.5.0.jar:/opt/drill/jars/3rdparty/curator-framework-2.5.0.jar:/opt/drill/jars/3rdparty/curator-recipes-2.5.0.jar:/opt/drill/jars/3rdparty/curator-x-discovery-2.5.0.jar:/opt/drill/jars/3rdparty/datanucleus-api-jdo-3.2.6.jar:/opt/drill/jars/3rdparty/datanucleus-core-3.2.10.jar:/opt/drill/jars/3rdparty/datanucleus-rdbms-3.2.9.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.data.converter-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.data.eventsource-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.data.logging-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.data.logging.protobuf-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.logback.appender.multiplex-classic-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.logback.appender.multiplex-core-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.logback.classic-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.logback.converter-classic-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.sender-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.sulky.codec-0.9.17.jar:/opt/drill/jars/3rdparty/de.huxhorn.sulky.formatting-0.9.17.jar:/opt/drill/jars/3rdparty/de.huxhorn.sulky.io-0.9.17.jar:/opt/drill/jars/3rdparty/derby-10.10.1.1.jar:/opt/drill/jars/3rdparty/dom4j-1.6.1.jar:/opt/drill/jars/3rdparty/eigenbase-properties-1.1.4.jar:/opt/drill/jars/3rdparty/findbugs-annotations-1.3.9-1.jar:/opt/drill/jars/3rdparty/foodmart-data-json-0.4.jar:/opt/drill/jars/3rdparty/freemarker-2.3.19.jar:/opt/drill/jars/3rdparty/guava-14.0.1.jar:/opt/drill/jars/3rdparty/hadoop-annotations-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-auth-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-client-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-common-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-hdfs-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-app-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-common-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-core-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-jobclient-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-shuffle-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-yarn-client-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-yarn-common-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-yarn-server-common-2.4.1.jar:/opt/drill/jars/3rdparty/hamcrest-core-1.1.jar:/opt/drill/jars/3rdparty/hbase-annotations-0.98.7-hadoop2.jar:/opt/drill/jars/3rdparty/hbase-client-0.98.7-hadoop2.jar:/opt/drill/jars/3rdparty/hbase-common-0.98.7-hadoop2.jar:/opt/drill/jars/3rdparty/hbase-protocol-0.98.7-hadoop2.jar:/opt/drill/jars/3rdparty/hibernate-validator-4.3.1.Final.jar:/opt/drill/jars/3rdparty/hive-contrib-1.0.0.jar:/opt/drill/jars/3rdparty/hive-hbase-handler-1.0.0.jar:/opt/drill/jars/3rdparty/hive-metastore-1.0.0.jar:/opt/drill/jars/3rdparty/hppc-0.4.2.jar:/opt/drill/jars/3rdparty/htrace-core-2.04.jar:/opt/drill/jars/3rdparty/httpclient-4.2.5.jar:/opt/drill/jars/3rdparty/httpcore-4.2.4.jar:/opt/drill/jars/3rdparty/jackson-annotations-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-core-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-core-asl-1.9.13.jar:/opt/drill/jars/3rdparty/jackson-databind-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-jaxrs-base-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-jaxrs-json-provider-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-mapper-asl-1.9.11.jar:/opt/drill/jars/3rdparty/jackson-module-jaxb-annotations-2.4.3.jar:/opt/drill/jars/3rdparty/janino-2.7.4.jar:/opt/drill/jars/3rdparty/jasper-compiler-5.5.23.jar:/opt/drill/jars/3rdparty/jasper-runtime-5.5.23.jar:/opt/drill/jars/3rdparty/javassist-3.12.1.GA.jar:/opt/drill/jars/3rdparty/javassist-3.16.1-GA.jar:/opt/drill/jars/3rdparty/javax.inject-1.jar:/opt/drill/jars/3rdparty/jcl-over-slf4j-1.7.6.jar:/opt/drill/jars/3rdparty/jcodings-1.0.8.jar:/opt/drill/jars/3rdparty/jcommander-1.30.jar:/opt/drill/jars/3rdparty/jdk.tools-1.7.jar:/opt/drill/jars/3rdparty/jdo-api-3.0.1.jar:/opt/drill/jars/3rdparty/jets3t-0.9.3.jar:/opt/drill/jars/3rdparty/jetty-continuation-9.1.1.v20140108.jar:/opt/drill/jars/3rdparty/jetty-http-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-io-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-security-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-server-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-servlet-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-util-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-webapp-9.1.1.v20140108.jar:/opt/drill/jars/3rdparty/jetty-xml-9.1.1.v20140108.jar:/opt/drill/jars/3rdparty/jline-2.10.jar:/opt/drill/jars/3rdparty/joda-time-2.3.jar:/opt/drill/jars/3rdparty/joni-2.1.2.jar:/opt/drill/jars/3rdparty/jpam-1.1.jar:/opt/drill/jars/3rdparty/jsch-0.1.42.jar:/opt/drill/jars/3rdparty/json-simple-1.1.1.jar:/opt/drill/jars/3rdparty/jsp-api-2.1.jar:/opt/drill/jars/3rdparty/jsr305-1.3.9.jar:/opt/drill/jars/3rdparty/jta-1.1.jar:/opt/drill/jars/3rdparty/jul-to-slf4j-1.7.6.jar:/opt/drill/jars/3rdparty/libfb303-0.9.0.jar:/opt/drill/jars/3rdparty/libthrift-0.9.0.jar:/opt/drill/jars/3rdparty/linq4j-0.4.jar:/opt/drill/jars/3rdparty/log4j-over-slf4j-1.7.6.jar:/opt/drill/jars/3rdparty/metrics-core-3.0.1.jar:/opt/drill/jars/3rdparty/metrics-healthchecks-3.0.1.jar:/opt/drill/jars/3rdparty/metrics-json-3.0.1.jar:/opt/drill/jars/3rdparty/metrics-jvm-3.0.1.jar:/opt/drill/jars/3rdparty/metrics-servlets-3.0.1.jar:/opt/drill/jars/3rdparty/mockito-core-1.9.5.jar:/opt/drill/jars/3rdparty/mongo-java-driver-3.0.2.jar:/opt/drill/jars/3rdparty/msgpack-0.6.6.jar:/opt/drill/jars/3rdparty/netty-3.6.6.Final.jar:/opt/drill/jars/3rdparty/netty-buffer-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-codec-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-common-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-handler-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-transport-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-transport-native-epoll-4.0.27.Final-linux-x86_64.jar:/opt/drill/jars/3rdparty/objenesis-1.0.jar:/opt/drill/jars/3rdparty/optiq-avatica-0.9-drill-r20.jar:/opt/drill/jars/3rdparty/paranamer-2.5.6.jar:/opt/drill/jars/3rdparty/parquet-column-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parquet-common-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parquet-encoding-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parquet-format-2.1.1-drill-r1.jar:/opt/drill/jars/3rdparty/parquet-generator-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parquet-hadoop-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parquet-jackson-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/opt/drill/jars/3rdparty/protobuf-java-2.5.0.jar:/opt/drill/jars/3rdparty/protostuff-api-1.0.8.jar:/opt/drill/jars/3rdparty/protostuff-core-1.0.8.jar:/opt/drill/jars/3rdparty/protostuff-json-1.0.8.jar:/opt/drill/jars/3rdparty/serializer-2.7.1.jar:/opt/drill/jars/3rdparty/slf4j-api-1.7.6.jar:/opt/drill/jars/3rdparty/snappy-java-1.0.5-M3.jar:/opt/drill/jars/3rdparty/sqlline-1.1.9-drill-r5.jar:/opt/drill/jars/3rdparty/stax-api-1.0-2.jar:/opt/drill/jars/3rdparty/stringtemplate-3.2.1.jar:/opt/drill/jars/3rdparty/tpch-sample-data-1.1.0.jar:/opt/drill/jars/3rdparty/univocity-parsers-1.3.0.jar:/opt/drill/jars/3rdparty/validation-api-1.0.0.GA.jar:/opt/drill/jars/3rdparty/velocity-1.7.jar:/opt/drill/jars/3rdparty/xalan-2.7.1.jar:/opt/drill/jars/3rdparty/xercesImpl-2.11.0.jar:/opt/drill/jars/3rdparty/xml-apis-1.4.01.jar:/opt/drill/jars/3rdparty/xmlenc-0.52.jar:/opt/drill/jars/3rdparty/xz-1.0.jar:/opt/drill/jars/classb/activation-1.1.jar:/opt/drill/jars/classb/aopalliance-repackaged-2.2.0.jar:/opt/drill/jars/classb/codemodel-2.6.jar:/opt/drill/jars/classb/hk2-api-2.2.0.jar:/opt/drill/jars/classb/hk2-locator-2.2.0.jar:/opt/drill/jars/classb/hk2-utils-2.2.0.jar:/opt/drill/jars/classb/javax.annotation-api-1.2.jar:/opt/drill/jars/classb/javax.inject-2.2.0.jar:/opt/drill/jars/classb/javax.servlet-api-3.1.0.jar:/opt/drill/jars/classb/javax.ws.rs
>>  
>> <http://javax.ws.rs/>-api-2.0.jar:/opt/drill/jars/classb/jaxb-api-2.2.2.jar:/opt/drill/jars/classb/jersey-client-2.8.jar:/opt/drill/jars/classb/jersey-common-2.8.jar:/opt/drill/jars/classb/jersey-container-jetty-http-2.8.jar:/opt/drill/jars/classb/jersey-container-jetty-servlet-2.8.jar:/opt/drill/jars/classb/jersey-container-servlet-2.8.jar:/opt/drill/jars/classb/jersey-container-servlet-core-2.8.jar:/opt/drill/jars/classb/jersey-guava-2.8.jar:/opt/drill/jars/classb/jersey-media-multipart-2.8.jar:/opt/drill/jars/classb/jersey-mvc-2.8.jar:/opt/drill/jars/classb/jersey-mvc-freemarker-2.8.jar:/opt/drill/jars/classb/jersey-server-2.8.jar:/opt/drill/jars/classb/jetty-6.1.26.jar:/opt/drill/jars/classb/jetty-util-6.1.26.jar:/opt/drill/jars/classb/logback-classic-1.0.13.jar:/opt/drill/jars/classb/logback-core-1.0.13.jar:/opt/drill/jars/classb/mimepull-1.9.3.jar:/opt/drill/jars/classb/osgi-resource-locator-1.0.1.jar:/opt/drill/jars/classb/reflections-0.9.8.jar
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: 
>> java.library.path=/Users/scottccote/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: 
>> java.io.tmpdir=/var/folders/ls/nw9l_52x7xs_h_pj_854g9hr0000gn/T/
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: java.compiler=<NA>
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: os.name <http://os.name/>=Mac OS X
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: os.arch=x86_64
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: os.version=10.10.5
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: user.name <http://user.name/>=scottccote
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: user.home=/Users/scottccote
>> 2015-09-22 22:54:07,726 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Drillbit environment: user.dir=/opt/drill
>> 2015-09-22 22:54:08,426 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Construction completed (700 ms).
>> 2015-09-22 22:54:11,817 [main] INFO  
>> o.a.d.e.e.f.FunctionImplementationRegistry - Function registry loaded.  2413 
>> functions loaded in 3067 ms.
>> 2015-09-22 22:54:12,789 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Startup completed (4363 ms).
>> 2015-09-22 22:54:21,347 [29fddd93-3036-63b9-7d06-5b7074e883cd:foreman] INFO  
>> o.a.drill.exec.work.foreman.Foreman - State change requested.  PENDING --> 
>> RUNNING
>> 2015-09-22 22:54:21,410 [29fddd93-3036-63b9-7d06-5b7074e883cd:frag:0:0] INFO 
>>  o.a.d.e.w.fragment.FragmentExecutor - 
>> 29fddd93-3036-63b9-7d06-5b7074e883cd:0:0: State change requested from 
>> AWAITING_ALLOCATION --> RUNNING for
>> 2015-09-22 22:54:21,417 [29fddd93-3036-63b9-7d06-5b7074e883cd:frag:0:0] INFO 
>>  o.a.d.e.w.f.AbstractStatusReporter - State changed for 
>> 29fddd93-3036-63b9-7d06-5b7074e883cd:0:0. New state: RUNNING
>> 2015-09-22 22:54:21,729 [29fddd93-3036-63b9-7d06-5b7074e883cd:frag:0:0] INFO 
>>  o.a.d.e.w.fragment.FragmentExecutor - 
>> 29fddd93-3036-63b9-7d06-5b7074e883cd:0:0: State change requested from 
>> RUNNING --> FINISHED for
>> 2015-09-22 22:54:21,730 [29fddd93-3036-63b9-7d06-5b7074e883cd:frag:0:0] INFO 
>>  o.a.d.e.w.f.AbstractStatusReporter - State changed for 
>> 29fddd93-3036-63b9-7d06-5b7074e883cd:0:0. New state: FINISHED
>> 2015-09-22 22:54:21,735 [BitServer-4] INFO  
>> o.a.drill.exec.work.foreman.Foreman - State change requested.  RUNNING --> 
>> COMPLETED
>> 2015-09-22 22:54:21,735 [BitServer-4] INFO  
>> o.a.drill.exec.work.foreman.Foreman - foreman cleaning up.
>> 2015-09-22 22:54:36,236 [29fddd82-9ae6-bb73-38bc-930be828013d:foreman] INFO  
>> o.a.drill.exec.work.foreman.Foreman - State change requested.  PENDING --> 
>> RUNNING
>> 2015-09-22 22:54:36,240 [29fddd82-9ae6-bb73-38bc-930be828013d:frag:0:0] INFO 
>>  o.a.d.e.w.fragment.FragmentExecutor - 
>> 29fddd82-9ae6-bb73-38bc-930be828013d:0:0: State change requested from 
>> AWAITING_ALLOCATION --> RUNNING for
>> 2015-09-22 22:54:36,240 [29fddd82-9ae6-bb73-38bc-930be828013d:frag:0:0] INFO 
>>  o.a.d.e.w.f.AbstractStatusReporter - State changed for 
>> 29fddd82-9ae6-bb73-38bc-930be828013d:0:0. New state: RUNNING
>> 2015-09-22 22:54:36,244 [29fddd82-9ae6-bb73-38bc-930be828013d:frag:0:0] INFO 
>>  o.a.d.e.w.fragment.FragmentExecutor - 
>> 29fddd82-9ae6-bb73-38bc-930be828013d:0:0: State change requested from 
>> RUNNING --> FINISHED for
>> 2015-09-22 22:54:36,244 [29fddd82-9ae6-bb73-38bc-930be828013d:frag:0:0] INFO 
>>  o.a.d.e.w.f.AbstractStatusReporter - State changed for 
>> 29fddd82-9ae6-bb73-38bc-930be828013d:0:0. New state: FINISHED
>> 2015-09-22 22:54:36,246 [BitServer-4] INFO  
>> o.a.drill.exec.work.foreman.Foreman - State change requested.  RUNNING --> 
>> COMPLETED
>> 2015-09-22 22:54:36,246 [BitServer-4] INFO  
>> o.a.drill.exec.work.foreman.Foreman - foreman cleaning up.
>> 2015-09-22 22:54:44,549 [29fddd7b-c1b6-07fb-b7ca-9e9ca8135950:foreman] INFO  
>> o.a.drill.exec.work.foreman.Foreman - State change requested.  PENDING --> 
>> FAILED
>> org.apache.drill.exec.work.foreman.ForemanException: Unexpected exception 
>> during fragment initialization: null
>>         at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:253) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>  [na:1.7.0_75]
>>         at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>  [na:1.7.0_75]
>>         at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
>> Caused by: java.lang.NullPointerException: null
>>         at 
>> org.apache.hadoop.fs.s3native.NativeS3FileSystem.listStatus(NativeS3FileSystem.java:479)
>>  ~[hadoop-common-2.4.1.jar:na]
>>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1483) 
>> ~[hadoop-common-2.4.1.jar:na]
>>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1560) 
>> ~[hadoop-common-2.4.1.jar:na]
>>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1540) 
>> ~[hadoop-common-2.4.1.jar:na]
>>         at 
>> org.apache.drill.exec.store.dfs.DrillFileSystem.list(DrillFileSystem.java:697)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.planner.sql.handlers.ShowFileHandler.getPlan(ShowFileHandler.java:97)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:178)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:903) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:242) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         ... 3 common frames omitted
>> 2015-09-22 22:54:44,550 [29fddd7b-c1b6-07fb-b7ca-9e9ca8135950:foreman] INFO  
>> o.a.drill.exec.work.foreman.Foreman - foreman cleaning up.
>> 2015-09-22 22:54:44,553 [29fddd7b-c1b6-07fb-b7ca-9e9ca8135950:foreman] ERROR 
>> o.a.drill.exec.work.foreman.Foreman - SYSTEM ERROR: NullPointerException
>> 
>> 
>> [Error Id: f38973d8-36b1-41f5-b1cc-6e6444316ba9 on 
>> macbookair-7f99.home:31010]
>> org.apache.drill.common.exceptions.UserException: SYSTEM ERROR: 
>> NullPointerException
>> 
>> 
>> [Error Id: f38973d8-36b1-41f5-b1cc-6e6444316ba9 on 
>> macbookair-7f99.home:31010]
>>         at 
>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:523)
>>  ~[drill-common-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.work.foreman.Foreman$ForemanResult.close(Foreman.java:737)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.work.foreman.Foreman$StateSwitch.processEvent(Foreman.java:839)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.work.foreman.Foreman$StateSwitch.processEvent(Foreman.java:781)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.common.EventProcessor.sendEvent(EventProcessor.java:73) 
>> [drill-common-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.work.foreman.Foreman$StateSwitch.moveToState(Foreman.java:783)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.work.foreman.Foreman.moveToState(Foreman.java:892) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:253) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>  [na:1.7.0_75]
>>         at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>  [na:1.7.0_75]
>>         at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
>> Caused by: org.apache.drill.exec.work.foreman.ForemanException: Unexpected 
>> exception during fragment initialization: null
>>         ... 4 common frames omitted
>> Caused by: java.lang.NullPointerException: null
>>         at 
>> org.apache.hadoop.fs.s3native.NativeS3FileSystem.listStatus(NativeS3FileSystem.java:479)
>>  ~[hadoop-common-2.4.1.jar:na]
>>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1483) 
>> ~[hadoop-common-2.4.1.jar:na]
>>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1560) 
>> ~[hadoop-common-2.4.1.jar:na]
>>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1540) 
>> ~[hadoop-common-2.4.1.jar:na]
>>         at 
>> org.apache.drill.exec.store.dfs.DrillFileSystem.list(DrillFileSystem.java:697)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.planner.sql.handlers.ShowFileHandler.getPlan(ShowFileHandler.java:97)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:178)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:903) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:242) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         ... 3 common frames omitted
>> 2015-09-22 22:54:44,566 [Client-1] INFO  
>> o.a.d.j.i.DrillResultSetImpl$ResultsListener - [#3] Query failed:
>> org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR: 
>> NullPointerException
>> 
>> 
>> [Error Id: f38973d8-36b1-41f5-b1cc-6e6444316ba9 on 
>> macbookair-7f99.home:31010]
>>         at 
>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:118)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.rpc.user.UserClient.handleReponse(UserClient.java:111) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:47)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:32)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>         at org.apache.drill.exec.rpc.RpcBus.handle(RpcBus.java:61) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:233) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:205) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>         at 
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
>>  [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
>>  [netty-handler-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>>  [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242)
>>  [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) 
>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) 
>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) 
>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>>         at 
>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>  [netty-common-4.0.27.Final.jar:4.0.27.Final]
>>         at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
>> 2015-09-22 22:55:38,902 [Client-1] INFO  o.a.drill.exec.rpc.user.UserClient 
>> - Channel closed /192.168.1.4:57765 <http://192.168.1.4:57765/> <--> 
>> MACBOOKAIR-7F99.home/192.168.1.4:31010 <http://192.168.1.4:31010/>.
>> 2015-09-22 22:55:42,031 [BitServer-3] INFO  
>> o.a.d.exec.rpc.control.ControlClient - Channel closed /192.168.1.4:57766 
>> <http://192.168.1.4:57766/> <--> MACBOOKAIR-7F99.home/192.168.1.4:31011 
>> <http://192.168.1.4:31011/>.
>> 2015-09-22 22:55:45,250 [main] INFO  o.apache.drill.exec.server.Drillbit - 
>> Shutdown completed (6348 ms).
>> 2015-09-22 22:55:45,255 [Drillbit-ShutdownHook#0] INFO  
>> o.apache.drill.exec.server.Drillbit - Received shutdown request.
>> 
>> ———>>>>> end sqlline.log <<<<———
>> 
>> 
>> 
>> ——>>>>>> 2. Second Attempt (use s3) <<<<<<<<<<<<<<——————————
>> ———>>>>> begin sqlline.log <<<<———
>> 
>>   1 2015-09-22 23:04:36,252 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: zookeeper.version=3.4.5-1392090, built on 09/30/2012 
>> 17:52 GMT
>>   2 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: host.name <http://host.name/>=macbookair-7f99.home
>>   3 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: java.version=1.7.0_75
>>   4 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: java.vendor=Oracle Corporation
>>   5 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: 
>> java.home=/Library/Java/JavaVirtualMachines/jdk1.7.0_75.jdk/Contents/Home/jre
>>  6 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: 
>> java.class.path=/opt/drill/conf:/opt/drill/jars/drill-common-1.1.0.jar:/opt/drill/jars/drill-hive-exec-shaded-1.1.0.jar:/opt/drill/jars/
>>     
>> drill-java-exec-1.1.0.jar:/opt/drill/jars/drill-jdbc-1.1.0.jar:/opt/drill/jars/drill-mongo-storage-1.1.0.jar:/opt/drill/jars/drill-protocol-1.1.0.jar:/opt/drill/jars/drill-storage-hbase-1.1.0.jar:/opt/drill/jars/drill-storage-hive-co
>>     
>> re-1.1.0.jar:/opt/drill/jars/tpch-sample-data-1.1.0.jar:/opt/drill/jars/ext/zookeeper-3.4.5.jar:/opt/drill/jars/3rdparty/antlr-2.7.7.jar:/opt/drill/jars/3rdparty/antlr-runtime-3.4.jar:/opt/drill/jars/3rdparty/asm-debug-all-5.0.3.jar:
>>     
>> /opt/drill/jars/3rdparty/avro-1.7.7.jar:/opt/drill/jars/3rdparty/avro-ipc-1.7.7-tests.jar:/opt/drill/jars/3rdparty/avro-ipc-1.7.7.jar:/opt/drill/jars/3rdparty/avro-mapred-1.7.7.jar:/opt/drill/jars/3rdparty/bonecp-0.8.0.RELEASE.jar:/o
>>     
>> pt/drill/jars/3rdparty/calcite-avatica-1.1.0-drill-r14.jar:/opt/drill/jars/3rdparty/calcite-core-1.1.0-drill-r14.jar:/opt/drill/jars/3rdparty/calcite-linq4j-1.1.0-drill-r14.jar:/opt/drill/jars/3rdparty/commons-beanutils-1.7.0.jar:/op
>>     
>> t/drill/jars/3rdparty/commons-beanutils-core-1.8.0.jar:/opt/drill/jars/3rdparty/commons-cli-1.2.jar:/opt/drill/jars/3rdparty/commons-codec-1.4.jar:/opt/drill/jars/3rdparty/commons-collections-3.2.1.jar:/opt/drill/jars/3rdparty/common
>>     
>> s-compiler-2.7.6.jar:/opt/drill/jars/3rdparty/commons-compiler-jdk-2.7.4.jar:/opt/drill/jars/3rdparty/commons-compress-1.4.1.jar:/opt/drill/jars/3rdparty/commons-configuration-1.6.jar:/opt/drill/jars/3rdparty/commons-dbcp-1.4.jar:/op
>>     
>> t/drill/jars/3rdparty/commons-digester-1.8.jar:/opt/drill/jars/3rdparty/commons-el-1.0.jar:/opt/drill/jars/3rdparty/commons-httpclient-3.1.jar:/opt/drill/jars/3rdparty/commons-io-2.4.jar:/opt/drill/jars/3rdparty/commons-lang-2.6.jar:
>>     
>> /opt/drill/jars/3rdparty/commons-lang3-3.1.jar:/opt/drill/jars/3rdparty/commons-math-2.2.jar:/opt/drill/jars/3rdparty/commons-math3-3.1.1.jar:/opt/drill/jars/3rdparty/commons-net-3.1.jar:/opt/drill/jars/3rdparty/commons-pool-1.5.4.ja
>>     
>> r:/opt/drill/jars/3rdparty/commons-pool2-2.1.jar:/opt/drill/jars/3rdparty/config-1.0.0.jar:/opt/drill/jars/3rdparty/curator-client-2.5.0.jar:/opt/drill/jars/3rdparty/curator-framework-2.5.0.jar:/opt/drill/jars/3rdparty/curator-recipe
>>     
>> s-2.5.0.jar:/opt/drill/jars/3rdparty/curator-x-discovery-2.5.0.jar:/opt/drill/jars/3rdparty/datanucleus-api-jdo-3.2.6.jar:/opt/drill/jars/3rdparty/datanucleus-core-3.2.10.jar:/opt/drill/jars/3rdparty/datanucleus-rdbms-3.2.9.jar:/opt/
>>     
>> drill/jars/3rdparty/de.huxhorn.lilith.data.converter-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.data.eventsource-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.data.logging-0.9.44.jar:/opt/drill/jars/3rdparty/de.hux
>>     
>> horn.lilith.data.logging.protobuf-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.logback.appender.multiplex-classic-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.logback.appender.multiplex-core-0.9.44.jar:/opt/drill/ja
>>     
>> rs/3rdparty/de.huxhorn.lilith.logback.classic-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.logback.converter-classic-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn.lilith.sender-0.9.44.jar:/opt/drill/jars/3rdparty/de.huxhorn
>>     
>> .sulky.codec-0.9.17.jar:/opt/drill/jars/3rdparty/de.huxhorn.sulky.formatting-0.9.17.jar:/opt/drill/jars/3rdparty/de.huxhorn.sulky.io-0.9.17.jar:/opt/drill/jars/3rdparty/derby-10.10.1.1.jar:/opt/drill/jars/3rdparty/dom4j-1.6.1.jar:/op
>>     
>> t/drill/jars/3rdparty/eigenbase-properties-1.1.4.jar:/opt/drill/jars/3rdparty/findbugs-annotations-1.3.9-1.jar:/opt/drill/jars/3rdparty/foodmart-data-json-0.4.jar:/opt/drill/jars/3rdparty/freemarker-2.3.19.jar:/opt/drill/jars/3rdpart
>>     
>> y/guava-14.0.1.jar:/opt/drill/jars/3rdparty/hadoop-annotations-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-auth-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-client-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-common-2.4.1.jar:/opt/drill/jars
>>     
>> /3rdparty/hadoop-hdfs-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-app-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-common-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-core-2.4.1.jar:/opt/dri
>>     
>> ll/jars/3rdparty/hadoop-mapreduce-client-jobclient-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-mapreduce-client-shuffle-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-yarn-client-2.4.1.jar:/opt/drill/jars/3rdparty/hadoop-yarn-common-2.4.1.j
>>     
>> ar:/opt/drill/jars/3rdparty/hadoop-yarn-server-common-2.4.1.jar:/opt/drill/jars/3rdparty/hamcrest-core-1.1.jar:/opt/drill/jars/3rdparty/hbase-annotations-0.98.7-hadoop2.jar:/opt/drill/jars/3rdparty/hbase-client-0.98.7-hadoop2.jar:/op
>>     
>> t/drill/jars/3rdparty/hbase-common-0.98.7-hadoop2.jar:/opt/drill/jars/3rdparty/hbase-protocol-0.98.7-hadoop2.jar:/opt/drill/jars/3rdparty/hibernate-validator-4.3.1.Final.jar:/opt/drill/jars/3rdparty/hive-contrib-1.0.0.jar:/opt/drill/
>>     
>> jars/3rdparty/hive-hbase-handler-1.0.0.jar:/opt/drill/jars/3rdparty/hive-metastore-1.0.0.jar:/opt/drill/jars/3rdparty/hppc-0.4.2.jar:/opt/drill/jars/3rdparty/htrace-core-2.04.jar:/opt/drill/jars/3rdparty/httpclient-4.2.5.jar:/opt/dri
>>     
>> ll/jars/3rdparty/httpcore-4.2.4.jar:/opt/drill/jars/3rdparty/jackson-annotations-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-core-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-core-asl-1.9.13.jar:/opt/drill/jars/3rdparty/jackson-databind
>>     
>> -2.4.3.jar:/opt/drill/jars/3rdparty/jackson-jaxrs-base-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-jaxrs-json-provider-2.4.3.jar:/opt/drill/jars/3rdparty/jackson-mapper-asl-1.9.11.jar:/opt/drill/jars/3rdparty/jackson-module-jaxb-annot
>>     
>> ations-2.4.3.jar:/opt/drill/jars/3rdparty/janino-2.7.4.jar:/opt/drill/jars/3rdparty/jasper-compiler-5.5.23.jar:/opt/drill/jars/3rdparty/jasper-runtime-5.5.23.jar:/opt/drill/jars/3rdparty/javassist-3.12.1.GA.jar:/opt/drill/jars/3rdpar
>>     
>> ty/javassist-3.16.1-GA.jar:/opt/drill/jars/3rdparty/javax.inject-1.jar:/opt/drill/jars/3rdparty/jcl-over-slf4j-1.7.6.jar:/opt/drill/jars/3rdparty/jcodings-1.0.8.jar:/opt/drill/jars/3rdparty/jcommander-1.30.jar:/opt/drill/jars/3rdpart
>>     
>> y/jdk.tools-1.7.jar:/opt/drill/jars/3rdparty/jdo-api-3.0.1.jar:/opt/drill/jars/3rdparty/jets3t-0.9.3.jar:/opt/drill/jars/3rdparty/jetty-continuation-9.1.1.v20140108.jar:/opt/drill/jars/3rdparty/jetty-http-9.1.5.v20140505.jar:/opt/dri
>>     
>> ll/jars/3rdparty/jetty-io-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-security-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-server-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-servlet-9.1.5.v20140505.jar:/opt/drill
>>     
>> /jars/3rdparty/jetty-util-9.1.5.v20140505.jar:/opt/drill/jars/3rdparty/jetty-webapp-9.1.1.v20140108.jar:/opt/drill/jars/3rdparty/jetty-xml-9.1.1.v20140108.jar:/opt/drill/jars/3rdparty/jline-2.10.jar:/opt/drill/jars/3rdparty/joda-time
>>     
>> -2.3.jar:/opt/drill/jars/3rdparty/joni-2.1.2.jar:/opt/drill/jars/3rdparty/jpam-1.1.jar:/opt/drill/jars/3rdparty/jsch-0.1.42.jar:/opt/drill/jars/3rdparty/json-simple-1.1.1.jar:/opt/drill/jars/3rdparty/jsp-api-2.1.jar:/opt/drill/jars/3
>>     
>> rdparty/jsr305-1.3.9.jar:/opt/drill/jars/3rdparty/jta-1.1.jar:/opt/drill/jars/3rdparty/jul-to-slf4j-1.7.6.jar:/opt/drill/jars/3rdparty/libfb303-0.9.0.jar:/opt/drill/jars/3rdparty/libthrift-0.9.0.jar:/opt/drill/jars/3rdparty/linq4j-0.
>>     
>> 4.jar:/opt/drill/jars/3rdparty/log4j-over-slf4j-1.7.6.jar:/opt/drill/jars/3rdparty/metrics-core-3.0.1.jar:/opt/drill/jars/3rdparty/metrics-healthchecks-3.0.1.jar:/opt/drill/jars/3rdparty/metrics-json-3.0.1.jar:/opt/drill/jars/3rdpart
>>     
>> y/metrics-jvm-3.0.1.jar:/opt/drill/jars/3rdparty/metrics-servlets-3.0.1.jar:/opt/drill/jars/3rdparty/mockito-core-1.9.5.jar:/opt/drill/jars/3rdparty/mongo-java-driver-3.0.2.jar:/opt/drill/jars/3rdparty/msgpack-0.6.6.jar:/opt/drill/ja
>>     
>> rs/3rdparty/netty-3.6.6.Final.jar:/opt/drill/jars/3rdparty/netty-buffer-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-codec-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-common-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-hand
>>     
>> ler-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-transport-4.0.27.Final.jar:/opt/drill/jars/3rdparty/netty-transport-native-epoll-4.0.27.Final-linux-x86_64.jar:/opt/drill/jars/3rdparty/objenesis-1.0.jar:/opt/drill/jars/3rdparty/op
>>     
>> tiq-avatica-0.9-drill-r20.jar:/opt/drill/jars/3rdparty/paranamer-2.5.6.jar:/opt/drill/jars/3rdparty/parquet-column-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parquet-common-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parqu
>>     
>> et-encoding-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parquet-format-2.1.1-drill-r1.jar:/opt/drill/jars/3rdparty/parquet-generator-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/parquet-hadoop-1.6.0rc3-drill-r0.3.jar:/opt/dr
>>     
>> ill/jars/3rdparty/parquet-jackson-1.6.0rc3-drill-r0.3.jar:/opt/drill/jars/3rdparty/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/opt/drill/jars/3rdparty/protobuf-java-2.5.0.jar:/opt/drill/jars/3rdparty/protostuff-api-1.0.8.jar:/opt/
>>     
>> drill/jars/3rdparty/protostuff-core-1.0.8.jar:/opt/drill/jars/3rdparty/protostuff-json-1.0.8.jar:/opt/drill/jars/3rdparty/serializer-2.7.1.jar:/opt/drill/jars/3rdparty/slf4j-api-1.7.6.jar:/opt/drill/jars/3rdparty/snappy-java-1.0.5-M3
>>     
>> .jar:/opt/drill/jars/3rdparty/sqlline-1.1.9-drill-r5.jar:/opt/drill/jars/3rdparty/stax-api-1.0-2.jar:/opt/drill/jars/3rdparty/stringtemplate-3.2.1.jar:/opt/drill/jars/3rdparty/tpch-sample-data-1.1.0.jar:/opt/drill/jars/3rdparty/univo
>>     
>> city-parsers-1.3.0.jar:/opt/drill/jars/3rdparty/validation-api-1.0.0.GA.jar:/opt/drill/jars/3rdparty/velocity-1.7.jar:/opt/drill/jars/3rdparty/xalan-2.7.1.jar:/opt/drill/jars/3rdparty/xercesImpl-2.11.0.jar:/opt/drill/jars/3rdparty/xm
>>     
>> l-apis-1.4.01.jar:/opt/drill/jars/3rdparty/xmlenc-0.52.jar:/opt/drill/jars/3rdparty/xz-1.0.jar:/opt/drill/jars/classb/activation-1.1.jar:/opt/drill/jars/classb/aopalliance-repackaged-2.2.0.jar:/opt/drill/jars/classb/codemodel-2.6.jar
>>     
>> :/opt/drill/jars/classb/hk2-api-2.2.0.jar:/opt/drill/jars/classb/hk2-locator-2.2.0.jar:/opt/drill/jars/classb/hk2-utils-2.2.0.jar:/opt/drill/jars/classb/javax.annotation-api-1.2.jar:/opt/drill/jars/classb/javax.inject-2.2.0.jar:/opt/
>>     
>> drill/jars/classb/javax.servlet-api-3.1.0.jar:/opt/drill/jars/classb/javax.ws.rs
>>  
>> <http://javax.ws.rs/>-api-2.0.jar:/opt/drill/jars/classb/jaxb-api-2.2.2.jar:/opt/drill/jars/classb/jersey-client-2.8.jar:/opt/drill/jars/classb/jersey-common-2.8.jar:/opt/dri
>>     
>> ll/jars/classb/jersey-container-jetty-http-2.8.jar:/opt/drill/jars/classb/jersey-container-jetty-servlet-2.8.jar:/opt/drill/jars/classb/jersey-container-servlet-2.8.jar:/opt/drill/jars/classb/jersey-container-servlet-core-2.8.jar:/op
>>     
>> t/drill/jars/classb/jersey-guava-2.8.jar:/opt/drill/jars/classb/jersey-media-multipart-2.8.jar:/opt/drill/jars/classb/jersey-mvc-2.8.jar:/opt/drill/jars/classb/jersey-mvc-freemarker-2.8.jar:/opt/drill/jars/classb/jersey-server-2.8.ja
>>     
>> r:/opt/drill/jars/classb/jetty-6.1.26.jar:/opt/drill/jars/classb/jetty-util-6.1.26.jar:/opt/drill/jars/classb/logback-classic-1.0.13.jar:/opt/drill/jars/classb/logback-core-1.0.13.jar:/opt/drill/jars/classb/mimepull-1.9.3.jar:/opt/dr
>> 
>>   7 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: 
>> java.library.path=/Users/scottccote/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Ja
>>     va/Extensions:/usr/lib/java:.
>>   8 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: 
>> java.io.tmpdir=/var/folders/ls/nw9l_52x7xs_h_pj_854g9hr0000gn/T/
>>   9 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: java.compiler=<NA>
>>  10 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: os.name <http://os.name/>=Mac OS X
>>  11 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: os.arch=x86_64
>>  12 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: os.version=10.10.5
>>  13 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: user.name <http://user.name/>=scottccote
>>  14 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: user.home=/Users/scottccote
>>  15 2015-09-22 23:04:36,255 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Drillbit environment: user.dir=/opt/drill
>>  16 2015-09-22 23:04:36,853 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Construction completed (598 ms).
>>  17 2015-09-22 23:04:40,052 [main] INFO  
>> o.a.d.e.e.f.FunctionImplementationRegistry - Function registry loaded.  2413 
>> functions loaded in 2877 ms.
>>  18 2015-09-22 23:04:41,028 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Startup completed (4175 ms).
>>  19 2015-09-22 23:04:57,239 [29fddb17-700b-68e8-a75c-76010ba852a6:foreman] 
>> INFO  o.a.drill.exec.work.foreman.Foreman - State change requested.  PENDING 
>> --> RUNNING
>>  20 2015-09-22 23:04:57,278 [29fddb17-700b-68e8-a75c-76010ba852a6:frag:0:0] 
>> INFO  o.a.d.e.w.fragment.FragmentExecutor - 
>> 29fddb17-700b-68e8-a75c-76010ba852a6:0:0: State change requested from 
>> AWAITING_ALLOCATION --> RUNNING for
>>  21 2015-09-22 23:04:57,283 [29fddb17-700b-68e8-a75c-76010ba852a6:frag:0:0] 
>> INFO  o.a.d.e.w.f.AbstractStatusReporter - State changed for 
>> 29fddb17-700b-68e8-a75c-76010ba852a6:0:0. New state: RUNNING
>>  22 2015-09-22 23:04:57,363 [29fddb17-700b-68e8-a75c-76010ba852a6:frag:0:0] 
>> INFO  o.a.d.e.w.fragment.FragmentExecutor - 
>> 29fddb17-700b-68e8-a75c-76010ba852a6:0:0: State change requested from 
>> RUNNING --> FINISHED for
>>  23 2015-09-22 23:04:57,364 [29fddb17-700b-68e8-a75c-76010ba852a6:frag:0:0] 
>> INFO  o.a.d.e.w.f.AbstractStatusReporter - State changed for 
>> 29fddb17-700b-68e8-a75c-76010ba852a6:0:0. New state: FINISHED
>>  24 2015-09-22 23:04:57,369 [BitServer-4] INFO  
>> o.a.drill.exec.work.foreman.Foreman - State change requested.  RUNNING --> 
>> COMPLETED
>>  25 2015-09-22 23:04:57,369 [BitServer-4] INFO  
>> o.a.drill.exec.work.foreman.Foreman - foreman cleaning up.
>>  26 2015-09-22 23:05:05,911 [29fddb0f-4fe2-66ea-09de-6006e1c465f4:foreman] 
>> INFO  o.a.drill.exec.work.foreman.Foreman - State change requested.  PENDING 
>> --> FAILED
>>  27 org.apache.drill.exec.planner.sql.QueryInputException: Failure handling 
>> SQL.
>>  28         at 
>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:191)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  29         at 
>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:903) 
>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  30         at 
>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:242) 
>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  31         at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>  [na:1.7.0_75]
>>  32         at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>  [na:1.7.0_75]
>>  33         at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
>>  34 Caused by: org.apache.hadoop.fs.s3.S3Exception: 
>> org.jets3t.service.S3ServiceException: Service Error Message. -- 
>> ResponseCode: 403, ResponseStatus: Forbidden, XML Error Message: <?xml 
>> version="1.0" encoding="UTF-8"?><Error><Code>Acce    
>> ssDenied</Code><Message>Access 
>> Denied</Message><RequestId>19C1D88EF4C0ED18</RequestId><HostId>zgvoYkGGGJb3C+krRs6ecdlgh+1Z5MllBgP/gPJubVKZj4kwmqvTFxIjcPYXC4Qe4QqeIICgw7g=</HostId></Error>
>>  35         at 
>> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.get(Jets3tFileSystemStore.java:175)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  36         at 
>> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.retrieveINode(Jets3tFileSystemStore.java:221)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  37         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
>> ~[na:1.7.0_75]
>>  38         at 
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>  ~[na:1.7.0_75]
>>  39         at 
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>  ~[na:1.7.0_75]
>>  40         at java.lang.reflect.Method.invoke(Method.java:606) 
>> ~[na:1.7.0_75]
>>  41         at 
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  42         at 
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  43         at com.sun.proxy.$Proxy55.retrieveINode(Unknown Source) ~[na:na]
>>  44         at 
>> org.apache.hadoop.fs.s3.S3FileSystem.listStatus(S3FileSystem.java:192) 
>> ~[hadoop-common-2.4.1.jar:na]
>>  45         at 
>> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1483) 
>> ~[hadoop-common-2.4.1.jar:na]
>>  46         at 
>> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1560) 
>> ~[hadoop-common-2.4.1.jar:na]
>>  47         at 
>> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1540) 
>> ~[hadoop-common-2.4.1.jar:na]
>>  48         at 
>> org.apache.drill.exec.store.dfs.DrillFileSystem.list(DrillFileSystem.java:697)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  49         at 
>> org.apache.drill.exec.planner.sql.handlers.ShowFileHandler.getPlan(ShowFileHandler.java:97)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  50         at 
>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:178)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  51         ... 5 common frames omitted
>> 
>> 52 Caused by: org.jets3t.service.S3ServiceException: Service Error Message.
>>  53         at org.jets3t.service.S3Service.getObject(S3Service.java:1470) 
>> ~[jets3t-0.9.3.jar:na]
>>  54         at 
>> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.get(Jets3tFileSystemStore.java:163)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  55         ... 20 common frames omitted
>>  56 2015-09-22 23:05:05,911 [29fddb0f-4fe2-66ea-09de-6006e1c465f4:foreman] 
>> INFO  o.a.drill.exec.work.foreman.Foreman - foreman cleaning up.
>>  57 2015-09-22 23:05:05,915 [29fddb0f-4fe2-66ea-09de-6006e1c465f4:foreman] 
>> ERROR o.a.drill.exec.work.foreman.Foreman - SYSTEM ERROR: 
>> S3ServiceException: Service Error Message.
>>  58
>>  59
>>  60 [Error Id: c813aea1-aaac-447d-9ce7-1baf19a7a462 on 
>> macbookair-7f99.home:31010]
>>  61 org.apache.drill.common.exceptions.UserException: SYSTEM ERROR: 
>> S3ServiceException: Service Error Message.
>>  62
>>  63
>>  64 [Error Id: c813aea1-aaac-447d-9ce7-1baf19a7a462 on 
>> macbookair-7f99.home:31010]
>>  65         at 
>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:523)
>>  ~[drill-common-1.1.0.jar:1.1.0]
>>  66         at 
>> org.apache.drill.exec.work.foreman.Foreman$ForemanResult.close(Foreman.java:737)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>  67         at 
>> org.apache.drill.exec.work.foreman.Foreman$StateSwitch.processEvent(Foreman.java:839)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>  68         at 
>> org.apache.drill.exec.work.foreman.Foreman$StateSwitch.processEvent(Foreman.java:781)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>  69         at 
>> org.apache.drill.common.EventProcessor.sendEvent(EventProcessor.java:73) 
>> [drill-common-1.1.0.jar:1.1.0]
>>  70         at 
>> org.apache.drill.exec.work.foreman.Foreman$StateSwitch.moveToState(Foreman.java:783)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>>  71         at 
>> org.apache.drill.exec.work.foreman.Foreman.moveToState(Foreman.java:892) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>  72         at 
>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:251) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>  73         at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>  [na:1.7.0_75]
>>  74         at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>  [na:1.7.0_75]
>>  75         at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
>>  76 Caused by: org.apache.drill.exec.planner.sql.QueryInputException: 
>> Failure handling SQL.
>>  77         at 
>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:191)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  78         at 
>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:903) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>  79         at 
>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:242) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>>  80         ... 3 common frames omitted
>>  81 Caused by: org.apache.hadoop.fs.s3.S3Exception: 
>> org.jets3t.service.S3ServiceException: Service Error Message. -- 
>> ResponseCode: 403, ResponseStatus: Forbidden, XML Error Message: <?xml 
>> version="1.0" encoding="UTF-8"?><Error><Code>Acce    
>> ssDenied</Code><Message>Access 
>> Denied</Message><RequestId>19C1D88EF4C0ED18</RequestId><HostId>zgvoYkGGGJb3C+krRs6ecdlgh+1Z5MllBgP/gPJubVKZj4kwmqvTFxIjcPYXC4Qe4QqeIICgw7g=</HostId></Error>
>>  82         at 
>> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.get(Jets3tFileSystemStore.java:175)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  83         at 
>> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.retrieveINode(Jets3tFileSystemStore.java:221)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  84         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
>> ~[na:1.7.0_75]
>>  85         at 
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>  ~[na:1.7.0_75]
>>  86         at 
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>  ~[na:1.7.0_75]
>>  87         at java.lang.reflect.Method.invoke(Method.java:606) 
>> ~[na:1.7.0_75]
>>  88         at 
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  89         at 
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
>>  ~[hadoop-common-2.4.1.jar:na]
>>  90         at com.sun.proxy.$Proxy55.retrieveINode(Unknown Source) ~[na:na]
>>  91         at 
>> org.apache.hadoop.fs.s3.S3FileSystem.listStatus(S3FileSystem.java:192) 
>> ~[hadoop-common-2.4.1.jar:na]
>>  92         at 
>> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1483) 
>> ~[hadoop-common-2.4.1.jar:na]
>>  93         at 
>> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1560) 
>> ~[hadoop-common-2.4.1.jar:na]
>>  94         at 
>> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1540) 
>> ~[hadoop-common-2.4.1.jar:na]
>> 95         at 
>> org.apache.drill.exec.store.dfs.DrillFileSystem.list(DrillFileSystem.java:697)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  96         at 
>> org.apache.drill.exec.planner.sql.handlers.ShowFileHandler.getPlan(ShowFileHandler.java:97)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  97         at 
>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:178)
>>  ~[drill-java-exec-1.1.0.jar:1.1.0]
>>  98         ... 5 common frames omitted
>>  99 Caused by: org.jets3t.service.S3ServiceException: Service Error Message.
>> 100         at org.jets3t.service.S3Service.getObject(S3Service.java:1470) 
>> ~[jets3t-0.9.3.jar:na]
>> 101         at 
>> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.get(Jets3tFileSystemStore.java:163)
>>  ~[hadoop-common-2.4.1.jar:na]
>> 102         ... 20 common frames omitted
>> 103 2015-09-22 23:05:05,932 [Client-1] INFO  
>> o.a.d.j.i.DrillResultSetImpl$ResultsListener - [#2] Query failed:
>> 104 org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR: 
>> S3ServiceException: Service Error Message.
>> 105
>> 106
>> 107 [Error Id: c813aea1-aaac-447d-9ce7-1baf19a7a462 on 
>> macbookair-7f99.home:31010]
>> 108         at 
>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:118)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>> 109         at 
>> org.apache.drill.exec.rpc.user.UserClient.handleReponse(UserClient.java:111) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>> 110         at 
>> org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:47)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>> 111         at 
>> org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:32)
>>  [drill-java-exec-1.1.0.jar:1.1.0]
>> 112         at org.apache.drill.exec.rpc.RpcBus.handle(RpcBus.java:61) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>> 113         at 
>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:233) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>> 114         at 
>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:205) 
>> [drill-java-exec-1.1.0.jar:1.1.0]
>> 115         at 
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
>>  [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>> 116         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 117         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 118         at 
>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
>>  [netty-handler-4.0.27.Final.jar:4.0.27.Final]
>> 119         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 120         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 121         at 
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>>  [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>> 122         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 123         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 124         at 
>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242)
>>  [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>> 125         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 126         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 127         at 
>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 128         at 
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 129         at 
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 130         at 
>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 131         at 
>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 132         at 
>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) 
>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 133         at 
>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>>  [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 134         at 
>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) 
>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 135         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) 
>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> 136         at 
>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>  [netty-common-4.0.27.Final.jar:4.0.27.Final]
>> 137         at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
>> 138 2015-09-22 23:05:12,270 [Client-1] INFO  
>> o.a.drill.exec.rpc.user.UserClient - Channel closed /192.168.1.4:57794 
>> <http://192.168.1.4:57794/> <--> MACBOOKAIR-7F99.home/192.168.1.4:31010 
>> <http://192.168.1.4:31010/>.
>> 139 2015-09-22 23:05:15,402 [BitServer-3] INFO  
>> o.a.d.exec.rpc.control.ControlClient - Channel closed /192.168.1.4:57795 
>> <http://192.168.1.4:57795/> <--> MACBOOKAIR-7F99.home/192.168.1.4:31011 
>> <http://192.168.1.4:31011/>.
>> 140 2015-09-22 23:05:18,630 [main] INFO  o.apache.drill.exec.server.Drillbit 
>> - Shutdown completed (6360 ms).
>> 141 2015-09-22 23:05:18,634 [Drillbit-ShutdownHook#0] INFO  
>> o.apache.drill.exec.server.Drillbit - Received shutdown request.
>> 
>> 
>> 
>> ———>>>>> end sqlline.log <<<<———
>> 
>> 
>> 
>> ——>>>>>> 3. S3 File Urls and notes <<<<<<<<<<<<<<--------------------
>> NOTE: Very strange as the files in my S3 bucket are visible.  scottccote has 
>> open/download, view permissions, edit permissions for both the bucket and 
>> the files listed below:
>> 
>> https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv
>>  
>> <https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv>
>>  
>> <https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv
>>  
>> <https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv>>
>> https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-59.csv
>>  
>> <https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-59.csv>
>> 
>> 
>> 
>> 
>> ——>>>>>> 4. content of s3sccapachedrill.sys.drill  
>> <<<<<<<<<<<<<<--------------------
>> The keys included in the core-site.xml are for scottccote and that user has 
>> every permission granted for the bucket and the files directly listed in the 
>> bucket.
>> 
>> Here is the contents of the file 
>> /tmp/drill/sys.storage_plugins/s3sccapachedrill.sys.drill :
>> 
>> {
>>   "type" : "file",
>>   "enabled" : true,
>>   "connection" : "s3n://sccapachedrill <s3n://sccapachedrill>",
>>   "workspaces" : {
>>     "root" : {
>>       "location" : "/",
>>       "writable" : false,
>>       "defaultInputFormat" : null
>>     },
>>     "tmp" : {
>>       "location" : "/tmp",
>>       "writable" : true,
>>       "defaultInputFormat" : null
>>     }
>>   },
>>   "formats" : {
>>     "psv" : {
>>       "type" : "text",
>>       "extensions" : [ "tbl" ],
>>       "delimiter" : "|"
>>     },
>>     "csv" : {
>>       "type" : "text",
>>       "extensions" : [ "csv" ],
>>       "delimiter" : ","
>>     },
>>     "tsv" : {
>>       "type" : "text",
>>       "extensions" : [ "tsv" ],
>>       "delimiter" : "\t"
>>     },
>>     "parquet" : {
>>       "type" : "parquet"
>>     },
>>     "json" : {
>>       "type" : "json"
>>     },
>>     "avro" : {
>>       "type" : "avro"
>>     }
>>   }
>> }
>> 
>> ——>>>>>> 5. content of conf dir and core-site.xml <<<<<<<<<<<<<<——————————
>> 
>> ls -l of conf directory yields:
>> 
>> MACBOOKAIR-7F99:conf scottccote$ ls -l
>> total 40
>> -rw-r--r--  1 scottccote  staff   474 Sep  9 19:33 core-site.xml
>> -rwxr-xr-x  1 scottccote  staff  1234 Jul  1 13:03 drill-env.sh
>> -rw-r--r--  1 scottccote  staff  3835 Jul  1 13:03 
>> drill-override-example.conf
>> -rw-r--r--  1 scottccote  staff  1194 Jul  1 13:03 drill-override.conf
>> -rw-r--r--  1 scottccote  staff  3134 Jul  1 13:03 logback.xml
>> 
>> ———>>>>> start core-site.xml <<<<———
>> 
>> <configuration>
>> <property>
>>   <name>fs.s3.awsAccessKeyId</name>
>>   <value>AKIAIK2N5W6FZH3TZLYA</value>
>> </property>
>> 
>> <property>
>>   <name>fs.s3.awsSecretAccessKey</name>
>>   <value>BsTbKVaS8lkNZrTV9R5IL5dAuPR6vXmuNGDfWuWv</value>
>> </property>
>> 
>> <property>
>>   <name>fs.s3n.awsAccessKeyId</name>
>>   <value>AKIAIK2N5W6FZH3TZLYA</value>
>> </property>
>> 
>> <property>
>>   <name>fs.s3n.awsSecretAccessKey</name>
>>   <value>BsTbKVaS8lkNZrTV9R5IL5dAuPR6vXmuNGDfWuWv</value>
>> </property>
>> </configuration>
>> 
>> ———>>>>> end core-site.xml <<<<———
>> 
>> 
>> ——>>>>>> 6. content of sqlline_queries.json <<<<<<<<<<<<<<——————————
>> 
>> {"queryId":"29fddb17-700b-68e8-a75c-76010ba852a6","schema":"","queryText":"use
>>  
>> s3sccapachedrill.root","start":1442981096676,"finish":1442981097370,"outcome":"COMPLETED","username":"anonymous"}{"queryId":"29fddb0f-4fe2-66ea-09de-6006e1c465f4","schema":"s3sccapachedrill.root","queryText":"show
>>  
>> files","start":1442981104646,"finish":1442981105912,"outcome":"FAILED","username":"anonymous”}
>> 
>> 
>> 
>> 
>> 
>> 
>> > On Sep 22, 2015, at 7:13 PM, Andries Engelbrecht 
>> > <[email protected] <mailto:[email protected]>> wrote:
>> >
>> > Scott,
>> >
>> > Try the following.
>> >
>> > In the SP config
>> >
>> > "connection": "s3n://sccapachedrill <s3n://sccapachedrill>”,
>> >
>> > or you can also try
>> >
>> > "connection": "s3://sccapachedrill <s3://sccapachedrill>”,
>> >
>> >
>> > For the core-site.xml in the CONF directory
>> >
>> > <property>
>> > <name>fs.s3.awsAccessKeyId</name>
>> > <value>ID</value>
>> > </property>
>> >
>> > <property>
>> > <name>fs.s3.awsSecretAccessKey</name>
>> > <value>SECRET</value>
>> > </property>
>> >
>> > <property>
>> > <name>fs.s3n.awsAccessKeyId</name>
>> > <value>ID</value>
>> > </property>
>> >
>> > <property>
>> > <name>fs.s3n.awsSecretAccessKey</name>
>> > <value>SECRET</value>
>> > </property>
>> >
>> > Where ID and SECRET is for your AWS account (just verify the 
>> > files/directories are readable with the account) Or test with a set of 
>> > files you provide public access.
>> >
>> > Check the permissions on all the conf files - I have my user own all the 
>> > files in the Drill install directory, no need for sudo
>> >
>> >
>> > RESTART drill using command below
>> >
>> > $DRILL-HOME/bin/drill-embedded
>> > https://drill.apache.org/docs/starting-drill-on-linux-and-mac-os-x/ 
>> > <https://drill.apache.org/docs/starting-drill-on-linux-and-mac-os-x/> 
>> > <https://drill.apache.org/docs/starting-drill-on-linux-and-mac-os-x/ 
>> > <https://drill.apache.org/docs/starting-drill-on-linux-and-mac-os-x/>>
>> >
>> >
>> > in sqlline
>> >
>> > use s3sccapachedrill.root;
>> >
>> > show files;
>> >
>> >
>> > See if this works.
>> >
>> > —Andries
>> >
>> >
>> >
>> >
>> >> On Sep 22, 2015, at 4:31 PM, scott cote <[email protected] 
>> >> <mailto:[email protected]>> wrote:
>> >>
>> >> Andries and others,
>> >>
>> >> I launch drill on my mac with the command:
>> >>
>> >> sudo bin/sqlline -u jdbc:drill:zk=local
>> >>
>> >> My core-site.xml is located in the conf folder and has not been modified 
>> >> or moved since my last post.
>> >>
>> >> the configuration of the s3sccapachedrill is now (based on your 
>> >> suggestion):
>> >>
>> >>
>> >> {
>> >> "type": "file",
>> >> "enabled": true,
>> >> "connection": "s3n://sccapachedrill/ <s3n://sccapachedrill/>",
>> >> "workspaces": {
>> >>   "root": {
>> >>     "location": "/",
>> >>     "writable": false,
>> >>     "defaultInputFormat": null
>> >>   },
>> >>   "tmp": {
>> >>     "location": "/tmp",
>> >>     "writable": true,
>> >>     "defaultInputFormat": null
>> >>   }
>> >> },
>> >> "formats": {
>> >>   "psv": {
>> >>     "type": "text",
>> >>     "extensions": [
>> >>       "tbl"
>> >>     ],
>> >>     "delimiter": "|"
>> >>   },
>> >>   "csv": {
>> >>     "type": "text",
>> >>     "extensions": [
>> >>       "csv"
>> >>     ],
>> >>     "delimiter": ","
>> >>   },
>> >>   "tsv": {
>> >>     "type": "text",
>> >>     "extensions": [
>> >>       "tsv"
>> >>     ],
>> >>     "delimiter": "\t"
>> >>   },
>> >>   "parquet": {
>> >>     "type": "parquet"
>> >>   },
>> >>   "json": {
>> >>     "type": "json"
>> >>   },
>> >>   "avro": {
>> >>     "type": "avro"
>> >>   }
>> >> }
>> >> }
>> >>
>> >>
>> >> Issuing the command:
>> >>
>> >> use s3sccapachedrill
>> >>
>> >> yields:
>> >>
>> >>
>> >> 0: jdbc:drill:zk=local> use s3sccapachedrill
>> >> . . . . . . . . . . . > ;
>> >> +-------+-----------------------------------------------+
>> >> |  ok   |                    summary                    |
>> >> +-------+-----------------------------------------------+
>> >> | true  | Default schema changed to [s3sccapachedrill]  |
>> >> +-------+-----------------------------------------------+
>> >> 1 row selected (1.38 seconds)
>> >> 0: jdbc:drill:zk=local>
>> >>
>> >>
>> >>
>> >> The command:
>> >>
>> >> show tables;
>> >>
>> >> yields:
>> >>
>> >> +--+
>> >> |  |
>> >> +--+
>> >> +--+
>> >> No rows selected (1.251 seconds)
>> >>
>> >> The command:
>> >>
>> >> show files;
>> >>
>> >> yields:
>> >>
>> >> Error: SYSTEM ERROR: NullPointerException
>> >>
>> >>
>> >> [Error Id: d96f65d6-16dd-4e4d-97c4-7bcb49f12295 on 
>> >> macbookair-7f99.home:31010] (state=,code=0)
>> >> 0: jdbc:drill:zk=local>
>> >>
>> >>
>> >>
>> >>
>> >> What am I missing here?
>> >>
>> >> Thanks in advance.
>> >>
>> >> Regards,
>> >>
>> >> SCott
>> >>
>> >>
>> >>
>> >>
>> >>
>> >>> On Sep 22, 2015, at 1:23 PM, Steven Phillips <[email protected] 
>> >>> <mailto:[email protected]>> wrote:
>> >>>
>> >>> You need to change s3 to s3n in the URI:
>> >>>
>> >>> See the discussion in the comments of this blog post:
>> >>>
>> >>> http://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/
>> >>>  
>> >>> <http://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/>
>> >>>
>> >>> Hopefully that helps. Let me know if you are still having problems.
>> >>>
>> >>> On Tue, Sep 22, 2015 at 8:47 AM, Andries Engelbrecht <
>> >>> [email protected] <mailto:[email protected]>> wrote:
>> >>>
>> >>>> Scott,
>> >>>>
>> >>>> In your SP configuration change  "connection": "s3://sccapachedrill/ 
>> >>>> <s3://sccapachedrill/>
>> >>>> <s3://sccapachedrill/ <s3://sccapachedrill/>> <s3://sccapachedrill/ 
>> >>>> <s3://sccapachedrill/> <s3://sccapachedrill/ <s3://sccapachedrill/>>>”, 
>> >>>> to
>> >>>> "connection": "s3://sccapachedrill/ <s3://sccapachedrill/> 
>> >>>> <s3://sccapachedrill/ <s3://sccapachedrill/>> “,
>> >>>> I think you may have misunderstood the instructions on the page.
>> >>>>
>> >>>>
>> >>>> Also when querying the metadata in S3 you are better of to use show 
>> >>>> files
>> >>>> instead of show tables.
>> >>>>
>> >>>> In some cases I used core-site.xml with the credentials and placed it in
>> >>>> $DRILL_HOME/conf
>> >>>>
>> >>>> See if this works for you.
>> >>>>
>> >>>>
>> >>>> —Andries
>> >>>>
>> >>>>
>> >>>>> On Sep 21, 2015, at 7:29 PM, scott cote <[email protected] 
>> >>>>> <mailto:[email protected]>> wrote:
>> >>>>>
>> >>>>> Drillers,
>> >>>>>
>> >>>>> I run a User Group for MongoDB and am attempting to demonstrate the
>> >>>> ability to join data in S3 with data located in a MongoDB Collection.
>> >>>> Unfortunately, I am unable to properly query a csv file that I placed 
>> >>>> in S3.
>> >>>>>
>> >>>>> As reference, I am using the MapR S3 page. Followed directions with one
>> >>>> modification: one may not place a hyphen within the name of the bucket 
>> >>>> or
>> >>>> the name of the storage plugin.  Doing so blows up the sql parser….  So
>> >>>> with that deviation, I did everything else listed on that page.  They
>> >>>> system is configured per
>> >>>> https://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/
>> >>>>  
>> >>>> <https://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/>
>> >>>> <
>> >>>> https://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/
>> >>>>  
>> >>>> <https://drill.apache.org/blog/2014/12/09/running-sql-queries-on-amazon-s3/>
>> >>>>>
>> >>>>>
>> >>>>> Problem:
>> >>>>>
>> >>>>> The s3sccapachedrill storage plugin shows enabled and I can switch to 
>> >>>>> is
>> >>>> for use, but viewing a list of files/tables returns an empty list, and
>> >>>> attempting to query a specific file by name cases stack traces.
>> >>>>>
>> >>>>>
>> >>>>> Inside this email are the following key pieces of information:
>> >>>>>
>> >>>>> 1. Command/Response of using Apache Drill to access S3 csv log files.
>> >>>>> 2. Configuration of s3sccapachedrill storage plugin
>> >>>>> 3. Path to the s3 file in the sccapachedrill bucket
>> >>>>> 4. Contents of my hadoop_excludes.txt
>> >>>>> 5. View of the end of the sqlline.log
>> >>>>>
>> >>>>>
>> >>>>> Please let me know what I should change?
>> >>>>>
>> >>>>> Thanks,
>> >>>>>
>> >>>>> SCott
>> >>>>> Scott C. Cote
>> >>>>> [email protected] <mailto:[email protected]> 
>> >>>>> <mailto:[email protected] <mailto:[email protected]>>
>> >>>>> 972.672.6484
>> >>>>>
>> >>>>>
>> >>>>> =====>>>>> Part 1 <<<<=========
>> >>>>> What I see when I request for a list of databases:
>> >>>>>
>> >>>>>
>> >>>>> 0: jdbc:drill:zk=local> show databases;
>> >>>>> +---------------------------+
>> >>>>> |        SCHEMA_NAME        |
>> >>>>> +---------------------------+
>> >>>>> | INFORMATION_SCHEMA        |
>> >>>>> | cp.default                |
>> >>>>> | dfs.default               |
>> >>>>> | dfs.root                  |
>> >>>>> | dfs.tmp                   |
>> >>>>> | s3sccapachedrill.default  |
>> >>>>> | s3sccapachedrill.root     |
>> >>>>> | s3sccapachedrill.tmp      |
>> >>>>> | sys                       |
>> >>>>> +---------------------------+
>> >>>>> 9 rows selected (0.083 seconds)
>> >>>>>
>> >>>>>
>> >>>>> Here is what I see when I request a list of tables (first sign of
>> >>>> trouble):
>> >>>>>
>> >>>>>
>> >>>>> 0: jdbc:drill:zk=local> use s3sccapachedrill;
>> >>>>> +-------+-----------------------------------------------+
>> >>>>> |  ok   |                    summary                    |
>> >>>>> +-------+-----------------------------------------------+
>> >>>>> | true  | Default schema changed to [s3sccapachedrill]  |
>> >>>>> +-------+-----------------------------------------------+
>> >>>>> 1 row selected (0.069 seconds)
>> >>>>> 0: jdbc:drill:zk=local> show tables;
>> >>>>> +--+
>> >>>>> |  |
>> >>>>> +--+
>> >>>>> +--+
>> >>>>> No rows selected (1.504 seconds)
>> >>>>>
>> >>>>>
>> >>>>> Here is what I see when I try to query a specific file in s3 -
>> >>>> oletv_server_event.log.2014-12-16-10-55.csv
>> >>>>>
>> >>>>> 0: jdbc:drill:zk=local> select * from
>> >>>> s3sccapachedrill.root.`oletv_server_event.log.2014-12-16-10-55.csv`;
>> >>>>> Sep 16, 2015 12:07:22 PM
>> >>>> org.apache.calcite.sql.validate.SqlValidatorException <init>
>> >>>>> SEVERE: org.apache.calcite.sql.validate.SqlValidatorException: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>> Sep 16, 2015 12:07:22 PM org.apache.calcite.runtime.CalciteException
>> >>>> <init>
>> >>>>> SEVERE: org.apache.calcite.runtime.CalciteContextException: From line 
>> >>>>> 1,
>> >>>> column 15 to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>> Error: PARSE ERROR: From line 1, column 15 to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>
>> >>>>>
>> >>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d on
>> >>>> macbookair-7f99.home:31010] (state=,code=0)
>> >>>>>
>> >>>>>
>> >>>>> See “Part 3” for the full http url for the file.
>> >>>>>
>> >>>>> =====>>>>> Part 2 <<<<=========
>> >>>>>
>> >>>>> Here is the configuration that I’m using for the s3sccapachedrill
>> >>>> storage plugin
>> >>>>>
>> >>>>>
>> >>>>> {
>> >>>>> "type": "file",
>> >>>>> "enabled": true,
>> >>>>> "connection": "s3://sccapachedrill/ <s3://sccapachedrill/> 
>> >>>>> <s3://sccapachedrill/ <s3://sccapachedrill/>>",
>> >>>>> "workspaces": {
>> >>>>> "root": {
>> >>>>>   "location": "/",
>> >>>>>   "writable": false,
>> >>>>>   "defaultInputFormat": null
>> >>>>> },
>> >>>>> "tmp": {
>> >>>>>   "location": "/tmp",
>> >>>>>   "writable": true,
>> >>>>>   "defaultInputFormat": null
>> >>>>> }
>> >>>>> },
>> >>>>> "formats": {
>> >>>>> "psv": {
>> >>>>>   "type": "text",
>> >>>>>   "extensions": [
>> >>>>>     "tbl"
>> >>>>>   ],
>> >>>>>   "delimiter": "|"
>> >>>>> },
>> >>>>> "csv": {
>> >>>>>   "type": "text",
>> >>>>>   "extensions": [
>> >>>>>     "csv"
>> >>>>>   ],
>> >>>>>   "delimiter": ","
>> >>>>> },
>> >>>>> "tsv": {
>> >>>>>   "type": "text",
>> >>>>>   "extensions": [
>> >>>>>     "tsv"
>> >>>>>   ],
>> >>>>>   "delimiter": "\t"
>> >>>>> },
>> >>>>> "parquet": {
>> >>>>>   "type": "parquet"
>> >>>>> },
>> >>>>> "json": {
>> >>>>>   "type": "json"
>> >>>>> },
>> >>>>> "avro": {
>> >>>>>   "type": "avro"
>> >>>>> }
>> >>>>> }
>> >>>>> }
>> >>>>>
>> >>>>> =====>>>>> Part 3 <<<<=========
>> >>>>> Here is the path to the s3 file
>> >>>>>
>> >>>>>
>> >>>> https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv
>> >>>>  
>> >>>> <https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv>
>> >>>> <
>> >>>> https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv
>> >>>>  
>> >>>> <https://s3-us-west-2.amazonaws.com/sccapachedrill/oletv_server_event.log.2014-12-16-10-55.csv>
>> >>>>>
>> >>>>>
>> >>>>>
>> >>>>> =====>>>>> Part 4 <<<<=========
>> >>>>> Here is the contents of my hadoop_excludes.txt
>> >>>>>
>> >>>>> asm
>> >>>>> jackson
>> >>>>> mockito
>> >>>>> log4j
>> >>>>> logging
>> >>>>> slf4j
>> >>>>> jetty
>> >>>>> jasper
>> >>>>> jersey
>> >>>>> eclipse
>> >>>>> common
>> >>>>> guava
>> >>>>> servlet
>> >>>>> parquet
>> >>>>>
>> >>>>> =====>>>>> Part 5 <<<<=========
>> >>>>> A view of the end of the sqlline.log where the command to view the file
>> >>>> as a table blew up looks like:
>> >>>>>
>> >>>>>
>> >>>>> 2015-09-16 12:07:22,093 [2a065e36-7ead-e8a3-70f3-5a9bf12c4abe:foreman]
>> >>>> INFO  o.a.d.e.planner.sql.DrillSqlWorker - User Error Occurred
>> >>>>> org.apache.drill.common.exceptions.UserException: PARSE ERROR: From 
>> >>>>> line
>> >>>> 1, column 15 to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>
>> >>>>>
>> >>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d ]
>> >>>>>    at
>> >>>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:523)
>> >>>> ~[drill-common-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:181)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:903)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:242)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> >>>> [na:1.8.0_45]
>> >>>>>    at
>> >>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> >>>> [na:1.8.0_45]
>> >>>>>    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
>> >>>>> Caused by: org.apache.calcite.tools.ValidationException:
>> >>>> org.apache.calcite.runtime.CalciteContextException: From line 1, column 
>> >>>> 15
>> >>>> to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>    at
>> >>>> org.apache.calcite.prepare.PlannerImpl.validate(PlannerImpl.java:176)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.prepare.PlannerImpl.validateAndGetType(PlannerImpl.java:185)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:428)
>> >>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert(DefaultSqlHandler.java:188)
>> >>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:157)
>> >>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:178)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    ... 5 common frames omitted
>> >>>>> Caused by: org.apache.calcite.runtime.CalciteContextException: From 
>> >>>>> line
>> >>>> 1, column 15 to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> >>>> Method) ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:348)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:689)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:674)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.newValidationError(SqlValidatorImpl.java:3750)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:106)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:86)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:874)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:863)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2745)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2730)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2953)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:86)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:874)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:863)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at org.apache.calcite.sql.SqlSelect.validate(SqlSelect.java:210)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:837)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:552)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.prepare.PlannerImpl.validate(PlannerImpl.java:174)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    ... 10 common frames omitted
>> >>>>> Caused by: org.apache.calcite.sql.validate.SqlValidatorException: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> >>>> Method) ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:348)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:457)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    ... 28 common frames omitted
>> >>>>> 2015-09-16 12:07:22,095 [2a065e36-7ead-e8a3-70f3-5a9bf12c4abe:foreman]
>> >>>> INFO  o.a.drill.exec.work.foreman.Foreman - State change requested.
>> >>>> PENDING --> FAILED
>> >>>>> org.apache.drill.exec.work.foreman.ForemanException: Unexpected
>> >>>> exception during fragment initialization: PARSE ERROR: From line 1, 
>> >>>> column
>> >>>> 15 to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>
>> >>>>>
>> >>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d ]
>> >>>>>    at
>> >>>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:253)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> >>>> [na:1.8.0_45]
>> >>>>>    at
>> >>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> >>>> [na:1.8.0_45]
>> >>>>>    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
>> >>>>> Caused by: org.apache.drill.common.exceptions.UserException: PARSE
>> >>>> ERROR: From line 1, column 15 to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>
>> >>>>>
>> >>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d ]
>> >>>>>    at
>> >>>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:523)
>> >>>> ~[drill-common-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:181)
>> >>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:903)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:242)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    ... 3 common frames omitted
>> >>>>> Caused by: org.apache.calcite.tools.ValidationException:
>> >>>> org.apache.calcite.runtime.CalciteContextException: From line 1, column 
>> >>>> 15
>> >>>> to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>    at
>> >>>> org.apache.calcite.prepare.PlannerImpl.validate(PlannerImpl.java:176)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.prepare.PlannerImpl.validateAndGetType(PlannerImpl.java:185)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:428)
>> >>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateAndConvert(DefaultSqlHandler.java:188)
>> >>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:157)
>> >>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:178)
>> >>>> ~[drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    ... 5 common frames omitted
>> >>>>> Caused by: org.apache.calcite.runtime.CalciteContextException: From 
>> >>>>> line
>> >>>> 1, column 15 to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> >>>> Method) ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:348)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:689)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:674)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.newValidationError(SqlValidatorImpl.java:3750)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:106)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:86)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:874)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:863)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2745)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2730)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2953)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:86)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:874)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:863)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at org.apache.calcite.sql.SqlSelect.validate(SqlSelect.java:210)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:837)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:552)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.prepare.PlannerImpl.validate(PlannerImpl.java:174)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    ... 10 common frames omitted
>> >>>>> Caused by: org.apache.calcite.sql.validate.SqlValidatorException: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> >>>> Method) ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> >>>> ~[na:1.8.0_45]
>> >>>>>    at
>> >>>> org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:348)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    at
>> >>>> org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:457)
>> >>>> ~[calcite-core-1.1.0-drill-r14.jar:1.1.0-drill-r14]
>> >>>>>    ... 28 common frames omitted
>> >>>>> 2015-09-16 12:07:22,096 [2a065e36-7ead-e8a3-70f3-5a9bf12c4abe:foreman]
>> >>>> INFO  o.a.drill.exec.work.foreman.Foreman - foreman cleaning up.
>> >>>>> 2015-09-16 12:07:22,106 [Client-1] INFO
>> >>>> o.a.d.j.i.DrillResultSetImpl$ResultsListener - [#18] Query failed:
>> >>>>> org.apache.drill.common.exceptions.UserRemoteException: PARSE ERROR:
>> >>>> From line 1, column 15 to line 1, column 30: Table
>> >>>> 's3sccapachedrill.root.oletv_server_event.log.2014-12-16-10-55.csv' not
>> >>>> found
>> >>>>>
>> >>>>>
>> >>>>> [Error Id: fc5d792b-2336-40e2-b89f-df3a593e090d on
>> >>>> macbookair-7f99.home:31010]
>> >>>>>    at
>> >>>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:118)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.rpc.user.UserClient.handleReponse(UserClient.java:111)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:47)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:32)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at org.apache.drill.exec.rpc.RpcBus.handle(RpcBus.java:61)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:233)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:205)
>> >>>> [drill-java-exec-1.1.0.jar:1.1.0]
>> >>>>>    at
>> >>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
>> >>>> [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
>> >>>> [netty-handler-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>> >>>> [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242)
>> >>>> [netty-codec-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>> >>>> [netty-transport-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at
>> >>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>> >>>> [netty-common-4.0.27.Final.jar:4.0.27.Final]
>> >>>>>    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
>> >>>>
>> >>>>
>> >>
>> >
>> 
>> 
> 

Reply via email to