Try the new binaries out:
https://www.dropbox.com/sm/create/apache-drill-1.0.0-m2-incubating-SNAPSHOT-binary-release.tar.gz

Its ~90 Megs.

Extract the tar to the installation dir as before and execute the queries
on it.

Also you can try this old method on the M1 version of binaries. This was
the old way of querying Sqlline:
http://www.confusedcoders.com/bigdata/apache-drill/apache-drill-executing-sample-sql-queries-on-json-data

Let me know if either one works.

Peace,
Yash



On Sat, May 24, 2014 at 1:15 PM, Yash Sharma <yash...@gmail.com> wrote:

> Building from source would give you M2 version. Let me put my binary
> tarball on dropbox.
>
>
>
> On Sat, May 24, 2014 at 1:10 PM, 南在南方 <i02...@qq.com> wrote:
>
>> Is that Apache Drill Milestone 1 - (Drill Alpha) is the newest version?
>>
>>
>>
>>
>> ------------------ 原始邮件 ------------------
>> 发件人: "南在南方"<i02...@qq.com>;
>> 发送时间: 2014年5月24日(星期六) 下午3:34
>> 收件人: "drill-user"<drill-user@incubator.apache.org>;
>> 主题: 回复: 回复: 回复: 回复: 回复: 回复: error when querying a csv file with drill.
>>
>>
>>
>> I think not, now I git the newest package .
>> Can I use your compiled drill directly in my computer?
>>
>>
>> ------------------ 原始邮件 ------------------
>> 发件人: "Yash Sharma";<yash...@gmail.com>;
>> 发送时间: 2014年5月24日(星期六) 下午3:57
>> 收件人: "drill-user"<drill-user@incubator.apache.org>;
>>
>> 主题: Re: 回复: 回复: 回复: 回复: 回复: error when querying a csv file with drill.
>>
>>
>>
>> Is this error message on a fresh pull?
>>
>>
>> On Sat, May 24, 2014 at 12:49 PM, 南在南方 <i02...@qq.com> wrote:
>>
>> > ERROR :No known driver to handle "xxxxx", That why I turn to binary. ==~
>> >
>> >
>> > bevin@le:~/compiled-drill/bin$ sqlline -u jdbc:drill:zk=local -n admin
>> -p
>> > admin
>> > scan complete in 81ms
>> > scan complete in 984ms
>> > No known driver to handle "jdbc:drill:zk=local"
>> > sqlline version 1.0.2 by Marc Prud'hommeaux
>> > sqlline> !quit
>> > bevin@le:~/compiled-drill/bin$ sqlline -u jdbc:drill:schmea=dfs -n
>> admin
>> > -p admin
>> > scan complete in 119ms
>> > scan complete in 825ms
>> > No known driver to handle "jdbc:drill:schmea=dfs"
>> > sqlline version 1.0.2 by Marc Prud'hommeaux
>> > sqlline>
>> >
>> > ------------------ 原始邮件 ------------------
>> > 发件人: "Yash Sharma";<yash...@gmail.com>;
>> > 发送时间: 2014年5月24日(星期六) 下午3:37
>> > 收件人: "drill-user"<drill-user@incubator.apache.org>;
>> >
>> > 主题: Re: 回复: 回复: 回复: 回复: error when querying a csv file with drill.
>> >
>> >
>> >
>> > I am not sure of the root cause Bevin.
>> > I would try to run queries on the binaries and let you know.
>> >
>> > Peace,
>> > Yash
>> >
>> >
>> > On Sat, May 24, 2014 at 12:31 PM, 南在南方 <i02...@qq.com> wrote:
>> >
>> > > Thanks.
>> > > I compiled it before, but now I use drill in my cluster environment
>> where
>> > > Network isn't available. but I would try again in my own ubuntu ma
>> chine.
>> > > So is this version problem with binary installed drill package?
>> > >
>> > >
>> > >
>> > >
>> > > ------------------ 原始邮件 ------------------
>> > > 发件人: "Yash Sharma"<yash...@gmail.com>;
>> > > 发送时间: 2014年5月24日(星期六) 下午3:17
>> > > 收件人: "drill-user"<drill-user@incubator.apache.org>;
>> > > 主题: Re: 回复: 回复: 回复: error when querying a csv file with drill.
>> > >
>> > >
>> > >
>> > > Hi Bevin,
>> > > These are the steps to build directly from the source:
>> > >
>> > > // Get Drill Source
>> > >
>> > > $ git clone https://github.com/apache/incubator-drill.git
>> > > $ cd incubator-drill
>> > >
>> > > // Inside /DRILL DIR/
>> > > $ mvn clean install -DskipTests
>> > >
>> > > // Install Drill
>> > > $ mkdir /opt/drill
>> > > $ tar xvzf distribution/target/*.tar.gz --strip=1 -C /opt/drill
>> > >
>> > > // Switch to installed directory
>> > > $ cd /opt/drill
>> > > $ sudo bin/sqlline -u jdbc:drill:zk=local -n admin -p admin
>> > >
>> > > 0: jdbc:drill:zk=local>
>> > > select * from dfs.`/home/yash/git/Drill/data.json`;
>> > >
>> > > Let me know if this works.
>> > >
>> > > Peace Yash
>> > >
>> > >
>> > > On Sat, May 24, 2014 at 12:10 PM, 南在南方 <i02...@qq.com> wrote:
>> > >
>> > > > Hi, thank you for your reply.
>> > > > I failed the query again this morning.
>> > > > I can query json file correctly. (oddly, I use jsonl schema which
>> > defined
>> > > > in storage-engines.json and there query with “select * from
>> "file.json"
>> > > ”,
>> > > > not back ticks).
>> > > >
>> > > >
>> > > > I use binary version of drill, Is that can be a reason I failed to
>> > query?
>> > > > I would appreciate it if you send your entire Drill directory zip
>> > package
>> > > > to me. i02...@qq.com or bevi...@gmail.com .
>> > > > THANKS
>> > > >
>> > > >
>> > > > ------------------ 原始邮件 ------------------
>> > > > 发件人: "Yash Sharma";<yash...@gmail.com>;
>> > > > 发送时间: 2014年5月24日(星期六) 凌晨1:52
>> > > > 收件人: "drill-user"<drill-user@incubator.apache.org>;
>> > > >
>> > > > 主题: Re: 回复: 回复: error when querying a csv file with drill.
>> > > >
>> > > >
>> > > >
>> > > > Hi All,
>> > > > Is the main issue on connecting/querying to the json/csv data
>> sources.
>> > > >
>> > > > Here is what I did for querying Json/Csv data Sources.
>> > > > Let me know if this is what we are looking for... or I am missing
>> the
>> > > > context here..
>> > > >
>> > > > 1. I did not modify the storage-plugin.json.
>> > > >
>> > > >
>> > >
>> >
>> https://github.com/apache/incubator-drill/blob/master/distribution/src/resources/storage-plugins.json
>> > > >
>> > > > *2 Created Json Data:* in Path /home/yash/git/Drill/data.json
>> > > > The content of Json is pasted in the end of mail.
>> > > >
>> > > > *3. Start Drill: *Drill is started by
>> > > > bin/sqlline -u jdbc:drill:zk=local -n admin -p admin
>> > > >
>> > > > *4. Query Json Data:* pass path with dfs<dot>`path\to\data.json`;
>> (path
>> > > in
>> > > > back-tick)
>> > > > 0: jdbc:drill:zk=local>
>> > > > select * from dfs.`/home/yash/git/Drill/data.json`;
>> > > >
>> > > > *5. Output*
>> > > >
>> > > >
>> > >
>> >
>> +------------+------------+------------+------------+------------+------------+
>> > > > |   title    |    name    |   width    |   height   |   image    |
>> > >  text
>> > > >    |
>> > > >
>> > > >
>> > >
>> >
>> +------------+------------+------------+------------+------------+------------+
>> > > > | Transparent Window | trn_window | 500        | 500        |
>> > > >
>> > > >
>> > >
>> >
>> {"alignment":"center","hOffset":250,"name":"sun1","src":"Images/Sun.png","vOffset":250}
>> > > > | {"alignm |
>> > > > | White Window | white_window | 500        | 500        |
>> > > >
>> > > >
>> > >
>> >
>> {"alignment":"center","hOffset":250,"name":"sun1","src":"Images/Sun.png","vOffset":250}
>> > > > | {"alignment" |
>> > > > | black window | black_window | 500        | 500        |
>> > > >
>> > > >
>> > >
>> >
>> {"alignment":"center","hOffset":250,"name":"sun1","src":"Images/Sun.png","vOffset":250}
>> > > > | {"alignment" |
>> > > >
>> > > >
>> > >
>> >
>> +------------+------------+------------+------------+------------+------------+
>> > > > 3 rows selected (0.095 seconds)
>> > > >
>> > > >
>> > > > Same procedure for *CSV* Data:
>> > > > *Query:*
>> > > > select * from dfs.`/home/yash/Desktop/Drill/data.csv`;
>> > > > select columns from dfs.`/home/yash/Desktop/Drill/data.csv`;
>> > > > select columns[1] from dfs.`/home/yash/Desktop/Drill/data.csv`;
>> > > >
>> > > >
>> > > >
>> > > > Hope it helps.
>> > > >
>> > > > Peace,
>> > > > Yash
>> > > >
>> > > >
>> > > >
>> > > > JSON Data:
>> > > >
>> > > > > {
>> > > > > "title": "Transparent Window",
>> > > > > "name": "trn_window",
>> > > > >  "width": 500,
>> > > > > "height": 500,
>> > > > >
>> > > > > "image": {
>> > > > >  "src": "Images/Sun.png",
>> > > > > "name": "sun1",
>> > > > > "hOffset": 250,
>> > > > >  "vOffset": 250,
>> > > > > "alignment": "center"
>> > > > > },
>> > > > >  "text": {
>> > > > > "data": "Click Here",
>> > > > > "size": 36,
>> > > > >  "style": "bold",
>> > > > > "name": "text1",
>> > > > > "hOffset": 250,
>> > > > >  "vOffset": 100,
>> > > > > "alignment": "center",
>> > > > > "onMouseUp": "sun1.opacity = (sun1.opacity / 100) * 90;"
>> > > > >  }
>> > > > > }
>> > > > >
>> > > > > {
>> > > > > "title": "White Window",
>> > > > > "name": "white_window",
>> > > > >  "width": 500,
>> > > > > "height": 500,
>> > > > >
>> > > > > "image": {
>> > > > >  "src": "Images/Sun.png",
>> > > > > "name": "sun1",
>> > > > > "hOffset": 250,
>> > > > >  "vOffset": 250,
>> > > > > "alignment": "center"
>> > > > > },
>> > > > >  "text": {
>> > > > > "data": "Click Here",
>> > > > > "size": 36,
>> > > > >  "style": "bold",
>> > > > > "name": "text1",
>> > > > > "hOffset": 250,
>> > > > >  "vOffset": 100,
>> > > > > "alignment": "center",
>> > > > > "onMouseUp": "sun1.opacity = (sun1.opacity / 100) * 90;"
>> > > > >  }
>> > > > > }
>> > > > >
>> > > > > {
>> > > > > "title": "black window",
>> > > > > "name": "black_window",
>> > > > >  "width": 500,
>> > > > > "height": 500,
>> > > > >
>> > > > > "image": {
>> > > > >  "src": "Images/Sun.png",
>> > > > > "name": "sun1",
>> > > > > "hOffset": 250,
>> > > > >  "vOffset": 250,
>> > > > > "alignment": "center"
>> > > > > },
>> > > > >  "text": {
>> > > > > "data": "Click Here",
>> > > > > "size": 36,
>> > > > >  "style": "bold",
>> > > > > "name": "text1",
>> > > > > "hOffset": 250,
>> > > > >  "vOffset": 100,
>> > > > > "alignment": "center",
>> > > > > "onMouseUp": "sun1.opacity = (sun1.opacity / 100) * 90;"
>> > > > >  }
>> > > > > }
>> > > >
>> > > >
>> > > >
>> > > > CSV Data:
>> > > >
>> > > > u1,purchase,iphone
>> > > > > u1,purchase,ipad
>> > > > > u2,purchase,nexus
>> > > > > u2,purchase,galaxy
>> > > > > u3,purchase,surface
>> > > > > u4,purchase,iphone
>> > > > > u4,purchase,galaxy
>> > > > > u1,view,iphone
>> > > > > u1,view,ipad
>> > > > > u1,view,nexus
>> > > > > u1,view,galaxy
>> > > > > u2,view,iphone
>> > > > > u2,view,ipad
>> > > > > u2,view,nexus
>> > > > > u2,view,galaxy
>> > > > > u3,view,surface
>> > > > > u3,view,nexus
>> > > > > u4,view,iphone
>> > > > > u4,view,ipad
>> > > > > u4,view,galaxy
>> > > >
>> > > >
>> > > >
>> > > >
>> > > >
>> > > >
>> > > >
>> > > >
>> > > >
>> > > >
>> > > > On Fri, May 23, 2014 at 9:06 PM, Jason Altekruse
>> > > > <altekruseja...@gmail.com>wrote:
>> > > >
>> > > > > Sorry, that was a bit of a roundabout way of describing the
>> process.
>> > > The
>> > > > > normal connection should just give you a session where you can
>> query
>> > > all
>> > > > > schemas (in your config file, dfs is the only schema listed). You
>> > > simply
>> > > > > need to write it as dfs.`filepath'
>> > > > >
>> > > > > I was looking at the connection docs quickly and I couldn't find
>> the
>> > > > exact
>> > > > > option for specifying schema. I'll reply if I find something later
>> > > today
>> > > > > (its likely just schema or default_schema). Doing this would
>> simply
>> > > allow
>> > > > > you to leave off the dfs part of the name, as it will assume your
>> > > queries
>> > > > > will want to default to that schema.
>> > > > >
>> > > > > -Jason
>> > > > >
>> > > > >
>> > > > > On Fri, May 23, 2014 at 10:20 AM, 南在南方 <i02...@qq.com> wrote:
>> > > > >
>> > > > > > I found that there are two configuration files in drill home
>> > > > > > (storage-engines.json and storage-plugins.json) . I have tried
>> to
>> > > > remove
>> > > > > > storage-engines.json, Connection can establish but same error
>> > occurs.
>> > > > > > I connected drill with string "sqlline -u jdbc:drill:schema=dfs
>> -n
>> > > > admin
>> > > > > -
>> > > > > > p admin " ,you mean I connect without giving a schema then query
>> > with
>> > > > > > dfs.`tablename`?
>> > > > > > It's 23:00 now. I would try again tomorrow. ^_^
>> > > > > > thanks
>> > > > > >
>> > > > > >
>> > > > > >
>> > > > > >
>> > > > > > ------------------ 原始邮件 ------------------
>> > > > > > 发件人: "Jason Altekruse"<altekruseja...@gmail.com>;
>> > > > > > 发送时间: 2014年5月23日(星期五) 晚上10:43
>> > > > > > 收件人: "drill-user"<drill-user@incubator.apache.org>;
>> > > > > > 主题: Re: 回复: error when querying a csv file with drill.
>> > > > > >
>> > > > > >
>> > > > > >
>> > > > > > It looks like you need to specify the data source you are using
>> > > > > (appearing
>> > > > > > in the storage section of your config file), I believe there is
>> a
>> > way
>> > > > > when
>> > > > > > you are connecting to pass a default schema, but it currently
>> isn't
>> > > set
>> > > > > up
>> > > > > > to bring you right into a schema, even if you only have line
>> > > specified
>> > > > in
>> > > > > > your config file. So you should be able to query from
>> dfs.`table`
>> > > >  (where
>> > > > > > table is the filename in your case)
>> > > > > >
>> > > > > > - Jason
>> > > > > >
>> > > > > >
>> > > > > > On Fri, May 23, 2014 at 9:30 AM, 南在南方 <i02...@qq.com> wrote:
>> > > > > >
>> > > > > > > I have tried three way (back-ticks, single quotes, double
>> > > > quotes).Here
>> > > > > is
>> > > > > > > the back-ticks error report:
>> > > > > > > I succeed in quering parquet and json format file before with
>> > > "double
>> > > > > > > quotes" around file classpath. I thought they were same.
>> > > > > > > 0: jdbc:drill:schema=dfs> select * from
>> > > > `/home/bevin/AllstarFull.csv`;
>> > > > > > > java.lang.RuntimeException: parse failed
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare2_(OptiqPrepareImpl.java:237)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare_(OptiqPrepareImpl.java:195)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepareSql(OptiqPrepareImpl.java:168)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.jdbc.OptiqStatement.parseQuery(OptiqStatement.java:402)
>> > > > > > >         at
>> > > > > > >
>> > > > >
>> > >
>> net.hydromatic.optiq.jdbc.OptiqStatement.execute(OptiqStatement.java:192)
>> > > > > > >         at sqlline.SqlLine$Commands.execute(SqlLine.java:3825)
>> > > > > > >         at sqlline.SqlLine$Commands.sql(SqlLine.java:3738)
>> > > > > > >         at sqlline.SqlLine.dispatch(SqlLine.java:882)
>> > > > > > >         at sqlline.SqlLine.begin(SqlLine.java:717)
>> > > > > > >         at
>> > > sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:460)
>> > > > > > >         at sqlline.SqlLine.main(SqlLine.java:443)
>> > > > > > > Caused by: org.eigenbase.sql.parser.SqlParseException: Lexical
>> > > error
>> > > > at
>> > > > > > > line 1, column 15.  Encountered: "`" (96), after : ""
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.convertException(SqlParserImpl.java:281)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.normalizeException(SqlParserImpl.java:44)
>> > > > > > >         at
>> > > > > > org.eigenbase.sql.parser.SqlParser.parseStmt(SqlParser.java:138)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare2_(OptiqPrepareImpl.java:235)
>> > > > > > >         ... 10 more
>> > > > > > > Caused by: org.eigenbase.sql.parser.impl.TokenMgrError:
>> Lexical
>> > > error
>> > > > > at
>> > > > > > > line 1, column 15.  Encountered: "`" (96), after : ""
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImplTokenManager.getNextToken(SqlParserImplTokenManager.java:4924)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_scan_token(SqlParserImpl.java:15281)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_3_245(SqlParserImpl.java:14738)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_3R_38(SqlParserImpl.java:14750)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_3R_42(SqlParserImpl.java:14643)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_3_73(SqlParserImpl.java:14951)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_2_73(SqlParserImpl.java:5338)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.TableRef(SqlParserImpl.java:1379)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.FromClause(SqlParserImpl.java:1286)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlSelect(SqlParserImpl.java:633)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.LeafQuery(SqlParserImpl.java:399)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.LeafQueryOrExpr(SqlParserImpl.java:2055)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.QueryOrExpr(SqlParserImpl.java:2017)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.OrderedQueryOrExpr(SqlParserImpl.java:377)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlStmt(SqlParserImpl.java:573)
>> > > > > > >         at
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlStmtEof(SqlParserImpl.java:599)
>> > > > > > >         at
>> > > > > > org.eigenbase.sql.parser.SqlParser.parseStmt(SqlParser.java:130)
>> > > > > > >         ... 11 more
>> > > > > > >
>> > > > > > >
>> > > > > > > Thanks.
>> > > > > > >
>> > > > > > >
>> > > > > > > ------------------ 原始邮件 ------------------
>> > > > > > > 发件人: "Jason Altekruse";<altekruseja...@gmail.com>;
>> > > > > > > 发送时间: 2014年5月23日(星期五) 晚上10:20
>> > > > > > > 收件人: "Apache Drill User"<drill-user@incubator.apache.org>;
>> > > > > > >
>> > > > > > > 主题: Re: error when querying a csv file with drill.
>> > > > > > >
>> > > > > > >
>> > > > > > >
>> > > > > > > Hello Bevin,
>> > > > > > >
>> > > > > > > Welcome to the Drill community! For your error it looks like
>> > there
>> > > is
>> > > > > an
>> > > > > > > issue parsing the table name. Table names are expected to be
>> > > > surrounded
>> > > > > > by
>> > > > > > > back-ticks instead of single quotes, for more info you can
>> read
>> > > this
>> > > > > post
>> > > > > > > on the wiki.
>> > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> https://cwiki.apache.org/confluence/display/DRILL/Apache+Drill+in+10+Minutes
>> > > > > > >
>> > > > > > > -Jason
>> > > > > > >
>> > > > > > >
>> > > > > > > On Fri, May 23, 2014 at 6:15 AM, 南在南方 <i02...@qq.com> wrote:
>> > > > > > >
>> > > > > > > > Here is my storage-plugins.json:
>> > > > > > > > {
>> > > > > > > >   "storage":{
>> > > > > > > >     dfs: {
>> > > > > > > >       type: "file",
>> > > > > > > >       connection: "file:///"
>> > > > > > > >       },
>> > > > > > > >       formats: {
>> > > > > > > >         "psv" : {
>> > > > > > > >           type: "text",
>> > > > > > > >           extensions: [ "tbl" ],
>> > > > > > > >           delimiter: "|"
>> > > > > > > >         },
>> > > > > > > >         "csv" : {
>> > > > > > > >           type: "text",
>> > > > > > > >           extensions: [ "csv" ],
>> > > > > > > >           delimiter: ","
>> > > > > > > >         },
>> > > > > > > >         "tsv" : {
>> > > > > > > >           type: "text",
>> > > > > > > >           extensions: [ "tsv" ],
>> > > > > > > >           delimiter: "\t"
>> > > > > > > >         },
>> > > > > > > >         "parquet" : {
>> > > > > > > >           type: "parquet"
>> > > > > > > >         },
>> > > > > > > >         "json" : {
>> > > > > > > >           type: "json"
>> > > > > > > >         }
>> > > > > > > >       }
>> > > > > > > >     },
>> > > > > > > >     cp: {
>> > > > > > > >       type: "file",
>> > > > > > > >       connection: "classpath:///"
>> > > > > > > >     }
>> > > > > > > >   }
>> > > > > > > > }I connect drill with string "sqlline -u
>> jdbc:drill:schema=dfs
>> > -n
>> > > > > admin
>> > > > > > > -p
>> > > > > > > > admin"
>> > > > > > > >
>> > > > > > > > bevin@le:/opt/apache-drill-1.0.0-m1/bin$ sqlline -u
>> > > > > > > jdbc:drill:schema=dfs
>> > > > > > > > -n admin -p admin
>> > > > > > > >
>> > > > > > > >
>> > > > > > > > Loaded singnal handler: SunSignalHandler
>> > > > > > > > /home/bevin/.sqlline/sqlline.properties (没有那个文件或目录)
>> > > > > > > > scan complete in 18ms
>> > > > > > > > scan complete in 2693ms
>> > > > > > > > Connecting to jdbc:drill:schema=dfs
>> > > > > > > > Connected to: Drill (version 1.0)
>> > > > > > > > Driver: Apache Drill JDBC Driver (version 1.0)
>> > > > > > > > Autocommit status: true
>> > > > > > > > Transaction isolation: TRANSACTION_REPEATABLE_READ
>> > > > > > > > sqlline version ??? by Marc Prud'hommeauxThen run a
>> query(the
>> > > file
>> > > > > path
>> > > > > > > is
>> > > > > > > > right) :
>> > > > > > > > 0: jdbc:drill:schema=dfs> select * from
>> > > > > "/home/bevin/AllstarFull.csv";
>> > > > > > > > 五月 23, 2014 7:09:14 下午
>> > > > > org.eigenbase.sql.validate.SqlValidatorException
>> > > > > > > > <init>
>> > > > > > > > 严重: org.eigenbase.sql.validate.SqlValidatorException: Table
>> > > > > > > > '/home/bevin/AllstarFull.csv' not found
>> > > > > > > > 五月 23, 2014 7:09:14 下午 org.eigenbase.util.EigenbaseException
>> > > <init>
>> > > > > > > > 严重: org.eigenbase.util.EigenbaseContextException: From line
>> 1,
>> > > > column
>> > > > > > 15
>> > > > > > > > to line 1, column 43
>> > > > > > > > org.eigenbase.util.EigenbaseContextException: From line 1,
>> > column
>> > > > 15
>> > > > > to
>> > > > > > > > line 1, column 43
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.resource.EigenbaseResource$_Def12.ex(EigenbaseResource.java:1026)
>> > > > > > > >         at
>> > > > > > >
>> org.eigenbase.sql.SqlUtil.newContextException(SqlUtil.java:739)
>> > > > > > > >         at
>> > > > > > >
>> org.eigenbase.sql.SqlUtil.newContextException(SqlUtil.java:726)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.newValidationError(SqlValidatorImpl.java:3830)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:78)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:90)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:802)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:790)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2776)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:3013)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:69)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:90)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:802)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:790)
>> > > > > > > >         at
>> > > org.eigenbase.sql.SqlSelect.validate(SqlSelect.java:154)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:753)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:444)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:445)
>> > > > > > > >         at
>> > > > > > > >
>> > net.hydromatic.optiq.prepare.Prepare.prepareSql(Prepare.java:160)
>> > > > > > > >         at
>> > > > > > > >
>> > net.hydromatic.optiq.prepare.Prepare.prepareSql(Prepare.java:129)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare2_(OptiqPrepareImpl.java:255)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare_(OptiqPrepareImpl.java:195)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepareSql(OptiqPrepareImpl.java:168)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.jdbc.OptiqStatement.parseQuery(OptiqStatement.java:402)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > >
>> > > >
>> >
>> net.hydromatic.optiq.jdbc.OptiqStatement.execute(OptiqStatement.java:192)
>> > > > > > > >         at
>> sqlline.SqlLine$Commands.execute(SqlLine.java:3825)
>> > > > > > > >         at sqlline.SqlLine$Commands.sql(SqlLine.java:3738)
>> > > > > > > >         at sqlline.SqlLine.dispatch(SqlLine.java:882)
>> > > > > > > >         at sqlline.SqlLine.begin(SqlLine.java:717)
>> > > > > > > >         at
>> > > > sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:460)
>> > > > > > > >         at sqlline.SqlLine.main(SqlLine.java:443)
>> > > > > > > > Caused by: org.eigenbase.sql.validate.SqlValidatorException:
>> > > Table
>> > > > > > > > '/home/bevin/AllstarFull.csv' not found
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.resource.EigenbaseResource$_Def9.ex(EigenbaseResource.java:963)
>> > > > > > > >         ... 27 more0: jdbc:drill:schema=dfs> select * from
>> > > > > > > > `/home/bevin/AllstarFull.csv`;
>> > > > > > > > java.lang.RuntimeException: parse failed
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare2_(OptiqPrepareImpl.java:237)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare_(OptiqPrepareImpl.java:195)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepareSql(OptiqPrepareImpl.java:168)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.jdbc.OptiqStatement.parseQuery(OptiqStatement.java:402)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > >
>> > > >
>> >
>> net.hydromatic.optiq.jdbc.OptiqStatement.execute(OptiqStatement.java:192)
>> > > > > > > >         at
>> sqlline.SqlLine$Commands.execute(SqlLine.java:3825)
>> > > > > > > >         at sqlline.SqlLine$Commands.sql(SqlLine.java:3738)
>> > > > > > > >         at sqlline.SqlLine.dispatch(SqlLine.java:882)
>> > > > > > > >         at sqlline.SqlLine.begin(SqlLine.java:717)
>> > > > > > > >         at
>> > > > sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:460)
>> > > > > > > >         at sqlline.SqlLine.main(SqlLine.java:443)
>> > > > > > > > Caused by: org.eigenbase.sql.parser.SqlParseException:
>> Lexical
>> > > > error
>> > > > > at
>> > > > > > > > line 1, column 15.  Encountered: "`" (96), after : ""
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.convertException(SqlParserImpl.java:281)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.normalizeException(SqlParserImpl.java:44)
>> > > > > > > >         at
>> > > > > > >
>> org.eigenbase.sql.parser.SqlParser.parseStmt(SqlParser.java:138)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare2_(OptiqPrepareImpl.java:235)
>> > > > > > > >         ... 10 more
>> > > > > > > > Caused by: org.eigenbase.sql.parser.impl.TokenMgrError:
>> Lexical
>> > > > error
>> > > > > > at
>> > > > > > > > line 1, column 15.  Encountered: "`" (96), after : ""
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImplTokenManager.getNextToken(SqlParserImplTokenManager.java:4924)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_scan_token(SqlParserImpl.java:15281)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_3_245(SqlParserImpl.java:14738)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_3R_38(SqlParserImpl.java:14750)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_3R_42(SqlParserImpl.java:14643)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_3_73(SqlParserImpl.java:14951)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_2_73(SqlParserImpl.java:5338)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.TableRef(SqlParserImpl.java:1379)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.FromClause(SqlParserImpl.java:1286)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlSelect(SqlParserImpl.java:633)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.LeafQuery(SqlParserImpl.java:399)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.LeafQueryOrExpr(SqlParserImpl.java:2055)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.QueryOrExpr(SqlParserImpl.java:2017)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.OrderedQueryOrExpr(SqlParserImpl.java:377)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlStmt(SqlParserImpl.java:573)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlStmtEof(SqlParserImpl.java:599)
>> > > > > > > >         at
>> > > > > > >
>> org.eigenbase.sql.parser.SqlParser.parseStmt(SqlParser.java:130)
>> > > > > > > >         ... 11 more0: jdbc:drill:schema=dfs> select * from
>> > > > > > > > '/home/bevin/AllstarFull.csv';
>> > > > > > > > java.lang.RuntimeException: parse failed
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare2_(OptiqPrepareImpl.java:237)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare_(OptiqPrepareImpl.java:195)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepareSql(OptiqPrepareImpl.java:168)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.jdbc.OptiqStatement.parseQuery(OptiqStatement.java:402)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > >
>> > > >
>> >
>> net.hydromatic.optiq.jdbc.OptiqStatement.execute(OptiqStatement.java:192)
>> > > > > > > >         at
>> sqlline.SqlLine$Commands.execute(SqlLine.java:3825)
>> > > > > > > >         at sqlline.SqlLine$Commands.sql(SqlLine.java:3738)
>> > > > > > > >         at sqlline.SqlLine.dispatch(SqlLine.java:882)
>> > > > > > > >         at sqlline.SqlLine.begin(SqlLine.java:717)
>> > > > > > > >         at
>> > > > sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:460)
>> > > > > > > >         at sqlline.SqlLine.main(SqlLine.java:443)
>> > > > > > > > Caused by: org.eigenbase.sql.parser.SqlParseException:
>> > > Encountered
>> > > > > > > > "\'/home/bevin/AllstarFull.csv\'" at line 1, column 15.
>> > > > > > > > Was expecting one of:
>> > > > > > > >     <IDENTIFIER> ...
>> > > > > > > >     <QUOTED_IDENTIFIER> ...
>> > > > > > > >     <UNICODE_QUOTED_IDENTIFIER> ...
>> > > > > > > >     "LATERAL" ...
>> > > > > > > >     "(" ...
>> > > > > > > >     "UNNEST" ...
>> > > > > > > >     "TABLE" ...
>> > > > > > > >
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.convertException(SqlParserImpl.java:281)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.normalizeException(SqlParserImpl.java:44)
>> > > > > > > >         at
>> > > > > > >
>> org.eigenbase.sql.parser.SqlParser.parseStmt(SqlParser.java:138)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare2_(OptiqPrepareImpl.java:235)
>> > > > > > > >         ... 10 more
>> > > > > > > > Caused by: org.eigenbase.sql.parser.impl.ParseException:
>> > > > Encountered
>> > > > > > > > "\'/home/bevin/AllstarFull.csv\'" at line 1, column 15.
>> > > > > > > > Was expecting one of:
>> > > > > > > >     <IDENTIFIER> ...
>> > > > > > > >     <QUOTED_IDENTIFIER> ...
>> > > > > > > >     <UNICODE_QUOTED_IDENTIFIER> ...
>> > > > > > > >     "LATERAL" ...
>> > > > > > > >     "(" ...
>> > > > > > > >     "UNNEST" ...
>> > > > > > > >     "TABLE" ...
>> > > > > > > >
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.generateParseException(SqlParserImpl.java:15443)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.jj_consume_token(SqlParserImpl.java:15272)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.TableRef(SqlParserImpl.java:1423)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.FromClause(SqlParserImpl.java:1286)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlSelect(SqlParserImpl.java:633)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.LeafQuery(SqlParserImpl.java:399)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.LeafQueryOrExpr(SqlParserImpl.java:2055)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.QueryOrExpr(SqlParserImpl.java:2017)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.OrderedQueryOrExpr(SqlParserImpl.java:377)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlStmt(SqlParserImpl.java:573)
>> > > > > > > >         at
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.eigenbase.sql.parser.impl.SqlParserImpl.SqlStmtEof(SqlParserImpl.java:599)
>> > > > > > > >         at
>> > > > > > >
>> org.eigenbase.sql.parser.SqlParser.parseStmt(SqlParser.java:130)
>> > > > > > > >         ... 11 moreI can't figure out what happened during
>> the
>> > > > query
>> > > > > ,
>> > > > > > > PLZ
>> > > > > > > > give me some help!
>> > > > > > > > THANKS
>> > > > > > > > BEVIN
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>>
>
>

Reply via email to