Hi Scott,
You definitely don't need to have Hadoop to query your local file system.
Could you list the exact command to Drill that gave you this error?
Best,
Nathan Griffith
Technical Writer
Dremio
On Tue, May 31, 2016 at 1:26 PM, Scott Kinney <scott.kin...@stem.com> wrote:
>
>
Hi, Scott.
That article is a bit dated. Try setting up your core-site.xml file like in
this post:
http://www.dremio.com/blog/how-to-query-s3-data-using-amazons-s3a-library/
Best,
Nathan Griffith
Technical Writer
Dremio
On Thu, May 26, 2016 at 2:33 PM, Scott Kinney <scott.kin...@stem.
Hey Tony,
If how accomplish a JPam to LDAP interface is still giving you trouble, it
looks like this section of documentation may be useful:
http://jpam.sourceforge.net/JPamUserGuide.html#id.s5
Basically you need to edit a conf file (net-sf-jpam) included in the JPam
distribution to use LDAP and
Hi Jiang,
Think of it this way: If you had a file that was just the list:
{"id":"1001","type":"Regular"}
{"id":"1002","type":"Chocolate"}
{"id":"1003","type":"Blueberry"}
{"id":"1004","type":"Devil's Food"}
What would you like it to return when you query:
select id from
Hi Rob,
Both are confirmed to work. I've written a small article that
summarizes some of what's required to connect to HDFS here:
http://www.dremio.com/blog/securing-sql-on-hadoop-part-2-installing-and-configuring-drill/
At the moment I don't have any firsthand experience configuring Drill
for
Was going to say my goto for this kind of issue is the 'tr' command in
unix, but if I understand right you'd rather not have to preprocess,
instead preferring an in-Drill solution.
As I think you're hinting at, a Drill UDF tailored to the data might
be one way to handle it.
On Mon, Feb 8, 2016
High John,
Looks like this might be as simple as including the necessary '.'
between 'dfs' and the file location. So your query should be:
select * from dfs.`/user/jmill383/data/newWARCDataset.csv` limit 100;
Best,
Nathan
On Mon, Feb 1, 2016 at 11:12 AM, JOHN MILLER wrote:
what
the files (core-site.xml? hdfs-site.xml?) and relevant lines are?
Thanks!
Nathan Griffith
Technical Writer/Evangelist
Dremio
; Ubuntu; Linux x86_64; rv:31.0)
> Gecko/20100101 Firefox/31.0',1) from (values(1));
>
> *Result:*
>
> 0: jdbc:drill:zk=local> select GetBrowserDtl('Mozilla/5.0 (X11; Linux
> x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.80
> Safari/537.36',1) from (
Hi Nirav!
A couple thoughts: a.) To help troubleshoot, you may want to prepend
the full path of "wurfl.xml" in the first statement of your code, b.)
So when you try to use the UDF in a query you get absolutely no error
messages from Drill?
Also, the first part of this article
.
>
> On Wed, Dec 30, 2015 at 8:24 PM, Nathan Griffith <ngriff...@dremio.com>
> wrote:
>
>> Hi Peder!
>>
>> Try cleaning the Drill stuff out of your Windows install's Temp
>> directory, and make sure that the environment variables are set as per
>> thi
Hey Peder,
What's in your C:\Windows\Temp directory? Is there something that
looks like it's from Drill? If there is, try deleting it and running
again.
--Nathan
On Thu, Dec 31, 2015 at 1:53 PM, Peder Jakobsen | gmail
wrote:
> Hi Tomer and Jacques,
>
> I tried those
bsen | gmail <pjakob...@gmail.com
>> wrote:
>
>> I deleted everything in C:\Windows\Temp. Note, when I start drill
>> again, nothing gets created in there. I will now try to search the
>> machine for any files that may have been created at the time of start
Hi Peder!
Try cleaning the Drill stuff out of your Windows install's Temp
directory, and make sure that the environment variables are set as per
this article:
http://www.dremio.com/blog/installing-apache-drill-on-microsoft-windows/
Best,
Nathan
On Tue, Dec 29, 2015 at 2:04 PM, Peder Jakobsen |
Hi Josh,
When you say the second query doesn't work, what is the exact
behavior? Does it return zero rows?
--Nathan
On Fri, Dec 18, 2015 at 3:55 PM, Spoutable wrote:
> I am querying json files on s3 using the s3a storage plugin on drill 1.3
>
> The following query works
Alexander,
I'm also successfully querying a directory of (compressed) JSON in Drill 1.4.
This is maybe a trivial suggestion, but for sanity what happens when
you specify the standard dfs plugin with:
dfs.`D:\DataFiles\2015\12\16` ?
--Nathan
On Fri, Dec 18, 2015 at 1:44 AM, Holy Alexander
Hi Alex,
Check out the "Setting up Apache Drill for use with S3" section of
this article I wrote:
http://www.dremio.com/blog/using-sql-to-interface-with-google-analytics-data-stored-on-amazon-s3/
It should handle what you're trying to do. In particular, make sure
your storage plugin for s3 is
17 matches
Mail list logo