you have to create a tmp object in your bucket to make it work.
s3://bucket_name/tmp has to be created and then it should work

On Fri, May 26, 2017 at 5:02 PM, Shuporno Choudhury <
[email protected]> wrote:

> Hi Nitin,
>
> Thanks for the config settings.
>
> Now, after entering those config settings
>     1. s3.tmp does appear in the "show schemas" result
>     2. Also, it doesn't disappear when I add a custom folder in the
> location attribute
>
> But when I try to run a CTAS statement, I get the following error:
>
> *Error: SYSTEM ERROR: IllegalArgumentException: URI has an authority
> component*
> *Fragment 0:0*
>
> Query that I am trying to run:
> *create table s3.tmp.`abcd` as select 1 from (values(1));*
>
> However, this query runs when I use dfs.tmp instead of s3.tmp
>
> On Fri, May 26, 2017 at 12:44 PM, Nitin Pawar <[email protected]>
> wrote:
>
> > Can you try with following s3 config
> >
> > {
> >   "type": "file",
> >   "enabled": true,
> >   "connection": "s3a://bucket_name",
> >   "config": {
> >
> >     "fs.s3a.connection.maximum": "10000",
> >     "fs.s3a.access.key": "access_key",
> >     "fs.s3a.secret.key": "secret_key",
> >     "fs.s3a.buffer.dir": "/tmp",
> >     "fs.s3a.multipart.size": "10485760",
> >     "fs.s3a.multipart.threshold": "104857600"
> >   },
> >   "workspaces": {
> >     "root": {
> >       "location": "/",
> >       "writable": false,
> >       "defaultInputFormat": null
> >     },
> >     "tmp": {
> >       "location": "/tmp",
> >       "writable": true,
> >       "defaultInputFormat": null
> >     }
> >   },
> >   "formats": {
> >     "psv": {
> >       "type": "text",
> >       "extensions": [
> >         "tbl"
> >       ],
> >       "delimiter": "|"
> >     },
> >     "csv": {
> >       "type": "text",
> >       "extensions": [
> >         "csv"
> >       ],
> >       "extractHeader": true,
> >       "delimiter": ","
> >     },
> >     "tsv": {
> >       "type": "text",
> >       "extensions": [
> >         "tsv"
> >       ],
> >       "delimiter": "\t"
> >     },
> >     "parquet": {
> >       "type": "parquet"
> >     },
> >     "json": {
> >       "type": "json",
> >       "extensions": [
> >         "json"
> >       ]
> >     },
> >     "avro": {
> >       "type": "avro"
> >     },
> >     "sequencefile": {
> >       "type": "sequencefile",
> >       "extensions": [
> >         "seq"
> >       ]
> >     },
> >     "csvh": {
> >       "type": "text",
> >       "extensions": [
> >         "csvh"
> >       ],
> >       "extractHeader": true,
> >       "delimiter": ","
> >     }
> >   }
> > }
> >
> > On Fri, May 26, 2017 at 10:29 AM, Shuporno Choudhury <
> > [email protected]> wrote:
> >
> > > Hi,
> > > Can someone at Drill help me with issue please?
> > >
> > > On Thu, May 25, 2017 at 1:33 PM, Shuporno Choudhury <
> > > [email protected]> wrote:
> > >
> > > > HI,
> > > >
> > > > I corrected the "show schemas"  output by putting only "/" in the
> > > > "location" . Now it shows s3.tmp in the output.
> > > >
> > > > But, it has a weird problem.
> > > > The moment I add a folder to the location, eg: "/myfolder", then
> s3.tmp
> > > > vanishes from the "show schemas" output.
> > > >
> > > > Also, when I try to write into s3, I get the following error:
> > > >
> > > > Exception in thread "drill-executor-9" java.lang.
> UnsatisfiedLinkError:
> > > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(
> > > > Ljava/lang/String;I)Z
> > > >         at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(
> > Native
> > > > Method)+--+
> > > >         at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(
> > > > NativeIO.java:609)
> > > >
> > > > This is only a snippet of the error associated with writing to s3
> > > >
> > > > On Thu, May 25, 2017 at 12:41 PM, Shuporno Choudhury <
> > > > [email protected]> wrote:
> > > >
> > > >> My s3 plugin info is as follows:
> > > >>
> > > >> {
> > > >>   "type": "file",
> > > >>   "enabled": true,
> > > >>   "connection": "s3a://abcd",
> > > >>   "config": {
> > > >>     "fs.s3a.access.key": "abcd",
> > > >>     "fs.s3a.secret.key": "abcd"
> > > >>   },
> > > >>   "workspaces": {
> > > >>     "root": {
> > > >>       "location": "/",
> > > >>       "writable": false,
> > > >>       "defaultInputFormat": null
> > > >>     },
> > > >>     "tmp": {
> > > >>       "location": "/",
> > > >>       "writable": *true*,
> > > >>       "defaultInputFormat": "parquet"
> > > >>     }
> > > >>   }
> > > >>
> > > >>
> > > >> I have removed the info about the formats to keep the mail small.
> > > >> Also, I am using Dill on *Windows 10*
> > > >>
> > > >> On Mon, May 22, 2017 at 3:57 PM, Shuporno Choudhury <
> > > >> [email protected]> wrote:
> > > >>
> > > >>> Hi,
> > > >>>
> > > >>> Is it possible to write to a folder in an s3 bucket using the
> > *s3.tmp*
> > > >>> workspace?
> > > >>> Whenever I try, it gives me the follwing error:
> > > >>>
> > > >>> *Error: VALIDATION ERROR: Schema [s3.tmp] is not valid with respect
> > to
> > > >>> either root schema or current default schema.*
> > > >>> *Current default schema:  s3.root*
> > > >>>
> > > >>> Also, s3.tmp doesn't appear while using the command "*show
> schemas*"
> > > >>> though the tmp workspace exists in the web console
> > > >>>
> > > >>> I am using Drill Version 1.10; embedded mode on my local system.
> > > >>>
> > > >>> However, I have no problem reading from an s3 bucket, the problem
> is
> > > >>> only writing to a s3 bucket.
> > > >>> --
> > > >>> Regards,
> > > >>> Shuporno Choudhury
> > > >>>
> > > >>
> > > >>
> > > >>
> > > >> --
> > > >> Regards,
> > > >> Shuporno Choudhury
> > > >>
> > > >
> > > >
> > > >
> > > > --
> > > > Regards,
> > > > Shuporno Choudhury
> > > >
> > >
> > >
> > >
> > > --
> > > Regards,
> > > Shuporno Choudhury
> > >
> >
> >
> >
> > --
> > Nitin Pawar
> >
>
>
>
> --
> Regards,
> Shuporno Choudhury
>



-- 
Nitin Pawar

Reply via email to