Hi recompiled and retried, now its looking like this with s3a :
com.amazonaws.AmazonClientException: Unable to load AWS credentials
from any provider in the chain
S3n is working find, (only problem is still the endpoint)
-
To
With s3n try this out:
*s3service.s3-endpoint*The host name of the S3 service. You should only
ever change this value from the default if you need to contact an
alternative S3 endpoint for testing purposes.
Default: s3.amazonaws.com
Thanks
Best Regards
On Tue, Jul 28, 2015 at 1:54 PM, Schmirr
I tried those 3 possibilities, and everything is working = endpoint param
is not working :
sc.hadoopConfiguration.set(s3service.s3-endpoint,test)
sc.hadoopConfiguration.set(fs.s3n.endpoint,test)
sc.hadoopConfiguration.set(fs.s3n.s3-endpoint,test)
2015-07-28 10:28 GMT+02:00 Akhil Das
...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is
there
a way I can specify the host somewhere
to hit.
Thanks
Best Regards
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst
schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is
there
a way I can
:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is
there
a way I can specify the host somewhere ?
-
To unsubscribe, e-mail: user
:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is there
a way I can specify the host somewhere ?
-
To unsubscribe, e-mail: user-unsubscr
your spark job add --jars path/to/thejar
From: Schmirr Wurst schmirrwu...@gmail.com
Sent: Wednesday, July 22, 2015 12:06 PM
To: Thomas Demoor
Subject: Re: use S3-Compatible Storage with spark
Hi Thomas, thanks, could you just tell me what exaclty I
provides
a S3 like RestAPI endpoint for you to hit.
Thanks
Best Regards
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst
schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon
, Jul 17, 2015 at 2:06 PM, Schmirr Wurst
schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is
there
a way I can specify the host somewhere
Best Regards
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst
schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is
there
a way I can specify the host somewhere
provides
a S3 like RestAPI endpoint for you to hit.
Thanks
Best Regards
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst
schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point
:
Could you name the Storage service that you are using? Most of them
provides
a S3 like RestAPI endpoint for you to hit.
Thanks
Best Regards
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark
are using? Most of them
provides
a S3 like RestAPI endpoint for you to hit.
Thanks
Best Regards
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema
Could you name the Storage service that you are using? Most of them
provides a S3 like RestAPI endpoint for you to hit.
Thanks
Best Regards
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is there
a way I can specify the host somewhere
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is there
a way I can specify the host somewhere ?
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
, 2015 at 1:36 AM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is there
a way I can specify the host somewhere
}[/${subfolder_name}]*. Bucket names are
unique across S3, so the resulting path is also unique. There is no concept
of hostname in s3 urls as far as I know.
-sujit
On Fri, Jul 17, 2015 at 1:36 AM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible
19 matches
Mail list logo