[jira] [Commented] (HADOOP-14832) Listing s3a bucket without credentials gives Interrupted error
[ https://issues.apache.org/jira/browse/HADOOP-14832?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16363793#comment-16363793 ] Steve Loughran commented on HADOOP-14832: - {code} > bin/hadoop fs -ls s3a://landsat-pds/ ... many lines excluded .. ls: doesBucketExist on landsat-pds: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint {code} The full stack shows lots of retries before things give up, which could be reduced by recognition that there is no point retrying. Filed HADOOP-15232 for it. Tested with the network pulled out, to see if that causes the error you see. No {code} Caused by: java.net.ConnectException: Network is unreachable (connect failed) at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at sun.net.NetworkClient.doConnect(NetworkClient.java:175) at sun.net.www.http.HttpClient.openServer(HttpClient.java:432) at sun.net.www.http.HttpClient.openServer(HttpClient.java:527) at sun.net.www.http.HttpClient.(HttpClient.java:211) at sun.net.www.http.HttpClient.New(HttpClient.java:308) at sun.net.www.http.HttpClient.New(HttpClient.java:326) {code} John, I'm going to close as cannot reproduce. If you can try again and do see it, maybe its related to some network timeouts on the retries, so that the retry loop was taking so long that the op was failing. > Listing s3a bucket without credentials gives Interrupted error > -- > > Key: HADOOP-14832 > URL: https://issues.apache.org/jira/browse/HADOOP-14832 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.0.0-beta1 >Reporter: John Zhuge >Priority: Minor > > In trunk pseudo distributed mode, without setting s3a credentials, listing an > s3a bucket only gives "Interrupted" error : > {noformat} > $ hadoop fs -ls s3a://bucket/ > ls: Interrupted > {noformat} > In comparison, branch-2 gives a much better error message: > {noformat} > (branch-2)$ hadoop_env hadoop fs -ls s3a://bucket/ > ls: doesBucketExist on hdfs-cce: com.amazonaws.AmazonClientException: No AWS > Credentials provided by BasicAWSCredentialsProvider > EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : > com.amazonaws.SdkClientException: Unable to load credentials from service > endpoint > {noformat} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-14832) Listing s3a bucket without credentials gives Interrupted error
[ https://issues.apache.org/jira/browse/HADOOP-14832?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16363757#comment-16363757 ] Steve Loughran commented on HADOOP-14832: - I don't see that, not with my diagnostics entry point. This stack is something we need in the troubleshooting docs though {code} org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on hwdev-steve-ireland-new: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:174) at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111) at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:260) at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:314) at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:256) at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:231) at org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:365) at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:301) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3354) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361) at org.apache.hadoop.fs.store.diag.StoreDiag.executeFileSystemOperations(StoreDiag.java:256) at org.apache.hadoop.fs.store.diag.StoreDiag.run(StoreDiag.java:197) at org.apache.hadoop.fs.store.diag.StoreDiag.run(StoreDiag.java:139) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.fs.store.diag.StoreDiag.exec(StoreDiag.java:333) at org.apache.hadoop.fs.store.diag.StoreDiag.main(StoreDiag.java:343) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:308) at org.apache.hadoop.util.RunJar.main(RunJar.java:222) Caused by: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:139) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1163) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4229) at com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:4990) at com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:4964) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4213) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4176) at com.amazonaws.services.s3.AmazonS3Client.getAcl(AmazonS3Client.java:3381) at com.amazonaws.services.s3.AmazonS3Client.getBucketAcl(AmazonS3Client.java:1160) at com.amazonaws.services.s3.AmazonS3Client.getBucketAcl(AmazonS3Client.java:1150) at
[jira] [Commented] (HADOOP-14832) Listing s3a bucket without credentials gives Interrupted error
[ https://issues.apache.org/jira/browse/HADOOP-14832?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16153319#comment-16153319 ] Steve Loughran commented on HADOOP-14832: - Any idea as to why things have changed? > Listing s3a bucket without credentials gives Interrupted error > -- > > Key: HADOOP-14832 > URL: https://issues.apache.org/jira/browse/HADOOP-14832 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.0.0-beta1 >Reporter: John Zhuge >Priority: Minor > > In trunk pseudo distributed mode, without setting s3a credentials, listing an > s3a bucket only gives "Interrupted" error : > {noformat} > $ hadoop fs -ls s3a://bucket/ > ls: Interrupted > {noformat} > In comparison, branch-2 gives a much better error message: > {noformat} > (branch-2)$ hadoop_env hadoop fs -ls s3a://bucket/ > ls: doesBucketExist on hdfs-cce: com.amazonaws.AmazonClientException: No AWS > Credentials provided by BasicAWSCredentialsProvider > EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : > com.amazonaws.SdkClientException: Unable to load credentials from service > endpoint > {noformat} -- This message was sent by Atlassian JIRA (v6.4.14#64029) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org