[jira] [Commented] (HADOOP-16626) S3A ITestRestrictedReadAccess fails

2019-10-05 Thread Hudson (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16626?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16945042#comment-16945042
 ] 

Hudson commented on HADOOP-16626:
-

SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #17488 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/17488/])
HADOOP-16626. S3A ITestRestrictedReadAccess fails without S3Guard. (stevel: rev 
b8086bf54d977199879a0f0dcff6058cdf56ded2)
* (edit) 
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/auth/ITestRestrictedReadAccess.java
* (edit) 
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/AbstractS3ATestBase.java
* (edit) 
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/contract/s3a/S3AContract.java
* (edit) 
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/S3ATestUtils.java


> S3A ITestRestrictedReadAccess fails
> ---
>
> Key: HADOOP-16626
> URL: https://issues.apache.org/jira/browse/HADOOP-16626
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Reporter: Siddharth Seth
>Assignee: Steve Loughran
>Priority: Major
> Fix For: 3.3.0
>
>
> Just tried running the S3A test suite. Consistently seeing the following.
> Command used 
> {code}
> mvn -T 1C  verify -Dparallel-tests -DtestsThreadCount=12 -Ds3guard -Dauth 
> -Ddynamo -Dtest=moo -Dit.test=ITestRestrictedReadAccess
> {code}
> cc [~ste...@apache.org]
> {code}
> ---
> Test set: org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> ---
> Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 5.335 s <<< 
> FAILURE! - in org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> testNoReadAccess[raw](org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess)
>   Time elapsed: 2.841 s  <<< ERROR!
> java.nio.file.AccessDeniedException: 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: getFileStatus on 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: 
> com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
> S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 
> FE8B4D6F25648BCD; S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=),
>  S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=:403
>  Forbidden
> at 
> org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:244)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2777)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2705)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2589)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerListStatus(S3AFileSystem.java:2377)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listStatus$10(S3AFileSystem.java:2356)
> at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:110)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.listStatus(S3AFileSystem.java:2356)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.checkBasicFileOperations(ITestRestrictedReadAccess.java:360)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.testNoReadAccess(ITestRestrictedReadAccess.java:282)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:298)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:292)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: 

[jira] [Commented] (HADOOP-16626) S3A ITestRestrictedReadAccess fails

2019-10-03 Thread Siddharth Seth (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16626?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16944139#comment-16944139
 ] 

Siddharth Seth commented on HADOOP-16626:
-

bq. When you call Configuration.addResource() it reloads all configs, so all 
settings you've previously cleared get set again.
Interesting. Any properties which have explicitly been set using 
config.set(...) are retained after an addResource() call. However, properties 
which have been unset explicitly via conf.unset() are lost of after an 
addResource(). This is probably a bug in 'Configuration'.

For my understanding, this specific call in createConfiguration()
{code}
removeBucketOverrides(bucketName, conf,
S3_METADATA_STORE_IMPL,
METADATASTORE_AUTHORITATIVE);
{code}
All the unsets it does are lost, and somehow in your config files you have 
bucket level overrides set up, which are lost as a result?

> S3A ITestRestrictedReadAccess fails
> ---
>
> Key: HADOOP-16626
> URL: https://issues.apache.org/jira/browse/HADOOP-16626
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Reporter: Siddharth Seth
>Assignee: Steve Loughran
>Priority: Major
>
> Just tried running the S3A test suite. Consistently seeing the following.
> Command used 
> {code}
> mvn -T 1C  verify -Dparallel-tests -DtestsThreadCount=12 -Ds3guard -Dauth 
> -Ddynamo -Dtest=moo -Dit.test=ITestRestrictedReadAccess
> {code}
> cc [~ste...@apache.org]
> {code}
> ---
> Test set: org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> ---
> Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 5.335 s <<< 
> FAILURE! - in org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> testNoReadAccess[raw](org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess)
>   Time elapsed: 2.841 s  <<< ERROR!
> java.nio.file.AccessDeniedException: 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: getFileStatus on 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: 
> com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
> S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 
> FE8B4D6F25648BCD; S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=),
>  S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=:403
>  Forbidden
> at 
> org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:244)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2777)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2705)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2589)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerListStatus(S3AFileSystem.java:2377)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listStatus$10(S3AFileSystem.java:2356)
> at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:110)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.listStatus(S3AFileSystem.java:2356)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.checkBasicFileOperations(ITestRestrictedReadAccess.java:360)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.testNoReadAccess(ITestRestrictedReadAccess.java:282)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:298)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:292)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at 

[jira] [Commented] (HADOOP-16626) S3A ITestRestrictedReadAccess fails

2019-10-03 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16626?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16943880#comment-16943880
 ] 

Steve Loughran commented on HADOOP-16626:
-

OK. I have now learned something.

When you call Configuration.addResource() it reloads all configs, so
all settings you've previously cleared get set again.

And we force in the contract/s3a.xml settings, don't we?

I'm going to change how we load that file (which declares the expected FS 
behaviour in the common tests). I'm going to make that load optional and only 
load it in those s3a contract tests, not the other S3A tests.


(pause)
Actually, that's not enough! The first call to Filesystem.get() will force 
service discovery of all filesystems, which will force their class 
instantiation, and then any class which forces in a config (HDFS) triggers this.

{code}
Breakpoint reached
  at 
org.apache.hadoop.conf.Configuration.addDefaultResource(Configuration.java:893)
  at 
org.apache.hadoop.mapreduce.util.ConfigUtil.loadResources(ConfigUtil.java:43)
  at org.apache.hadoop.mapred.JobConf.(JobConf.java:123)
  at java.lang.Class.forName0(Class.java:-1)
  at java.lang.Class.forName(Class.java:348)
  at 
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2603)
  at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:96)
  at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:79)
  at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
  at org.apache.hadoop.security.Groups.(Groups.java:106)
  at org.apache.hadoop.security.Groups.(Groups.java:102)
  at 
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:451)
  at 
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:355)
  at 
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:317)
  at 
org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1989)
  at 
org.apache.hadoop.security.UserGroupInformation.createLoginUser(UserGroupInformation.java:746)
  at 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:696)
  at 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607)
  at 
org.apache.hadoop.fs.viewfs.ViewFileSystem.(ViewFileSystem.java:230)
  at 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(NativeConstructorAccessorImpl.java:-1)
  at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  at java.lang.Class.newInstance(Class.java:442)
  at 
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
  at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
  at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
  at 
org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3310)
  at 
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3355)
  at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3394)
  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:500)
  at 
org.apache.hadoop.fs.contract.AbstractBondedFSContract.init(AbstractBondedFSContract.java:72)
  at 
org.apache.hadoop.fs.contract.AbstractFSContractTestBase.setup(AbstractFSContractTestBase.java:178)
  at 
org.apache.hadoop.fs.s3a.AbstractS3ATestBase.setup(AbstractS3ATestBase.java:55)
  at 
org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.setup(ITestRestrictedReadAccess.java:233)
  at 
sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1)
  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
  at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
  at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
  at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
  at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
  at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
  at 

[jira] [Commented] (HADOOP-16626) S3A ITestRestrictedReadAccess fails

2019-10-03 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16626?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16943699#comment-16943699
 ] 

Steve Loughran commented on HADOOP-16626:
-

Looking at this.

The issue is not just that the test fails for Sid;
It is that it works for me. Why? The code which tries to disable S3Guard
isn't picked up: be power bucket settings are overriding what we've chosen.

This is unfortunate, because were trying to unset those in 
removeBaseAndBucketOverrides(). I'm going to look at this in more detail.
Only once I fix test setup to replicate the problem will I look at fixing it, 
which is simply one of "lists will fail without read access on raw"

> S3A ITestRestrictedReadAccess fails
> ---
>
> Key: HADOOP-16626
> URL: https://issues.apache.org/jira/browse/HADOOP-16626
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Reporter: Siddharth Seth
>Assignee: Steve Loughran
>Priority: Major
>
> Just tried running the S3A test suite. Consistently seeing the following.
> Command used 
> {code}
> mvn -T 1C  verify -Dparallel-tests -DtestsThreadCount=12 -Ds3guard -Dauth 
> -Ddynamo -Dtest=moo -Dit.test=ITestRestrictedReadAccess
> {code}
> cc [~ste...@apache.org]
> {code}
> ---
> Test set: org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> ---
> Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 5.335 s <<< 
> FAILURE! - in org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> testNoReadAccess[raw](org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess)
>   Time elapsed: 2.841 s  <<< ERROR!
> java.nio.file.AccessDeniedException: 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: getFileStatus on 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: 
> com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
> S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 
> FE8B4D6F25648BCD; S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=),
>  S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=:403
>  Forbidden
> at 
> org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:244)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2777)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2705)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2589)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerListStatus(S3AFileSystem.java:2377)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listStatus$10(S3AFileSystem.java:2356)
> at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:110)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.listStatus(S3AFileSystem.java:2356)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.checkBasicFileOperations(ITestRestrictedReadAccess.java:360)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.testNoReadAccess(ITestRestrictedReadAccess.java:282)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:298)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:292)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden 
> (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 
> FE8B4D6F25648BCD; S3 Extended 

[jira] [Commented] (HADOOP-16626) S3A ITestRestrictedReadAccess fails

2019-10-03 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16626?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16943488#comment-16943488
 ] 

Steve Loughran commented on HADOOP-16626:
-

caused by HADOOP-16458 

> S3A ITestRestrictedReadAccess fails
> ---
>
> Key: HADOOP-16626
> URL: https://issues.apache.org/jira/browse/HADOOP-16626
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Reporter: Siddharth Seth
>Assignee: Steve Loughran
>Priority: Major
>
> Just tried running the S3A test suite. Consistently seeing the following.
> Command used 
> {code}
> mvn -T 1C  verify -Dparallel-tests -DtestsThreadCount=12 -Ds3guard -Dauth 
> -Ddynamo -Dtest=moo -Dit.test=ITestRestrictedReadAccess
> {code}
> cc [~ste...@apache.org]
> {code}
> ---
> Test set: org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> ---
> Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 5.335 s <<< 
> FAILURE! - in org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> testNoReadAccess[raw](org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess)
>   Time elapsed: 2.841 s  <<< ERROR!
> java.nio.file.AccessDeniedException: 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: getFileStatus on 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: 
> com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
> S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 
> FE8B4D6F25648BCD; S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=),
>  S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=:403
>  Forbidden
> at 
> org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:244)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2777)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2705)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2589)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerListStatus(S3AFileSystem.java:2377)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listStatus$10(S3AFileSystem.java:2356)
> at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:110)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.listStatus(S3AFileSystem.java:2356)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.checkBasicFileOperations(ITestRestrictedReadAccess.java:360)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.testNoReadAccess(ITestRestrictedReadAccess.java:282)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:298)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:292)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden 
> (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 
> FE8B4D6F25648BCD; S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=),
>  S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=
> at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1712)
> at 
> com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1367)
> at 
> 

[jira] [Commented] (HADOOP-16626) S3A ITestRestrictedReadAccess fails

2019-10-03 Thread Steve Loughran (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16626?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16943464#comment-16943464
 ] 

Steve Loughran commented on HADOOP-16626:
-

The text file system created here. has list access but not Head/Get access.

Looking at the stack I don't see how the raw check could work at all here, 
Because we call getFileStatus before the LIST. With S3Guard, fine,
provided the entry is in the table. But raw -it should always fail.

So why don't I see that? As I am clearing the bucket settings?
I will look with a debugger.

FWIW, I do hope/plan to actually remove those getFileStatus calls
before list operations which are normally called against directories
-the list* operations, essentially. They should do the list first,
And only if that fails to find anything, fallback to the getFileStatus
probes for file or marker. This should make a big difference during query 
planning, and stop markers being mistaken to empty directories.

This means whatever changes I do to fix this regression will have to be rolled 
back later. Never mind

Thanks for finding this. 

> S3A ITestRestrictedReadAccess fails
> ---
>
> Key: HADOOP-16626
> URL: https://issues.apache.org/jira/browse/HADOOP-16626
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs/s3
>Reporter: Siddharth Seth
>Assignee: Steve Loughran
>Priority: Major
>
> Just tried running the S3A test suite. Consistently seeing the following.
> Command used 
> {code}
> mvn -T 1C  verify -Dparallel-tests -DtestsThreadCount=12 -Ds3guard -Dauth 
> -Ddynamo -Dtest=moo -Dit.test=ITestRestrictedReadAccess
> {code}
> cc [~ste...@apache.org]
> {code}
> ---
> Test set: org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> ---
> Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 5.335 s <<< 
> FAILURE! - in org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess
> testNoReadAccess[raw](org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess)
>   Time elapsed: 2.841 s  <<< ERROR!
> java.nio.file.AccessDeniedException: 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: getFileStatus on 
> test/testNoReadAccess-raw/noReadDir/emptyDir/: 
> com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon 
> S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 
> FE8B4D6F25648BCD; S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=),
>  S3 Extended Request ID: 
> hgUHzFskU9CcEUT3DxgAkYcWLl6vFoa1k7qXX29cx1u3lpl7RVsWr5rp27/B8s5yjmWvvi6hVgk=:403
>  Forbidden
> at 
> org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:244)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2777)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2705)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2589)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerListStatus(S3AFileSystem.java:2377)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listStatus$10(S3AFileSystem.java:2356)
> at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:110)
> at 
> org.apache.hadoop.fs.s3a.S3AFileSystem.listStatus(S3AFileSystem.java:2356)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.checkBasicFileOperations(ITestRestrictedReadAccess.java:360)
> at 
> org.apache.hadoop.fs.s3a.auth.ITestRestrictedReadAccess.testNoReadAccess(ITestRestrictedReadAccess.java:282)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:298)
>