thisisnic commented on PR #49619:
URL: https://github.com/apache/arrow/pull/49619#issuecomment-4158788314
```
══ Warnings
════════════════════════════════════════════════════════════════════
── Warning ('test-gcs.R:122:1'): (code run outside of `test_that()`)
───────────
error in running command
Backtrace:
▆
1. ├─testthat::skip_if_not(system("storage-testbench -h") == 0, message =
"googleapis-storage-testbench is not installed.") at test-gcs.R:122:1
2. │ └─base::isTRUE(condition)
3. └─base::system("storage-testbench -h")
══ Failed tests
════════════════════════════════════════════════════════════════
── Error ('test-s3.R:49:5'): read/write Feather on S3
──────────────────────────
Error: IOError: When initiating multiple part upload for key
'1774901280.20806/test.feather' in bucket 'arrow-datasets': AWS Error
ACCESS_DENIED during CreateMultipartUpload operation: User:
arn:aws:iam::855673865593:user/crossbow is not authorized to perform:
s3:PutObject on resource:
"arn:aws:s3:::arrow-datasets/1774901280.20806/test.feather" because no
identity-based policy allows the s3:PutObject action (Request ID:
YHWYK1PAJ4PQ4RTW)
Backtrace:
▆
1. └─arrow::write_feather(example_data, bucket_uri(now, "test.feather")) at
test-s3.R:49:5
2. └─arrow:::make_output_stream(sink)
3. └─fs_and_path$fs$OpenOutputStream(fs_and_path$path)
4. └─arrow:::fs___FileSystem__OpenOutputStream(self,
clean_path_rel(path))
── Error ('test-s3.R:55:5'): read/write Parquet on S3
──────────────────────────
Error: IOError: When initiating multiple part upload for key
'1774901280.20806/test.parquet' in bucket 'arrow-datasets': AWS Error
ACCESS_DENIED during CreateMultipartUpload operation: User:
arn:aws:iam::855673865593:user/crossbow is not authorized to perform:
s3:PutObject on resource:
"arn:aws:s3:::arrow-datasets/1774901280.20806/test.parquet" because no
identity-based policy allows the s3:PutObject action (Request ID:
YHWZWQHNDRM7CJ24)
Backtrace:
▆
1. └─arrow::write_parquet(example_data, bucket_uri(now, "test.parquet")) at
test-s3.R:55:5
2. └─arrow:::make_output_stream(sink)
3. └─fs_and_path$fs$OpenOutputStream(fs_and_path$path)
4. └─arrow:::fs___FileSystem__OpenOutputStream(self,
clean_path_rel(path))
── Error ('test-s3.R:60:5'): RandomAccessFile$ReadMetadata() works for
S3FileSystem ──
Error: IOError: Path does not exist
'arrow-datasets/1774901280.20806/test.parquet'. Detail: [errno 2] No such file
or directory
Backtrace:
▆
1. └─bucket$OpenInputFile(paste0(now, "/", "test.parquet")) at
test-s3.R:60:5
2. └─arrow:::fs___FileSystem__OpenInputFile(self, clean_path_rel(path))
── Error ('test-s3.R:44:1'): (code run outside of `test_that()`)
───────────────
Error: IOError: Path does not exist 'arrow-datasets/1774901280.20806/'.
Detail: [errno 2] No such file or directory
Backtrace:
▆
1. └─bucket$DeleteDir(now) at test-s3.R:44:1
2. └─arrow:::fs___FileSystem__DeleteDir(self, clean_path_rel(path))
[ FAIL 4 | WARN 1 | SKIP 30 | PASS 8080 ]
Error:
! Test failures.
Warning messages:
1: In for (i in seq_len(n)) { :
closing unused connection 4 (/tmp/Rtmp0rl4Xk/file8f9832b0767c/part-0.csv)
2: In for (i in seq_len(n)) { :
closing unused connection 5 (/tmp/Rtmp0rl4Xk/file8f986ad76486/part-0.tsv)
3: In for (i in seq_len(n)) { :
closing unused connection 4 (/tmp/Rtmp0rl4Xk/file8f986bbb0db0/part-0.csv)
Execution halted
1 error ✖ | 0 warnings ✔ | 2 notes ✖
```
Failures look unrelated; I need to update some bucket settings. Probably
worth getting that done and retriggering CI here though to make sure this isn't
a new thing in 4.5.2.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]