[
https://issues.apache.org/jira/browse/HIVE-16983?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16888170#comment-16888170
]
Hive QA commented on HIVE-16983:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12876365/HIVE-16983-branch-2.1.patch
{color:red}ERROR:{color} -1 due to build exiting with an error
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/18088/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/18088/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-18088/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2019-07-18 17:07:19.907
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-18088/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z branch-2.1 ]]
+ [[ -d apache-github-branch-2.1-source ]]
+ [[ ! -d apache-github-branch-2.1-source/.git ]]
+ [[ ! -d apache-github-branch-2.1-source ]]
+ date '+%Y-%m-%d %T.%3N'
2019-07-18 17:07:19.956
+ cd apache-github-branch-2.1-source
+ git fetch origin
>From https://github.com/apache/hive
7fecb6f..7534f82 branch-1 -> origin/branch-1
fd2f7c8..2039350 branch-2 -> origin/branch-2
93163cb..9fb2238 branch-2.3 -> origin/branch-2.3
31a417e..91c243c branch-3 -> origin/branch-3
3e16420..378083e branch-3.1 -> origin/branch-3.1
e7f2fccd..374f361 master -> origin/master
* [new branch] revert-648-hive21783 -> origin/revert-648-hive21783
+ git reset --hard HEAD
HEAD is now at 292a98f HIVE-16480: Empty vector batches of floats or doubles
gets EOFException (Owen O'Malley via Jesus Camacho Rodriguez)
+ git clean -f -d
+ git checkout branch-2.1
Already on 'branch-2.1'
Your branch is up-to-date with 'origin/branch-2.1'.
+ git reset --hard origin/branch-2.1
HEAD is now at 292a98f HIVE-16480: Empty vector batches of floats or doubles
gets EOFException (Owen O'Malley via Jesus Camacho Rodriguez)
+ git merge --ff-only origin/branch-2.1
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2019-07-18 17:07:37.288
+ rm -rf ../yetus_PreCommit-HIVE-Build-18088
+ mkdir ../yetus_PreCommit-HIVE-Build-18088
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-18088
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-18088/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh
/data/hiveptest/working/scratch/build.patch
error: a/pom.xml: does not exist in index
error: patch failed: pom.xml:168
Falling back to three-way merge...
Applied patch to 'pom.xml' with conflicts.
Going to apply patch with: git apply -p1
error: patch failed: pom.xml:168
Falling back to three-way merge...
Applied patch to 'pom.xml' with conflicts.
U pom.xml
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-18088
+ exit 1
'
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12876365 - PreCommit-HIVE-Build
> getFileStatus on accessible s3a://[bucket-name]/folder: throws
> com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon
> S3; Status Code: 403; Error Code: 403 Forbidden;
> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: HIVE-16983
> URL: https://issues.apache.org/jira/browse/HIVE-16983
> Project: Hive
> Issue Type: Bug
> Components: Hive
> Affects Versions: 2.1.1
> Environment: Hive 2.1.1 on Ubuntu 14.04 AMI in AWS EC2, connecting to
> S3 using s3a:// protocol
> Reporter: Alex Baretto
> Assignee: Vlad Gudikov
> Priority: Major
> Fix For: 2.1.1
>
> Attachments: HIVE-16983-branch-2.1.patch
>
>
> I've followed various published documentation on integrating Apache Hive
> 2.1.1 with AWS S3 using the `s3a://` scheme, configuring `fs.s3a.access.key`
> and
> `fs.s3a.secret.key` for `hadoop/etc/hadoop/core-site.xml` and
> `hive/conf/hive-site.xml`.
> I am at the point where I am able to get `hdfs dfs -ls s3a://[bucket-name]/`
> to work properly (it returns s3 ls of that bucket). So I know my creds,
> bucket access, and overall Hadoop setup is valid.
> hdfs dfs -ls s3a://[bucket-name]/
>
> drwxrwxrwx - hdfs hdfs 0 2017-06-27 22:43
> s3a://[bucket-name]/files
> ...etc.
> hdfs dfs -ls s3a://[bucket-name]/files
>
> drwxrwxrwx - hdfs hdfs 0 2017-06-27 22:43
> s3a://[bucket-name]/files/my-csv.csv
> However, when I attempt to access the same s3 resources from hive, e.g. run
> any `CREATE SCHEMA` or `CREATE EXTERNAL TABLE` statements using `LOCATION
> 's3a://[bucket-name]/files/'`, it fails.
> for example:
> >CREATE EXTERNAL TABLE IF NOT EXISTS mydb.my_table ( my_table_id string,
> >my_tstamp timestamp, my_sig bigint ) ROW FORMAT DELIMITED FIELDS TERMINATED
> >BY ',' LOCATION 's3a://[bucket-name]/files/';
> I keep getting this error:
> >FAILED: Execution Error, return code 1 from
> >org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception:
> >java.nio.file.AccessDeniedException s3a://[bucket-name]/files: getFileStatus
> >on s3a://[bucket-name]/files:
> >com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service:
> >Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID:
> >C9CF3F9C50EF08D1), S3 Extended Request ID:
> >T2xZ87REKvhkvzf+hdPTOh7CA7paRpIp6IrMWnDqNFfDWerkZuAIgBpvxilv6USD0RSxM9ymM6I=)
> This makes no sense. I have access to the bucket as one can see in the hdfs
> test. And I've added the proper creds to hive-site.xml.
> Anyone have any idea what's missing from this equation?
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)