[ 
https://issues.apache.org/jira/browse/HIVE-14373?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15450573#comment-15450573
 ] 

Abdullah Yousufi commented on HIVE-14373:
-----------------------------------------

Thanks for the comments [~spena] and [~poeppt]. Currently, the user passes in 
the path for the directory they want the tests to be run, so the user chooses 
where the tests should be run, similarly to a unique test ID being passed in.

But I agree, I think it's a good idea for the directory to be explicitly 
created at the beginning and removed at the end.

Unfortunately, I currently do not have access to a developer environment. Would 
anyone be interested in finishing up this ticket?



> Add integration tests for hive on S3
> ------------------------------------
>
>                 Key: HIVE-14373
>                 URL: https://issues.apache.org/jira/browse/HIVE-14373
>             Project: Hive
>          Issue Type: Sub-task
>            Reporter: Sergio Peña
>            Assignee: Abdullah Yousufi
>         Attachments: HIVE-14373.02.patch, HIVE-14373.03.patch, 
> HIVE-14373.04.patch, HIVE-14373.patch
>
>
> With Hive doing improvements to run on S3, it would be ideal to have better 
> integration testing on S3.
> These S3 tests won't be able to be executed by HiveQA because it will need 
> Amazon credentials. We need to write suite based on ideas from the Hadoop 
> project where:
> - an xml file is provided with S3 credentials
> - a committer must run these tests manually to verify it works
> - the xml file should not be part of the commit, and hiveqa should not run 
> these tests.
> https://wiki.apache.org/hadoop/HowToContribute#Submitting_patches_against_object_stores_such_as_Amazon_S3.2C_OpenStack_Swift_and_Microsoft_Azure



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to