-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/52283/#review151382
-----------------------------------------------------------



Took a quick pass. Have some questions and suggestions.

Will take a another pass once those are addressed.


common/src/java/org/apache/hadoop/hive/conf/Constants.java (lines 34 - 35)
<https://reviews.apache.org/r/52283/#comment219968>

    Please provide some hint in the name that this is storing name of an 
environment variable.
    
    For example: using ENVVAR instead of NAME
    
    HIVE_SERVER2_JOB_CREDSTORE_PASSWORD_ENVVAR
    HADOOP_CREDENTIAL_PASSWORD_ENVVAR



common/src/java/org/apache/hadoop/hive/conf/Constants.java (line 36)
<https://reviews.apache.org/r/52283/#comment219969>

    Same. This is name of config and not the path, right ?
    
    HADOOP_CREDENTIAL_PROVIDER_PATH_CONFIG



common/src/java/org/apache/hadoop/hive/conf/HiveConf.java (line 2522)
<https://reviews.apache.org/r/52283/#comment219970>

    Does this work for Spark ? Tez ? Good to explicitly add all engines that 
are known to support this functionality. If an engine has not been tested, I 
think it's ok to state that "this functionality has not been tested against 
<engine>".



common/src/java/org/apache/hadoop/hive/conf/HiveConfUtil.java (lines 136 - 137)
<https://reviews.apache.org/r/52283/#comment219972>

    Worth adding why environment variables are used instead of hive configs.
    
    Since this is the main method could you summarize everything you are doing ?
    
    For example:
    
    If the driver is configured with HIVE_SERVER2_JOB_CREDSTORE_LOCATION, 
    AND 
    HIVE_JOB_CREDSTORE_PASSWORD environment is set then:
    (1) JobConf.MAPRED_MAP_TASK_ENV should contain 
HIVE_SERVER2_JOB_CREDSTORE_LOCATION
    (2) Set hadoop.security.credential.provider.path to 
HIVE_SERVER2_JOB_CREDSTORE_LOCATION
    
    What do you mean by HIVE_JOB_CREDSTORE_PASSWORD
    or HADOOP_CREDSTORE_PASSWORD ? You mean you'll take either one if set ? 
What if both are set ?



ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java (line 422)
<https://reviews.apache.org/r/52283/#comment219705>

    Check if string is empty?



ql/src/test/org/apache/hadoop/hive/ql/exec/TestHiveCredentialProviders.java 
(lines 35 - 36)
<https://reviews.apache.org/r/52283/#comment219706>

    nit: please prefix with word "test"
    
    nit: HIVE_JOB_CREDSTORE_PASSWORD_ENVVAR_VAL
    HADOOP_CREDSTORE_PASSWORD_ENVVAR_VAL



ql/src/test/org/apache/hadoop/hive/ql/exec/TestHiveCredentialProviders.java 
(lines 81 - 82)
<https://reviews.apache.org/r/52283/#comment219703>

    I would use assertEquals here and get rid of the message. It's making these 
statements difficult to read without adding much value. If this assert fails, 
we know exactly what happened.  
    
    Same for all other instances in this file.



ql/src/test/org/apache/hadoop/hive/ql/exec/TestHiveCredentialProviders.java 
(lines 88 - 94)
<https://reviews.apache.org/r/52283/#comment219975>

    Are you testing the same thing twice ?



ql/src/test/org/apache/hadoop/hive/ql/exec/TestHiveCredentialProviders.java 
(lines 109 - 112)
<https://reviews.apache.org/r/52283/#comment219977>

    This is super unread-able... please make this into one line....



ql/src/test/org/apache/hadoop/hive/ql/exec/TestHiveCredentialProviders.java 
(lines 333 - 335)
<https://reviews.apache.org/r/52283/#comment219974>

    not need based on prior comment


- Mohit Sabharwal


On Sept. 30, 2016, 6:58 p.m., Vihang Karajgaonkar wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/52283/
> -----------------------------------------------------------
> 
> (Updated Sept. 30, 2016, 6:58 p.m.)
> 
> 
> Review request for hive, Mohit Sabharwal and Sergio Pena.
> 
> 
> Bugs: HIVE-14822
>     https://issues.apache.org/jira/browse/HIVE-14822
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> This change adds support for credential provider for jobs launched from 
> HiveServer2. It also adds support for job-specific credential provider and 
> password which is passed to the jobs via the job configs when they are 
> launched from HS2. The hive specific credential provider location is 
> specified by a new configuration property specific to hiveserver2 and the 
> password to job credential provider is provided using the environment variable
> 
> 
> Diffs
> -----
> 
>   common/src/java/org/apache/hadoop/hive/common/EnvironmentUtils.java 
> PRE-CREATION 
>   common/src/java/org/apache/hadoop/hive/common/FileUtils.java 
> 3ed2d086fd8e14b9a70550c02082c1456f905a08 
>   common/src/java/org/apache/hadoop/hive/conf/Constants.java 
> 77c6aa5663c2c5358715801bc039c0fe8035e3dc 
>   common/src/java/org/apache/hadoop/hive/conf/HiveConf.java 
> 43a16d7fed1d38400793e7c96a7febebe42d351b 
>   common/src/java/org/apache/hadoop/hive/conf/HiveConfUtil.java 
> 16c2eafdb6888ed62168dd00f76335fa2484753b 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java 
> fd640567124e1bb8b558359732764a07c8b0f2ed 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java 
> cea9582c8ccb0c79700ac7913735d4cdf52f0c7e 
>   
> ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveSparkClientFactory.java 
> 784b9c9fa769eeb088e3a6408a0b29147a8b4086 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java 
> c75333d7022f776aab184a4b804fd7ad7befac64 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java 
> a9f70c4100219c8929abd4e31b30d340e6ba98bd 
>   ql/src/test/org/apache/hadoop/hive/ql/exec/TestHiveCredentialProviders.java 
> PRE-CREATION 
>   
> spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java 
> 936fdafdb37c461a7a5deb97cba72d4db54a49e1 
> 
> Diff: https://reviews.apache.org/r/52283/diff/
> 
> 
> Testing
> -------
> 
> Testing running multiple queries on S3 with keys stored in a credential 
> provider
> 
> 
> Thanks,
> 
> Vihang Karajgaonkar
> 
>

Reply via email to