[ https://issues.apache.org/jira/browse/HIVE-14822?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15570576#comment-15570576 ]
Hive QA commented on HIVE-14822: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12832972/HIVE-14822.06.patch {color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 1 failed/errored test(s), 10568 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[reloadJar] {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/1516/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/1516/console Test logs: http://ec2-204-236-174-241.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-Build-1516/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12832972 - PreCommit-HIVE-Build > Add support for credential provider for jobs launched from Hiveserver2 > ---------------------------------------------------------------------- > > Key: HIVE-14822 > URL: https://issues.apache.org/jira/browse/HIVE-14822 > Project: Hive > Issue Type: Bug > Components: HiveServer2 > Reporter: Vihang Karajgaonkar > Assignee: Vihang Karajgaonkar > Attachments: HIVE-14822.01.patch, HIVE-14822.02.patch, > HIVE-14822.03.patch, HIVE-14822.05.patch, HIVE-14822.06.patch > > > When using encrypted passwords via the Hadoop Credential Provider, > HiveServer2 currently does not correctly forward enough information to the > job configuration for jobs to read those secrets. If your job needs to access > any secrets, like S3 credentials, then there's no convenient and secure way > to configure this today. > You could specify the decryption key in files like mapred-site.xml that > HiveServer2 uses, but this would place the encryption password on local disk > in plaintext, which can be a security concern. > To solve this problem, HiveServer2 should modify job configuration to include > the environment variable settings needed to decrypt the passwords. > Specifically, it will need to modify: > * For MR2 jobs: > ** yarn.app.mapreduce.am.admin.user.env > ** mapreduce.admin.user.env > * For Spark jobs: > ** spark.yarn.appMasterEnv.HADOOP_CREDSTORE_PASSWORD > ** spark.executorEnv.HADOOP_CREDSTORE_PASSWORD > HiveServer2 can get the decryption password from its own environment, the > same way it does for its own credential provider store today. > Additionally, it can be desirable for HiveServer2 to have a separate > encrypted password file than what is used by the job. HiveServer2 may have > secrets that the job should not have, such as the metastore database password > or the password to decrypt its private SSL certificate. It is also best > practices to have separate passwords on separate files. To facilitate this, > Hive will also accept: > * A configuration for a path to a credential store to use for jobs. This > should already be uploaded in HDFS. (hive.server2.job.keystore.location or a > better name) If this is not specified, then HS2 will simply use the value of > hadoop.security.credential.provider.path. > * An environment variable for the password to decrypt the credential store > (HIVE_JOB_KEYSTORE_PASSWORD or better). If this is not specified, then HS2 > will simply use the standard environment variable for decrypting the Hadoop > Credential Provider. -- This message was sent by Atlassian JIRA (v6.3.4#6332)