[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-22 Thread Prasanth Jayachandran (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14903797#comment-14903797
 ] 

Prasanth Jayachandran commented on HIVE-11762:
--

This broke branch-1 build. DFSClient is not imported. Is there a jira already?

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>Assignee: Jason Dere
> Fix For: 1.3.0, 2.0.0
>
> Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch, 
> HIVE-11762.3.patch, HIVE-11762.4.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-21 Thread Jason Dere (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14900304#comment-14900304
 ] 

Jason Dere commented on HIVE-11762:
---

Tests results look good. Does this one look ok [~spena]?

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>Assignee: Jason Dere
> Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch, 
> HIVE-11762.3.patch, HIVE-11762.4.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-21 Thread JIRA

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14900765#comment-14900765
 ] 

Sergio Peña commented on HIVE-11762:


It is correct [~jdere] :)
+1



> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>Assignee: Jason Dere
> Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch, 
> HIVE-11762.3.patch, HIVE-11762.4.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-19 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14877310#comment-14877310
 ] 

Hive QA commented on HIVE-11762:




{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12761245/HIVE-11762.4.patch

{color:red}ERROR:{color} -1 due to 1 failed/errored test(s), 9452 tests executed
*Failed tests:*
{noformat}
org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation
{noformat}

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/5345/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/5345/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-5345/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 1 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12761245 - PreCommit-HIVE-TRUNK-Build

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>Assignee: Jason Dere
> Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch, 
> HIVE-11762.3.patch, HIVE-11762.4.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-12 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14741918#comment-14741918
 ] 

Hive QA commented on HIVE-11762:




{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12755278/HIVE-11762.3.patch

{color:red}ERROR:{color} -1 due to 3 failed/errored test(s), 9424 tests executed
*Failed tests:*
{noformat}
org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation
org.apache.hive.hcatalog.streaming.TestStreaming.testEndpointConnection
org.apache.hive.hcatalog.streaming.TestStreaming.testTransactionBatchEmptyCommit
{noformat}

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/5249/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/5249/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-5249/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 3 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12755278 - PreCommit-HIVE-TRUNK-Build

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>Assignee: Jason Dere
> Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch, 
> HIVE-11762.3.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-11 Thread Xuefu Zhang (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14741074#comment-14741074
 ] 

Xuefu Zhang commented on HIVE-11762:


Could that be caused by adding  on the dependency? From the 
stacktrace, it appears that KeyProvider class is not found. Is that class 
expected in hadoop-hdfs.jar?

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>Assignee: Jason Dere
> Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch, 
> HIVE-11762.3.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-11 Thread JIRA

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14741020#comment-14741020
 ] 

Sergio Peña commented on HIVE-11762:


I don't know what could be causing the issue. However, the 
KeyProviderKeyExtension is only used on TestEncryptedHDFSCliDriver.
[~xuefuz] Any idea why TestSparkCliDriver is failing with the KeyProvider tests?

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>Assignee: Jason Dere
> Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch, 
> HIVE-11762.3.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-10 Thread JIRA

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14738889#comment-14738889
 ] 

Sergio Peña commented on HIVE-11762:


It looks good [~jdere]
+1

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>Assignee: Jason Dere
> Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-10 Thread Jason Dere (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14739647#comment-14739647
 ] 

Jason Dere commented on HIVE-11762:
---

Whoa, lot of failures .. I ran TestSparkCliDriver and see the following error:
{noformat}
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - Exception in thread "main" 
java.lang.NoClassDefFoundError: org/apache/hadoop/crypto/key/KeyProvider
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.hadoop.hive.shims.Hadoop23Shims.(Hadoop23Shims.java:1058)
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at java.lang.Class.forName0(Native 
Method)
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
java.lang.Class.forName(Class.java:190)
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:146)
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:141)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.hadoop.hive.conf.HiveConf$ConfVars.(HiveConf.java:369)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfiguration.java:46)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:146)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
java.lang.reflect.Method.invoke(Method.java:606)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - Caused by: java.lang.ClassNotFoundException: 
org.apache.hadoop.crypto.key.KeyProvider
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
java.net.URLClassLoader$1.run(URLClassLoader.java:366)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
java.net.URLClassLoader$1.run(URLClassLoader.java:355)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
java.security.AccessController.doPrivileged(Native Method)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - at 
java.net.URLClassLoader.findClass(URLClassLoader.java:354)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 

[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-10 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14739556#comment-14739556
 ] 

Hive QA commented on HIVE-11762:




{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12755035/HIVE-11762.2.patch

{color:red}ERROR:{color} -1 due to 592 failed/errored test(s), 9424 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_add_part_multiple
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_alter_merge_orc
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_alter_merge_stats_orc
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_annotate_stats_join
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join0
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join1
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join10
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join11
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join12
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join13
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join14
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join15
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join16
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join17
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join18
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join18_multi_distinct
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join19
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join2
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join20
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join21
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join22
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join23
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join24
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join26
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join27
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join28
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join29
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join3
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join30
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join31
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join32
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join4
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join5
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join6
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join7
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join8
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join9
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join_filters
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join_nulls
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join_reordering_values
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join_stats
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join_stats2
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_join_without_localtask
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_smb_mapjoin_14
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_1
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_10
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_12
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_13
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_14
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_15
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_16
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_2
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_3
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_4
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_5
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_6
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_7
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_auto_sortmerge_join_8

[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-09 Thread JIRA

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14737118#comment-14737118
 ] 

Sergio Peña commented on HIVE-11762:


Hi [~asuresh]

The KeyProvider is used for running tests with encrypted tables. It is also 
used on Hive to check the encrypted key strength when attempting to copy tables 
between different encryption zones.

But I see the error here is only when configuring the MiniDFS for Hive tests. 
If you go to {{Hadoop23Shims.getMiniDfs()}}, you will see that the KeyProvider 
is obtained from {{miniDFSCluster}}, and then set again using the same 
{{miniDFSCluster}} object. That's the only place where we use the 
{{KeyProviderCryptoExtension}}. It is weird we get and set again, but HDFS 2.6 
does not have the keyProvider assigned even if the get() method returns one (it 
might be a bug). However, this KeyProvider is needed on the MiniDFS to be able 
to run TestEncryptedHDFSCliDriver.

Is there a way to get the HDFS version on Hadoop23Shims so so that we use 
KeyProvider instead of KeyProviderCryptoExtension?

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-08 Thread Arun Suresh (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14735538#comment-14735538
 ] 

Arun Suresh commented on HIVE-11762:


Hmmm.. thats funny... KeyProviderCryptoExtension extends KeyProviderExtension 
which extends KeyProvider, so don't see why there is a problem..

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11762) TestHCatLoaderEncryption failures when using Hadoop 2.7

2015-09-08 Thread Jason Dere (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14735588#comment-14735588
 ] 

Jason Dere commented on HIVE-11762:
---

This is used within Hive for testing as well (when setting up MiniMR tests), if 
that makes it any more acceptable to use within Hive.
Someone else might have to comment if it's possible to eliminate its use in 
Hive - [~spena]?

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> ---
>
> Key: HIVE-11762
> URL: https://issues.apache.org/jira/browse/HIVE-11762
> Project: Hive
>  Issue Type: Bug
>  Components: Shims, Tests
>Reporter: Jason Dere
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>   at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>   at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>@VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)