[ 
https://issues.apache.org/jira/browse/HIVE-8394?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14191772#comment-14191772
 ] 

Hive QA commented on HIVE-8394:
-------------------------------



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12678359/HIVE-8394.2.patch

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1573/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1573/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-1573/

Messages:
{noformat}
**** This message was trimmed, see log for full details ****
[INFO] Installing 
/data/hive-ptest/working/apache-svn-trunk-source/contrib/pom.xml to 
/data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.15.0-SNAPSHOT/hive-contrib-0.15.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive HBase Handler 0.15.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-handler ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler 
(includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ 
hive-hbase-handler ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
hive-hbase-handler ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler 
---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ 
hive-hbase-handler ---
[INFO] Compiling 37 source files to 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseSerDe.java:
 Some input files use or override a deprecated API.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseSerDe.java:
 Recompile with -Xlint:deprecation for details.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseSerDeParameters.java:
 Some input files use unchecked or unsafe operations.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseSerDeParameters.java:
 Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- avro-maven-plugin:1.7.6:protocol (default) @ hive-hbase-handler ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.7:add-test-source (add-test-sources) @ 
hive-hbase-handler ---
[INFO] Test Source directory: 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/gen/avro/gen-java
 added.
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ 
hive-hbase-handler ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handler 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf
     [copy] Copying 8 files to 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
hive-hbase-handler ---
[INFO] Compiling 16 source files to 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/test-classes
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/org/apache/hadoop/hive/hbase/TestHBaseKeyFactory2.java:
 Some input files use or override a deprecated API.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/org/apache/hadoop/hive/hbase/TestHBaseKeyFactory2.java:
 Recompile with -Xlint:deprecation for details.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/org/apache/hadoop/hive/hbase/avro/ContactInfo.java:
 Some input files use unchecked or unsafe operations.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/org/apache/hadoop/hive/hbase/avro/ContactInfo.java:
 Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-handler 
---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-handler ---
[INFO] Building jar: 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.15.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ 
hive-hbase-handler ---
[INFO] 
[INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hbase-handler ---
[INFO] Building jar: 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.15.0-SNAPSHOT-tests.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ 
hive-hbase-handler ---
[INFO] Installing 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.15.0-SNAPSHOT.jar
 to 
/data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.15.0-SNAPSHOT/hive-hbase-handler-0.15.0-SNAPSHOT.jar
[INFO] Installing 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/pom.xml to 
/data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.15.0-SNAPSHOT/hive-hbase-handler-0.15.0-SNAPSHOT.pom
[INFO] Installing 
/data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.15.0-SNAPSHOT-tests.jar
 to 
/data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.15.0-SNAPSHOT/hive-hbase-handler-0.15.0-SNAPSHOT-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive HCatalog 0.15.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog 
(includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-hcatalog 
---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf
     [copy] Copying 8 files to 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ 
hive-hcatalog ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog 
---
[INFO] Installing 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/pom.xml to 
/data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog/0.15.0-SNAPSHOT/hive-hcatalog-0.15.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive HCatalog Core 0.15.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-core ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core 
(includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ 
hive-hcatalog-core ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
hive-hcatalog-core ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-core 
---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ 
hive-hcatalog-core ---
[INFO] Compiling 79 source files to 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/JsonSerDe.java:
 Some input files use or override a deprecated API.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/JsonSerDe.java:
 Recompile with -Xlint:deprecation for details.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/HCatBaseOutputFormat.java:
 Some input files use unchecked or unsafe operations.
[WARNING] 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/HCatBaseOutputFormat.java:
 Recompile with -Xlint:unchecked for details.
[INFO] 4 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/DynamicPartitionFileRecordWriterContainer.java:[104,44]
 method register in class 
org.apache.hive.hcatalog.mapreduce.TaskCommitContextRegistry cannot be applied 
to given types;
  required: 
org.apache.hadoop.mapreduce.TaskAttemptContext,org.apache.hive.hcatalog.mapreduce.TaskCommitContextRegistry.TaskCommitterProxy
  found: org.apache.hadoop.mapreduce.TaskAttemptID,<anonymous 
org.apache.hive.hcatalog.mapreduce.TaskCommitContextRegistry.TaskCommitterProxy>
  reason: actual argument org.apache.hadoop.mapreduce.TaskAttemptID cannot be 
converted to org.apache.hadoop.mapreduce.TaskAttemptContext by method 
invocation conversion
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive .............................................. SUCCESS [12.650s]
[INFO] Hive Shims Common ................................. SUCCESS [7.257s]
[INFO] Hive Shims 0.20 ................................... SUCCESS [3.670s]
[INFO] Hive Shims Secure Common .......................... SUCCESS [5.452s]
[INFO] Hive Shims 0.20S .................................. SUCCESS [2.454s]
[INFO] Hive Shims 0.23 ................................... SUCCESS [8.050s]
[INFO] Hive Shims ........................................ SUCCESS [1.852s]
[INFO] Hive Common ....................................... SUCCESS [26.256s]
[INFO] Hive Serde ........................................ SUCCESS [21.837s]
[INFO] Hive Metastore .................................... SUCCESS [37.088s]
[INFO] Hive Ant Utilities ................................ SUCCESS [1.887s]
[INFO] Hive Query Language ............................... SUCCESS [1:37.469s]
[INFO] Hive Service ...................................... SUCCESS [12.555s]
[INFO] Hive Accumulo Handler ............................. SUCCESS [6.831s]
[INFO] Hive JDBC ......................................... SUCCESS [1:28.444s]
[INFO] Hive Beeline ...................................... SUCCESS [1.530s]
[INFO] Hive CLI .......................................... SUCCESS [1.709s]
[INFO] Hive Contrib ...................................... SUCCESS [1.603s]
[INFO] Hive HBase Handler ................................ SUCCESS [8.545s]
[INFO] Hive HCatalog ..................................... SUCCESS [0.600s]
[INFO] Hive HCatalog Core ................................ FAILURE [2.134s]
[INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
[INFO] Hive HCatalog Server Extensions ................... SKIPPED
[INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
[INFO] Hive HCatalog Webhcat ............................. SKIPPED
[INFO] Hive HCatalog Streaming ........................... SKIPPED
[INFO] Hive HWI .......................................... SKIPPED
[INFO] Hive ODBC ......................................... SKIPPED
[INFO] Hive Shims Aggregator ............................. SKIPPED
[INFO] Hive TestUtils .................................... SKIPPED
[INFO] Hive Packaging .................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5:52.804s
[INFO] Finished at: Fri Oct 31 09:00:12 EDT 2014
[INFO] Final Memory: 125M/832M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on 
project hive-hcatalog-core: Compilation failure
[ERROR] 
/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/DynamicPartitionFileRecordWriterContainer.java:[104,44]
 method register in class 
org.apache.hive.hcatalog.mapreduce.TaskCommitContextRegistry cannot be applied 
to given types;
[ERROR] required: 
org.apache.hadoop.mapreduce.TaskAttemptContext,org.apache.hive.hcatalog.mapreduce.TaskCommitContextRegistry.TaskCommitterProxy
[ERROR] found: org.apache.hadoop.mapreduce.TaskAttemptID,<anonymous 
org.apache.hive.hcatalog.mapreduce.TaskCommitContextRegistry.TaskCommitterProxy>
[ERROR] reason: actual argument org.apache.hadoop.mapreduce.TaskAttemptID 
cannot be converted to org.apache.hadoop.mapreduce.TaskAttemptContext by method 
invocation conversion
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-hcatalog-core
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12678359 - PreCommit-HIVE-TRUNK-Build

> HIVE-7803 doesn't handle Pig MultiQuery, can cause data-loss.
> -------------------------------------------------------------
>
>                 Key: HIVE-8394
>                 URL: https://issues.apache.org/jira/browse/HIVE-8394
>             Project: Hive
>          Issue Type: Bug
>          Components: HCatalog
>    Affects Versions: 0.12.0, 0.14.0, 0.13.1
>            Reporter: Mithun Radhakrishnan
>            Assignee: Mithun Radhakrishnan
>            Priority: Critical
>         Attachments: HIVE-8394.1.patch, HIVE-8394.2.patch
>
>
> We've found situations in production where Pig queries using {{HCatStorer}}, 
> dynamic partitioning and {{opt.multiquery=true}} that produce partitions in 
> the output table, but the corresponding directories have no data files (in 
> spite of Pig reporting non-zero records written to HDFS). I don't yet have a 
> distilled test-case for this.
> Here's the code from FileOutputCommitterContainer after HIVE-7803:
> {code:java|title=FileOutputCommitterContainer.java|borderStyle=dashed|titleBGColor=#F7D6C1|bgColor=#FFFFCE}
>   @Override
>   public void commitTask(TaskAttemptContext context) throws IOException {
>     String jobInfoStr = 
> context.getConfiguration().get(FileRecordWriterContainer.DYN_JOBINFO);
>     if (!dynamicPartitioningUsed) {
>          //See HCATALOG-499
>       FileOutputFormatContainer.setWorkOutputPath(context);
>       
> getBaseOutputCommitter().commitTask(HCatMapRedUtil.createTaskAttemptContext(context));
>     } else if (jobInfoStr != null) {
>       ArrayList<String> jobInfoList = 
> (ArrayList<String>)HCatUtil.deserialize(jobInfoStr);
>       org.apache.hadoop.mapred.TaskAttemptContext currTaskContext = 
> HCatMapRedUtil.createTaskAttemptContext(context);
>       for (String jobStr : jobInfoList) {
>       OutputJobInfo localJobInfo = 
> (OutputJobInfo)HCatUtil.deserialize(jobStr);
>       FileOutputCommitter committer = new FileOutputCommitter(new 
> Path(localJobInfo.getLocation()), currTaskContext);
>       committer.commitTask(currTaskContext);
>       }
>     }
>   }
> {code}
> The serialized jobInfoList can't be retrieved, and hence the commit never 
> completes. This is because Pig's MapReducePOStoreImpl deliberately clones 
> both the TaskAttemptContext and the contained Configuration instance, thus 
> separating the Configuration instances passed to 
> {{FileOutputCommitterContainer::commitTask()}} and 
> {{FileRecordWriterContainer::close()}}. Anything set by the RecordWriter is 
> unavailable to the Committer.
> One approach would have been to store state in the FileOutputFormatContainer. 
> But that won't work since this is constructed via reflection in 
> HCatOutputFormat (itself constructed via reflection by PigOutputFormat via 
> HCatStorer). There's no guarantee that the instance is preserved.
> My only recourse seems to be to use a Singleton to store shared state. I'm 
> loath to indulge in this brand of shenanigans. (Statics and container-reuse 
> in Tez might not play well together, for instance.) It might work if we're 
> careful about tearing down the singleton.
> Any other ideas? 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to