[ 
https://issues.apache.org/jira/browse/HIVE-10727?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14549751#comment-14549751
 ] 

Hive QA commented on HIVE-10727:
--------------------------------



{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12733649/HIVE-10727.patch

{color:red}ERROR:{color} -1 due to 3 failed/errored test(s), 8946 tests executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udaf_context_ngrams
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udaf_percentile_approx_23
org.apache.hadoop.hive.cli.TestEncryptedHDFSCliDriver.testCliDriver_encryption_insert_partition_static
{noformat}

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/3936/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/3936/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-3936/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 3 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12733649 - PreCommit-HIVE-TRUNK-Build

> Import throws error message "org.apache.thrift.protocol.TProtocolException: 
> Required field 'filesAdded' is unset!"
> ------------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-10727
>                 URL: https://issues.apache.org/jira/browse/HIVE-10727
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 1.2.0
>            Reporter: Balu Vellanki
>            Assignee: Sushanth Sowmyan
>         Attachments: HIVE-10727.patch, hive.log
>
>
> Here are the steps to reproduce. Please setup two hive warehouses with 
> hive.metastore.event.listeners set to 
> org.apache.hive.hcatalog.listener.DbNotificationListener. On warehouse 1, 
> please do the following as user hive.
> {code}
> ## create table
> CREATE TABLE page_view4(viewTime INT, userid BIGINT,
>      page_url STRING, referrer_url STRING,
>      ip STRING COMMENT 'IP Address of the User')
>  COMMENT 'This is the page view table'
>  PARTITIONED BY(dt STRING, country STRING)
>  STORED AS SEQUENCEFILE;
> ## Add partitions
> alter table page_view4 add partition (dt="1", country="usa");
> alter table page_view4 add partition (dt="2", country="india");
> insert into table page_view4 PARTITION (dt="1", country="usa") VALUES (1, 1, 
> "url1", "referurl1", "ip1");
> ## Export table
> export table page_view4 to '/tmp/export4' for replication('foo');
> {code}
> '/tmp/export4' is created with owner as hive and group hdfs. The 
> '/apps/hive/warehouse/page_view4/' is created with owner hive and group 
> users. 
> Copy the exported data in  '/tmp/export4' to hdfs in warehouse 2. The data is 
> still owned by hive and belongs to group hdfs. Please change the group for 
> dir '/tmp/export4' to users. 
> {code}
> bash# su - hdfs
> hdfs : bash# hadoop fs -chown -R hive:users /tmp/export4
> {code}
> As user hive, do the following
> {code}
> hive> import table page_view4 from '/tmp/export4' ;
> Copying data from hdfs://node-4.example.com:8020/tmp/export4/dt=1/country=usa
> ....
> Loading data to table default.page_view4 partition (country=usa, dt=1)
> Failed with exception org.apache.hadoop.hive.ql.metadata.HiveException: 
> org.apache.thrift.protocol.TProtocolException: Required field 'filesAdded' is 
> unset! Struct:InsertEventRequestData(filesAdded:null)
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.MoveTask
> {code}
> The import failed. Attaching the logs from /tmp/hive/hive.log for further 
> debugging. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to