Build failed in Hudson: Hive-trunk-h0.20 #386

2010-10-08 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/386/changes

Changes:

[namit] HIVE-1570 referencing an added file by it's name in a transform script 
does
not work in hive local mode (Joydeep Sen Sarma via namit)

--
[...truncated 14092 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK

Hudson build is back to normal : Hive-trunk-h0.17 #563

2010-10-09 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.17/563/changes




Build failed in Hudson: Hive-trunk-h0.19 #564

2010-10-09 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/564/changes

Changes:

[namit] HIVE-537. Hive TypeInfo/ObjectInspector to support union
(Amareshwari Sriramadasu via namit)

[namit] HIVE-1546 Ability to plug custom Semantic Analyzers for Hive Grammar
(Ashutosh Chauhan via namit)

--
[...truncated 12205 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv3.txt
[junit] Loading data to 

Build failed in Hudson: Hive-trunk-h0.19 #565

2010-10-10 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/565/

--
[...truncated 12280 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.seq
[junit] Loading data to table 

Build failed in Hudson: Hive-trunk-h0.19 #566

2010-10-11 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/566/

--
[...truncated 12225 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.19/ws/hive/data/files/kv1.seq
[junit] Loading data to table 

Build failed in Hudson: Hive-trunk-h0.20 #389

2010-10-11 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/389/

--
[...truncated 14197 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table 

Build failed in Hudson: Hive-trunk-h0.18 #567

2010-10-12 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/567/changes

Changes:

[jvs] HIVE-1264. Make Hive work with Hadoop security
(Todd Lipcon via jvs)

--
[...truncated 31063 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 

Build failed in Hudson: Hive-trunk-h0.20 #397

2010-10-19 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/397/changes

Changes:

[namit] spelling mistake in Siying's name

[namit] HIVE-1638. convert commonly used udfs to generic udfs
(Siyong Dong via namit)

[jvs] HIVE-1726. Update README file for 0.6.0 release
(Carl Steinbach via jvs)

[jvs] HIVE-1725. Include metastore upgrade scripts in release tarball
(Carl Steinbach via jvs)

--
[...truncated 15079 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: 

Hudson build is back to normal : Hive-trunk-h0.20 #398

2010-10-20 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/398/changes




Build failed in Hudson: Hive-trunk-h0.20 #403

2010-10-25 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/403/

--
[...truncated 15087 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 

Build failed in Hudson: Hive-trunk-h0.20 #406

2010-10-28 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/406/changes

Changes:

[namit] HIVE-1757 Test cleanup for 1641
(Liyin Tang via namit)

[namit] HIVE-1755 Update broken test outputs due to 1641
(He Yongqiang via namit)

[namit] HIVE-474 Support for distinct selection on two or more columns
(Amareshwari Sriramadasu via namit)

--
[...truncated 15243 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data 

Build failed in Hudson: Hive-trunk-h0.20 #410

2010-11-02 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/410/

--
[...truncated 15162 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 

Hudson build is back to normal : Hive-trunk-h0.20 #411

2010-11-03 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/411/changes




Build failed in Hudson: Hive-trunk-h0.20 #413

2010-11-05 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/413/changes

Changes:

[namit] HIVE-1767 Bug in merging files for dynamic partitions
(He Yongqiang via namit)

[namit] HIVE-1751. Optimize ColumnarStructObjectInspector.getStructFieldData()
(Siying Dong via namit)

--
[...truncated 16480 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 

Hudson build is back to normal : Hive-trunk-h0.20 #416

2010-11-08 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/416/changes




Hudson build is back to normal : Hive-trunk-h0.20 #426

2010-11-18 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/426/changes




Build failed in Hudson: Hive-trunk-h0.20 #430

2010-11-22 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/430/changes

Changes:

[namit] HIVE-1787 optimize the code path when there are no outer joins
(Siying Dong via namit)

--
[...truncated 15439 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: 

Hudson build is back to normal : Hive-trunk-h0.20 #431

2010-11-23 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/431/




Hudson build is back to normal : Hive-trunk-h0.20 #434

2010-12-06 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/434/




Build failed in Hudson: Hive-trunk-h0.20 #440

2010-12-12 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/440/

--
[...truncated 15041 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 

Hudson build is back to normal : Hive-trunk-h0.20 #441

2010-12-13 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/441/




Build failed in Hudson: Hive-trunk-h0.20 #444

2010-12-16 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/444/changes

Changes:

[heyongqiang] HIVE-842 Authentication Infrastructure for Hive. (Ashutosh 
Chauhan via He Yongqiang)

--
[...truncated 15542 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: 

Build failed in Hudson: Hive-trunk-h0.20 #445

2010-12-17 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/445/changes

Changes:

[namit] HIVE-1853 Downrgrade JDO (Paul Yang via namit)

--
[...truncated 5861 lines...]
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x00xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap

Build failed in Hudson: Hive-trunk-h0.20 #446

2010-12-18 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/446/

--
[...truncated 5845 lines...]
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x00xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
  

Build failed in Hudson: Hive-trunk-h0.20 #447

2010-12-19 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/447/

--
[...truncated 5845 lines...]
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x00xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
  

Build failed in Hudson: Hive-trunk-h0.20 #448

2010-12-20 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/448/

--
[...truncated 5845 lines...]
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x00xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
  

Build failed in Hudson: Hive-trunk-h0.20 #449

2010-12-21 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/449/changes

Changes:

[pauly] HIVE-1857 mixed case tablename on lefthand side of LATERAL VIEW results 
in
query failing with confusing error message (John Sichi via pauly)

[namit] HIVE-1855 Include Process ID in the log4j log file name
(Ning Zhang via namit)

[nzhang] HIVE-1835. Better auto-complete for Hive (Paul Butler via Ning Zhang)

[jssarma] commit second diff for hive-1846 (rvadali via jssarma)

[jssarma] Reversing erroneous commit

[jssarma] commit second diff for hive-1846 (rvadali via jssarma)

[namit] HIVE-1854 Temporarily disable metastore tests for 
listPartitionsByFilter()
(Paul Yang via namit)

--
[...truncated 15104 lines...]
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] rmr: cannot remove 
phttps://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12:
 No such file or directory.
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt

Build failed in Hudson: Hive-trunk-h0.20 #451

2010-12-23 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/451/changes

Changes:

[jssarma] HIVE-1852 Reduce unnecessary DFSClient.rename() calls (Ning Zhang via 
jssarma)

--
[...truncated 14785 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 

Build failed in Hudson: Hive-trunk-h0.20 #452

2010-12-24 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/452/changes

Changes:

[namit] HIVE-1806 Merge per dynamic partition based on size of each dynamic 
partition
(Ning Zhang via namit)

[namit] HIVE-1456 No need to check for LOG as null in sort-merge join
(Alexey Diomin via namit)

--
[...truncated 18843 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to 

Hudson build is back to normal : Hive-trunk-h0.20 #453

2010-12-25 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/453/changes




Build failed in Hudson: Hive-trunk-h0.20 #455

2010-12-27 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/455/changes

Changes:

[cws] HIVE-1790 Support HAVING clause in Hive (Vaibhav Aggarwal via cws)

--
[...truncated 14714 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 

Hudson build is back to normal : Hive-trunk-h0.20 #456

2010-12-28 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/456/changes




Build failed in Hudson: Hive-trunk-h0.20 #457

2010-12-28 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/457/

--
[...truncated 14806 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table 

Build failed in Hudson: Hive-trunk-h0.20 #458

2010-12-28 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/458/changes

Changes:

[namit] HIVE-1870 Add TestRemoteHiveMetaStore deleted accidently
(Carl Steinbach via namit)

--
[...truncated 14826 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 

Build failed in Hudson: Hive-trunk-h0.20 #459

2010-12-29 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/459/

--
[...truncated 5886 lines...]
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o = [234, null, {firstKey=1, secondKey=2}]
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.872 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyArrayMapStruct
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.553 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyPrimitive
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.339 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazySimpleSerDe
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.59 sec
[junit] Running org.apache.hadoop.hive.serde2.lazybinary.TestLazyBinarySerDe
[junit] Beginning Test TestLazyBinarySerDe:

Hudson build is back to normal : Hive-trunk-h0.20 #461

2010-12-31 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/461/




Build failed in Hudson: Hive-trunk-h0.20 #462

2011-01-01 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/462/

--
[...truncated 14843 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table 

Build failed in Hudson: Hive-trunk-h0.20 #463

2011-01-02 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/463/

--
[...truncated 5886 lines...]
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o = [234, null, {firstKey=1, secondKey=2}]
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.847 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyArrayMapStruct
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.484 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyPrimitive
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.438 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazySimpleSerDe
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.606 sec
[junit] Running org.apache.hadoop.hive.serde2.lazybinary.TestLazyBinarySerDe
[junit] Beginning Test 

Build failed in Hudson: Hive-trunk-h0.20 #464

2011-01-03 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/464/

--
[...truncated 14764 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table 

Build failed in Hudson: Hive-trunk-h0.20 #465

2011-01-03 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/465/changes

Changes:

[namit] HIVE-1874 fix HBase filter pushdown broken by HIVE-1638
(John Sichi via namit)

[namit] HIVE-1873 Fix 'tar' build target broken in HIVE-1526
(Carl Steinbach via namit)

--
[...truncated 6178 lines...]
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o = [234, null, {firstKey=1, secondKey=2}]
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.733 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyArrayMapStruct
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.559 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyPrimitive
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.397 sec
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.397 sec
[junit] Running 

Build failed in Hudson: Hive-trunk-h0.20 #467

2011-01-04 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/467/

--
[...truncated 14626 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table 

Hudson build is back to normal : Hive-trunk-h0.20 #469

2011-01-05 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/469/




Build failed in Hudson: Hive-trunk-h0.20 #470

2011-01-06 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/470/changes

Changes:

[namit] HIVE-1840 Support ALTER DATABASE to change database properties
(Ning Zhang via namit)

[namit] HIVE-1881 Add an option to use FsShell to delete dir in warehouse
(He Yongqiang via namit)

--
[...truncated 18671 lines...]
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_function2.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_function2.q.out
[junit] Done query: unknown_function2.q
[junit] Begin query: unknown_function3.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_function3.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_function3.q.out
[junit] Done query: unknown_function3.q
[junit] Begin query: unknown_function4.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: 

Build failed in Hudson: Hive-trunk-h0.20 #471

2011-01-06 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/471/

--
[...truncated 14567 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table 

Build failed in Hudson: Hive-trunk-h0.20 #472

2011-01-07 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/472/changes

Changes:

[namit] HIVE-1889 add an option (hive.index.compact.file.ignore.hdfs)
to ignore HDFS location stored in index files
(Yongqiang He via namit)

--
[...truncated 15166 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
  

Build failed in Hudson: Hive-trunk-h0.20 #473

2011-01-07 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/473/

--
[...truncated 7225 lines...]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o = [234, null, {firstKey=1, secondKey=2}]
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.861 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyArrayMapStruct
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.367 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyPrimitive
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.436 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazySimpleSerDe
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.376 sec
[junit] Running org.apache.hadoop.hive.serde2.lazybinary.TestLazyBinarySerDe
[junit] Beginning Test TestLazyBinarySerDe:
[junit] Test TestLazyBinarySerDe passed!
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.536 sec
[junit] Running 

Build failed in Hudson: Hive-trunk-h0.20 #476

2011-01-08 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/476/

--
[...truncated 7224 lines...]
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o = [234, null, {firstKey=1, secondKey=2}]
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.912 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyArrayMapStruct
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.557 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyPrimitive
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.54 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazySimpleSerDe
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.513 sec
[junit] Running org.apache.hadoop.hive.serde2.lazybinary.TestLazyBinarySerDe
[junit] Beginning Test TestLazyBinarySerDe:
[junit] Test TestLazyBinarySerDe passed!

Build failed in Hudson: Hive-trunk-h0.20 #477

2011-01-09 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/477/

--
[...truncated 7224 lines...]
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o = [234, null, {firstKey=1, secondKey=2}]
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 1.014 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyArrayMapStruct
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.561 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyPrimitive
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.454 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazySimpleSerDe
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.509 sec
[junit] Running org.apache.hadoop.hive.serde2.lazybinary.TestLazyBinarySerDe
[junit] Beginning Test TestLazyBinarySerDe:
[junit] Test TestLazyBinarySerDe passed!
   

Build failed in Hudson: Hive-trunk-h0.20 #478

2011-01-10 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/478/

--
[...truncated 7225 lines...]
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TBinaryProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: org.apache.thrift.protocol.TJSONProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
[junit] bytes in text 
={-1:{i32:234},-2:{lst:[str,2,firstString,secondString]},-3:{map:[str,i32,2,{firstKey:1,secondKey:2}]},-4:{i32:-234},-5:{dbl:1.0},-6:{dbl:-2.5}}
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
[junit] bytes in text 
=234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Beginning Test testTBinarySortableProtocol:
[junit] Testing struct test { double hello}
[junit] Testing struct test { i32 hello}
[junit] Testing struct test { i64 hello}
[junit] Testing struct test { string hello}
[junit] Testing struct test { string hello, double another}
[junit] Test testTBinarySortableProtocol passed!
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
[junit] bytes in text =234  firstStringsecondString
firstKey1secondKey2
[junit] compare to=234  firstStringsecondString
firstKey1secondKey2
[junit] o class = class java.util.ArrayList
[junit] o size = 3
[junit] o = [234, null, {firstKey=1, secondKey=2}]
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.936 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyArrayMapStruct
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.5 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazyPrimitive
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.339 sec
[junit] Running org.apache.hadoop.hive.serde2.lazy.TestLazySimpleSerDe
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.574 sec
[junit] Running org.apache.hadoop.hive.serde2.lazybinary.TestLazyBinarySerDe
[junit] Beginning Test TestLazyBinarySerDe:
[junit] Test TestLazyBinarySerDe passed!

Build failed in Hudson: Hive-trunk-h0.20 #480

2011-01-11 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/480/

--
[...truncated 15632 lines...]
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: 

Build failed in Hudson: Hive-trunk-h0.20 #481

2011-01-11 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/481/changes

Changes:

[jvs] HIVE-1829. Fix intermittent failures in TestRemoteMetaStore
(Carl Steinbach via jvs)

--
[...truncated 16127 lines...]
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.seq
[junit] Loading data to table src_sequencefile
[junit] POSTHOOK: Output: defa...@src_sequencefile
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/complex.seq
[junit] Loading data to table src_thrift
[junit] POSTHOOK: Output: defa...@src_thrift
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: Output: defa...@src_json
[junit] OK
[junit] diff 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ql/test/logs/negative/unknown_table1.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ql/src/test/results/compiler/errors/unknown_table1.q.out
[junit] Done query: unknown_table1.q
[junit] Begin query: unknown_table2.q
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-08, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-08/hr=12
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=11)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=11
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table srcpart partition (ds=2008-04-09, hr=12)
[junit] POSTHOOK: Output: defa...@srcpart@ds=2008-04-09/hr=12
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket0.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] Copying data from 

Build failed in Hudson: Hive-trunk-h0.20 #482

2011-01-12 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/482/changes

Changes:

[namit] HIVE-78 Authorization model for Hive
(Yongqiang He via namit)

--
[...truncated 24035 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] PREHOOK: 

Build failed in Hudson: Hive-trunk-h0.20 #483

2011-01-12 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/483/changes

Changes:

[heyongqiang] HIVE-1907 Store jobid in ExecDriver (namit via He Yongqiang)

--
[...truncated 21400 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] PREHOOK: 

Build failed in Hudson: Hive-trunk-h0.20 #484

2011-01-12 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/484/

--
[...truncated 21975 lines...]
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: defa...@src1
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #485

2011-01-13 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/485/changes

Changes:

[heyongqiang] HIVE-1865 redo zookeeper hive lock manager (namit via He 
Yongqiang)

--
[...truncated 21347 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK
[junit] 

Build failed in Hudson: Hive-trunk-h0.20 #490

2011-01-15 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/490/

--
[...truncated 24515 lines...]
[junit] PREHOOK: Input: default@testhivejdbcdriverview
[junit] PREHOOK: Output: default@testhivejdbcdriverview
[junit] POSTHOOK: query: drop view testHiveJdbcDriverView
[junit] POSTHOOK: type: DROPVIEW
[junit] POSTHOOK: Input: default@testhivejdbcdriverview
[junit] POSTHOOK: Output: default@testhivejdbcdriverview
[junit] OK
[junit] PREHOOK: query: create view testHiveJdbcDriverView comment 'Simple 
view' as select * from testHiveJdbcDriverTable
[junit] PREHOOK: type: CREATEVIEW
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-01-15_11-10-48_869_4770481205834447959/-mr-1
[junit] POSTHOOK: query: create view testHiveJdbcDriverView comment 'Simple 
view' as select * from testHiveJdbcDriverTable
[junit] POSTHOOK: type: CREATEVIEW
[junit] POSTHOOK: Output: default@testHiveJdbcDriverView
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-01-15_11-10-48_869_4770481205834447959/-mr-1
[junit] OK
[junit] PREHOOK: query: select c1, c2, c3, c4, c5 as a, c6, c7, c8, c9, 
c10, c11, c12, c1*2, sentences(null, null, null) as b from testDataTypeTable 
limit 1
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testdatatypetable@dt=20090619
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-01-15_11-10-48_898_8335693208673389672/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-01-15 11:10:51,726 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select c1, c2, c3, c4, c5 as a, c6, c7, c8, c9, 
c10, c11, c12, c1*2, sentences(null, null, null) as b from testDataTypeTable 
limit 1
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testdatatypetable@dt=20090619
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-01-15_11-10-48_898_8335693208673389672/-mr-1
[junit] OK
[junit] PREHOOK: query: drop table testHiveJdbcDriverTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivejdbcdrivertable
[junit] PREHOOK: Output: default@testhivejdbcdrivertable
[junit] POSTHOOK: query: drop table testHiveJdbcDriverTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivejdbcdrivertable
[junit] POSTHOOK: Output: default@testhivejdbcdrivertable
[junit] OK
[junit] PREHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivejdbcdriverpartitionedtable
[junit] PREHOOK: Output: default@testhivejdbcdriverpartitionedtable
[junit] POSTHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivejdbcdriverpartitionedtable
[junit] POSTHOOK: Output: default@testhivejdbcdriverpartitionedtable
[junit] OK
[junit] PREHOOK: query: drop table testDataTypeTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testdatatypetable
[junit] PREHOOK: Output: default@testdatatypetable
[junit] POSTHOOK: query: drop table testDataTypeTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testdatatypetable
[junit] POSTHOOK: Output: default@testdatatypetable
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/jdbc/tmp/hive_job_log_hudson_201101151110_1474355649.txt
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/jdbc/tmp/hive_job_log_hudson_201101151110_1121452554.txt
[junit] PREHOOK: query: drop table testHiveJdbcDriverTable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testHiveJdbcDriverTable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testHiveJdbcDriverTable (key int 
comment 'the key', value string) comment 'Simple table'
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testHiveJdbcDriverTable (key int 
comment 'the key', value string) comment 'Simple table'
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testHiveJdbcDriverTable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testHiveJdbcDriverTable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivejdbcdrivertable
[junit] POSTHOOK: query: load data local inpath 

Build failed in Hudson: Hive-trunk-h0.20 #491

2011-01-16 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/491/

--
[...truncated 24584 lines...]
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-01-16_11-04-17_648_6366922926584381024/-mr-1
[junit] POSTHOOK: query: create view testHiveJdbcDriverView comment 'Simple 
view' as select * from testHiveJdbcDriverTable
[junit] POSTHOOK: type: CREATEVIEW
[junit] POSTHOOK: Output: default@testHiveJdbcDriverView
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-01-16_11-04-17_648_6366922926584381024/-mr-1
[junit] OK
[junit] PREHOOK: query: select c1, c2, c3, c4, c5 as a, c6, c7, c8, c9, 
c10, c11, c12, c1*2, sentences(null, null, null) as b from testDataTypeTable 
limit 1
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testdatatypetable@dt=20090619
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-01-16_11-04-17_675_4693918222610335762/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-01-16 11:04:20,500 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select c1, c2, c3, c4, c5 as a, c6, c7, c8, c9, 
c10, c11, c12, c1*2, sentences(null, null, null) as b from testDataTypeTable 
limit 1
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testdatatypetable@dt=20090619
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-01-16_11-04-17_675_4693918222610335762/-mr-1
[junit] OK
[junit] PREHOOK: query: drop table testHiveJdbcDriverTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivejdbcdrivertable
[junit] PREHOOK: Output: default@testhivejdbcdrivertable
[junit] POSTHOOK: query: drop table testHiveJdbcDriverTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivejdbcdrivertable
[junit] POSTHOOK: Output: default@testhivejdbcdrivertable
[junit] OK
[junit] PREHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivejdbcdriverpartitionedtable
[junit] PREHOOK: Output: default@testhivejdbcdriverpartitionedtable
[junit] POSTHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivejdbcdriverpartitionedtable
[junit] POSTHOOK: Output: default@testhivejdbcdriverpartitionedtable
[junit] OK
[junit] PREHOOK: query: drop table testDataTypeTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testdatatypetable
[junit] PREHOOK: Output: default@testdatatypetable
[junit] POSTHOOK: query: drop table testDataTypeTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testdatatypetable
[junit] POSTHOOK: Output: default@testdatatypetable
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/jdbc/tmp/hive_job_log_hudson_201101161104_730754518.txt
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/jdbc/tmp/hive_job_log_hudson_201101161104_1033128959.txt
[junit] PREHOOK: query: drop table testHiveJdbcDriverTable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testHiveJdbcDriverTable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testHiveJdbcDriverTable (key int 
comment 'the key', value string) comment 'Simple table'
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testHiveJdbcDriverTable (key int 
comment 'the key', value string) comment 'Simple table'
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testHiveJdbcDriverTable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testHiveJdbcDriverTable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivejdbcdrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testHiveJdbcDriverTable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivejdbcdrivertable
[junit] OK
[junit] PREHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testHiveJdbcDriverPartitionedTable 
(key int, 

Build failed in Hudson: Hive-trunk-h0.20 #492

2011-01-16 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/492/changes

Changes:

[namit] HIVE-1917 CTAS (create-table-as-select) throws exception when showing
results (Ning Zhang via namit)

--
[...truncated 24582 lines...]
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-01-16_23-23-12_353_4609221742616710084/-mr-1
[junit] POSTHOOK: query: create view testHiveJdbcDriverView comment 'Simple 
view' as select * from testHiveJdbcDriverTable
[junit] POSTHOOK: type: CREATEVIEW
[junit] POSTHOOK: Output: default@testHiveJdbcDriverView
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-01-16_23-23-12_353_4609221742616710084/-mr-1
[junit] OK
[junit] PREHOOK: query: select c1, c2, c3, c4, c5 as a, c6, c7, c8, c9, 
c10, c11, c12, c1*2, sentences(null, null, null) as b from testDataTypeTable 
limit 1
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testdatatypetable@dt=20090619
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-01-16_23-23-12_381_5138612469464623136/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-01-16 23:23:15,204 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select c1, c2, c3, c4, c5 as a, c6, c7, c8, c9, 
c10, c11, c12, c1*2, sentences(null, null, null) as b from testDataTypeTable 
limit 1
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testdatatypetable@dt=20090619
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-01-16_23-23-12_381_5138612469464623136/-mr-1
[junit] OK
[junit] PREHOOK: query: drop table testHiveJdbcDriverTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivejdbcdrivertable
[junit] PREHOOK: Output: default@testhivejdbcdrivertable
[junit] POSTHOOK: query: drop table testHiveJdbcDriverTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivejdbcdrivertable
[junit] POSTHOOK: Output: default@testhivejdbcdrivertable
[junit] OK
[junit] PREHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivejdbcdriverpartitionedtable
[junit] PREHOOK: Output: default@testhivejdbcdriverpartitionedtable
[junit] POSTHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivejdbcdriverpartitionedtable
[junit] POSTHOOK: Output: default@testhivejdbcdriverpartitionedtable
[junit] OK
[junit] PREHOOK: query: drop table testDataTypeTable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testdatatypetable
[junit] PREHOOK: Output: default@testdatatypetable
[junit] POSTHOOK: query: drop table testDataTypeTable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testdatatypetable
[junit] POSTHOOK: Output: default@testdatatypetable
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/jdbc/tmp/hive_job_log_hudson_201101162323_1072159952.txt
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/jdbc/tmp/hive_job_log_hudson_201101162323_712238333.txt
[junit] PREHOOK: query: drop table testHiveJdbcDriverTable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testHiveJdbcDriverTable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testHiveJdbcDriverTable (key int 
comment 'the key', value string) comment 'Simple table'
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testHiveJdbcDriverTable (key int 
comment 'the key', value string) comment 'Simple table'
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testHiveJdbcDriverTable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testHiveJdbcDriverTable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivejdbcdrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testHiveJdbcDriverTable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivejdbcdrivertable
[junit] OK
[junit] PREHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testHiveJdbcDriverPartitionedTable
[junit] 

Build failed in Hudson: Hive-trunk-h0.20 #493

2011-01-17 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/493/

--
[...truncated 21415 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #494

2011-01-18 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/494/

--
[...truncated 25417 lines...]
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src_thrift
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt'
 INTO TABLE src_json
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt
[junit] Loading data to table src_json
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/json.txt'
 INTO TABLE src_json
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src_json
[junit] OK
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/hbase-handler/test/data/warehouse/stats_src
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/hbase-handler/test/logs/hbase-handler/hbase_stats.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/hbase-handler/src/test/results/hbase_stats.q.out
[junit] 17c17
[junit]  PREHOOK: type: QUERY
[junit] ---
[junit]  PREHOOK: type: null
[junit] 21c21
[junit]  POSTHOOK: type: QUERY
[junit] ---
[junit]  POSTHOOK: type: null
[junit] 97c97
[junit]  PREHOOK: type: QUERY
[junit] ---
[junit]  PREHOOK: type: null
[junit] 101c101
[junit]  POSTHOOK: type: QUERY
[junit] ---
[junit]  POSTHOOK: type: null
[junit] 111c111
[junit]  PREHOOK: type: QUERY
[junit] ---
[junit]  PREHOOK: type: null
[junit] 115c115
[junit]  POSTHOOK: type: QUERY
[junit] ---
[junit]  POSTHOOK: type: null
[junit] Exception: Client execution results failed with error code = 1
[junit] junit.framework.AssertionFailedError: Client execution results 
failed with error code = 1
[junit] at junit.framework.Assert.fail(Assert.java:47)
[junit] at 
org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver_hbase_stats(TestHBaseCliDriver.java:235)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:154)
[junit] at junit.framework.TestCase.runBare(TestCase.java:127)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:118)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:208)
[junit] at junit.framework.TestSuite.run(TestSuite.java:203)
[junit] at 
junit.extensions.TestDecorator.basicRun(TestDecorator.java:22)
[junit] at junit.extensions.TestSetup$1.protect(TestSetup.java:19)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.extensions.TestSetup.run(TestSetup.java:23)
[junit] at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:422)
[junit] at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:931)
[junit] at 
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:785)
[junit] Tests run: 4, Failures: 1, Errors: 0, Time elapsed: 320.852 sec
[junit] Test org.apache.hadoop.hive.cli.TestHBaseCliDriver FAILED
[junit] Running org.apache.hadoop.hive.cli.TestHBaseMinimrCliDriver
[junit] Starting DataNode 0 with dfs.data.dir: 
build/test/data/dfs/data/data1,build/test/data/dfs/data/data2
[junit] Starting DataNode 1 with dfs.data.dir: 
build/test/data/dfs/data/data3,build/test/data/dfs/data/data4
[junit] Starting DataNode 2 with dfs.data.dir: 
build/test/data/dfs/data/data5,build/test/data/dfs/data/data6
[junit] Starting DataNode 3 with dfs.data.dir: 
build/test/data/dfs/data/data7,build/test/data/dfs/data/data8
[junit] Generating rack names for tasktrackers
[junit] Generating host names for tasktrackers
[junit] Hive history 

Build failed in Hudson: Hive-trunk-h0.20 #495

2011-01-19 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/495/

--
[...truncated 21356 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #497

2011-01-19 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/497/changes

Changes:

[cws] HIVE-1915 Authorization on database level is broken (He Yongqiang via cws)

--
[...truncated 7276 lines...]
compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#contrib;working@minerva
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1183ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/2ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: hbase-handler
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:
 [echo] Jar: hbase-handler

test:

test-shims:

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

compile:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#shims;working@minerva
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve]  found hadoop#core;0.20.3-CDH3-SNAPSHOT in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 2471ms :: artifacts dl 2ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   2   |   0   |   0   |   0   ||   2   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/2ms)

install-hadoopcore-internal:

build_shims:
 [echo] Compiling shims against hadoop 0.20.0 
(https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/hadoopcore/hadoop-0.20.0)

Build failed in Hudson: Hive-trunk-h0.20 #498

2011-01-19 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/498/changes

Changes:

[pauly] HIVE-1862 Revive partition filtering in the Hive MetaStore
(Mac Yang via pauly)

--
[...truncated 4238 lines...]

jar:
 [echo] Jar: shims
  [jar] Building jar: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/shims/hive-shims-0.7.0-SNAPSHOT.jar

create-dirs:
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/common
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/common/classes
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/common/test
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/common/test/src
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/common/test/classes

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#common;working@minerva
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1315ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/3ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: common
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 5 source files to 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/common/classes
[javac] Note: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
 uses or overrides a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.

jar:
 [echo] Jar: common
  [jar] Building jar: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/common/hive-common-0.7.0-SNAPSHOT.jar

create-dirs:
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/classes
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/src
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/classes

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable 

Build failed in Hudson: Hive-trunk-h0.20 #501

2011-01-20 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/501/

--
[...truncated 7224 lines...]
create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#contrib;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 758ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/0ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: hbase-handler
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:
 [echo] Jar: hbase-handler

test:

test-shims:

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

compile:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#shims;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve]  found hadoop#core;0.20.3-CDH3-SNAPSHOT in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1974ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   2   |   0   |   0   |   0   ||   2   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/1ms)

install-hadoopcore-internal:

build_shims:
 [echo] Compiling shims against hadoop 0.20.0 
(https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/hadoopcore/hadoop-0.20.0)
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/shims/build.xml:53:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

ivy-init-dirs:

ivy-download:
   

Build failed in Hudson: Hive-trunk-h0.20 #502

2011-01-21 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/502/

--
[...truncated 7224 lines...]
create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#contrib;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 710ms :: artifacts dl 0ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/1ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: hbase-handler
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:
 [echo] Jar: hbase-handler

test:

test-shims:

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

compile:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#shims;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve]  found hadoop#core;0.20.3-CDH3-SNAPSHOT in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 2009ms :: artifacts dl 0ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   2   |   0   |   0   |   0   ||   2   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/1ms)

install-hadoopcore-internal:

build_shims:
 [echo] Compiling shims against hadoop 0.20.0 
(https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/hadoopcore/hadoop-0.20.0)
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/shims/build.xml:53:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

ivy-init-dirs:

ivy-download:
   

Build failed in Hudson: Hive-trunk-h0.20 #504

2011-01-23 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/504/

--
[...truncated 7277 lines...]
compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#contrib;working@minerva
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1187ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/3ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: hbase-handler
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:
 [echo] Jar: hbase-handler

test:

test-shims:

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

compile:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#shims;working@minerva
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve]  found hadoop#core;0.20.3-CDH3-SNAPSHOT in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 2422ms :: artifacts dl 2ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   2   |   0   |   0   |   0   ||   2   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/3ms)

install-hadoopcore-internal:

build_shims:
 [echo] Compiling shims against hadoop 0.20.0 
(https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/hadoopcore/hadoop-0.20.0)
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/shims/build.xml:53:
 

Build failed in Hudson: Hive-trunk-h0.20 #505

2011-01-23 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/505/changes

Changes:

[namit] HIVE-1923 Allow any type of stats publisher and aggregator in addition 
to
HBase and JDBC (Ning Zhang via namit)

--
[...truncated 7276 lines...]
compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#contrib;working@minerva
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1182ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/2ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: hbase-handler
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:
 [echo] Jar: hbase-handler

test:

test-shims:

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

compile:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#shims;working@minerva
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve]  found hadoop#core;0.20.3-CDH3-SNAPSHOT in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 2568ms :: artifacts dl 2ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   2   |   0   |   0   |   0   ||   2   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/3ms)

install-hadoopcore-internal:

build_shims:
 [echo] Compiling shims against hadoop 0.20.0 

Build failed in Hudson: Hive-trunk-h0.20 #506

2011-01-24 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/506/

--
[...truncated 7224 lines...]
create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#contrib;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 752ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/1ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: hbase-handler
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:
 [echo] Jar: hbase-handler

test:

test-shims:

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

compile:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#shims;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve]  found hadoop#core;0.20.3-CDH3-SNAPSHOT in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1982ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   2   |   0   |   0   |   0   ||   2   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/1ms)

install-hadoopcore-internal:

build_shims:
 [echo] Compiling shims against hadoop 0.20.0 
(https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/hadoopcore/hadoop-0.20.0)
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/shims/build.xml:53:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

ivy-init-dirs:

ivy-download:
   

Build failed in Hudson: Hive-trunk-h0.20 #507

2011-01-24 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/507/changes

Changes:

[namit] HIVE-1908 FileHandler leak on partial iteration of the resultset
(Chinna Rao Lalam via namit)

[namit] HIVE-1897 Alter command execution when HDFS is down results in holding
stale data in MetaStore (Chinna Rao Lalam via namit)

--
[...truncated 7223 lines...]
create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#contrib;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 863ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/1ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: hbase-handler
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:
 [echo] Jar: hbase-handler

test:

test-shims:

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

compile:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#shims;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve]  found hadoop#core;0.20.3-CDH3-SNAPSHOT in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1981ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   2   |   0   |   0   |   0   ||   2   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/1ms)

install-hadoopcore-internal:

build_shims:
 [echo] Compiling shims against hadoop 0.20.0 

Build failed in Hudson: Hive-trunk-h0.20 #514

2011-01-28 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/514/

--
[...truncated 7232 lines...]
create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

install-hadoopcore:

install-hadoopcore-default:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#contrib;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 736ms :: artifacts dl 0ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   1   |   0   |   0   |   0   ||   1   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/1ms)

install-hadoopcore-internal:

setup:

compile:
 [echo] Compiling: hbase-handler
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:283:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:
 [echo] Jar: hbase-handler

test:

test-shims:

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

jar:

init:

compile:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-retrieve-hadoop-source:
:: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: 
org.apache.hadoop.hive#shims;work...@vesta.apache.org
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.20.0 in hadoop-source
[ivy:retrieve]  found hadoop#core;0.20.3-CDH3-SNAPSHOT in hadoop-source
[ivy:retrieve] :: resolution report :: resolve 1962ms :: artifacts dl 1ms
-
|  |modules||   artifacts   |
|   conf   | number| search|dwnlded|evicted|| number|dwnlded|
-
|  default |   2   |   0   |   0   |   0   ||   2   |   0   |
-
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/1ms)

install-hadoopcore-internal:

build_shims:
 [echo] Compiling shims against hadoop 0.20.0 
(https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/hadoopcore/hadoop-0.20.0)
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/shims/build.xml:53:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

ivy-init-dirs:

ivy-download:
   

Build failed in Hudson: Hive-trunk-h0.20 #516

2011-01-30 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/516/

--
[...truncated 3813 lines...]
A ql/src/java/org/apache/hadoop/hive/ql/parse/QBMetaData.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/BaseSemanticAnalyzer.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/ASTNodeOrigin.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/OpParseContext.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/ParseContext.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticException.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/PrunedPartitionList.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/ParseError.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/RowResolver.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/TableSample.java
A 
ql/src/java/org/apache/hadoop/hive/ql/parse/ExplainSemanticAnalyzer.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/ParseUtils.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java
A 
ql/src/java/org/apache/hadoop/hive/ql/parse/HiveSemanticAnalyzerHookContextImpl.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/TypeCheckCtx.java
A ql/src/java/org/apache/hadoop/hive/ql/Driver.java
A ql/src/java/org/apache/hadoop/hive/ql/util
A ql/src/java/org/apache/hadoop/hive/ql/util/DosToUnix.java
A ql/src/java/org/apache/hadoop/hive/ql/udf
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFYear.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToInteger.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSign.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFWeekOfYear.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPPositive.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToLong.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFLog2.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFAbs.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFConv.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToByte.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBin.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPI.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToDouble.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDate.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitOr.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMultiply.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDegrees.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFSubstr.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFAtan.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCos.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFHex.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRound.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFLower.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSqrt.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFRegExp.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFUpper.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPower.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericOp.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitNot.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPosMod.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPNegative.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFMinute.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPDivide.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDateDiff.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitXor.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFConcat.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSecond.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericUnaryOp.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCeil.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMod.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFType.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRadians.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToBoolean.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPLongDivide.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRegExpExtract.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFFromUnixTime.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDateSub.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFAsin.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFExp.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic
A 

Build failed in Hudson: Hive-trunk-h0.20 #519

2011-01-31 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/519/changes

Changes:

[jvs] HIVE-1927. Fix TestHadoop20SAuthBridge failure on Hudson
(Devaraj Das via jvs)

--
[...truncated 3812 lines...]
AUql/src/java/org/apache/hadoop/hive/ql/parse/JoinCond.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/QBMetaData.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/BaseSemanticAnalyzer.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/ASTNodeOrigin.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/OpParseContext.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/ParseContext.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticException.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/PrunedPartitionList.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/ParseError.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/RowResolver.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/TableSample.java
A 
ql/src/java/org/apache/hadoop/hive/ql/parse/ExplainSemanticAnalyzer.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/ParseUtils.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java
A 
ql/src/java/org/apache/hadoop/hive/ql/parse/HiveSemanticAnalyzerHookContextImpl.java
A ql/src/java/org/apache/hadoop/hive/ql/parse/TypeCheckCtx.java
A ql/src/java/org/apache/hadoop/hive/ql/Driver.java
A ql/src/java/org/apache/hadoop/hive/ql/util
A ql/src/java/org/apache/hadoop/hive/ql/util/DosToUnix.java
A ql/src/java/org/apache/hadoop/hive/ql/udf
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFYear.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToInteger.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSign.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFWeekOfYear.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPPositive.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToLong.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFLog2.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFAbs.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFConv.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToByte.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBin.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPI.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToDouble.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDate.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitOr.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMultiply.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDegrees.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFSubstr.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFAtan.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCos.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFHex.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRound.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFLower.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSqrt.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFRegExp.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFUpper.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPower.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericOp.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitNot.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPosMod.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPNegative.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFMinute.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPDivide.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDateDiff.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitXor.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFConcat.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSecond.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericUnaryOp.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCeil.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMod.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFType.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRadians.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToBoolean.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPLongDivide.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRegExpExtract.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFFromUnixTime.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDateSub.java
A 

Build failed in Hudson: Hive-trunk-h0.20 #520

2011-01-31 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/520/changes

Changes:

[cws] HIVE-1931 Improve the implementation of the METASTORE_CACHE_PINOBJTYPES 
config (Mac Yang via cws)

--
[...truncated 3835 lines...]
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSign.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFWeekOfYear.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPPositive.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToLong.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFLog2.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFAbs.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFConv.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToByte.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBin.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPI.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToDouble.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDate.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitOr.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMultiply.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDegrees.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFSubstr.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFAtan.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCos.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFHex.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRound.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFLower.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSqrt.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFRegExp.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFUpper.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPower.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericOp.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitNot.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPosMod.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPNegative.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFMinute.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPDivide.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDateDiff.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPBitXor.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFConcat.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSecond.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericUnaryOp.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCeil.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMod.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFType.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRadians.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/UDFToBoolean.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPLongDivide.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFRegExpExtract.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFFromUnixTime.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDateSub.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFAsin.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/UDFExp.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFSum.java
AUql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFIndex.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFStringToMap.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDTFJSONTuple.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPNull.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic/UDTFCollector.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFArrayContains.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPOr.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFStruct.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFMax.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPNotEqual.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFVarianceSample.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFStd.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFBridge.java
A 
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFBridge.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFUtils.java
A ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NGramEstimator.java

Build failed in Hudson: Hive-trunk-h0.20 #521

2011-01-31 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/521/

--
[...truncated 7544 lines...]

jar:

init:

dynamic-serde:

ivy-init-dirs:

ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/ivy/lib/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve:
[ivy:resolve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:resolve] :: loading settings :: file = 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ivy/ivysettings.xml

ivy-retrieve:

compile:
 [echo] Compiling: hive
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/serde/build.xml:52:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

compile-test:
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:317:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 20 source files to 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/classes
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] Note: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/serde/src/test/org/apache/hadoop/hive/serde2/dynamic_type/TestDynamicSerDe.java
 uses unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build-common.xml:330:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

test-jar:
  [jar] Building MANIFEST-only jar: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/test-udfs.jar

test-init:
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/data
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/logs/clientpositive
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/logs/clientnegative
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/logs/positive
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/logs/negative
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/data/warehouse
[mkdir] Created dir: 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/serde/test/data/metadb

test:
[junit] Running org.apache.hadoop.hive.serde2.TestTCTLSeparatedProtocol
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.187 sec
[junit] Running 
org.apache.hadoop.hive.serde2.binarysortable.TestBinarySortableSerDe
[junit] Beginning Test testBinarySortableSerDe:
[junit] Test testTBinarySortableProtocol passed!
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.842 sec
[junit] Running org.apache.hadoop.hive.serde2.dynamic_type.TestDynamicSerDe
[junit] input struct = [234, [firstString, secondString], {firstKey=1, 
secondKey=2}, -234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxffx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class java.lang.Integer
[junit] o[1] class = class java.util.ArrayList
[junit] o[2] class = class java.util.HashMap
[junit] o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, 
-234, 1.0, -2.5]
[junit] Testing protocol: 
org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
[junit] TypeName = 
struct_hello:int,2bye:arraystring,another:mapstring,int,nhello:int,d:double,nd:double
[junit] bytes 
=xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x00xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
[junit] o class = class java.util.ArrayList
[junit] o size = 6
[junit] o[0] class = class 

Build failed in Hudson: Hive-trunk-h0.20 #524

2011-02-01 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/524/

--
[...truncated 22339 lines...]
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes2.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes2.q.out
[junit] Done query: serde_typedbytes2.q
[junit] Begin query: serde_typedbytes3.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes3.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes3.q.out
[junit] Done query: serde_typedbytes3.q
[junit] Begin query: serde_typedbytes4.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes4.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes4.q.out
[junit] Done query: serde_typedbytes4.q
[junit] Begin query: serde_typedbytes5.q
[junit] dummy
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes5.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes5.q.out
[junit] Done query: serde_typedbytes5.q
[junit] Begin query: serde_typedbytes6.q
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes6.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes6.q.out
[junit] Done query: serde_typedbytes6.q
[junit] Begin query: serde_typedbytes_null.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/table1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes_null.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes_null.q.out
[junit] Done query: serde_typedbytes_null.q
[junit] Begin 

Build failed in Hudson: Hive-trunk-h0.20 #525

2011-02-01 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/525/changes

Changes:

[namit] HIVE-1934 Alter table rename messes the location
(Paul Yang via namit)

Mmetastore/src/java/org/apache/hadoop/hive/metastore/HiveAlterHandler.java
MCHANGES.txt
Mql/src/test/results/clientpositive/alter3.q.out
Mql/src/test/queries/clientpositive/alter3.q

--
[...truncated 21257 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #526

2011-02-01 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/526/changes

Changes:

[jvs] Add reviewboard property.

--
[...truncated 22216 lines...]
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes2.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes2.q.out
[junit] Done query: serde_typedbytes2.q
[junit] Begin query: serde_typedbytes3.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes3.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes3.q.out
[junit] Done query: serde_typedbytes3.q
[junit] Begin query: serde_typedbytes4.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes4.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes4.q.out
[junit] Done query: serde_typedbytes4.q
[junit] Begin query: serde_typedbytes5.q
[junit] dummy
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes5.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes5.q.out
[junit] Done query: serde_typedbytes5.q
[junit] Begin query: serde_typedbytes6.q
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes6.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes6.q.out
[junit] Done query: serde_typedbytes6.q
[junit] Begin query: serde_typedbytes_null.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/table1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes_null.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes_null.q.out
[junit] Done 

Build failed in Hudson: Hive-trunk-h0.20 #528

2011-02-02 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/528/changes

Changes:

[namit] HIVE-1944 Dynamic partition insert creating different directories for 
the
same partition during merge (Ning Zhang via namit)

--
[...truncated 22323 lines...]
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes2.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes2.q.out
[junit] Done query: serde_typedbytes2.q
[junit] Begin query: serde_typedbytes3.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes3.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes3.q.out
[junit] Done query: serde_typedbytes3.q
[junit] Begin query: serde_typedbytes4.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes4.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes4.q.out
[junit] Done query: serde_typedbytes4.q
[junit] Begin query: serde_typedbytes5.q
[junit] dummy
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/dest1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes5.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes5.q.out
[junit] Done query: serde_typedbytes5.q
[junit] Begin query: serde_typedbytes6.q
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes6.q.out
 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/contrib/src/test/results/clientpositive/serde_typedbytes6.q.out
[junit] Done query: serde_typedbytes6.q
[junit] Begin query: serde_typedbytes_null.q
[junit] Deleted 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/data/warehouse/table1
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I 
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I 
Location -I transient_lastDdlTime -I last_modified_ -I 
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused 
by: -I LOCK_QUERYID: -I grantTime -I [.][.][.] [0-9]* more 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/contrib/test/logs/contribclientpositive/serde_typedbytes_null.q.out
 

Build failed in Hudson: Hive-trunk-h0.20 #529

2011-02-02 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/529/changes

Changes:

[heyongqiang] HIVE-1942. change the value of hive.input.format to 
CombineHiveInputFormat for tests (namit via He Yongqiang)

--
[...truncated 22669 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: 

Build failed in Hudson: Hive-trunk-h0.20 #530

2011-02-03 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/530/

--
[...truncated 22570 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #531

2011-02-03 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/531/changes

Changes:

[namit] HIVE-1716 Make TestHBaseCliDriver use dynamic ports to avoid conflicts 
with
already-running services (John Sichi via namit)

--
[...truncated 22563 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: 

Build failed in Hudson: Hive-trunk-h0.20 #532

2011-02-03 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/532/changes

Changes:

[namit] HIVE-1951 input16_cc.q is failing in testminimrclidriver
(He Yongqiang via namit)

--
[...truncated 22598 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK

Build failed in Hudson: Hive-trunk-h0.20 #533

2011-02-03 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/533/changes

Changes:

[heyongqiang] HIVE-1952. fix some outputs and make some tests deterministic 
(namit via He Yongqiang)

--
[...truncated 21915 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1

Build failed in Hudson: Hive-trunk-h0.20 #534

2011-02-04 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/534/changes

Changes:

[namit] HIVE-1956 Provide DFS initialization script for Hive
(Bruno Mahe via namit)

--
[...truncated 22372 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK
[junit] 

Build failed in Hudson: Hive-trunk-h0.20 #535

2011-02-04 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/535/

--
[...truncated 22397 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #536

2011-02-05 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/536/

--
[...truncated 22456 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #538

2011-02-06 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/538/changes

Changes:

[namit] HIVE-1961 Make Stats gathering more flexible with timeout and atomicity
(Ning Zhang via namit)

--
[...truncated 19174 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1

Build failed in Hudson: Hive-trunk-h0.20 #539

2011-02-07 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/539/

--
[...truncated 21902 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src1
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #540

2011-02-07 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/540/changes

Changes:

[heyongqiang] HIVE-1900 a mapper should be able to span multiple partitions 
(namit via He Yongqiang)

[namit] HIVE-1964 Add fully deterministic ORDER BY in test union22.q and 
input40.q
(John Sichi via namit)

--
[...truncated 20010 lines...]
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt
[junit] Loading data to table srcbucket
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket1.txt'
 INTO TABLE srcbucket
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket
[junit] OK
[junit] PREHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: CREATE TABLE srcbucket2(key int, value string) 
CLUSTERED BY (key) INTO 4 BUCKETS STORED AS TEXTFILE
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket20.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket21.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket22.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt
[junit] Loading data to table srcbucket2
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/srcbucket23.txt'
 INTO TABLE srcbucket2
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@srcbucket2
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table src
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 INTO TABLE src
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@src
[junit] OK
[junit] PREHOOK: query: LOAD DATA LOCAL INPATH 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt'
 INTO TABLE src1
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv3.txt
[junit] Loading data to table src1
[junit] POSTHOOK: query: LOAD DATA LOCAL INPATH 

Build failed in Hudson: Hive-trunk-h0.20 #542

2011-02-08 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/542/changes

Changes:

[pauly] HIVE-1818 Call frequency and duration metrics for HiveMetaStore via jmx
(Sushanth Sowmyan via pauly)

[jvs] HIVE-1970. Modify build to run all tests regardless of subproject 
failures.
(Carl Steinbach via jvs)

--
[...truncated 22603 lines...]
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (num int)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (num int)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102081943_1314495100.txt
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102081943_474408860.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-08_19-43-39_257_7044498286683889035/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-08 19:43:41,823 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-08_19-43-39_257_7044498286683889035/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-08_19-43-41_974_4178845283739002672/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-08 19:43:44,534 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-08_19-43-41_974_4178845283739002672/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-08_19-43-44_715_3499711993732994623/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-08 19:43:47,282 null map = 100%,  reduce = 0%
[junit] Ended Job = 

Build failed in Hudson: Hive-trunk-h0.20 #544

2011-02-09 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/544/changes

Changes:

[nzhang] HIVE-1971. Verbose/echo mode for the Hive CLI (Jonathan Natkins via 
Ning Zhang)

--
[...truncated 24135 lines...]
[junit] 2011-02-09 07:10:23,095 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_07-10-20_442_6861395604252812301/-mr-1
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102090710_1196282200.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_07-10-23_896_6998938275565321940/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-09 07:10:26,605 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_07-10-23_896_6998938275565321940/-mr-1
[junit] OK
[junit] PREHOOK: query: select count(1) as c from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_07-10-26_846_6491588939046426848/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks determined at compile time: 1
[junit] In order to change the average load for a reducer (in bytes):
[junit]   set hive.exec.reducers.bytes.per.reducer=number
[junit] In order to limit the maximum number of reducers:
[junit]   set hive.exec.reducers.max=number
[junit] In order to set a constant number of reducers:
[junit]   set mapred.reduce.tasks=number
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-09 07:10:29,566 null map = 100%,  reduce = 100%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select count(1) as c from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_07-10-26_846_6491588939046426848/-mr-1
[junit] OK
[junit] -  ---
[junit] 
[junit] Testcase: testExecute took 10.847 sec
[junit] Testcase: testNonHiveCommand took 0.951 sec
[junit] Testcase: testMetastore took 0.299 sec
[junit] Testcase: testGetClusterStatus took 0.119 sec
[junit] Testcase: testFetch took 9.549 sec
[junit] Testcase: testDynamicSerde took 6.5 sec

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for 

Build failed in Hudson: Hive-trunk-h0.20 #545

2011-02-09 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/545/

--
[...truncated 25392 lines...]
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (num int)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (num int)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102091108_1713412541.txt
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102091108_426386900.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_11-08-31_327_4767368152041876847/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-09 11:08:33,881 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_11-08-31_327_4767368152041876847/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_11-08-34_049_8416191857889998057/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-09 11:08:36,591 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_11-08-34_049_8416191857889998057/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_11-08-36_758_8861724302206474949/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-09 11:08:39,297 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 

Build failed in Hudson: Hive-trunk-h0.20 #547

2011-02-09 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/547/changes

Changes:

[namit] HIVE-1948 Add audit logging in the metastore
(Devaraj Das via namit)

--
[...truncated 25316 lines...]
[junit] 2011-02-09 21:10:55,934 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_21-10-53_352_5080872800286226991/-mr-1
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102092110_1326956834.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_21-10-56_816_9051942042303308962/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-09 21:10:59,514 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_21-10-56_816_9051942042303308962/-mr-1
[junit] OK
[junit] PREHOOK: query: select count(1) as c from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_21-10-59_709_2751416800073562522/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks determined at compile time: 1
[junit] In order to change the average load for a reducer (in bytes):
[junit]   set hive.exec.reducers.bytes.per.reducer=number
[junit] In order to limit the maximum number of reducers:
[junit]   set hive.exec.reducers.max=number
[junit] In order to set a constant number of reducers:
[junit]   set mapred.reduce.tasks=number
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-09 21:11:02,451 null map = 100%,  reduce = 100%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select count(1) as c from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-09_21-10-59_709_2751416800073562522/-mr-1
[junit] OK
[junit] -  ---
[junit] 
[junit] Testcase: testExecute took 10.7 sec
[junit] Testcase: testNonHiveCommand took 1.027 sec
[junit] Testcase: testMetastore took 0.31 sec
[junit] Testcase: testGetClusterStatus took 0.099 sec
[junit] Testcase: testFetch took 9.362 sec
[junit] Testcase: testDynamicSerde took 6.528 sec

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds


Build failed in Hudson: Hive-trunk-h0.20 #550

2011-02-10 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/550/changes

Changes:

[nzhang] HIVE-1978 Hive SymlinkTextInputFormat does not estimate input size 
correctly (He Yongqiang via Ning Zhang)

--
[...truncated 25317 lines...]
[junit] 2011-02-10 20:38:46,547 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-10_20-38-43_965_6626046934112576239/-mr-1
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102102038_2010009206.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-10_20-38-47_467_4739685777001142634/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-10 20:38:50,198 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-10_20-38-47_467_4739685777001142634/-mr-1
[junit] OK
[junit] PREHOOK: query: select count(1) as c from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-10_20-38-50_401_8789120560465472230/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks determined at compile time: 1
[junit] In order to change the average load for a reducer (in bytes):
[junit]   set hive.exec.reducers.bytes.per.reducer=number
[junit] In order to limit the maximum number of reducers:
[junit]   set hive.exec.reducers.max=number
[junit] In order to set a constant number of reducers:
[junit]   set mapred.reduce.tasks=number
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-10 20:38:53,165 null map = 100%,  reduce = 100%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select count(1) as c from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-10_20-38-50_401_8789120560465472230/-mr-1
[junit] OK
[junit] -  ---
[junit] 
[junit] Testcase: testExecute took 10.359 sec
[junit] Testcase: testNonHiveCommand took 0.912 sec
[junit] Testcase: testMetastore took 0.275 sec
[junit] Testcase: testGetClusterStatus took 0.124 sec
[junit] Testcase: testFetch took 9.831 sec
[junit] Testcase: testDynamicSerde took 7.238 sec

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 

Build failed in Hudson: Hive-trunk-h0.20 #551

2011-02-11 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/551/

--
[...truncated 25849 lines...]
[junit] 2011-02-11 11:49:22,076 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-11_11-49-19_502_7629701857959757269/-mr-1
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_20110249_1587749582.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-11_11-49-22_795_6893776407798048013/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-11 11:49:25,466 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-11_11-49-22_795_6893776407798048013/-mr-1
[junit] OK
[junit] PREHOOK: query: select count(1) as c from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-11_11-49-25_621_3732103169707422420/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks determined at compile time: 1
[junit] In order to change the average load for a reducer (in bytes):
[junit]   set hive.exec.reducers.bytes.per.reducer=number
[junit] In order to limit the maximum number of reducers:
[junit]   set hive.exec.reducers.max=number
[junit] In order to set a constant number of reducers:
[junit]   set mapred.reduce.tasks=number
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-11 11:49:28,328 null map = 100%,  reduce = 100%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select count(1) as c from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-11_11-49-25_621_3732103169707422420/-mr-1
[junit] OK
[junit] -  ---
[junit] 
[junit] Testcase: testExecute took 10.152 sec
[junit] Testcase: testNonHiveCommand took 0.803 sec
[junit] Testcase: testMetastore took 0.256 sec
[junit] Testcase: testGetClusterStatus took 0.089 sec
[junit] Testcase: testFetch took 9.564 sec
[junit] Testcase: testDynamicSerde took 6.255 sec

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 

Build failed in Hudson: Hive-trunk-h0.20 #552

2011-02-12 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/552/

--
[...truncated 25352 lines...]
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (num int)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (num int)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102121107_47545355.txt
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102121107_879646061.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_11-07-31_404_5058133033610434163/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-12 11:07:33,994 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_11-07-31_404_5058133033610434163/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_11-07-34_174_8582527589076619709/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-12 11:07:36,689 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_11-07-34_174_8582527589076619709/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_11-07-36_829_5726493308979473263/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-12 11:07:39,339 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 

Build failed in Hudson: Hive-trunk-h0.20 #553

2011-02-12 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/553/changes

Changes:

[jvs] HIVE-1465. hive-site.xml ${user.name} not replaced for local-file derby
metastore connection URL
(Carl Steinbach via jvs)

--
[...truncated 25848 lines...]
[junit] 2011-02-12 14:57:59,804 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_14-57-57_234_5359577069398398626/-mr-1
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102121458_1068583066.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_14-58-00_640_4959342169698642431/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-12 14:58:03,298 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_14-58-00_640_4959342169698642431/-mr-1
[junit] OK
[junit] PREHOOK: query: select count(1) as c from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_14-58-03_471_5710578895006890672/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks determined at compile time: 1
[junit] In order to change the average load for a reducer (in bytes):
[junit]   set hive.exec.reducers.bytes.per.reducer=number
[junit] In order to limit the maximum number of reducers:
[junit]   set hive.exec.reducers.max=number
[junit] In order to set a constant number of reducers:
[junit]   set mapred.reduce.tasks=number
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-12 14:58:06,173 null map = 100%,  reduce = 100%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select count(1) as c from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-12_14-58-03_471_5710578895006890672/-mr-1
[junit] OK
[junit] -  ---
[junit] 
[junit] Testcase: testExecute took 10.414 sec
[junit] Testcase: testNonHiveCommand took 0.928 sec
[junit] Testcase: testMetastore took 0.31 sec
[junit] Testcase: testGetClusterStatus took 0.094 sec
[junit] Testcase: testFetch took 9.685 sec
[junit] Testcase: testDynamicSerde took 6.92 sec

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 

Build failed in Hudson: Hive-trunk-h0.20 #555

2011-02-13 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/555/

--
[...truncated 25293 lines...]
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (num int)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (num int)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102131107_1168142543.txt
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102131107_1070614485.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-13_11-07-25_137_8017462829348998262/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-13 11:07:27,689 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-13_11-07-25_137_8017462829348998262/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-13_11-07-27_850_3285606576066926987/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-13 11:07:30,363 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-13_11-07-27_850_3285606576066926987/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-13_11-07-30_505_3522795126917262640/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-13 11:07:33,037 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 

Build failed in Hudson: Hive-trunk-h0.20 #557

2011-02-14 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/557/changes

Changes:

[cws] Preparing for 0.8.0 development

[jvs] HIVE-1882. Remove CHANGES.txt
(Carl Steinbach via jvs)

--
[...truncated 25879 lines...]
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (num int)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (num int)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102141649_681273817.txt
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102141649_1413713185.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-trunk-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-14_16-49-52_633_6099625099767784404/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-14 16:49:55,165 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-14_16-49-52_633_6099625099767784404/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-14_16-49-55_297_7872579519500927534/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-14 16:49:57,841 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-14_16-49-55_297_7872579519500927534/-mr-1
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-14_16-49-58_006_139498592294053/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-14 16:50:00,530 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 

Build failed in Hudson: Hive-0.7.0-h0.20 #2

2011-02-15 Thread Apache Hudson Server
See https://hudson.apache.org/hudson/job/Hive-0.7.0-h0.20/2/

--
[...truncated 25841 lines...]
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-15 11:46:57,586 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-15_11-46-55_052_142901053124351264/-mr-1
[junit] OK
[junit] Hive history 
file=https://hudson.apache.org/hudson/job/Hive-0.7.0-h0.20/ws/hive/build/service/tmp/hive_job_log_hudson_201102151146_1492191894.txt
[junit] PREHOOK: query: drop table testhivedrivertable
[junit] PREHOOK: type: DROPTABLE
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: default@testhivedrivertable
[junit] POSTHOOK: query: drop table testhivedrivertable
[junit] POSTHOOK: type: DROPTABLE
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] PREHOOK: type: CREATETABLE
[junit] POSTHOOK: query: create table testhivedrivertable (key int, value 
string)
[junit] POSTHOOK: type: CREATETABLE
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-0.7.0-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] PREHOOK: type: LOAD
[junit] Copying data from 
https://hudson.apache.org/hudson/job/Hive-0.7.0-h0.20/ws/hive/data/files/kv1.txt
[junit] Loading data to table testhivedrivertable
[junit] POSTHOOK: query: load data local inpath 
'https://hudson.apache.org/hudson/job/Hive-0.7.0-h0.20/ws/hive/data/files/kv1.txt'
 into table testhivedrivertable
[junit] POSTHOOK: type: LOAD
[junit] POSTHOOK: Output: default@testhivedrivertable
[junit] OK
[junit] PREHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-15_11-46-58_452_3858282071504824997/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks is set to 0 since there's no reduce operator
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-15 11:47:01,162 null map = 100%,  reduce = 0%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select key, value from testhivedrivertable where 
key  10
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-15_11-46-58_452_3858282071504824997/-mr-1
[junit] OK
[junit] PREHOOK: query: select count(1) as c from testhivedrivertable
[junit] PREHOOK: type: QUERY
[junit] PREHOOK: Input: default@testhivedrivertable
[junit] PREHOOK: Output: 
file:/tmp/hudson/hive_2011-02-15_11-47-01_341_8593372343534106635/-mr-1
[junit] Total MapReduce jobs = 1
[junit] Launching Job 1 out of 1
[junit] Number of reduce tasks determined at compile time: 1
[junit] In order to change the average load for a reducer (in bytes):
[junit]   set hive.exec.reducers.bytes.per.reducer=number
[junit] In order to limit the maximum number of reducers:
[junit]   set hive.exec.reducers.max=number
[junit] In order to set a constant number of reducers:
[junit]   set mapred.reduce.tasks=number
[junit] Job running in-process (local Hadoop)
[junit] 2011-02-15 11:47:04,033 null map = 100%,  reduce = 100%
[junit] Ended Job = job_local_0001
[junit] POSTHOOK: query: select count(1) as c from testhivedrivertable
[junit] POSTHOOK: type: QUERY
[junit] POSTHOOK: Input: default@testhivedrivertable
[junit] POSTHOOK: Output: 
file:/tmp/hudson/hive_2011-02-15_11-47-01_341_8593372343534106635/-mr-1
[junit] OK
[junit] -  ---
[junit] 
[junit] Testcase: testExecute took 10.319 sec
[junit] Testcase: testNonHiveCommand took 0.977 sec
[junit] Testcase: testMetastore took 0.297 sec
[junit] Testcase: testGetClusterStatus took 0.105 sec
[junit] Testcase: testFetch took 9.564 sec
[junit] Testcase: testDynamicSerde took 6.731 sec

test-conditions:

gen-test:

create-dirs:

compile-ant-tasks:

create-dirs:

init:

compile:
 [echo] Compiling: anttasks
[javac] 
https://hudson.apache.org/hudson/job/Hive-0.7.0-h0.20/ws/hive/ant/build.xml:40:
 warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds

deploy-ant-tasks:

create-dirs:

init:


  1   2   3   >