I have never used 'int' partition keys.

You might want to try inserting the transform output into a HDFS file using 
'insert overwrite directory' and then use 'load into table' to load that file 
into the correct partition?


________________________________
From: Josh Ferguson <j...@besquared.net>
Reply-To: <hive-user@hadoop.apache.org>
Date: Mon, 12 Jan 2009 21:48:17 -0800
To: <hive-user@hadoop.apache.org>
Subject: Re: INSERT OVERWRITE not working with map/reduce transform

https://gist.github.com/3cb4be29625442c90140

Josh

On Jan 12, 2009, at 9:39 PM, Zheng Shao wrote:

Should be tab separated.

if you run it without insert, is there any data on the screen?

Zheng

On Mon, Jan 12, 2009 at 9:32 PM, Josh Ferguson <j...@besquared.net> wrote:

The only thing I could figure is that my output is incorrect.. Is the output 
from the transform script supposed to be tab separated or separated by the 
delimiters of the table you're trying to insert into? It doesn't seem to make 
any difference (my table is still empty no matter which one I try) but I'd 
better make sure just incase.

Josh

On Jan 12, 2009, at 9:11 PM, Zheng Shao wrote:

Here are some examples:

[zs...@xxx /hive.root] find ./ql/src/test/queries/clientpositive -name '*.q'  | 
xargs grep TRANSFORM
 ./ql/src/test/queries/clientpositive/input14_limit.q:  SELECT 
TRANSFORM(src.key, src.value)
 ./ql/src/test/queries/clientpositive/input14_limit.q:  SELECT 
TRANSFORM(src.key, src.value)
./ql/src/test/queries/clientpositive/input14.q:  SELECT TRANSFORM(src.key, 
src.value)
 ./ql/src/test/queries/clientpositive/input14.q:  SELECT TRANSFORM(src.key, 
src.value)
 ./ql/src/test/queries/clientpositive/input18.q:  SELECT TRANSFORM(src.key, 
src.value, 1+2, 3+4)
./ql/src/test/queries/clientpositive/input18.q:  SELECT TRANSFORM(src.key, 
src.value, 1+2, 3+4)
 ./ql/src/test/queries/clientpositive/scriptfile1.q:  SELECT TRANSFORM(src.key, 
src.value)
 ./ql/src/test/queries/clientpositive/input5.q:  SELECT 
TRANSFORM(src_thrift.lint, src_thrift.lintstring)
./ql/src/test/queries/clientpositive/input5.q:  SELECT 
TRANSFORM(src_thrift.lint, src_thrift.lintstring)
 ./ql/src/test/queries/clientpositive/input17.q:  SELECT 
TRANSFORM(src_thrift.aint + src_thrift.lint[0], src_thrift.lintstring[0])
 ./ql/src/test/queries/clientpositive/input17.q:  SELECT 
TRANSFORM(src_thrift.aint + src_thrift.lint[0], src_thrift.lintstring[0])


On Mon, Jan 12, 2009 at 8:58 PM, Josh Ferguson <j...@besquared.net> wrote:

Anyone have any word on why this might not work? Can someone give me an example 
of a query they use to INSERT OVERWRITE a table from a map and/or reduce job 
that I could use as a reference?

 Josh F.



 On Jan 11, 2009, at 9:48 PM, Josh Ferguson wrote:


I have a query that returns the proper results:

 SELECT TRANSFORM(actor_id) USING '/my/script.rb' AS (actor_id, percentile, 
count) FROM (SELECT actor_id FROM activities CLUSTER BY actor_id) actors;

 But when I do

 INSERT OVERWRITE TABLE percentiles
 SELECT TRANSFORM(actor_id) USING '/my/script.rb' AS (actor_id, percentile, 
count) FROM (SELECT actor_id FROM activities CLUSTER BY actor_id) actors;

 It says it loads data into the percentiles table but when I ask for data from 
that table I get:

 hive> SELECT actor_id, percentile, count FROM percentiles;
 FAILED: Error in semantic analysis: 
org.apache.hadoop.hive.ql.metadata.HiveException: Path 
/user/hive/warehouse/percentiles not a valid path

 $ hadoop fs -ls /user/hive/warehouse/percentiles/
 Found 1 items
 -rw-r--r--   1 Josh supergroup          0 2009-01-11 21:45 
/user/hive/warehouse/percentiles/attempt_200901112100_0010_r_000000_0

 It's nothing but an empty file.

 Am I doing something wrong?

 Josh Ferguson






--
Yours,
Zheng





--
Yours,
Zheng


Reply via email to