You could also specify fully qualified hdfs path in the create table command.
It could look like

create external table test(key string )
row format delimited
fields terminated by '\000'
collection items terminated by ' '
location 'hdfs://new_master_host:port/table_path';

Then you can use the 'insert overwrite' command to move all data to the new 
table.
You can use dynamic partitioning to move partitioned tables.
Please use the following options:

SET hive.exec.dynamic.partition=true;

SET hive.exec.dynamic.partition.mode=nonstrict;

That way you will have the data copied and all the partitions created correctly 
for you.

Thanks
Vaibhav

From: Bhupesh Bansal [mailto:bhup...@groupon.com]
Sent: Friday, August 19, 2011 2:27 PM
To: user@hive.apache.org
Subject: Alter table Set Locations for all partitions

Hey Folks,

I am wondering what is the easiest way to migrate data off one hadoop/hive 
cluster to another.

I distcp all data to new cluster, and then copied the metadata directory to new 
cluster.
hive comes up fine and show tables etc but the hive location is still pointing 
to old cluster

There is one command
alter table table_name set location new_location

but it doesnt work for partitioned tables, is there a way we can do it for 
*ALL* partitions easily ??

Best
Bhupesh

Reply via email to